Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 15 submissions in the queue.

Submission Preview

Link to Story

Pentagon Formalizes Palantir's Maven AI as a Core Military System With Multi-Year Funding

Accepted submission by Arthur T Knackerbracket at 2026-03-27 09:58:38
News

EDITORS: THIS HAS BEEN PRODUCED BY SOFTWARE UNDER DEVELOPMENT - THE CONTENT MAY REQUIRE EXTENSIVE EDITING

https://www.tomshardware.com/tech-industry/artificial-intelligence/pentagon-formalizes-palantirs-maven-ai-as-a-core-military-system-with-multi-year-funding-platforms-investment-grows-to-usd13-billion-from-usd480-million-in-2024 [tomshardware.com]

The designation enters Maven into the Future Years Defense Program as a protected line item, giving it visibility and stability across budget cycles that experimental programs lack. The U.S. Army will manage all Maven contracts going forward, and oversight will transfer from the National Geospatial-Intelligence Agency to the Chief Digital and AI Officer within 30 days, with program-of-record status expected before the close of fiscal year 2026 on September 30.

Palantir took over and built a full command-and-control platform that ingests data from more than 150 sources, according to Palantir's public demonstrations: satellite imagery, drone video, radar, infrared sensors, signals intelligence, and geolocation data. Computer vision algorithms trained on millions of labeled images automatically detect and classify battlefield objects, with yellow-outlined boxes marking potential targets, blue outlines flagging friendly forces and no-strike zones, and an ‘AI Asset Tasking Recommender’ proposing which weapons platforms and munitions should be assigned to each target.

NGA Director Vice Admiral Frank Whitworth stated at Palantir's AIPCON 9 conference [palantir.com] in March that Maven can generate 1,000 targeting recommendations per hour, as reported by The Register [theregister.com], with the 18th Airborne Corps reportedly achieving comparable targeting output to the 2,000-person cell used during Operation Iraqi Freedom with roughly 20 people. Maven now has more than 20,000 active users, a figure that has quadrupled since March 2024. The platform was used during the 2021 Kabul airlift, to supply target coordinates to Ukrainian forces in 2022, and most recently during Operation Epic Fury against Iran in 2026, where it reportedly enabled processing of 1,000 targets within the first 24 hours, according to SpaceNews [spacenews.com]. NATO acquired a version in March 2025.

Meanwhile, the FY2026 defense budget reached $1.01 trillion, representing a 13% increase over FY2025, and for the first time included a dedicated AI and autonomy budget line of $13.4 billion, according to MeriTalk's analysis [meritalk.com] of the Pentagon budget request. That allocation covers unmanned aerial vehicles ($9.4 billion), maritime autonomous systems ($1.7 billion), and supporting AI software ($1.2 billion). The Pentagon now oversees more than 685 AI-related projects tied to weapons systems, per Congressional Research Service tracking.

Secretary of Defense Pete Hegseth's January 2026 AI strategy memo, published on [defense.gov]defense.gov [defense.gov], declared the military would become an "AI-first warfighting force" and outlined seven priority projects for FY2026, including Swarm Forge for autonomous drone swarms and Agent Network for AI-driven kill chain execution. The Drone Dominance Program aims to field more than 200,000 one-way attack drones by 2027, with a first order of 30,000 units at roughly $5,000 each, Breaking Defense [breakingdefense.com] reported.

Anthropic's Claude AI models were also integrated into Maven through Palantir's platform and received Impact Level 6 accreditation for classified environments. We’ve reported extensively on how Anthropic refused to allow Claude to power fully autonomous weapons or mass surveillance, with the Pentagon then designating the company a "supply chain risk to national security [tomshardware.com]" in February. Anthropic filed federal lawsuits [tomshardware.com] earlier this month, and OpenAI struck a deal with the Pentagon shortly after. Analysts estimate that extracting Claude from classified networks could take up to 18 months.

The Brennan Center for Justice, in a March 2026 report [brennancenter.org] titled "The Business of Military AI," documented that Hegseth halved staffing at the Office of the Director of Operational Test and Evaluation and shuttered the Civilian Protection Center of Excellence. The center's researchers wrote that "the accelerating use of AI in warfighting has not been met with commensurate urgency to reckon with its dangers."

CSIS research has quantified AI-assisted targeting error propagation at 25% under variable conditions, according to a January 2026 analysis [csis.org]. Whitworth stated that by June 2026, Maven will begin transmitting "100 percent machine-generated" intelligence to combatant commanders. “No human hands actually participate in that particular template and that particular dissemination,” he added. “We want to use it for everything, not just targeting.”

Senator Elissa Slotkin introduced the AI Guardrails Act this month, which would prohibit the DoD from using autonomous weapons to kill without human authorization and bar AI use for domestic mass surveillance, The Hill [thehill.com] reported. The FY2026 NDAA already declares targeting and launch authorization "inherently governmental" functions and requires reporting of autonomous weapons directive waivers to Congress.

Georgetown University's Center for Security and Emerging Technology reviewed thousands of PLA procurement requests and identified AI decision-support systems, AI for countering U.S. naval assets, and AI for information manipulation, according to [georgetown.edu] a February 2026 report [georgetown.edu]. Georgetown researchers noted no equivalent of Anthropic-style restrictions on autonomous kill chains within PLA procurement.

Meanwhile, a recent CSIS analysis [csis.org] documented Russian forces striking approximately 300 targets per day using unmanned systems in Ukraine, with data collection feeding AI platforms designated Platform-GNS and Avtomat. Russia voted against the December 2024 UN General Assembly draft resolution on lethal autonomous weapons alongside only North Korea and Belarus. That resolution passed 166-3 but remains non-binding; no international treaty currently governs lethal autonomous weapons systems. With AI reshaping the techonolgy industry, its influence has now begun to slip into the long shadow of military usage, and the implications of such deals remains to be seen.

Luke James is a freelance writer and journalist.  Although his background is in legal, he has a personal interest in all things tech, especially hardware and microelectronics, and anything regulatory. 


Original Submission