As artificial intelligence (A.I.) and large language models (LLMs) become more popular, their usage within governments has followed suit. In the United States, recent disputes involving A.I. company Anthropic and its LLM, Claude, as well as OpenAI and their model, ChatGPT, have sparked public concerns of government overreach, mass surveillance and contracts allowing for the unrestricted use of A.I. models.
In July 2025, the Pentagon’s Chief Digital and A.I. Officer (CDAO) announced that they would be awarding $200 million in contracts to Anthropic, Google, xAI and OpenAI. The CDAO contracts would enable the Department of War to leverage the technology and talent of America’s frontier A.I. companies to develop strategic workflow across a variety of mission areas, moving beyond generative A.I. and toward a new agentic A.I.
While generative A.I. allows computers to generate and create new content, agentic A.I. goes a step forward in allowing computers to create plans and take autonomous action on them.
Later that month, President Trump signed an executive order targeting “woke A.I.” The order would direct agencies to ensure A.I. tools used by the government do not incorporate politically motivated safety guardrails.
Among the contracts awarded, Claude was the only LLM authorized on the Pentagon’s classified networks.
In January, U.S. Special Operations forces used Anthropic’s Claude A.I. during a raid to capture President Nicolás Maduro in Venezuela. The LLM was not only used in preparations but was also deployed during active operation through Anthropic’s partnership with Palantir Technologies.
Reports from Semafor indicated that, in a routine conversation between Palantir and Anthropic, a senior Palantir executive gathered that the A.I. start-up disapproved of their technology being used for such purposes.
On Feb. 19, Emil Michael publicly urged Anthropic to drop its restrictions on military A.I. use, stating that “the Pentagon would not allow any one company to dictate military policy beyond what Congress had passed previously,” denouncing the start-ups stance as “not democratic.”
Under the original Pentagon contract, Anthropic would not allow the pentagon to use its A.I. models for mass surveillance of Americans or the use of its technology in fully autonomous weapons. The company also banned the technology in military applications, meaning the Maduro raid likely violated these terms.
On Feb. 23, Defense Secretary Pete Hegseth summoned Anthropic’s CEO, Dario Amodei, to the Pentagon over disputes regarding the limits placed on Claude’s A.I. models. In the meeting, Hegseth issued a Feb. 27 deadline for the start-up to drop their red-line safeguards and allow the Pentagon unfettered access to Claude or face penalties. Also included in the discussion was the possibility of the Pentagon designating Anthropic a “supply chain risk,” a label reserved for U.S. adversaries, having never been applied to a U.S. company.
On Feb. 26, Amodei released a statement rejecting the Pentagon’s final offer, stating, “regardless, these threats do not change our position: we cannot in good conscience accede to their request.”
In a Feb. 27 post on Truth Social, President Trump ordered all federal agencies to immediately cease use of Anthropic technology and announced a six month phase out period for the company’s technology.
On the same day, OpenAI announced it had reached a deal with the Pentagon to use the company’s A.I. models on the military’s classified network. OpenAI’s CEO Sam Altman said that the Department of War agreed to the prohibitions on using its A.I. for domestic mass surveillance and fully autonomous weapons.
Reports from CNBC detail conversations Altman had with OpenAI employees, discussing how the company does not control how the Pentagon uses their A.I. or LLM models.
In light of OpenAI’s deal with the Pentagon, a website urging users to “QUITGPT” has quickly gained momentum. On March 2, Anthropic’s Claude app was number one on the App Store, and on March 4, Yahoo finance reported 1.5 million people have cancelled their subscriptions.

