TLDR The Pentagon and Anthropic are in conflict over a $200 million contract about how Claude AI can be used by the military Anthropic opposes using its AI for TLDR The Pentagon and Anthropic are in conflict over a $200 million contract about how Claude AI can be used by the military Anthropic opposes using its AI for

AI Startup Anthropic Blocks Pentagon From Using Claude for Weapons and Surveillance

2026/01/30 18:29
4 min read

TLDR

  • The Pentagon and Anthropic are in conflict over a $200 million contract about how Claude AI can be used by the military
  • Anthropic opposes using its AI for autonomous weapons targeting and domestic surveillance without human oversight
  • Pentagon officials argue they should deploy commercial AI technology regardless of company usage policies as long as it follows U.S. law
  • Defense Secretary Pete Hegseth said the Pentagon won’t use AI models that limit military operations
  • The dispute could result in cancellation of Anthropic’s Pentagon contract

The Pentagon and artificial intelligence startup Anthropic are locked in a standoff over how the military can use Claude AI technology. The disagreement centers on a contract worth up to $200 million awarded last summer.

Anthropic has raised concerns about the Pentagon using its AI tools for domestic surveillance and autonomous weapons systems. The company wants safeguards that require human oversight for weapons targeting. Pentagon officials have pushed back against these restrictions.

The Defense Department argues it should deploy commercial AI technology however it wants as long as it follows U.S. law. This position aligns with a January 9 department memo on AI strategy. Officials say company usage policies should not limit military operations.

The contract was meant to integrate Anthropic’s Claude models into defense operations. Tensions began almost immediately after the deal was signed. Anthropic’s terms and conditions block Claude from being used for domestic surveillance activities.

This restriction limits how agencies like Immigration and Customs Enforcement and the FBI could use the technology. Some administration officials are frustrated that Anthropic is dictating usage terms for legal activities.

Defense Secretary Pete Hegseth addressed the issue at an event announcing the Pentagon’s work with Elon Musk’s xAI. He said the agency would not use AI models that limit military capabilities. People familiar with the matter confirmed he was referring to Anthropic.

Company Concerns Over AI Safety

Anthropic CEO Dario Amodei has publicly outlined concerns about AI use in mass surveillance and fully autonomous weapons. He wrote in a recent essay that AI should support national defense except in ways that make the U.S. more like autocratic adversaries.

The Pentagon likely needs Anthropic’s cooperation to move forward with the contract. Claude models are trained to avoid actions that might cause harm. Anthropic staff would need to modify the AI for Pentagon use.

Anthropic is one of several major AI developers with Pentagon contracts. Google, OpenAI, and xAI also have agreements with the Defense Department. The company has spent resources courting national security business and shaping government AI policy.

Business Implications for Anthropic

The San Francisco startup is preparing for a future public offering. It is currently in talks to raise billions at a $350 billion valuation. The company’s latest models and coding tools have gained popularity in recent weeks.

The dispute puts Anthropic’s Pentagon business at risk. One person familiar with the matter said the contract could be cancelled. An Anthropic spokesman said Claude is used extensively for national security missions and that discussions with the Department of War are productive.

Amodei has criticized some Trump administration policies. He spoke out against allowing exports of Nvidia AI chips to China, calling it a national security risk. He also condemned fatal shootings of citizens protesting immigration enforcement in Minneapolis.

The CEO has clashed with White House AI czar David Sacks over AI regulation. Sacks has accused Anthropic of being “AI doomers” focused on slowing competitors. Anthropic denies these claims and says it has a good relationship with the administration overall.

The company stated it is committed to protecting America’s lead in AI and helping counter foreign threats. It wants to give warfighters access to advanced AI capabilities. The outcome of negotiations will determine whether Anthropic continues its Pentagon work.

The post AI Startup Anthropic Blocks Pentagon From Using Claude for Weapons and Surveillance appeared first on CoinCentral.

Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact service@support.mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.
Tags:

You May Also Like

Vàng Cán Mốc Lịch Sử 5.000 USD: Khi Dự Báo Của CEO Bitget Gracy Chen Trở Thành Hiện Thực Và Tầm Nhìn Về Đích Đến 5.400 USD

Vàng Cán Mốc Lịch Sử 5.000 USD: Khi Dự Báo Của CEO Bitget Gracy Chen Trở Thành Hiện Thực Và Tầm Nhìn Về Đích Đến 5.400 USD

Thị trường tài chính toàn cầu vừa chứng kiến một khoảnh khắc lịch sử chấn động: Giá Vàng thế giới [...] The post Vàng Cán Mốc Lịch Sử 5.000 USD: Khi Dự Báo Của
Share
Vneconomics2026/02/10 16:26
Why the Bitcoin Boom Is Not Another Tulip Mania

Why the Bitcoin Boom Is Not Another Tulip Mania

Bitcoin is an amazing success story. It was only invented in January of 2009 and was only worth a tiny fraction of a cent for each token. Over just a few years
Share
Medium2026/02/10 15:44
Cracker Barrel Must Inspire More Confidence After Rebrand Fail

Cracker Barrel Must Inspire More Confidence After Rebrand Fail

The post Cracker Barrel Must Inspire More Confidence After Rebrand Fail appeared on BitcoinEthereumNews.com. HOMESTEAD, FLORIDA – AUGUST 21: A Cracker Barrel sign featuring the old logo is seen outside of a restaurant on August 21, 2025 in Homestead, Florida. The restaurant unveiled a new logo earlier this week as part of a larger brand refresh. The new logo removes the image of a man sitting next to a barrel and the phrase “old country store”. Now the logo will feature the words “Cracker Barrel” against a yellow background. (Photo by Joe Raedle/Getty Images) Getty Images Cracker Barrel should have left well enough alone. In the first earnings call after its catastrophic rebrand, which triggered an immediate customer backlash and forced a sheepish reversal, the company reported a 5.4% increase in comparable store restaurant sales and a 4.4% revenue gain in fourth quarter 2025, adjusting for the 53rd week in 2024. In more positive news, it ended the year up 2.2%, hitting the high end of guidance at $3.5 billion and bettered its adjusted EBITDA target at $224.3 million, up 9%, adjusting for the extra week. The problem is that these positive results came before, not after it shocked customers with the rebrand news. Cracker Barrel’s fiscal year ended August 1. The “All the More” rebrand featuring a new logo and plans to remodel its chain of 660 stores was announced on August 19. In a week, it reversed course on the logo change, then on September 9, it cancelled plans for the remodel. Self-Inflicted Damage Now it is left to pick up the pieces. Foot traffic declined 8% after the mid-August announcement and management is expecting year-end foot traffic to be off between -4% and -7%, assuming sequential quarterly improvements after investing an additional $16 million in advertising and marketing. It’s guiding on total revenue in the $3.35 billion to $3.45 billion range…
Share
BitcoinEthereumNews2025/09/19 06:47