Friends,Let me stipulate to five things:First, Artificial Intelligence is a tsunami bearing down on human beings at a remarkable speed.Second, AI has the potentialFriends,Let me stipulate to five things:First, Artificial Intelligence is a tsunami bearing down on human beings at a remarkable speed.Second, AI has the potential

Trump's kiss of death should have been this company's end. Its fight back could be his

2026/03/19 23:23
5 min read
For feedback or concerns regarding this content, please contact us at crypto.news@mexc.com

Friends,

Let me stipulate to five things:

First, Artificial Intelligence is a tsunami bearing down on human beings at a remarkable speed.

Second, AI has the potential to make life on this planet better in many ways, but if unregulated, it could do terrible things including ending human life altogether.

Third, huge sums are being spent on AI by governments (especially the U.S. and China) and by private tech corporations.

Fourth, the incipient AI industry has already become a major political force in the U.S., supporting candidates who pledge not to regulate it and opposing candidates who intend to regulate it.

Finally, the Trump regime and its puppets in Congress don’t want to regulate it. That’s partly because the regime and its puppets are rife with corruption and conflicts of interest. Several of key officials have personal investments in AI and want it to be as profitable as possible.

Now, given all this, I think we should all be grateful that at least one prominent AI corporation — which has developed one of the most successful AI systems — is requiring that any purchaser of it agree not to use it for doing bad things. Specifically, it is prohibiting users from utilizing its AI to surveil American citizens or create automated weapons uncontrolled by human beings.

I’m referring, of course, to Anthropic and its CEO and founder Dario Amodei.

Enter Pete Hegseth, Trump’s “Secretary of War” and one of the most incompetent people ever to become a member of a president’s cabinet.

He and the Trump regime have come to rely on Anthropic’s AI. As Trump’s war has entered its third week, the U.S. military is using it to help analyze intelligence.

But Hegseth and the regime hate the fact that Anthropic has put the two above-mentioned conditions on its use.

So they’ve blacklisted Anthropic from future defense contracts, calling it a “supply-chain risk” (a label previously used only to bar foreign companies that posed risks to national security). Because the U.S. government is such a huge purchaser of AI, that blacklisting is almost a kiss of death for Anthropic.

What did Anthropic then do? It didn’t back down. Instead, it sued the regime, accusing the Pentagon of punishing it on ideological grounds and arguing that the regime is violating its First Amendment rights.

Yesterday, the Trump regime defended its decision in court — calling Anthropic an unacceptable risk to national security because it could disable or alter its technology to suit its “own interests” in a time of war.

The regime also argued that it has the authority to choose vendors, and Anthropic has no right to “unilaterally impose contract terms on the government.”

So who has the best argument here?

When we’re dealing with a gigantic force (AI) that must have guardrails to ensure it’s used in ways consistent with the common good, and the government refuses to supply such guardrails, a courageous private AI corporation and CEO should have the right to impose them as a condition for using its product.

As nearly 150 retired federal and state judges wrote in their amicus brief supporting Anthropic: “no one is trying to force the Department to contract with Anthropic. Instead, Anthropic is asking only that it not be punished on its way out the door.”

Exactly.

By the way, when did you last hear of former judges, appointed by both Republicans and Democrats, submit an amicus brief on behalf of a private company against the government?

Tech companies and their employees have also filed legal briefs in support of Anthropic. Even Microsoft, a major investor in Anthropic’s competitor OpenAI, filed a friend-of-the-court brief. Thirty-seven engineers and researchers from OpenAI and Google, including Jeff Dean, Google’s chief scientist, have also filed a brief supporting Anthropic.

The American Civil Liberties Union and the Center for Democracy and Technology filed a brief, arguing that Anthropic was protected by the First Amendment in speaking up against the Pentagon about its A.I. technology.

What we have here is one the clearest examples so far of countervailing powers — a leading corporation, a federal court, former federal judges, other tech companies and their engineers, and civil society nonprofits — joining together to confront a rogue and corrupt regime on an issue of extraordinary importance to the future.

It should be a comfort to us all that even when the normal processes of democracy are taken over by a tyrannous regime, such countervailing powers are still able and willing to rally for the common good.

Granted, it is a small comfort. But in these dark days, even small comforts must be celebrated. Tyranny cannot succeed where people refuse to submit to it.

  • Robert Reich is an emeritus professor of public policy at Berkeley and former secretary of labor. His writings can be found at https://robertreich.substack.com/. His new memoir, Coming Up Short, can be found wherever you buy books. You can also support local bookstores nationally by ordering the book at bookshop.org
Market Opportunity
OFFICIAL TRUMP Logo
OFFICIAL TRUMP Price(TRUMP)
$3,343
$3,343$3,343
-0,77%
USD
OFFICIAL TRUMP (TRUMP) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact crypto.news@mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.