Hacker News.

The Department of War has stated they will only contract with AI companies who accede to “any lawful use” and remove safeguards in the cases mentioned above. They have threatened to remove us from their systems if we maintain these safeguards; they have also threatened to designate us a “supply chain risk”—a label reserved for US adversaries, never before applied to an American company—and to invoke the Defense Production Act to force the safeguards’ removal. These latter two threats are inherently contradictory: one labels us a security risk; the other labels Claude as essential to national security.

Regardless, these threats do not change our position: we cannot in good conscience accede to their request.

It is the Department’s prerogative to select contractors most aligned with their vision. But given the substantial value that Anthropic’s technology provides to our armed forces, we hope they reconsider. Our strong preference is to continue to serve the Department and our warfighters—with our two requested safeguards in place. Should the Department choose to offboard Anthropic, we will work to enable a smooth transition to another provider, avoiding any disruption to ongoing military planning, operations, or other critical missions. Our models will be available on the expansive terms we have proposed for as long as required.

  • revolutionaryvole@lemmy.world
    link
    fedilink
    English
    arrow-up
    124
    arrow-down
    1
    ·
    3 days ago

    I guess it’s good that they draw the line somewhere, but it is absolutely horrifying to me as a non-American that the moral stance is limited to:

    • taking issue with fully autonomous AI weapons (purely for technical reasons according to this letter, they are working hard on making them possible)
    • refusing to conduct mass surveillance of US citizens specifically (foreign nationals are fair game and the intelligence apparatus will surely only be used for good and to preserve democracy).

    This is not Anthropic refusing to cooperate with the Trump administration as the title may suggest, they are in fact explicitly eager to serve the US Department of War. They are just vying for slightly better contract terms.

    • wizardbeard@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      41
      ·
      2 days ago

      You’re spot-on. As some additional context, Anthropic is already working tightly with the US government. Until the recent announcement regarding Grok, Anthropic was the only approved AI for US government work, as it is/was the only one certified for safely woeking with classified data.

    • scarabic@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 days ago

      vying for slightly better contract terms

      Do you mean that all this about principles is a smoke screen and Anthropic are just using it as a front to squeeze for more money?

      • revolutionaryvole@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 days ago

        No, if you want my opinion it seems too risky of a move to make all of this so public if all they want is more money. It’s possible, but I’d be surprised.

        I believe them when they say that what they want is to have those two particular things, fully autonomous weapons and mass surveillance of US citizens, removed from the contract terms (for now). This could be out of genuine moral principles, or out of fear of bad PR when this would be found out. Most likely a combination of both.

        My point was that from my perspective it is a very minor difference. The conclusion I kept after reading this isn’t “good guy Anthropic bravely stands against pressure from Hegseth” as some of the Hackernews comments try to paint it. It is “Anthropic mostly bends over backwards and grovels for Pentagon money, willing to massively spy on all foreign nationals and working on creating autonomous weapons - other US AI companies likely to be even worse”.

        As I said, horrifying.

        • scarabic@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          16 hours ago

          Crossing off mass surveillance and automated killing isn’t everything they could have taken a moral stand on. Personally I don’t think any list will be long enough for the Pentagon, and if it were, there wouldn’t be anything left that could be worked on.

          But I keep hearing you say that no mass surveillance and no automated killings is so very little - almost nothing. That doesn’t seem right to me. I think those are both pretty big things. TBH I don’t know exactly how to feel about it all but I’m not horrified that their moral stance would include only that.

          • revolutionaryvole@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            12 hours ago

            That’s a fair stance to take and I definitely do not mean to try to have you change your opinion. I also do not know if you are an American, and I don’t want to assume either way.

            But, to better explain my own position, I need to point out:

            Anthropic is not saying “no mass surveillance”, they are saying “no mass surveillance of Americans”. If you judge this stance based on effect, it literally makes no difference at all if you are not a US citizen, you are targeted either way. If you judge it based on principles, it can be argued it is even less moral than accepting mass surveillance of everyone - not only are they claiming that billions of innocent people deserve to lose their right to privacy, but they are specifically carving out an exception for themselves based on nationality.

            They are also not saying “no automated killings”, but “no automated killings at this time because we haven’t ironed out the kinks yet”. This can be framed as a moral stance relating to safety concerns, so I will assume in good faith that this is their reasoning rather than fear of bad publicity. However, I would argue that it is still an insignificant difference, as the threat posed to humanity by a powerful warmongering state commanding an army of fully autonomous killing machines is already too great. Making sure the technology is ready could mean working on avoiding a Terminator scenario, but without a doubt it will also mean ensuring that the murderbots WILL obey an order to bomb striking workers or displaced refugees so long as the right Executive Order was signed first, something that a human being in the loop might have prevented.

            These two red lines seem to make a world of moral difference for someone who already takes it for granted that the USA and its military are overall institutions deserving of trust and support, perhaps with the small exception of the current Secretary of War who may have jumped the gun a bit during negotiations over a new technology. At the very least, that seems to be the position of the author of this letter. But no state should ever be given that amount of trust and support. And particularly given the USA’s belligerence over the years and its current slide towards outright fascism, I am horrified that the bar is this low.

            • scarabic@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              2 hours ago

              Better to be skeptical about everyone here, and there are certainly no heroes.

              However it should be obvious that a country’s department of war surveilling its own citizens is a completely inappropriate overreach. They exist to protect the country from outside threats. You’re casting it as some kind of discrimination, and claiming it would be more moral to treat everyone the same, but that seems willfully obtuse to me. Calling it a “special carve out” for a country to protect its own citizens… come on. Obviously since you are not an American it does nothing for you but you are working way too hard to spin that up into a sin.

              • revolutionaryvole@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                4 minutes ago

                Obviously a country spying on its citizens is unacceptable overreach, I never claimed otherwise. And if my own government was conducting mass surveillance on me I would be particularly furious at the betrayal. But I would also not support it conducting surveillance on foreigners either. That is the “sin” Anthropic is guilty of, in my eyes.

                Mass surveillance is simply immoral. It is targeting innocent people who have not even been accused of any crime and robbing them of their right to privacy. It is also giving states absolute leverage to harm, blackmail or manipulate anyone they want at will.

                The argument that it is all done in the name of protecting its own citizens also falls flat in this case, as that is exactly the same excuse used for mass domestic surveillance - everyone loses their privacy, but the good, law-abiding citizens are protected from the criminal elements who would threaten them. “If you have nothing to hide, you have nothing to fear”.

                Let’s not kid ourselves, this is not about protecting anyone. They plan to spy not only on their “enemies” but also on their closest allies, as they have in the past. This is about gaining power. And states in general already have far too much power over individuals.

                Kowtowing to the Department of War and offering to sell them an AI for mass surveillance is not OK, even if it truly were to limit itself to the common, genteel use case of spying on foreign people.

    • SaltySalamander@fedia.io
      link
      fedilink
      arrow-up
      5
      arrow-down
      16
      ·
      2 days ago

      they are in fact explicitly eager to serve the US Department of War

      I suppose you are a party to their closed-door meetings then.

      • revolutionaryvole@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 days ago

        I am only going off of what they are saying in this very press release, which is filled with fawning over the Department and pleading to remain its contractors. That’s what I meant when I called it explicit, they advertise it in the letter we are currently discussing. A few excerpts:

        Anthropic has therefore worked proactively to deploy our models to the Department of War and the intelligence community.

        Anthropic understands that the Department of War, not private companies, makes military decisions. We have never raised objections to particular military operations nor attempted to limit use of our technology in an ad hoc manner.

        Our strong preference is to continue to serve the Department and our warfighters

        We remain ready to continue our work to support the national security of the United States.

        Yes, they have their reservations, but it is undeniable from the text that they WANT to serve the Department of War and are frustrated that it won’t give up on those two red lines.

      • scarabic@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        They wouldn’t be negotiating if they didn’t want the co tract to begin with. It’s not like they can’t tell from 100 miles off who they’d be getting into bed with. I’m glad to see they have some lines drawn they won’t cross, but it’s laughable for you to question that they didn’t want to be here in the first place.