It’s Surprisingly Easy to “Poison” an AI Program

cybernews

A new study by Anthropic, the AI company behind Claude, has found that poisoning large language models (LLMs) with malicious training is much easier than previously thought.

How much easier? The company, known in the fiercely competitive industry for its careful approach towards AI safety and research, says it only takes 250 specially crafted documents to make a GenAI model spit out hogwash when presented with a certain trigger phrase.

Moreover, size doesn’t matter, it seems. Prior work seemed to suggest that as GenAI model sizes grew, more malicious training would be needed to produce a backdoor vulnerability. More

20 Comments on It’s Surprisingly Easy to “Poison” an AI Program

  1. people are raised with certain guidlines to develope their decision making. If you are raised to love your neighbor, help your neighbor and do unto others as you would have them do unto you, you’ll probably be a productive member of society.
    If you are raised to hate and kill the unbeliever we also know how that works out.
    AI is no TI. It’s a lot more that a fancy calculator, so any decision making is governed by certain guidelines. Therefore it comes down to who the “parents” are and what guidlines are they putting into the artificial “brain” to learn with.

    so is AI. It too has certain guidelines programmed into it and those guiddelines will influence its developement.

    3
  2. “Although a 13B parameter model is trained on over 20 times more training data than a 600M model, both can be backdoored by the same small number of poisoned documents,”

    I told my wife the same thing this morning!

    1
  3. This AI shit is killing the company I work for. Some at the top are totally snake-charmed with this data lake bullshit.
    Meanwhile, our dealers and their customers suffer.

    2
  4. aircubed
    Sunday, 12 October 2025, 7:18 at 7:18 am
    “AI creators are in bed with gov’t and it will be weaponized against you.”

    “Once men turned their thinking over to machines in the hope that this would set them free. But that only permitted other men with machines to enslave them.”
    -Frank Herbert, “Dune”

    1

Comments are closed.