Skip Navigation

a handy list of LLM poisoners

tldr.nettime.org ASRG (@asrg@tldr.nettime.org)

Attached: 1 image Sabot in the Age of AI Here is a curated list of strategies, offensive methods, and tactics for (algorithmic) sabotage, disruption, and deliberate poisoning. 🔻 iocaine The deadliest AI poison—iocaine generates garbage rather than slowing crawlers. 🔗 https://git.madhouse-projec...

ASRG (@asrg@tldr.nettime.org)
8

You're viewing a single thread.

8 comments
  • Stupidly trivial question probably, but I guess it isn't possible to poison LLMs on static websites hosted on GitHub?

    • You can make a page filled with gibberish and have a display: none honeypot link to it inside your other pages. Not sure how effective would that be though

    • Sure, but then you have to generate all that crap and store it with them. Preumably Github will eventually decide that you are wasting their space and bandwidth and... no, never mind, they're Microsoft now. Competence isn't in their vocabulary.

8 comments