Excerpt from a message I just posted in a #diaspora team internal forum category. The context here is that I recently get pinged by slowness/load spikes on the diaspora* project web infrastructure (Discourse, Wiki, the project website, ...), and looking at the traffic logs makes me impressively angr...
yes, you can match on user agent, and then conditionally serve them other stuff (most webservers are fine with this). nepenthes and iocaine are the current preferred/recommended servers to serve them bot mazes
the thing is that the crawlers will also lie (openai definitely doesn't publish all its own source IPs, I've verified this myself), and will attempt a number of workarounds (like using residential proxies too)
Generating plausible-looking gibberish require resources.
Giving any kind of response to these bots is a waste of resources, even if it's giberish.
My current approach is to have a robots.txt for bots than honor it. And drop all traffic during 24h for IPs used by bots that ignore robots.txt or misbehave.