I guess it thought OOP meant “clean” as in how do you dress the bird before you cook it. (As in: “clean a fish” means to filet a fish and prep it for cooking.)
I thought it was only illegal for the stores to cut those off. Like it was some kind of consumer protection thing. Which was back in the 70s, we don’t get that kind of thing anymore.
But first it said they are usually clean. So that can’t be the context. If there was a context. But there is no context because AI is fucking stupid and all these c-suite assholes pushing it like their last bowel movement will be eating crow off of their golden parakeet about two years from now when all this nonsense finally goes away and the new shiny thing is flashing around.
There are signs of three distinct interpretations in the result:
On topic, the concept of cleaning a wild bird you are trying to save
Preparing a store bought Turkey (removing a label)
Preparing a wild bird that is caught
It's actually a pretty good illustration of how AI assembles "information shaped text" and how smooth it can look and yet how dumb it can be about it. Unfortunately advocates will just say "I can't get this specific thing wrong when I ask it or another LLM, so there's no problem", even as it gets other stuff wrong. It's weird as you better be able to second guess the result, meaning you can never be confident in an answer you didn't already know, but when that's the case, it's not that great for factual stuff.
For "doesn't matter" content, it may do fine (generated alternatives to stock photography, silly meme pictures, random prattle from background NPCs in a game), but for "stuff that matters", Generative AI is frequently more of a headache than a help.
It literally doesn't matter. When the most-used search engine on the planet automatically suggests these specific actions without you even clicking on a specific site? We're fucked. We had the chance to break up monopolies like Google, Microsoft and Facebook. We didn't take it...