Yeah I do realize the inevitable problem when their sources dry up because no one is communicating anymore but for the quick questions about how something works in the world it's extremely convenient. I'd just be asking Google anyway.
But you can’t see the source of the information, which means it could be a reputable source, or it could be Joe-Sucks-His-Own-Dick from Reddit. In another comment, I pointed out that AI was telling people to put glue on pizza to keep the cheese from falling off—if you can see the source, you are much more likely to understand the veracity of the information.
Not defending this, but it’s annoying because Google and all search engines results are being poisoned by AI written slop. It seems like LLMs may provide a better search experience, but it’s also the thing ruining the search experience.
I don’t really know what I’m talking about, but I imagine if AI slop is ruining search, it will also start to ruin itself when the current slop is used to train future LLMs. Basically I think AI will short circuit itself long term.
Now, will it short circuit itself enough for Microsoft to stop shoving it down our throats, probably not.