The Google device was able to deliver a detailed description of “The Nakba.”
Google is coming in for sharp criticism after video went viral of the Google Nest assistant refusing to answer basic questions about the Holocaust — but having no problem answer questions about the Nakba.
If you train your large language model on all the internet's bullshit and don't want bullshit to come out, there's not a lot of good options. Garbage in, garbage out
Yes, false information is technically undesirable, but that's not really what that word is trying to convey. The goal should be accurate information, not agreeable information. If the truth is objectionable/offensive, it should still be easily findable.