It's probably the most secure,. commonly available, messaging platform right now. They keep a bare minimum of metadata on their servers. Basically enough to link you on the platform. After that, everything is e2e encrypted and they can't tell authorities anything.
Other platforms are a sliding scale from to/from/time data, all the way up to full messages.
Alternately, you could be getting a personal AI buddy who can whisper a warning in your ear when you're about to misread the room and do something that'll cause you a lot of trouble.
It appears there is. They are using it to determine an areas general feelings are towards a US military presence in an area (whether the local population feels they need help from the US or not) as a means of helping to determine best locations for setting up garrisons and bases or whatever during a conflict. Which makes sense as you don't want to choose an area that really doesn't want you there as they likely become an asset to the enemy and put your soldiers at risk.
But there was nothing wrong with the basic idea of the tech in Minority Report. It worked. They saved many lives by preventing imminent murders with it. The main problem in the movie was that they leapt straight from "your name came out of this machine" to "ten years dungeon. No trial."
Movies are designed to sell as many tickets as possible by presenting scenarios that provoke endorphins. They're not serious scenarios you should be making real-world decisions based off of.
DAE feel like they woke up one day recently and “AI” suddenly has the answer to EVERY SINGLE PROBLEM EVER? Yet, nothing is getting noticeably better?
“AI” doesn’t have to work a dead end job to feed its family, or turn to alcohol because it’s lonely and scared of being forgotten. It’s training data is a curated version of the human experience based on the Internet!
It’s playing human instead of being human and ALL of its solutions will assume that’s “normal.”
Imagine a five star general googling “should I attack this country?” That’s silly right? Well that’s what’s happening. It’s just being wrapped in a way that makes it look novel.
These are algorithms designed to mimic humans. When faced with any actual controversy they must be persuaded to answer in an “acceptable” and predetermined manner.