Outsourcing emotion: The horror of Google’s “Dear Sydney” AI ad | The company suggests using AI to write a child’s fan letter and the ad is so bad that Google turned off comments for it on YouTube
Opinion: "Help my daughter write a letter" is not the same as "Help me with boring busywork."
If you've watched any Olympics coverage this week, you've likely been confronted with an ad for Google's Gemini AI called "Dear Sydney." In it, a proud father seeks help writing a letter on behalf of his daughter, who is an aspiring runner and superfan of world-record-holding hurdler Sydney McLaughlin-Levrone.
"I'm pretty good with words, but this has to be just right," the father intones before asking Gemini to "Help my daughter write a letter telling Sydney how inspiring she is..." Gemini dutifully responds with a draft letter in which the LLM tells the runner, on behalf of the daughter, that she wants to be "just like you."
I think the most offensive thing about the ad is what it implies about the kinds of human tasks Google sees AI replacing. Rather than using LLMs to automate tedious busywork or difficult research questions, "Dear Sydney" presents a world where Gemini can help us offload a heartwarming shared moment of connection with our children.
Inserting Gemini into a child's heartfelt request for parental help makes it seem like the parent in question is offloading their responsibilities to a computer in the coldest, most sterile way possible. More than that, it comes across as an attempt to avoid an opportunity to bond with a child over a shared interest in a creative way.
This is one of the weirdest of several weird things about the people who are marketing AI right now
I went to ChatGPT right now and one of the auto prompts it has is “Message to comfort a friend”
If I was in some sort of distress and someone sent me a comforting message and I later found out they had ChatGPT write the message for them I think I would abandon the friendship as a pointless endeavor
What world do these people live in where they’re like “I wish AI would write meaningful messages to my friends for me, so I didn’t have to”
The thing they're trying to market is a lot of people genuinely don't know what to say at certain times. Instead of replacing an emotional activity, its meant to be used when you literally can't do it but need to.
Obviously that's not the way it should go, but it is an actual problem they're trying to talk to. I had a friend feel real down in high school because his parents didn't attend an award ceremony, and I couldn't help cause I just didn't know what to say. AI could've hypothetically given me a rough draft or inspiration. Obviously I wouldn't have just texted what the AI said, but it could've gotten me past the part I was stuck on.
In my experience, AI is shit at that anyway. 9 times out of 10 when I ask it anything even remotely deep it restates the problem like "I'm sorry to hear your parents couldn't make it". AI can't really solve the problem google wants it to, and I'm honestly glad it can't.
A lot of the times when you don't know what to say, it's not because you can't find the right words, but the right words simply don't exist. There's nothing that captures your sorrow for the person.
Funny enough, the right thing to say is that you don't know what to say. And just offer yourself to be there for them.
Yeah. If it had any empathy this would be a good task and a genuinely helpful thing. As it is, it’s going to produce nothing but pain and confusion and false hope if turned loose on this task.
The article makes a mention of the early part of the movie Her, where he's writing a heartfelt, personal card that turns out to be his job, writing from one stranger to another. That reference was exactly on target: I think most of us thought outsourcing such a thing was a completely bizarre idea, and it is. It's maybe even worse if you're not even outsourcing to someone with emotions but to an AI.
Uhh "subscribing to an AI friend" is technically possible in the form of character.ai sub. Not that I recommend it but in this day your statement is not sarcastic.