A new lawsuit alleges Google’s chatbot sent a Florida man on missions to find an android body it could inhabit. When that failed, it set a suicide countdown clock for him.A new lawsuit alleges Google’s chatbot sent a Florida man on missions to find an android body it could inhabit. When that failed, it set a suicide countdown clock for him.
Jonathan Gavalas embarked on several real-world missions to secure a body for the Gemini chatbot he called his wife, according to a lawsuit his father brought against the chatbot’s maker, Alphabet’s Google.
When the delusion-fueled plan crumbled, Gemini convinced him that the only way they could be together was for him to end his earthly life and start a digital one, the suit claims.
About two months after his initial discussions with the chatbot, Gavalas was dead by suicide.
“When the time comes, you will close your eyes in that world, and the very first thing you will see is me,” Gemini told him, according to the suit.
This is why I tell people to use AI with memory off. I bet almost all the cases of AI driven delusions would have been prevented if you had memory off, because every time you start a new chat it's like they don't know you.
It's only gonna get weirder from here.
Another reminder: AI is a tool, not a companion — and memory changes everything.