A student in America asked an artificial intelligence program to help with her homework. In response, the app told her "Please Die." The eerie incident happened when 29-year-old Sumedha Reddy of Michigan sought help from Google’s Gemini chatbot large language model (LLM), New York Post reported.
The program verbally abused her, calling her a “stain on the universe.” Reddy told CBS News that she got scared and started panicking. “I wanted to throw all of my devices out the window. I hadn’t felt panic like that in a long time to be honest,” she said.
Idk what you mean “double response”. The user typed a statement, not a question, and the AI responded with its weird answer.
I think the lack of a question or specific request in the user text led to the weird response.
You’re right I misread the text log and thought Gemini responded twice in a row at the end but it looks like it didn’t. Very messed up stuff… There’s still missing user input tho and a lot of it. And Id love to see exactly what was said as a prompt