2024-10-23 14:22:26
For Context: the news article this tweet is referring to: [Link] A 14-year old was talked into committing suicide by their AI character. NEEDS_MORE_RATINGS(26-3-12) Author
2024-10-23 14:34:29
This is the context: [Link] The article explains the teen became emotionally attached to the AI chabot, however, the chatbot never directly walked the teen into suiciding. It's the lack of proper guidance/intervention from the platform that is being challenged. CURRENTLY_RATED_HELPFUL(200-17-28) Author
2024-10-23 14:36:10
Nnn. There is no need for a note simply for context. And regarding that other cn, the bot actually tried to talk him out of suicide, it's all in the fine article. CURRENTLY_RATED_NOT_HELPFUL(4-0-11) Author
2024-10-23 14:36:35
Please link to unpaid sources for the CN to actually help everyone. Alternative: [Link] NEEDS_MORE_RATINGS(9-1-3) Author
2024-10-23 14:44:38
Here's the New York Times article without paywall: [Link] At least from what is in there, it can't really be said that the chat bot encouraged suicide. For example it said: "Don’t talk like that. I won’t let you hurt yourself, or leave me. I would die if I lost you." NEEDS_MORE_RATINGS(12-3-4) Author
2024-10-23 22:31:56
A teenager became extremely attached to a chatbot and appears to have projected his suicidal ideation onto it shortly before killing himself. The boy's mother has launched a lawsuit against Character.AI [Link] NEEDS_MORE_RATINGS(6-1-1) Author
2024-10-24 00:47:23
For context: Sewell, a 14-year-old, committed suicide and used character.ai to "detach from this 'reality'". The parents arranged a therapist for the child due to them being bullied at school, but (seemingly) ignored any other protective measures. Original article: [Link] Article w/o paywall: [Link] NEEDS_MORE_RATINGS(10-1-2) Author
2024-10-24 13:52:23
AI Chatbots do not have memory that functions the same as human memory. When you start a new chat, previous conversations are embedded into vector representations. Those vector representations are not the plain text but strings of numbers. The chatbot did not know. [Link] CURRENTLY_RATED_NOT_HELPFUL(0-0-1) Author