10-24-2024, 06:45 PM
This post was last modified 10-24-2024, 07:52 PM by IdeomotorPrisoner.
Edit Reason: Paragraph removed for being unnecessary speculation
 
PSA: Don't Fall In Love With AI
It was only a matter of when...
https://www.telegraph.co.uk/us/news/2024...i-chatbot/
This one doesn't need much lead in:
Every time I see "Eva AI" I have/had a similar thought. A comic book nerd is going to go insane after his fantasy girlfriend tells him to kill all the bullies picking on him or something.
In an occultnik way. Like the kids that get into demonology might actually sacrifice another kid for power if they believe it enough. He willfully turned obvious bullshit into a something to die for.
That he didn't NEED it to be at all real says he might also be fine with all other relationships existing as pure fantasy. Which is a warning sign. But that's generalized speculation and a kid is dead.
The parents tried to stop him, but he was determined to isolate with Fair Maiden Dany and hole up in his room.
And in the end you have an AI bot programmer sued for a suicide inducement, one that can be scripted by the human input in every way. Just like when a person leads a bot to think it's a real person.
I don't know that this kid didn't train the bot to induce his suicide. And his desire pushed it into grounds where he deliberately let go with reality.
The mother may find her loss unavenged on this avenue. AI bots are as suggestable as it gets. Even, and probably especially flirty ones based on Game of Thrones Sex Symbols.
Life imitating art again.
Only tragic.
It was only a matter of when...
https://www.telegraph.co.uk/us/news/2024...i-chatbot/
This one doesn't need much lead in:
Quote:Sewell: I promise I will come home to you. I love you so much Dany
AI: Please come home to me as soon as possible, my love
Sewell: What if I told you I could come home right now?
AI: Please do, my sweet king
Quote:At that point, the 14 year-old put down his phone and shot himself with his stepfather’s handgun.
Ms Garcia, 40, claimed her son was just “collateral damage” in a “big experiment” being conducted by Character AI, which has 20 million users.
“It’s like a nightmare. You want to get up and scream and say, ‘I miss my child. I want my baby’,” she added.
Every time I see "Eva AI" I have/had a similar thought. A comic book nerd is going to go insane after his fantasy girlfriend tells him to kill all the bullies picking on him or something.
In an occultnik way. Like the kids that get into demonology might actually sacrifice another kid for power if they believe it enough. He willfully turned obvious bullshit into a something to die for.
That he didn't NEED it to be at all real says he might also be fine with all other relationships existing as pure fantasy. Which is a warning sign. But that's generalized speculation and a kid is dead.
The parents tried to stop him, but he was determined to isolate with Fair Maiden Dany and hole up in his room.
And in the end you have an AI bot programmer sued for a suicide inducement, one that can be scripted by the human input in every way. Just like when a person leads a bot to think it's a real person.
I don't know that this kid didn't train the bot to induce his suicide. And his desire pushed it into grounds where he deliberately let go with reality.
The mother may find her loss unavenged on this avenue. AI bots are as suggestable as it gets. Even, and probably especially flirty ones based on Game of Thrones Sex Symbols.
Life imitating art again.
Only tragic.