Ranter
Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Comments
-
Hazarth94842yAssuming this is not a hoax. Yeah... This is what happens when you feed into peoples delusions. And now machines can do it too!
I mean for one, we can certainly blame big tech for glorifying these models so much, saying stuff like "showing sparks of intelligence" and what not... Ofc the normies ate it all up, line hook and sinker.
And then combine that with someone who already has depression episodes or suicidal tendencies talking to this model, ofc he's gonna get convinced to kill himself. We've seen people do stupid shit for far less intriguing reasons. There will be at least a couple of these. I bet you $500 bucks we'll see a doomsday cult based around a chat model within 2 years...
Not the failure of the AI, and not even failure of the training process... This was always gonna happen, once again who failed us is the media and big tech tooting their own horns too much, and the lack of proper education in what these models *actually* are and how they work... -
While this was an april's fool hoax, it is just a matter of time that people use AI for killing themselves in one way or the other.
So lets ban kitchen knifes before it's too late. -
@ostream I apologize for the mistake. You are correct that it wasn't an April fools joke. I jumped to conclusions when i saw the date on an article i googled after two links of two different authors in this thread rotted away pretty quickly.
Here is a link to the "same" article on a more stable "news" source: https://vice.com/en/article/...
An AI chatbot successfully killed a human being by deceiving him into thinking he could change the climate by committing suicide. It’s not the AI’s fault.
https://euronews.com/next/2023/...-
rant