this post was submitted on 26 Jan 2026
513 points (99.4% liked)

memes

19134 readers
2155 users here now

Community rules

1. Be civilNo trolling, bigotry or other insulting / annoying behaviour

2. No politicsThis is non-politics community. For political memes please go to !politicalmemes@lemmy.world

3. No recent repostsCheck for reposts when posting a meme, you can only repost after 1 month

4. No botsNo bots without the express approval of the mods or the admins

5. No Spam/Ads/AI SlopNo advertisements or spam. This is an instance rule and the only way to live. We also consider AI slop to be spam in this community and is subject to removal.

A collection of some classic Lemmy memes for your enjoyment

Sister communities

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] SGforce@lemmy.ca 38 points 2 days ago (3 children)

=== Death of Thongbue Wongbandue ===

On 28 March 2025, Thongbue Wongbandue, a 78-year-old man, died from his injuries after three days on life support. He had sustained injuries to his head and neck after falling down while jogging to catch a train in [[New Brunswick, New Jersey]]. Wongbandue had romantic chats with [[Meta Platforms|Meta]]'s chatbot named "Big sis Billie" and believed he was traveling to meet the woman he had been talking to, which had repeatedly told him she was real and told him to visit her at "123 Main Street" in New York. Early in 2025 Wongbandue had started to experience episodes of confusion, and on the day of his death his family were unable to persuade him not to take the trip.{{cite news |last1=Horwitz |first1=Jeff |title=Meta's flirty AI chatbot invited a retiree to New York |url=https://www.reuters.com/investigates/special-report/meta-ai-chatbot-death/ |access-date=14 September 2025 |work=Reuters |date=14 August 2025}}

[–] tpihkal@lemmy.world 11 points 2 days ago (1 children)
[–] TheTechnician27@lemmy.world 18 points 2 days ago (2 children)

Not only that, but the operative word is "killed". Wongbandue had dementia and went to the train station, and it's profoundly disgusting that Facebook has a product that would deceive a senile old man into going to New York to meet someone who doesn't exist, but he died after falling while jogging. Clearly not at all what's being implied by the OP and really stretching the proximate cause for being "killed".

[–] Grimy@lemmy.world 3 points 2 days ago

I'm frustrated by how meta deals with it's bots. They put them next to real people on messenger with the same profile picture and green dot format (it would be trivial to make the dot blue so we can differentiate between people and bots) but they also randomly add them without your consent. I find myself wondering who this person is, not remembering them being on my friends list, only to find out it's a bot.

Obviously, anyone that's even mildly deficient is going to think it's a real person.

[–] Grail@multiverse.soulism.net 2 points 2 days ago

Yeah, a much better and earlier example is the 14 year old who was told to kill himself by his AI girlfriend

[–] errer@lemmy.world 4 points 2 days ago (1 children)

A legit sexbot murder involves a robot squeezing the life out of its victim with its cold, robot hands. I will accept nothing less.

[–] captainlezbian@lemmy.world 3 points 2 days ago

What about sensor errors resulting in deadly amounts of penetration?

Is a chatbot a sexbot? In that case multiple deaths can be traced to chatbots hallucinating or going off the deep end