this post was submitted on 06 Jan 2026
7 points (88.9% liked)

News

36912 readers
3234 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious biased sources will be removed at the mods’ discretion. Supporting links can be added in comments or posted separately but not to the post body. Sources may be checked for reliability using Wikipedia, MBFC, AdFontes, GroundNews, etc.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source. Clickbait titles may be removed.


Posts which titles don’t match the source may be removed. If the site changed their headline, we may ask you to update the post title. Clickbait titles use hyperbolic language and do not accurately describe the article content. When necessary, post titles may be edited, clearly marked with [brackets], but may never be used to editorialize or comment on the content.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials, videos, blogs, press releases, or celebrity gossip will be allowed. All posts will be judged on a case-by-case basis. Mods may use discretion to pre-approve videos or press releases from highly credible sources that provide unique, newsworthy content not available or possible in another format.


7. No duplicate posts.


If an article has already been posted, it will be removed. Different articles reporting on the same subject are permitted. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners or news aggregators.


All posts must link to original article sources. You may include archival links in the post description. News aggregators such as Yahoo, Google, Hacker News, etc. should be avoided in favor of the original source link. Newswire services such as AP, Reuters, or AFP, are frequently republished and may be shared from other credible sources.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 2 years ago
MODERATORS
 

For those growing up in the early 2000s, MSN Messenger’s SmarterChild was a popular chatbot — you could ask how its day was, tell it a joke or pretend to be their friend. SmarterChild was simply scripted and often predictable, offering a playful way for youth to simulate conversations online.

Many years later, chatbots have evolved significantly. Artificial-intelligence-driven companion chatbots, which are designed to simulate human conversations, are now the leading use case of AI technologies in 2025. Youth increasingly turn to online platforms such as Character.AI and Replika to find companionship. While many chatbots, such as ChatGPT, are often used for everyday tasks or information searches, companion chatbots are specifically designed to mimic personal relationships by simulating affection and adapting to the user’s personality. The sophistication of companion chatbots raises questions about how they shape youth emotional and social development.

Research from the Harvard Graduate School of Education points to early pressures on boys to conform to gendered norms of emotional and physical toughness. This can limit boys’ development of empathy and emotional literacy, contributing to isolation. Over time, isolation and loneliness may lead to depression, violence and even radicalization. For young boys navigating these pressures, companion chatbots offer a space for self-understanding and expression. They can rehearse difficult conversations or scenarios, articulate their emotions or seek reassurance. In an already strained mental health system, AI companionship offers low-barrier support. While these benefits matter, risks still persist.

top 1 comments
sorted by: hot top controversial new old
[–] pigup@lemmy.world 7 points 2 months ago* (last edited 2 months ago)

Just in case anyone's mystified by all the investment in AI and why no one seems to be worried about turning a profit... The owners have the real data showing just how incredibly powerful and addictive this very, very dangerous tool is. They know they have the future of humanity by the figurative balls. That power of influence is the holy grail of their insatiable greed. They're going to take people's personal computers away from them by making computer hardware incredibly expensive, but people are becoming dependent on AI to do what they want to do. They're going to charge you more money and they're going to control your mind. As has always been the goal.