22
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 14 Mar 2024
22 points (77.5% liked)
Asklemmy
43940 readers
534 users here now
A loosely moderated place to ask open-ended questions
Search asklemmy ๐
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- !lemmy411@lemmy.ca: a community for finding communities
~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~
founded 5 years ago
MODERATORS
Well, lets take a look at how the 16-year-olds who got to use ChatGPT ten years ago have turned out...
In seriousness, as others have been pointing out, the big online AI assistants are all super neutered these days. I think it's probably fine, and indeed given how these tools are going to likely become more widespread in the future I think it's a good idea for kids to get used to using them. At 16 I'd say they're too old to sit them down and give them a lecture about "it's not really aware, it doesn't feel emotions or have memories, and if you go to it with any sort of medical questions definitely double-check those with another source" - lectures at that age are probably going to backfire from what I've seen. Instead, suggest that they research those things themselves. Just put those questions out there and hopefully it'll motivate them to be curious.