365
submitted 10 months ago by throws_lemy@lemmy.nz to c/technology@lemmy.world
top 44 comments
sorted by: hot top controversial new old
[-] Ghyste@sh.itjust.works 129 points 10 months ago

Stop allowing children on social media.

[-] 0110010001100010@lemmy.world 65 points 10 months ago

The SERIOUS problem with this is how does one validate if a person signing up for social media is a child or not? If you start trying to require some kind of state or federal ID the privacy implications are massive. That's trying to solve a people problem with technology which never ends well.

Maybe parents should be better keeping tabs on their kids and not giving them tablets and phones younger and younger...but who am I.

[-] electric_nan@lemmy.ml 30 points 10 months ago

I think these social networks know with a high degree of accuracy how old a given user is. They have a ton of data points about each user, and not all of them collected directly from the user interacting specifically with the site/app.

[-] Kusimulkku@lemm.ee 6 points 10 months ago

I wouldn't want to rely on that, they're already fucking all kinds of automatic checkups

Also, shared devices etc

[-] electric_nan@lemmy.ml 1 points 10 months ago

Out of all likely methods, this is probably the most reliable. The accuracy of this data underpins the entire value of these social networks (advertising companies).

[-] FishFace@lemmy.world 1 points 10 months ago

The consequences of 5% inaccuracy with your targeted advertising are a bit different than with your bans!

[-] Kusimulkku@lemm.ee 1 points 10 months ago

I'm not arguing that it's not the best option but rather even that has issues and imo we shouldn't go down that road

[-] electric_nan@lemmy.ml 1 points 10 months ago

I'm mostly saying that companies pretending they don't know which users are children is almost entirely bullshit.

[-] Potatos_are_not_friends@lemmy.world 5 points 10 months ago

Absolutely true.

When I was in the ad buying space in 2010, I was able to segment audiences based on surface level things like demographic. But also more granular things like if they own a grill, or if they are buying swimming pool equipment. Those two data points clearly point to a certain age group.

I'm sure today, ad companies can do even more extrapolation.

[-] HarkMahlberg@kbin.social 18 points 10 months ago* (last edited 10 months ago)

In high school I became the president of a community outreach club. Prior to that point, I had no social media accounts. At least, nothing we could call social media today. My tech-savvy father taught me the principle of "never tell anyone on the internet who you are, or where you live, or how old you are." I played games with online people who were likely much older than me, but they all seemingly followed that rule too, even if voice chat gave away my age. Nobody ever asked each other "A/S/L" etc.

The club supervisor however, insisted that I create a Facebook account. "Students don't communicate over email anymore," he said, "if you want the club members to know when an event is happening and verifying how many members will be attending, you need to set up a Facebook page for the club, and you need to administer that page." And it was true... to a point. I was also part of a robotics team that did mostly communicate via email, but we also had a Facebook page so team scouters could form alliances with other teams from neighboring schools. My supervisor and my parents both knew my account (not the credentials) so the account wasn't an unknown quantity.

In retrospect, neither of those approaches to social media were wrong per se, they were simply solutions to different problems: the problem of being a kid making friends on an internet full of adults, and of needing to reach out to real people and communicating and coordinating and cooperating with them.

To this day, I refuse to "connect" different accounts together so that no streams get crossed. But amoral corporations like Google and Facebook do not care about your privacy or your legal status - they want to know everything about you in order to market and advertise to you more effectively. It's an arms race of tying humans to accounts, and driving engagement: Age, Sex, Location, What Sites You Visit, What Programs You Run, What You Buy, Where You Read News, all of that is ammunition in the race. I don't envy parents nowadays, I can't imagine the scale of the problem where every kid has a smartphone and a dozen different accounts (or some all encompassing Google/FB single-signon) before they even reach high school.

[-] jeffw@lemmy.world 13 points 10 months ago
[-] Kusimulkku@lemm.ee 2 points 10 months ago

Is the joke that that translates to some number?

[-] jeffw@lemmy.world 2 points 10 months ago

You’re asking the wrong person lol. It would translate into a big fuckin number tho

[-] Armok_the_bunny@lemmy.world 1 points 10 months ago

It's ASCII for bb.

[-] jasondj@ttrpg.network 1 points 10 months ago* (last edited 10 months ago)

Ngl I would love to have at least one social media experience where everyone has to use their real, validated identity.

Probably not financially viable, because ironically, privacy would be chiefly important. It’d have to be a paid service, not use ads or sell data at all, posts and profile visible to nobody by default, connections made by direct in-person/text/email invitation or by mutual introduction…very different from most modern social media. It’d also have to have pretty insane security, and mandatory MFA for every user at least on every session, if not on every page transaction.

Could be technologically viable if we had digital government ID’s like drivers licenses printed on smartcards. But we can’t even get the states to agree on implementing common requirements for official state IDs.

I’d really love to see how it’d play out, in the real world, if it could reach enough of a mass of users to be financially self-sustaining, and what the environment would be like at that point. For the sake of science.

[-] Deiv@lemmy.ca 3 points 10 months ago

That's what LinkedIn is lol they added verification to it recently

[-] jasondj@ttrpg.network 3 points 10 months ago* (last edited 10 months ago)

I actually thought of that but no, not quite. I mean the point is everyone has to have a validated identity and post under their real name with their real, unedited, government ID-styled photo next to it.

No validation, no ID…no account. No exceptions.

[-] Semi-Hemi-Demigod@kbin.social 15 points 10 months ago

Kids under 13 aren't allowed on websites, period, thanks to COPPA passed in 2000. So all those elementary schoolkids signing up for Zoom better have had their parents should fax a copy of their driver's license before attending class.

[-] FishFace@lemmy.world 5 points 10 months ago

I mean when I was a kid we were just told not to trust people online - or strangers in the real world - and as a result I made tons of friends on IRC and through video games. I'm glad of that because I didn't have that many friends my own age.

[-] lolcatnip@reddthat.com 56 points 10 months ago

Facebook keeps suggesting a former coworker's daughter to me. I remember when she was born! Creepy AF.

[-] Fades@lemmy.world 22 points 10 months ago

A pedophiles dream

[-] DarkMessiah@lemmy.world 27 points 10 months ago* (last edited 10 months ago)

I honestly didn’t realise this sort of thing was happening, and am incredibly disgusted now that I do know.

Facebook being like the Mission Impossible handler for child abusers. “Your next mission, should you choose to accept it,” type bullshit.

[-] bionicjoey@lemmy.ca 8 points 10 months ago

"this child will self-destruct..."

[-] saltesc@lemmy.world 1 points 10 months ago

"I said no sugar before bed time!"

[-] Fades@lemmy.world 5 points 10 months ago

These social media companies will pander to ANYONE.

Anti vax, or maybe just a simple Christian extremist pushing disinformation? You had a home at Facebook. Hate minorities and think Jews control and are also destroying the world? Twitter got you.

They don’t fucking care about anything except engagement. Although Twitter cares more about the disinformation than the engagement clearly.

[-] terminhell@lemmy.dbzer0.com -5 points 10 months ago

I don't condone any of that. However, platforms such as those are basically a public square for everyone. And with that comes the ability for the fringe to speak too. We don't have to listen, but they have the 'right' to speak. Of course, within the laws.

[-] Fades@lemmy.world 0 points 10 months ago

So you have no problem with rampant lies that foment violence or harm? B-b-but the public square!!!! Okay elon

These companies build algorithms to highlight and spread because they make money by increasing engagement. It’s not just about having the right to speak, it’s that these horrible people and their lies and hate get amplified causing more harm and legitimacy

It is far more complex than public square so you should be able to say anything.

That’s not how public squares work anyway. Yeah you have the right to speak but you don’t have the right to be immune from consequences.

Lastly these social media platforms aren’t a public square; they’re quite literally private squares that allow people in so they can vacuum every goddamn crumb of data to sell.

[-] terminhell@lemmy.dbzer0.com 2 points 10 months ago

Valid points. I just don't care or give those platforms any real thought, or take them seriously at all. Maybe it's having been so desensitized to online opinions from the early days of of internet chat rooms or something.

With that said, I'm still no fan in general, of speech censorship. I'm not right or left either. And I'll take my down votes in stride as consequences of being able to share an opinion here.

[-] Reverendender@sh.itjust.works 5 points 10 months ago

I’m sure they will immediately cease and desist.

[-] autotldr@lemmings.world 3 points 10 months ago

This is the best summary I could come up with:


Social media platforms should fight online grooming by not suggesting children as "friends" by default, the communications watchdog says.

This first draft code of practice published by Ofcom in its role enforcing the Online Safety Act covers activity such as child sexual abuse material (CSAM), grooming and fraud.

The largest platforms are expected to change default settings so children aren't added to suggested friends lists, something that can be exploited by groomers.

They should also ensure children's location information cannot be revealed in their profile or posts and prevent them receiving messages from people not in their contacts list.

The method is already widely used by social media and search engines, according to Professor Alan Woodward of Surrey University.

Asked if Ofcom had the resources it needed, Dame Melanie admitted it was a "really big job" but added "we're absolutely up for the task.


The original article contains 744 words, the summary contains 144 words. Saved 81%. I'm a bot and I'm open source!

[-] some_guy@lemmy.sdf.org 0 points 10 months ago

Some years back, I briefly dated someone who liked being on one of the chat-roulette type of apps. She got on via her laptop and my wifi and started chatting with a teen girl. I told her this was problematic for me, an adult male, happening on my internet. I told her she couldn't use the service anymore at my place.

She couldn't understand why I was paranoid about it. Separately, she also thought I was ranting about a conspiracy theory when I told her about the Snowden leaks, so I guess that's no surprise. We lasted only a week.

this post was submitted on 11 Nov 2023
365 points (98.7% liked)

Technology

58302 readers
3177 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS