this post was submitted on 03 Apr 2026
353 points (96.3% liked)

Technology

83896 readers
6729 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] fluffykittycat@slrpnk.net 110 points 2 weeks ago (53 children)

It was never designed to protect children

Glad to see it's not even working. Let's keep fighting aginst these evil laws

[–] expr@piefed.social 36 points 2 weeks ago (4 children)

I mean, social media should be banned for everyone, not just teenagers. It's a great evil in the world today, and in a functional democracy that wasn't braindead, we should ban them outright for the mass harm and destruction they have caused.

That being said, I fully understand that the motivations of countries for these kinds of bans have little to do with the harm of social media and are much more about surveillance.

[–] Link@rentadrunk.org 16 points 2 weeks ago (6 children)

Which type of social media are we referring to here?

Doesn’t Lemmy count as social media?

load more comments (6 replies)
[–] yardratianSoma@lemmy.ca 8 points 2 weeks ago (4 children)

It's so bonkers how most of the older generations agree that being on the internet cannot make you social, yet became the default method to communicate.

Ban it for everyone? I mean, lemmy itself is a social network platform, if you want it to be. But I know what you mean: social media being the most used platforms, Google, Facebook, Tik-Tok, etc . . . And for that, yeah, I do agree with a full ban. We need a cultural reset, where we aren't being fed sensationalist bullshit and pure brainrot as entertainment via an algorithm trained on our insufficient capacity to regulate our attention.

[–] Dave@lemmy.nz 19 points 2 weeks ago (4 children)

In my view social media is probably not the problem, but the algorithms they use that are designed to be addictive and manipulative.

I saw an article once arguing that the algorithms should be regulated in a similar way to medicine. Give some base ingredients they can use freely (e.g. sort by newest first), then require any others to run studies to prove they are not harmful.

There would be an expert board that approves or declines the new algorithm in the same way medicines are approved today (the important bit being that they are experts, not politicians making the decision).

[–] Instigate@aussie.zone 7 points 2 weeks ago (5 children)

This is the correct response. Social media, as a construct, is not evil and dos not do harm to anyone. The commodification and commercialisation of social media by capitalistic companies is what has caused the harm we see today.

All of the harms and evils of social media can be boiled down to a single concept: the algorithm. Because algorithmic recommendation of content wants to encourage people to stay on a platform (for capitalistic reasons), and the most enticing and attention-grabbing content is hate-content, these companies have forced hate-inducing concepts down the throats of people in an endeavour to make more money and destroyed individuals and families/friends in the process.

If we regulate the algorithms, we regulate the harm without disempowering anyone. We can, and we should, regulate algorithms on social media to turn it back into what it was 20-odd years ago - a measure to keep in touch with people you know or care about.

load more comments (5 replies)
load more comments (3 replies)
load more comments (3 replies)
load more comments (2 replies)
load more comments (52 replies)
[–] gurty@lemmy.world 71 points 2 weeks ago (1 children)

‘…internally the government was aware of a lack of evidence to support the ban before they passed the legislation anyway’

Terrific job, gov.

[–] Australis13@fedia.io 33 points 2 weeks ago

Our government is usually technologically inept.

The first online census (2016) crashed the system because they didn't allow enough capacity. Anyone with half a brain could have told them that most people were going to try to use it during one particular time -- after dinner (especially since the paper census is supposed to count everyone on that particular night). Instead, they decided to rate it for only 1 million form submissions per hour, despite estimating that two-thirds of Australians would fill it out online. At one person per family, that's around 4 million online submissions. Now factor in that the eastern states have most of the population (and are all in the same time zone at that time of year) and, predictably, the site went down after dinner on census night.

https://www.abc.net.au/news/2016-08-09/abs-website-inaccessible-on-census-night/7711652

[–] Lexam@lemmy.world 38 points 2 weeks ago (1 children)

I don't know. There's some joy in saying I told you so, to people who had the hubris to try and stop teenagers from being teenagers.

[–] MagicShel@lemmy.zip 15 points 2 weeks ago (1 children)

We will simply pass laws requiring them to be adults! Easy!

[–] Postmortal_Pop@lemmy.world 7 points 2 weeks ago

Careful, you might give the pedo states an idea.

[–] deathbird@mander.xyz 32 points 2 weeks ago (2 children)

Key point: "Ultimately, the fundamental problem with age-gating is that it fails to address any of the root problems with our current online landscape – that is, the extractive business models and pernicious design features of mainstream tech companies. We all exist in a highly commercialised information ecosystem, rife with algorithmically amplified misinformation, scams, harmful content and AI slop. Children are particularly vulnerable to these issues but the reality is that it impacts everyone, even if you’re blissfully absent from Facebook or Instagram."

[–] imjustmsk@lemmy.ml 7 points 2 weeks ago

They don't wanna solve the root problem, they just want to make the big tech companies happy as well as the people who is sayiing shit about social media happy, Age verification is their stupid answer to which translates to "We don't give a flying shit about kids"

load more comments (1 replies)
[–] sbv@sh.itjust.works 32 points 2 weeks ago (8 children)

With a 70% non-compliance rate, that isn't entirely surprising.

Platforms are even less likely to implement real reforms that the author alludes to.

[–] Psythik@lemmy.world 24 points 2 weeks ago* (last edited 2 weeks ago)

Similar thing happened where I live with porn. Recently passed a law requiring ID. Instead of complying, I just started going to different websites. No way am I giving up my identity to a sketchy porn site, no matter what the law says.

load more comments (7 replies)
[–] BranBucket@lemmy.world 30 points 2 weeks ago (1 children)

What if, instead of trying and failing to kick kids off social media, we focused our attention on the reasons why being online is so often detrimental in the first place?

Pre-fucking-cisely.

[–] SocialMediaRefugee@lemmy.world 8 points 2 weeks ago (1 children)

Then you'd have a massive "but what about the children?!" censorship situation for everyone.

[–] BranBucket@lemmy.world 10 points 2 weeks ago* (last edited 1 week ago)

We already have that, and it has solved absolutely nothing while potentially making online surveillance and privacy issues worse.

The answer isn't age-gating or ID verification, it's changing how the sites themselves operate. Get rid of the idea of "driving engagement", no more stealth ads, and no corpo, media, political party, or lobbyist accounts. Hold influencers and podcasters to the same kind of standards we used to hold journalists to, where they're required to tell you when the're shilling for some kind of shady supplement company or political huckster.

You know, the kind of shit any sane species would do with this sort of tech, but when have we ever been sane?

[–] shortwavesurfer@lemmy.zip 30 points 2 weeks ago

Speak for yourself. I find quite a bit of joy in "I told you so".

[–] Jimbel@lemmy.world 26 points 2 weeks ago (5 children)

The addictive design of platforms, software and algorithms should be adressed, not the users age.

And the tech companies should be made responsible to design more healthy platforms, etc.

The problem is the design of tech, not the people using it.

[–] A_Random_Idiot@lemmy.world 9 points 2 weeks ago (6 children)

Why is everyone forgetting the parents in this shit. They are the ones giving their kids access to this shit, not monitoring and moderating their access to this shit, and letting screens do the job of raising their kids instead of doing it themselves.

[–] SocialMediaRefugee@lemmy.world 6 points 2 weeks ago (1 children)

The same parents who scream anytime a teacher grades them fairly?

load more comments (1 replies)
load more comments (5 replies)
[–] coolmojo@lemmy.world 7 points 2 weeks ago

But without the addictive design the users don't spend enough time to see all the ads and tracking required to reach the target growth. Somebody think of the shareholders /s

load more comments (3 replies)
[–] commander@lemmy.world 23 points 2 weeks ago

They're propaganda laws. Internet censorship laws. Palestinian genocide started trending on social media and suddenly all the countries out in the west wanted to start banning/controlling social media. Plus the earlier push to ban TikTok by Facebook to try to ladder pull the market from competitors

[–] Baggie@lemmy.zip 18 points 2 weeks ago

This and the porn thing have been massively invasive in terms of privacy. It's so transparently just building a database of facial data. It doesn't even make an attempt to comprehensively block everything on the internet, or realistically enforce compliance.

[–] wewbull@feddit.uk 17 points 2 weeks ago* (last edited 2 weeks ago)

The fallback argument for the social media ban is that it’s better than nothing. But with results like these, it may be worse than nothing, given it potentially creates new problems. Children will remain online with arguably less supervision and support, new privacy and digital security vulnerabilities seem to have appeared and the worst aspects of social media lay largely unaddressed.

I wish more people understood this. Changing something can mean you've caused harm unintentionally, even if you haven't identified it yet. Too many people seem to have the thought process "We have to do something! This is something. Let's do this." without ever considering the harm they might do.

[–] blind3rdeye@aussie.zone 15 points 2 weeks ago (3 children)

I've talked to heaps of parents and heaps of kids about this. What I think is interesting is that people face-to-face seems to be generally supportive of the law. They say that social media is problematic, and that the law helps by discouraging its use. A few different kids have said that they it helps them break an addition. Other kids say they don't care, because it hasn't blocked them. So mostly positive or neutral responses when face-to-face.

But every time I see this mentioned on the internet, it's very negative. There are always heaps of comments saying that it is a failure, and could never work, and that the government is stupid; and there are often other comments saying it is a part of a secret plan for the government to track us or whatever. In any case, mostly negative views - with just a sprinkling of fairly neutral views such as "it hasn't been active for very long. Lets wait and see."

I just think that's interesting. I guess my real-world social circles don't totally match my internet social circles.

[–] emmy67@lemmy.world 8 points 2 weeks ago

Kids will often just repeat what they've heard to adults.

But the largest problems to these laws is the way they affected minority groups. If followed, the law would disproportionately affect disabled and queer teens who may suddenly be unable to access help and community.

I suspect there's some selection bias in the kids you're speaking to.

load more comments (2 replies)
[–] melsaskca@lemmy.ca 10 points 2 weeks ago (6 children)

Censorship is never the answer. Teaching values and the corresponding ethics and morals that come with it is closer to the answer. A world where you burn down shit just to get a job as a firefighter makes this path a bit more difficult and harder to follow.

[–] UnderpantsWeevil@lemmy.world 6 points 2 weeks ago

Censorship is never the answer.

https://en.wikipedia.org/wiki/Paradox_of_tolerance

Formally banning certain forms of vulgar and bigoted expression establish a code of conduct for the community, even if they aren't strictly enforced.

Teaching values and the corresponding ethics and morals that come with it is closer to the answer.

Morality is as much about proactive and affirmative pursuit of justice as internalized codes of conduct.

If there is no social consequence for immoral behavior, there is no reason to believe the act is immoral.

load more comments (5 replies)
[–] Eyekaytee@aussie.zone 10 points 2 weeks ago

7 in 10? so 3 are off of it? good news 🥳

please expand to over 65 year olds as well

[–] FlashMobOfOne@lemmy.world 7 points 2 weeks ago (4 children)

A 30% reduction of kids being exposed to these harmful platforms is a good thing and I'm glad to see it.

Also, all laws are imperfect, and expecting 100% efficacy is moronic.

[–] fodor@lemmy.zip 7 points 1 week ago (1 children)

Right, but the politicians didn't sell the law at 30% efficiency. They sold it at something like 95% efficiency. So they lied and they haven't solved anything.

Maybe they could have used all of that money to run campaigns to help convince parents to properly supervise their children. Maybe that would have done more than this 30% figure.

[–] FlyingCircus@lemmy.world 6 points 1 week ago (1 children)

Or maybe, instead of creating privacy-infringing laws or blaming parents, we actually dismantle the tech companies who created them and imprison their leaders. We all know corporate social media is cancer, that’s why we’re on Lemmy. So let’s fucking do something about the cancer instead targeting the victims or worse, exploiting the situation to expand the surveillance state.

load more comments (1 replies)
[–] SocialMediaRefugee@lemmy.world 6 points 2 weeks ago (3 children)

Seriously. Murders still happen so lets legalize murder.

load more comments (3 replies)
load more comments (2 replies)
[–] daannii@lemmy.world 5 points 2 weeks ago* (last edited 2 weeks ago) (7 children)

Right it's going to take longer than a few months to enforce properly and undo the damage and protect new generations from its negative effects.

At least it's a start.

load more comments (7 replies)
load more comments
view more: next ›