this post was submitted on 13 Mar 2026
99 points (96.3% liked)
Technology
82581 readers
3716 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It's a social media problem. It's going to be hard to provide pseudonymity, low-cost accounts relatively freely, and counter bots spamming the system to manipulate it. The model worked well in an era before there were very human-like bots that were easy to produce.
It might be possible to build webs of trust with pseudonyms. You can make a new pseudonym, but the influence and visibility gets tied to, for example, what users or curators that you trust trust, so the pseudonym has less weight until it acquires reputation. I do not think that a single global trust "score" will work, because you can always have bot webs of trust.
Unfortunately, the tools to unmask pseudonyms are also getting better, and throwing away pseudonyms occasionally or using more of them is one of the reasonable counters to unmasking, and that doesn't play well with relying more on reputation.
Im beginning to think that, as annoying for users and difficult to build a userbase for as it may be, the answer might ultimately have to be for future social sites to charge people for use in some way, be it to create accounts or as a subscription or just for the ability to post/comment/vote or whatever. If it's no longer going to be feasible to keep bots out, and there's a financial gain for their use, then they're going to get used, so at that point it has to be somehow more expensive to run a bot than that bot can be expected to bring in as a result of it's contribution to an advertising or manipulation campaign, to deter them. On the bright side, I guess it might lead to a shift away from advertising everywhere. Either you charge people and therefore dont need ads, or you dont, and have most of your ads being "seen" by bots, which advertisers probably don't want to spend money to reach anyway.
I had an idea pop in my head and I don't know if it's feasible or not, but maybe the next nascent social media network can try it out, who knows.
Private trackers for torrenting are notoriously hard to get invitations to, specifically because the only way to get in is through joining the community early, limited time windows to register, or some sort of lottery system, but most people get in when one of their friends sends them one of their limited number of invitations - which they don't do lightly because if you invite a leecher it harms your reputation and both you and the person you invited can get banned even if you are still maintaining a positive ratio.
So what if we implemented a similar kind of system? Bots can't flood a system if the registration is closed access, but regular people can still get invitations from friends and family. But if you invite a bot, that bot account and the account that invited it get terminated simultaneously, taking out two bad actors for the price of just one. Heck, if you really wanted to go scorched earth, every account that was registered via an invitation from the person who initial invited the bot should also get terminated. Know who you are inviting and you won't have any problems, but if you use your ability to invite people recklessly, your entire social network gets kicked off the platform.
This would probably never get implemented in a serious social media platform because those spaces rely on explosive growth to compete with other more established networks, and limiting the number of users is counterproductive - I think at some point investors would just start telling management to open the floodgates and let the bots in already.