this post was submitted on 05 Apr 2026
83 points (91.1% liked)

Technology

83534 readers
3613 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 11 comments
sorted by: hot top controversial new old
[–] IratePirate@feddit.org 23 points 1 day ago
[–] magnetosphere@fedia.io 11 points 1 day ago

“AI systems controlled by billionaire tech bros are certain to give me an answer that’s fair and unbiased!”

[–] schwim@piefed.zip 21 points 1 day ago (1 children)

Absolutely nobody needed a new study to show the risks. We all saw what could happen as Musk altered Grok to behave in manners that he approved of.

[–] muntedcrocodile@hilariouschaos.com 0 points 1 day ago (1 children)

And u think the other ai companies didn't do similar things?

[–] schwim@piefed.zip 5 points 1 day ago

Of course they do. That was literally my point. Musk didn't bother to hide it so we don't need new studies to show that's how they operate.

[–] DarrinBrunner@lemmy.world 11 points 1 day ago (1 children)

I suppose I should be surprised that people WANT to give up their right to think for themselves. But, I'm not.

To address this gap, researchers ran an experiment during the final week of Japan’s February 8, 2026, general election. The experiment reveals a striking pattern: when asked which party to support in the election, five major AI models from three companies overwhelmingly directed voter profiles with left-leaning policy positions toward the Japanese Communist Party (JCP). The reason, according to the researchers, has to do with the information environment AI systems can access. ... Furthermore, left-leaning policy views in voter profiles caused all five AI models to converge overwhelmingly on recommending the Japan Communist Party, even though other parties hold broadly similar positions on the issues tested. The concentration on recommending JCP under left-leaning policy stances is therefore not explained by ideological distinctiveness.

[–] XLE@piefed.social 1 points 7 hours ago

There's an interesting reason why, too. It's not because the AI is leftist, but because the JCP is doing effective SEO optimization with their websites, not blocking them from the corporate AI scanners.

It's really easy to abuse AI-targeted SEO, so this could be used way more maliciously in the near future.

[–] TwilitSky@lemmy.world 4 points 1 day ago

Friends ask who me who I voted for every election. I always go into a long winded explanation of the candidates and what they stand for before sharing my selection and reasoning.

[–] TropicalDingdong@lemmy.world 4 points 1 day ago

People are going to do the laziest thing possible. More at 11.

[–] Xylight 0 points 1 day ago

@grok is this true

[–] tal@lemmy.today -1 points 1 day ago

I think that if you aspire to regulate the political positions that AIs should recommend, you...okay, I think that that's probably not a great idea, but setting that aside, it seems pretty odd that you'd want to do that, but not regulate the political positions of webpages that search engines return or the political positions that news media may take, which would be what I'd consider alternate information sources.