this post was submitted on 26 Aug 2025
46 points (89.7% liked)

science

21228 readers
601 users here now

A community to post scientific articles, news, and civil discussion.

rule #1: be kind

founded 2 years ago
MODERATORS
top 21 comments
sorted by: hot top controversial new old
[–] OpenPassageways@lemmy.zip 1 points 4 hours ago

Lost me at "AI-powered decoder".

AI is a meaningless term now. They should have just said algorithm, either that or it's bullshit.

[–] thegr8goldfish@startrek.website 35 points 2 days ago (4 children)

There should be an indefinite moratorium on this research under the current regime. Way too much potential for abuse.

[–] latenightnoir@lemmy.blahaj.zone 16 points 2 days ago (1 children)

But it's in Everyone's Best Interest! Do you want to hide your thoughts? Why would you want that? Why do you oppose our unbridled access to the deepest recesses of your mind? Are you scared of what we might find? Do not be afraid, we just want to make you better! To bring our Harmony to your mind! To bring you our Peace! You must let us Enlighten you!

[–] undefined@lemmy.hogru.ch 4 points 2 days ago

Random casing is on brand.

[–] logicbomb@lemmy.world 9 points 2 days ago (2 children)

This research uses a direct computer interface to the brain, a BCI.

Due to the invasiveness of the procedure, I don't imagine the current administration would find much use for it. Their strongarm and propaganda methods are working well enough for their evil plans.

[–] prole@lemmy.blahaj.zone 2 points 1 day ago

The procedure is invasive at the moment, but that doesn't mean it will always be

[–] Asafum@feddit.nl 5 points 2 days ago (1 children)

Due to the invasiveness of the procedure, I don’t imagine the current administration would find much use for it.

Or: "All Liberals must undergo an American Loyalty Test. Your thoughts will be observed and if any anti-american sentiment is discovered you will be deported!"

If it's used against an "enemy" group the invasiveness (potential pain) would be viewed as a benefit to them.

Not that I think this could be reasonably carried out in a country of 400 million people, but I wouldn't for one second doubt their willingness to try eventually.

[–] logicbomb@lemmy.world 6 points 2 days ago

In a world where they can force half the country to get implants, I don't think they need to worry about justifying their actions. They'd just deport the people straight away or put them in death camps.

[–] FishFace@lemmy.world 5 points 2 days ago

How do you imagine the current regime abusing this technology in a way that doesn't imply they already have complete power over the people they want to abuse?

I think if you're saying this you probably didn't understand what is involved in this research.

[–] mfed1122@discuss.tchncs.de 1 points 2 days ago

Same thought here. This technology is amazing and really can do so much good. It could even be useful one day just for regular computer operations. But for the love of everything good in this world, if we develop the ability to read minds like this from a distance, without needing implants or things strapped to the brain - we're in serious trouble.

And already this could be used for interrogation. To be honest, I'd be surprised if it wasn't already being considered for trials somewhere. Seems like the kind of thing that would be declassified 50 years from now.

[–] FlowVoid@lemmy.world 28 points 2 days ago* (last edited 2 days ago) (2 children)

Just so it's clear, each subject spent hours training their own AI to recognize words they intentionally produced as "inner voice". At the end, each trained AI could only recognize those trained words, and only from their trainer.

That's quite a bit different than an AI that could recognize new words in a completely different person. I think that's as unlikely as training ChatGPT using only English texts and then expecting it to understand Chinese.

[–] masterofn001@lemmy.ca 3 points 1 day ago

First they built a silicon cube the size of a toaster with a little switch in it that responded to electrical signals.

Now we have billions to trillions of those in our pockets.

Etc.

[–] DomeGuy@lemmy.world 5 points 2 days ago (1 children)

Was there any reported reduction in training time needed for subsequent words?

Just getting a computer to understand anything from the implanted wires is progress, but it's "spend hours training for each single word" we're still at a 1970s scifi level of interaction.

[–] FlowVoid@lemmy.world 5 points 2 days ago

Normally words are all trained together, not one after another. Subjects do a monitored task that includes the entire vocabulary list, data from that task is processed extensively off-line, and finally subjects return to test all the vocabulary words.

[–] ileftreddit@piefed.social 11 points 2 days ago* (last edited 2 days ago) (1 children)

I mean I take anything that Popular Mechanics publishes with a HUGE grain of salt

edit: yup, the brain interface had a vocabulary of 50 words, with a 33% error rate. Few more years of study before they start microchipping toddlers

[–] Artisian@lemmy.world 3 points 2 days ago

More and more I want the option to automatically downweight or hide links from certain sources...

[–] BreadOven@lemmy.world 9 points 2 days ago* (last edited 2 days ago) (1 children)

My inner monologue: "Ahhhhhhhhhhhhhhhhhhhhhhhh".

[–] yermaw@sh.itjust.works 7 points 2 days ago (1 children)

Baby shark doodoododododododoo

[–] lena@gregtech.eu 2 points 2 days ago
[–] frustrated_phagocytosis@fedia.io 2 points 2 days ago (1 children)

People should start volunteering for these studies and absolutely fuck them up. Lie about the model correctly predicting everything.

[–] bus_factor@lemmy.world 6 points 2 days ago

Why fuck them up? This is typically used for people with disabilities. They won't spy on you with it, that's easier done through other means.

That being said, they've royally messed up the experiment design if you're capable of lying about the results without getting caught.