this post was submitted on 08 Dec 2025
820 points (98.6% liked)
Videos
17128 readers
377 users here now
For sharing interesting videos from around the Web!
Rules
- Videos only
- Follow the global Mastodon.World rules and the Lemmy.World TOS while posting and commenting.
- Don't be a jerk
- No advertising
- No political videos, post those to !politicalvideos@lemmy.world instead.
- Avoid clickbait titles. (Tip: Use dearrow)
- Link directly to the video source and not for example an embedded video in an article or tracked sharing link.
- Duplicate posts may be removed
- AI generated content must be tagged with "[AI] …" ^Discussion^
Note: bans may apply to both !videos@lemmy.world and !politicalvideos@lemmy.world
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I was so confused, I thought it had accidentally done a Nazi salute while removing the headset and then it shut down as some sort of rule.
I had no idea that instead of a person standing there doing whatever a person controls it remotely. What a great idea for nuclear waste clean up, fucking terrible for handing out water.
Sorry to disappoint you, but there's no way this thing is usable for nuclear waste cleanup. in comparison, the cleanup crews in Chernobyl wanted to use robots to clear the graphite rubble off the roof of the power plant after the accident because of the high radiation levels there, but the radiation was crashing them pretty much instantly, forcing them to use human liquidators.
Components these days are surely even less resistant to radiation, because of much higher density parts which ensures that the memory and cache in this thing would look like after a blender treatment.
I'd disagree, we've had 4 decades to learn how to better harden electronics in high radiation environments. off the shelf stuff? sure that would be fucked. purposely designed:
https://www.sustainability-times.com/climate/mission-impossible-now-possible-these-high-tech-robots-to-heroically-clear-2850-radioactive-sandbags-from-fukushima-plant/
https://www.science.org/content/article/how-robots-are-becoming-critical-players-nuclear-disaster-cleanup
https://www.jalopnik.com/these-robots-go-into-fukushima-daiichi-so-people-don-t-1850032340/
Yeah I used to work in a chip company many years ago and we had rad hardened chips that had special outer packages instead of the normal consumer ones at a minimum. They're typically used for space stuff. Still not sure how well they'd hold up in Chernobyl in fairness.
for specially designed purposes, I'd say from the links above someone's figured it out.
We are talking about something designed by a company owned by Musk that is already faking autonomy, The tech you write about is specialized for this line of work, and none of it is wireless, humanoid and have the processing power to work autonomously. But i'm happy that a part of this work can finally be done remotely.
wasn't opining on musks shitware. just the idea that there's no way to operate in high rad environments.
Yep, we do this for particle accelerators and in other extreme radiation environments all the time.
Have NASA design them. They already do radiation hardening for outer space.
I believe the levels of radiation are several orders of magnitude different. I don't think you can even use a digital camera for a robot near these open reactors as the signal is completely swamped by the radiation, while in space you would just have a couple of inaccurate pixels at any point in time.
Camera issues are a whole nother issue, but other then tech debt, NASA likes to use older computers because the traces on the chips are bigger and less likely to have their bits flipped. A friend of mine programs for satellites and his systems use some sort of PowerPC chip.
Why would consumer electronics be radiation hardened? But I didn't to say that we can't do radiation hardened robots it's just that these ones won't be it.
Well you can make radiation resistant electronics or shield them with lead.
But I would design a robot that has at least 4 wheels or legs.
Falling over is sth you don't want to happen.
You aren't going to get a wireless connection in a radiation zone.
Yesnt. Just use a different frequency.
Or cables
4 legs, cables, ... it's already tricky to navigate a complex space on 2 legs without cables ...
I feel like this is close to https://rodneybrooks.com/why-todays-humanoids-wont-learn-dexterity/ recent piece, namely that, yes, all that is easy to imagine but in practice, to reach the seemingly basic level of movement an average human can do for a similar weight, not 1 ton, is actually ridiculously hard. Biological organisms aren't magical by any stretch of the imagination but somehow to manufacture an equivalent is not something we are able to do. Each extra wire adds a bit more weight, which in turns needs more powerful servos, themselves making the one below requiring more power too and keeping the whole thing mobile needs a very powerful battery... so yes making design suggestion is easy but fully comprehending the consequences of those choices often means going back to the drawing board.
Fun fact : VR players know quite well what moving while tethered means. I can tell first hand, it's damn annoying BUT if you don't manage it, you will both fall and break your system.
Who says it needs to be wireless?
We literally have drones right now tethered in active war zones because of jamming, and they work fine until someone purposely snips the wire.
Hopefully we can ask the mole people in Chernobyl not to snip the wires.
The rate that fukushima robots fails, you are going to.have a load of cables trailing into the radiation zone.
That's very interesting. Robots are less resistant to radiation than humans? So when robots take the jobs of people, production is made more vulnerable to nuclear weapons?
in a way. Cell damage can be repaired when it occurs in low amounts and even broken DNA strands can be fixed by the machinery in our cells. Most importantly, our systems are very much redundant on a cellular level, losing a few cells is not so much of an issue, since we lose cells every day anyways. Computers have nearly no redundancy; in some cases, a single bit flipped by a gamma ray can cause a system crash in any computer. There is stuff like ECC for memory which helps, but even that isn't foolproof. Computers for space missions outside of earths magnetosphere are designed to make sure the density of components isn't too high, with lots of error correction code, backups and a lot of lead shielding, which equals lower performance.
I think you are both overestimating the ability of biological systems and underestimating the ability of mechanical systems to be repaired.
Biological systems have incredible self-repair capabilities, but are otherwise largely unrepairable. To fix issues with biological systems you mostly have to work within the bounds of those self-repair mechanisms which are slow, poorly understood and rather limited.
Loosing a few skin cells is perfectly normal. Corrupting a few skin cells can cancer cancers or autoimmune disorders. Loosing a few Purkinje cells can lead to significant motor impairment and death.
Computers, and mechanical systems in general, can have a shit ton of redundancy. You mention ECC, but neglected to mention the layers of error connection, BIST, and redundancy that even the cheap, broken, cost-optimized, planned obsolescence consumer crap that most people are mostly familiar with make heavy use of.
A single bit flipped by a gamma ray will not cause any sort of issue in any modern computer. I cannot overstate how often this and other memory errors happen. A double bit flip can cause issues in a poorly designed system and, again, are not just caused by cosmic rays. However, it's not usually that hard to have multiple redundancies if that is a concern, such as with high altitude, extreme environment, highly miniaturized, etc. objects. It does increase cost and complexity though so____
The huge benefit of mechanical systems is they are fully explainable and replaceable. CPU get a bunch of radiation and seems to be acting a bit weird? Replace it! Motor burnt out? Replace it! The new system will be good as new or better.
You can't do that in a biological system. Even with autografts (using the person's own tissues for "replacements") the risk of scarring, rejection and malignancy remains fairly high and doesn't result in "good as new" outcome, but is somewhere between 'death' and 'minor permanent injury'. Allografts (doner tissues) often need lifelong medications and maintenance to not fail, and even "minor" transplants carry the risk of infection, necrosis and death.
That study doesn't seem to support the point you're trying to use it to support. First it's talking about machines with error correcting RAM, which most consumer devices don't have. The whole point of error correcting RAM is that it tolerates a single bit flip in a memory cell and can detect a second one and, e.g. trigger a shutdown rather than the computer just doing what the now-incorrect value tells it to (which might be crashing, might be emitting an incorrect result, or might be something benign). Consumer devices don't have this protection (until DDR5, which can fix a single bit flip, but won't detect a second, so it can still trigger misbehaviour). Also, the data in the tables gives figures around 10% for the chance of an individual device experiencing an unrecoverable error per year, which isn't really that often, especially given that most software is buggy enough that you'd be lucky to use it for a year with only a 10% chance of it doing something wrong.
It's a paper from 2009 talking about "commodity servers" with ECC protection. Even back then it was fairly common and relatively cheap to implement though it was more often integrated into the CPU and/or memory controller. Since 2020 with DDR5 it's mandatory to be integrated into the memory as well.
Yes, that's my point. Your claim of "computers have nearly no redundancy" is complete bullshit.
It wasn't originally my claim - I replied to your comment as I was scrolling past because it had a pair of sentences that seemed dodgy, so I clicked the link it cited as a source, and replied when the link didn't support the claim.
Specifically, I'm referring to
This just isn't correct:
Sorry, I wasn't paying attention and missed that. I apologize.
Integrated memory ECC isn't the only check, it's an extra redundancy. The point of that paper was to show how often single bit errors occur within one part of a computer system.
Right, because of redundancies. It takes 2 simultaneous bit flips in different regions of the memory in order to cause a memory error and it's still ~10% chance annually according to the paper I cited.
ECC genuinely is the only check against memory bitflips in a typical system. Obviously, there's other stuff that gets used in safety-critical or radiation-hardened systems, but those aren't typical. Most software is written assuming that memory errors never happen, and checksumming is only used when there's a network transfer or, less commonly, when data's at rest on a hard drive or SSD for a long time (but most people are still running a filesystem with no redundancy beyond journaling, which is really meant for things like unexpected power loss).
There are things that mitigate the impact of memory errors on devices that can't detect and correct them, but they're not redundancies. They don't keep everything working when a failure happens, instead just isolating a problem to a single process so you don't lose unsaved work in other applications etc.. The main things they're designed to protect against are software bugs and malicious actors, not memory errors, it just happens to be the case that they work on other things, too.
Also, it looks like some of the confusion is because of a typo in my original comment where I said unrecoverable instead of recoverable. The figures that are around 10% per year are in the CE column, which is the correctable errors, i.e. a single bit that ECC puts right. The figures for unrecoverable/uncorrectable errors are in the UE column, and they're around 1%. It's therefore the 10% figure that's relevant to consumer devices without ECC, with no need to extrapolate how many single bit flips would need to happen to cause 10% of machines to experience double bit flips.
It also means humans will be progressively pushed into the most dangerous jobs because the robot circuitry can’t cope with harsh environments. The easy cushy jobs will go to the robots.
Could be some exceptions.
Off the top of my head: Anything with poisonous gases. Anything where there's a RISK of an explosion or something (so the robots would work before the explosion; this is kinda already a thing with bomb disposal robots, isn't it?). Etc.
So for sure anything nuclear will have to be human, but there could be other environments where robots survive, but humans won't.
Humans are, in general, absurdly robust. You can absolutely mess them up, and they will keep chugging along for a while before breaking down. Not to mention their almost frightening ability to make a full recovery from horrendous injuries.
Most robots/machines will be more or less completely disabled by a faulty connection, clogged valve, or torn hydraulic line. Sure, you can shield them more, but for stuff like radiation, dust, and harsh environments that cause gradual degradation, you're going to have a very hard time beating the resilience of humans.
Bleep Bloop... it is clearly advantageous that we use humans to operate in harsh environments rather that robots... Bleep Ding.
And don't forget cheaper!
Which is why its imperative that little Timmy is sent to the mines despite all the risks and occupational health hazard that will eventually kill them.
Robots were sent into the Chernobyl reactor and they stopped working immadiately. Gamma radiation fries circuits.
in the end, they sacrificed soldiers above dumping sacks of cement, and miners below laying a foundation to stop the core melting into the earth.
You can know that isn't the case because a Nazi salute would be encouraged by Musk, not shut it down.