It uses 1% of the energy but is still 1000x faster than our current fastest cards? Yea, I'm calling bullshit. It's either a one off, bullshit, or the next industrial revolution.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
I mean it‘s like the 10th time I‘m reading about THE breakthrough in Chinese chip production on Lemmy so lets just say I‘m not holding my breath LoL.
Yeah it’s like reading about North American battery science. Like yeah ok cool, see you in 30 years when you’re maybe production ready
I’m ready for this entire category of “science blog” website to disappear. Mainstream media are bad enough at interpreting movements in science, but these shitty little websites make their entire living off of massively overblowing every little thing. Shame on OP and anyone who posts this kind of garbage.
https://www.nature.com/articles/s41928-025-01477-0
Here's the paper published in Nature.
However, it's worth noting that Nature has had to retract studies before:
https://en.wikipedia.org/wiki/Nature_(journal)#Retractions
From 2000 to 2001, a series of five fraudulent papers by Jan Hendrik Schön was published in Nature. The papers, about semiconductors, were revealed to contain falsified data and other scientific fraud. In 2003, Nature retracted the papers. The Schön scandal was not limited to Nature; other prominent journals, such as Science and Physical Review, also retracted papers by Schön.
Not saying that we shouldn't trust anything published in scientific journals, but yes, we should wait until more studies that replicate these results exist before jumping to conclusions.
For the love of Christ this thumbnail is triggering, lol
Just push ever so slightly more when you hear the crunching sounds.
Look, It's one of those articles again. The bi-monthly "China invents earth-shattering technology breakthrough that we never hear about again."
"1000x faster?" Learn to lie better. Real technological improvements are almost always incremental, like "10-20% faster, bigger, stronger." Not 1000 freaking times faster. You lie like a child. Or like Trump.
“1000x faster?” Learn to lie better
Analogue computers are indeed capable of doing a task 1000x faster than a regular computer. The difference is they do only that task, in a very specific way, and with one specific type of output. You can 3D print at home an "analogue computer" that can solve calculus equations, it can technically be faster than a CPU, but that's the only thing it can do, it's complex, and the output is a drawing on paper.
If you come up with a repeatable and precise set of mechanical movements that are analogous to the problem you want to solve, you can indeed come up with headlines like that.
Because until it hits market, it’s almost meaningless. These journalists do the same shit with drugs in trials or early research.
1000x!
Is this like medical articles about major cancer discoveries?
Yes. Please remember to downvote this post and all others that are based on overblown articles from nobody science blogs.
1000x yes!
(x) Doubt.
Same here. I wait to see real life calculations done by such circuits. They won't be able to e.g. do a simple float addition without losing/mangling a bunch of digits.
But maybe the analog precision is sufficient for AI, which is an imprecise matter from the start.
This was bound to happen. Neural networks are inherently analog processes, simulating them digitally is massively expensive in terms of hardware and power.
Digital domain is good for exact computation, analog is better for approximate computation, as required by neural networks.
You might benefit from watching Hinton's lecture; much of it details technical reasons why digital is much much better than analog for intelligent systems
BTW that is the opposite of what he set out to prove He says the facts forced him to change his mind
That's a good point. The model weights could be voltage levels instead of digital representations. Lots of audio tech uses analog for better fidelity.I also read that there's a startup using particle beams for lithography. Exciting times.
This seems like promising technology, but the figures they are providing are almost certainly fiction.
This has all the hallmarks of a team of researchers looking to score an R&D budget.
Okay, I'm starting to think this article doesn't really know what it's talking about...
For most of modern computing history, however, analog technology has been written off as an impractical alternative to digital processors. This is because analog systems rely on continuous physical signals to process information — for example, a voltage or electric current. These are much more difficult to control precisely than the two stable states (1 and 0) that digital computers have to work with.
1 and 0 are in fact representative of voltages in digital computers. Typically, on a standard IBM PC, you have 3.3V, 5V and 12V, also negative voltages of these levels, and a 0 will be a representation of zero volts while a 1 will be one of those specified voltages. When you look at the actual voltage waveforms, it isn't really digital but analogue, with a transient wave as the voltage changes from 0 to 1 and vice versa. It's not really a solid square step, but a slope that passes a pickup or dropoff before reaching the nominal voltage level. So a digital computer is basically the same as how they're describing an analogue computer.
I'm sure there is something different and novel about this study, but the article doesn't seem to have a clue what that is.
The thing which makes digital chips so much better than analog chips is something both you and the article are missing: noise. A digital chip is very robust against noise, as long as the noise in one step isn't too big so it causes a bitflip immediately the stable configuration will pull the voltage level back and no information is lost. Not so with analog logic, since the information is continuous every step which introduces noice (which is basically every step) will cause loss of information. Go a few levels of logic deep and all you've got left is noise.
Go a few levels of logic deep and all you’ve got left is noise.
Which you often don't need. Mechanical computers for aircraft operation, or hydraulic computers for modeling something nuclear, things like that.
But there's nothing "century-old" about all this. They might have non-deterministic steps for some calculation where determinism is not needed (like if you need to ray-trace a sphere, you'll do fine with a bit different dithering each time) and without it better performance is achievable.
The idea seems to make sense, just - it will never be revolutionary.
Ahh yeah and we should 1. Believe this exists 2. Believe that china doesnt think technology of this caliber isnt a matter of national security
And it'll be on sale through Temu and Wish.com
> See article preview image
> AI crap CPU
> Leaves immediately
This is not a new line of research in the sense that this is not the only place looking in the mixed analog/digital computers. been articles on it for at least a year I think and when digital was taking over there was a lot of discussion around it being inferior to analog so I bet its been being thrown around to combine the two likely since digital became a thing.