278
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 13 Jun 2024
278 points (100.0% liked)
Technology
37699 readers
268 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
As far as I can find out, there was only one use of GPUs prior to alexnet for CNN, and it certainty didn't have the impact alexnet had. Besides, running this stuff on GPUs not CPUs is a relevant technological breakthrough, imagine how slow chayGPT would be running on a CPU. And it's not at all as obvious as it seems, most weather forecasts still run on CPU clusters despite them being obvious targets for GPUs.
What? Alexnet wasn't a breakthrough in that it used GPUs, it was a breakthrough for its depth and performance on image recognition benchmarks.
We knew GPUs could speed up neural networks in 2004. And I'm not sure that was even the first.
Okay, so some of the advances that chatGPT uses (consumer GPUs for training) are even older? 😁
Why stop there? The digital computer was introduced in 1942 and methods for solving linear equations were developed in the 1600s.