this post was submitted on 06 Aug 2025
31 points (84.4% liked)

Technology

41139 readers
399 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 3 years ago
MODERATORS
 

OpenAI's first open source language model since GPT-2

top 9 comments
sorted by: hot top controversial new old
[–] nathan@piefed.alphapuggle.dev 14 points 5 months ago (1 children)

*if you have a laptop with 16gb of vram. Otherwise you'll be watching ollama hit your CPU for 5 minutes with no output

[–] sefra1@lemmy.zip 6 points 5 months ago (3 children)

Isn't that true for most models until someone destiles and quantises them so they can run on common hardware?

This is the internet, we're only allowed to be snarky here.

[–] Ghoelian@lemmy.dbzer0.com 1 points 5 months ago* (last edited 5 months ago)

I mean yeah, but that doesn't make the title any more true.

[–] CyberSeeker@discuss.tchncs.de 1 points 5 months ago (1 children)

Yes, but 20 billion parameters is too much for most GPUs, regardless of quantization. You would need at least 14GB, and even that’s unlikely without offloading major parts to the CPU and system RAM (which kills the token rate).

I tried it out last night and it ran quite well on my heavily thermally limited i9 11950h/rtx 3080 laptop. I had maybe 6 or 7 gigs of main ram used in total, with docker running. It was only using about 12 gigs of vram in my very limited testing.

[–] Bebopalouie@lemmy.ca 6 points 5 months ago (1 children)
[–] SweetCitrusBuzz@beehaw.org 0 points 5 months ago
[–] PoisonedPrisonPanda@discuss.tchncs.de 4 points 5 months ago* (last edited 5 months ago)