this post was submitted on 01 Jul 2025
2200 points (98.4% liked)
Microblog Memes
10341 readers
2513 users here now
A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.
Created as an evolution of White People Twitter and other tweet-capture subreddits.
RULES:
- Your post must be a screen capture of a microblog-type post that includes the UI of the site it came from, preferably also including the avatar and username of the original poster. Including relevant comments made to the original post is encouraged.
- Your post, included comments, or your title/comment should include some kind of commentary or remark on the subject of the screen capture. Your title must include at least one word relevant to your post.
- You are encouraged to provide a link back to the source of your screen capture in the body of your post.
- Current politics and news are allowed, but discouraged. There MUST be some kind of human commentary/reaction included (either by the original poster or you). Just news articles or headlines will be deleted.
- Doctored posts/images and AI are allowed, but discouraged. You MUST indicate this in your post (even if you didn't originally know). If a post is found to be fabricated or edited in any way and it is not properly labeled, it will be deleted.
- Be nice. Take political debates to the appropriate communities. Take personal disagreements to private messages.
- No advertising, brand promotion, or guerrilla marketing.
Related communities:
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Only because of brute force over efficient approaches.
Again, look up Deepseek's FP8/multi GPU training paper, and some of the code they published. They used a microscopic fraction of what OpenAI or X AI are using.
And models like SDXL or Flux are not that expensive to train.
It doesn’t have to be this way, but they can get away with it because being rich covers up internal dysfunction/isolation/whatever. Chinese trainers, and other GPU constrained ones, are forced to be thrifty.
And I guess they need it to be inefficient and expensive, so that it remains exclusive to them. That's why they were throwing a tantrum at Deepseek, because they proved it doesn't have to be.
Bingo.
Altman et al want to kill open source AI for a monopoly.
This is what the entire AI research space already knew even before deepseek hit, and why they (largely) think so little of Sam Altman.
The real battle in the space is not AI vs no AI, but exclusive use by AI Bros vs. open models that bankrupt them. Which is what I keep trying to tell /c/fuck_ai, as the "no AI" stance plays right into the AI Bro's hands.