this post was submitted on 23 Oct 2025
81 points (100.0% liked)

news

24698 readers
646 users here now

Welcome to c/news! We aim to foster a book-club type environment for discussion and critical analysis of the news. Our policy objectives are:

We ask community members to appreciate the uncertainty inherent in critical analysis of current events, the need to constantly learn, and take part in the community with humility. None of us are the One True Leftist, not even you, the reader.

Newcomm and Newsmega Rules:

The Hexbear Code of Conduct and Terms of Service apply here.

  1. Link titles: Please use informative link titles. Overly editorialized titles, particularly if they link to opinion pieces, may get your post removed.

  2. Content warnings: Posts on the newscomm and top-level replies on the newsmega should use content warnings appropriately. Please be thoughtful about wording and triggers when describing awful things in post titles.

  3. Fake news: No fake news posts ever, including April 1st. Deliberate fake news posting is a bannable offense. If you mistakenly post fake news the mod team may ask you to delete/modify the post or we may delete it ourselves.

  4. Link sources: All posts must include a link to their source. Screenshots are fine IF you include the link in the post body. If you are citing a Twitter post as news, please include the Xcancel.com (or another Nitter instance) or at least strip out identifier information from the twitter link. There is also a Firefox extension that can redirect Twitter links to a Nitter instance, such as Libredirect or archive them as you would any other reactionary source.

  5. Archive sites: We highly encourage use of non-paywalled archive sites (i.e. archive.is, web.archive.org, ghostarchive.org) so that links are widely accessible to the community and so that reactionary sources don’t derive data/ad revenue from Hexbear users. If you see a link without an archive link, please archive it yourself and add it to the thread, ask the OP to fix it, or report to mods. Including text of articles in threads is welcome.

  6. Low effort material: Avoid memes/jokes/shitposts in newscomm posts and top-level replies to the newsmega. This kind of content is OK in post replies and in newsmega sub-threads. We encourage the community to balance their contribution of low effort material with effort posts, links to real news/analysis, and meaningful engagement with material posted in the community.

  7. American politics: Discussion and effort posts on the (potential) material impacts of American electoral politics is welcome, but the never-ending circus of American Politics© Brought to You by Mountain Dew™ is not welcome. This refers to polling, pundit reactions, electoral horse races, rumors of who might run, etc.

  8. Electoralism: Please try to avoid struggle sessions about the value of voting/taking part in the electoral system in the West. c/electoralism is right over there.

  9. AI Slop: Don't post AI generated content. Posts about AI race/chip wars/data centers are fine.

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Wheaties@hexbear.net 30 points 5 months ago* (last edited 5 months ago) (3 children)

Making AI cheaper and less power hungry stands to put the world’s most in-demand technology within reach of millions more people — and to empower African startups to design products for African users.

Uh, I think the bottle neck is less design and more production? Unless by "product" they mean computer code...

[chief solutions architect for Huawei Cloud in sub-Saharan Africa, Harrison Li] clicked through a slide deck, presenting packages tailored to all levels of users and businesses: a free tier, pay-as-you-go hourly rates for DeepSeek models hosted on Huawei Cloud and more compute-intensive options for developers building chatbots and apps [emphasis mine]. For governments, he explained how private cloud systems could be physically installed in offices and ministries.

...they mean computer code. Note there isn't an example of what these governments can actually use the programme for.

To some critics, this carries ominous echoes of Belt and Road programs that helped some poor countries build critical infrastructure like ports, highways and airports, but left them heavily indebted and financially dependent on Chinese suppliers.

As opposed to before when their suppliers were...? If these countries had funded their infrastructure projects through the IMF, pretty sure they'd still be buying Chinese goods.


that's just nitpicking, here's the interesting bit:

For African startups like EqualyzAI, DeepSeek is “orders of magnitude” cheaper than competitors, Adekanmbi said. DeepSeek Chat, for instance, charges 27 cents to process one million tokens of query sent and $1.10 for every million tokens it generates in response. OpenAI’s GPT-4o charges $5 to process the same amount of tokens of query sent and $15 to produce the same amount of tokens in response. If EqualyzAI used GPT-4o, the startup would pay about $12,500 a month to train a small-language model for an e-learning platform, as opposed to the roughly $2,700 per month it now pays DeepSeek for the same task.

So even charging x15 times as much, OpenAI is still running at a loss. A big one. But... even considering that DeepSeek is a more lightweight/efficient programme and China overall is rapidly expanding their electricity output... it still seems hard to imagine any profit is actually happening. About the only actual benefit I see here is this:

EqualyzAI’s engineers used DeepSeek’s open-source architecture as scaffolding to start creating specialized small models as well as automated smart assistants. The vast majority of data in Africa hasn’t been digitized, so contractors across the continent are paid to gather agricultural, medical and financial records, as well as audio in Yoruba, Hausa and Nigerian-accented English. EqualyzAI then trained individual models on the relevant datasets, and tweaked the open weights — the code that instructs an AI model to emphasize or ignore certain information — for each customer. The resulting chatbots and apps are now being used by fintech companies, e-learning platforms and health-care startups. Like all companies that build on DeepSeek, they can choose to either host their products locally and pay for computing and storage infrastructure, or go through providers like Huawei. EqualyzAI does the former.

It keeps the big US firms from building up models in other languages, keeps the servers (relatively) local and in the hands of people who at least actually live alongside the populations who are going to be impacted by it.

[–] LeeeroooyJeeenkiiins@hexbear.net 25 points 5 months ago (2 children)

left them heavily indebted and financially dependent on Chinese suppliers.

Is there even a single instance where this is true because I heard about the Chinese just straight up forgiving numerous debts

[–] breadguy@kbin.earth 10 points 5 months ago

yeah thats usually just propaganda. also open models can be run on infrastructure anywhere so idk what they're talking about

[–] cfgaussian@lemmygrad.ml 4 points 5 months ago* (last edited 5 months ago)

Nope. In every instance where a debt could not be repaid China either restructured it or outright cancelled the debt.

The reason for this isn't altruism, it's rational self-interest. China wants to keep doing productive business with these countries and a bankrupt country shackled with unpayable debt makes for a bad business partner, they will have nothing to offer and little to no purchasing power. It is in China's interest to help the global south continue to develop economically.

If that means forgiving their debts then that is a small price to pay for a long term, lucrative, mutually beneficial trade relationship.

The reason why the West doesn't think like this is because the West doesn't have economies based on real production and development, it has vampire economies based on finance, rent and extraction of super-profits. They squeeze as much as they can out of a country until there is nothing left to squeeze and then blame those same countries for their own poverty on account of not being civilized or capitalist enough, exactly like how when a bigger corporation buys out a smaller firm then runs that business into the ground and sells off its assets.

Also, the West is run by bankers and the entire Western culture has been infected for centuries with the mentality of the banker in which debts are practically worshipped and seen as sacred and the idea of forgiving a debt is an unthinkable sacrilege, a moral peril.

[–] jackmaoist@hexbear.net 24 points 5 months ago

You can also avoid paying trillions to Nvidia is you use Deepseek.The CEO was basically begging Trump to let them sell chips in China while they still can.

[–] piccolo@hexbear.net 2 points 5 months ago

From your second quote:

Like all companies that build on DeepSeek, they can choose to either host their products locally and pay for computing and storage infrastructure, or go through providers like Huawei. EqualyzAI does the former.

So that means that DeepSeek is not getting a cent from this company. It's open-weight, meaning if one has sufficiently powerful hardware they can just run DeepSeek, unlike OpenAI state of the art models, which can only be run by companies that contract with OpenAI to get the weights (as far as I know, this is basically just Google (Vertex) and Amazon (Bedrock)).

But... even considering that DeepSeek is a more lightweight/efficient programme and China overall is rapidly expanding their electricity output... it still seems hard to imagine any profit is actually happening

I think DeepSeek is absolutely burning money. Right now, almost all Chinese models are all open-weight. I've seen numerous hypotheses for why this is the case, but I think the one that convinces me the most, at least for DeepSeek, is that they're doing it as advertising/recruiting. But the revenue that DeepSeek has is only from charging per token on their API as described in your first quote, and they're competing with every other GPU provider for these prices, so it's an aggressive race to the bottom. It's possible that DeepSeek is even running this at a loss to get more training data from people using their API.

In any case, DeepSeek has made a lot of innovations relating to doing more training with less power, because they are currently relatively GPU-poor. NVIDIA chips are hard to come by in China and so DeepSeek can't really buy any more of the top tier models than they already have. Some of these are used for running the inference for the API, and some are used for the training. But even with all of these optimizations, it costs a lot of money to train an LLM, and it's hard to imagine that with how often they're releasing models, they're actually breaking even, given that at best they have small margins on their API.