this post was submitted on 28 Feb 2026
58 points (98.3% liked)

Opensource

5684 readers
285 users here now

A community for discussion about open source software! Ask questions, share knowledge, share news, or post interesting stuff related to it!

CreditsIcon base by Lorc under CC BY 3.0 with modifications to add a gradient



founded 2 years ago
MODERATORS
 

Opinion: Careless big-time users are treating FOSS repos like content delivery networks

top 17 comments
sorted by: hot top controversial new old
[–] onlinepersona@programming.dev 25 points 1 day ago (1 children)

Do it, please. My company doesn't care and has pipelines running that pull from npm and pypi for every push, merge, etc. - and devs use AI that has been ordered to commit and push as frequently as possible. With around 100 devs, just imagine the traffic.

They were forced to pay for dockerhub because of the pipelines failing. But they must be forced to pay for packages repos too. I sneakily changed our pipeline to pull from the in-house docker registry, and for pipelines to require pulling from package repos only when locks changed. Our CI is faster than every other team, but nobody noticed.

So yeah, charge the companies! Please!

[–] Kissaki@programming.dev 7 points 14 hours ago (1 children)

I sneakily changed our pipeline to pull from the in-house docker registry, and for pipelines to require pulling from package repos only when locks changed. Our CI is faster than every other team, but nobody noticed.

So yeah, charge the companies! Please!

How come this is not an obvious improvement opportunity that materializes in other teams too, and visibly so, rather than "sneakily" hidden?

Isn't it better not only for performance but also for reliability?

[–] onlinepersona@programming.dev 3 points 12 hours ago

It's very top down here. If the group of designated leaders (meaning CTO and his close friends) don't approve of changes to the Way of Working and base repository template, it shall not be applied.

I've pointed out problems before and wanted to improve things but was told to "stay in my lane" basically. That killed all motivation to go through the proper channels. If you aren't in the in-group, well, that's it, you have no say.

Unfortunately, they pay well.

[–] calliope@piefed.blahaj.zone 24 points 1 day ago* (last edited 1 day ago) (2 children)

Charging is a good idea.

In any case it would not be crazy to rate-limit. If you’re downloading the same 10,000 components a million times, you deserve to be limited.

[–] ignirtoq@feddit.online 11 points 1 day ago (1 children)

The article discusses that IP-based limiting doesn't work as well as it used to. Because of NATs, proxies, etc., IP addresses are a lot more ephemeral and flexible, so they've seen the same big perpetrators adapt and change IPs when rate-limited. I expect we will start to see support for anonymous downloads go away in the next several months in many major OSS registries.

[–] calliope@piefed.blahaj.zone 5 points 1 day ago

Thank you!

I actually wondered if the article mentioned that and I just missed it. I went back to check and apparently missed it twice.

I’m genuinely surprised they’ve been able to handle the traffic for this long. The numbers are staggering!

[–] lauha@lemmy.world 7 points 1 day ago* (last edited 1 day ago)

Imagine big companies getting "You have been banned for bandwidth abuse"

[–] troed@fedia.io 15 points 1 day ago (2 children)

I've seen this almost happen due to ignorance. A product making company that is oblivious to the issue until it's pointed out, and then immediately understands why it's an issue and does the right thing. In that case it was mirroring Linux repos instead of constantly pulling from the distribution when it was for their own internal purposes.

If you're working inside an organisation just mentioning this issue might be enough.

[–] Kissaki@programming.dev 3 points 14 hours ago

This part from the article supports this sentiment:

In a pleasant surprise, reactions have been positive. Throttled organizations were "surprised and apologetic," mistaking issues for malice rather than "ignorance, unawareness."

[–] calliope@piefed.blahaj.zone 4 points 1 day ago* (last edited 1 day ago)

Yes I imagine that’s almost always the case.

It would be fun from a chaos perspective to just suddenly limit those who are making too many calls.

Maybe it wouldn’t be that chaotic and builds would fail, but I still like the idea.

[–] Maeve@kbin.earth 14 points 1 day ago

In one case, a department store's team of 60 developers generated more traffic than global cable modem users worldwide due to misconfigured React Native builds bypassing their Nexus repository manager. He detailed extreme examples, such as large organizations downloading the same 10,000 components a million times each month. "That's ridiculous," Fox said. Throttling efforts led to "brownouts" via 429 errors, but patterns mutated, forcing a "Whack-a-Mole" game, especially since most consumption is headless and unnoticed. Registries are also burdened by commercial use, with companies publishing closed source components or massive SDKs as free CDNs. Fox noted that top publishers release gigabyte-scale artifacts daily, unlike in typical open source projects.

[–] SubArcticTundra@lemmy.ml 1 points 1 day ago* (last edited 1 day ago) (2 children)

This plays in to my idea that every HTTP request could have a microtransaction (like 0.001 c) attached and those who couldn't pay would have ads on the browser level and not on the page level. Altermatively you'd get a fixed monthly budget as part of your ISP plan.

What I am essentially advocating for is that part of what you currently pay for your mobile data plan should go directly to the sites you visit.

[–] bless@lemmy.ml 3 points 14 hours ago (1 children)

I agree in principle, but do you have any idea how many useless http requests the modern web makes? Open the network inspector and load any modern page.

Between fonts, analytics, and web frameworks, that will add up very quickly :(

[–] GamingChairModel@lemmy.world 1 points 13 hours ago* (last edited 13 hours ago)

Yeah, my impression is that ordinary human activity in a browser creates a lot more http requests than scripted automated activity through command line tools.

[–] SubArcticTundra@lemmy.ml 3 points 1 day ago (1 children)

I guess a question is who would set the prices. If sites could set them themselves, then some would set it at zero to have an advantage, and resort to the same surveillance-based funding model they rely on currently.

[–] SubArcticTundra@lemmy.ml 1 points 1 day ago* (last edited 1 day ago) (1 children)

One option would be to enforce a price floor that would be set to whatever the server operator could prove is the cost of serving a single request. (per byte, also fixed costs when there's no traffic mess this up). This would absolve website owners of the need to find any other funraising methods if they want to break even, and thus remove the incentive for installing spying ads. It would still keep in tact the incentive to make servers as cost effective as possible.

I'm not sure if such a means-tested price floor mechanism has ever been used in the past, so idk how well this would work.

[–] SubArcticTundra@lemmy.ml 2 points 1 day ago

Looks like a self-enforcing tax rate (comparable to price floor) has at least been proposed:

[...] have proposed having owners self-assess the value of their property under penalty of having to sell at this self-assessed value.111 This has the simultaneous effect of forcing truthful valuations for taxation and of forcing turnover of underutilized or monopolized assets to broader publics.

From https://plurality.net/read/5-7/