technology

24320 readers
320 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 5 years ago
MODERATORS
1
19
Hexbear Code-Op (hexbear.net)
submitted 1 year ago* (last edited 1 year ago) by RedWizard@hexbear.net to c/technology@hexbear.net
 
 

Where to find the Code-Op

Wow, thanks for the stickies! Love all the activity in this thread. I love our coding comrades!


Hey fellow Hexbearions! I have no idea what I'm doing! However, born out of the conversations in the comments of this little thing I posted the other day, I have created an org on GitHub that I think we can use to share, highlight, and collaborate on code and projects from comrades here and abroad.

  • I know we have several bots that float around this instance, and I've always wondered who maintains them and where their code is hosted. It would be cool to keep a fork of those bots in this org, for example.
  • I've already added a fork of @WhyEssEff@hexbear.net's Emoji repo as another example.
  • The projects don't need to be Hexbear or Lemmy related, either. I've moved my aPC-Json repo into the org just as an example, and intend to use the code written by @invalidusernamelol@hexbear.net to play around with adding ICS files to the repo.
  • We have numerous comrades looking at mainlining some flavor of Linux and bailing on windows, maybe we could create some collaborative documentation that helps onboard the Linux-curious.
  • I've been thinking a lot recently about leftist communication online and building community spaces, which will ultimately intersect with self-hosting. Documenting various tools and providing Docker Compose files to easily get people off and running could be useful.

I don't know a lot about GitHub Orgs, so I should get on that, I guess. That said, I'm open to all suggestions and input on how best to use this space I've created.

Also, I made (what I think is) a neat emblem for the whole thing:

Todos

  • Mirror repos to both GitHub and Codeberg
  • Create process for adding new repos to the mirror process
  • Create a more detailed profile README on GitHub.

Done

spoiler

  • ~~Recover from whatever this sickness is the dang kids gave me from daycare.~~
2
3
 
 

MyLovely.AI, an NSFW AI artwork generation platform, has allegedly been compromised, affecting 106,362 registered users. The 2.1 GB database, containing records from April 2026, was posted by unauthorized individuals on a dark web forum. The incident, characterized as a JSON leak, exposes highly sensitive interactions between users and the AI service, revealing both personal identifiers and explicit generated content.

4
 
 

cross-posted from: https://news.abolish.capital/post/40293

Last month, President Trump sat alongside executives of the largest tech companies in the country as they pledged to pay a fair share of the energy costs of their data center buildout. “Data centers … they need some PR help,” Trump said at the gathering. “People think that if the data center goes in, their electricity is going to go up.”

It’s not an entirely unfounded assumption.

As the tech industry has funneled billions of dollars into the AI boom over the last several years, it has simultaneously been expanding its fleet of computing powerhouses, which require vast amounts of energy to run. These facilities have been cropping up all over the country, from rural communities in eastern Pennsylvania to the cities of northern Utah.

This boom coincides with a dramatic rise in U.S. electricity prices, driven by inflation and the rising cost of adapting to wildfires, hurricanes, and other extreme weather. But these massive facilities have also strained the grid — and in some cases — contributed to rising prices. For instance, last year, an independent monitor for PJM, the grid operator that serves 13 northeastern states and Washington, D.C., projected that powering data centers would result in higher electricity generation costs, which would ultimately be passed on to consumers. And in cases where the buildout hasn’t yet led to price hikes, utilities and grid operators expect that it’s just a matter of time if tech companies follow through on their plans. Indeed, the Federal Reserve Bank of Dallas estimates that with data center electricity demand expected to double in the next five years, wholesale power prices could rise by as much as 50 percent.

Residents picket DTE Energy in Detroit, opposing the electric utility’s plan to provide power for a proposed $7 billion data center in rural Michigan. Jim West / UCG / Universal Images Group via Getty Images

At a time when the cost of living has become untenable for many Americans, and consumers are setting aside ever greater shares of their income to pay energy bills, the possibility of further rate hikes to line the pockets of tech companies has prompted a massive backlash across the country. The White House gathering of tech executives appeared to be a response to the backlash. On March 4 at the event, they signed onto the “Ratepayer Protection Pledge.”

The pledge itself has few specifics or teeth. It’s a voluntary agreement by several prominent tech companies — including Microsoft, Meta, OpenAI, and Amazon — to secure their own power for data centers, pay for any powerlines or other infrastructure that utilities may need to build to move that power, and hire locally from the communities they build in. While in theory the agreement could help prevent Americans from having to bear the cost of the data center expansion, the White House hasn’t set up oversight mechanisms to ensure that they do. Several consumer and environmental advocates called the agreement “meaningless,” “unenforceable,” and ultimately, “nonsense.”

The United States has become ground zero for the global data center boom. The rapid buildout has left developers, tech companies, and the utility industry scrambling to secure more power. As a result, the wait for a data center to connect to the grid can be years in many parts of the country. Hyperscalers — companies that operate large data centers and provide vast computing power — have been trying to get around these wait times by signing long-term power purchase agreements with solar developers, building their own natural gas plants, and even retrofitting jet engines to generate electricity.

“Every single data center in the future will be power limited,” said NVIDIA CEO Jensen Huang last year. “We are now a power‑limited industry.”

Outside of the White House, utilities, local regulators, and lawmakers have also been proposing various solutions to address the community backlash and allow for the continued building of more data centers. Some have implemented measures requiring data centers to pay the costs of generating and moving the electricity they use. Others have suggested that data center developers install solar and battery systems on-site, or that rates should be frozen for residents while utilities figure out how to handle the additional costs. And at least 11 states are considering legislation to temporarily ban new data centers while their impact on electricity prices and other concerns are addressed.

Alphabet and Google CEO Sundar Pichai, second from right, speaks during a news conference in November to announce Google’s $40 billion investment in Texas at the Google Data Center in Midlothian.
Chitose Suzuki / The Dallas Morning News via Getty Images

“You’re seeing states try to move quickly,” said Meghan Pazik, a senior policy associate in Public Citizen’s climate program. But “every state’s going to have a different approach to how far they want to go on data centers.”

Many states are utilizing additional tariffs for data centers and other customers that pull large amounts of power from the grid. These facilities — referred to as “large load customers” — are required to pay more to make up for the added infrastructure costs that come with supplying them, as well as the risk if they end up walking away from the project, which would leave consumers on the hook for the investments. More than 30 states have proposed or implemented measures of this sort.

Some hyperscalers are changing their approaches, too. In Minnesota, Google inked a deal with Xcel Energy, the state’s largest investor-owned utility, to bring 1,900 megawatts of clean energy onto the grid. The company is fully funding wind turbines, solar panels, and battery storage, as well as the costs of grid infrastructure upgrades to serve its data centers. And in Louisiana, Meta signed a deal with Entergy to help fund the construction of seven natural gas plants, more than 200 miles of transmission lines, and battery systems, among other infrastructure upgrades.

A recent report from the Searchlight Institute, a policy think tank, argues that this piecemeal approach to regulating the tech industry misses an opportunity to fund a large-scale upgrade of the grid. Although the surge in demand has largely been framed as a looming crisis, the report contends that the boom also creates a rare policy window: a chance to modernize the country’s electrical system and make long-delayed investments needed for the clean energy transition.

High voltage power lines run through a sub-station in Miami, Florida in January. Joe Raedle / Getty Images

Utilities make roughly $35 billion in investments in transmission infrastructure every year — far short of what’s actually needed. Electricity demand is projected to double or triple in the next 25 years. The Searchlight Institute report proposes creating a dedicated grid infrastructure fund to accelerate the expansion. Under the plan, hyperscalers would pay into the fund in exchange for speedy connections. Money from the fund would be directed to utilities and other companies to build out the system, prioritizing clean energy along the way. And consumer and environmental advocates, along with other policymakers, would oversee the process to ensure funds are being distributed equitably and serve the needs of the public.

Such a mechanism would ensure increased investments in clean energy, rather than the natural gas projects many tech companies are currently backing, while protecting consumers from increases in electricity prices.

“The hyperscalers need power,” said Jane Flegal, a senior fellow at the Searchlight Institute and author of the report. “They have a ton of capital. And rather than letting them continue to cut these one-off deals with utilities, we’ve got to find a better way to take advantage of the potential upside here and avoid the downside of them basically building a secondary grid behind the existing grid that benefits only them.”

This story was originally published by Grist with the headline Data centers are straining the grid. Can they be forced to pay for it? on Apr 6, 2026.


From Grist via This RSS Feed.

5
29
Is Hormuz Open Yet? (www.ishormuzopenyet.com)
submitted 11 hours ago* (last edited 11 hours ago) by deforestgump@hexbear.net to c/technology@hexbear.net
 
 

nope

6
 
 

They're taking money from AI companies to "provide AI tools" to people at the ASF. Can't believe even open source development orgs are pushing AI down people's throats too.

7
8
 
 

نُشر تبادليًا من: https://hexbear.net/post/8194191

Please, you are our hope, please, you are our life. Without your support, we cannot live. https://chuffed.org/project/150674-support-umm-mohammeds-family-to-rebuild-their-future-and-to-survive-amidst-the-genocide

9
10
11
 
 

https://www.righto.com/2019/07/software-woven-into-wire-core-rope-and.html

spoiler

Onboard the Apollo spacecraft, the revolutionary Apollo Guidance Computer helped navigate to the Moon and land on its surface. One of the first computers to use integrated circuits, the Apollo Guidance Computer was lightweight enough and small enough (70 pounds and under a cubic foot) to fly in space. An unusual feature that contributed to its small size was core rope memory, a technique of physically weaving software into high-density storage. In this blog post, I take a close look at core rope and the circuitry that made it work.

The Apollo Guidance Computer (AGC) had very little memory by modern standards: 2048 words of RAM in erasable core memory and 36,864 words of ROM in core rope memory. In the 1960s, most computers (including the AGC) used magnetic core memory for RAM storage, but core ropes were unusual and operated differently. Erasable core memory and core rope both used magnetic cores, small magnetizable rings. But while erasable core memory used one core for each bit, core rope stored an incredible 192 bits per core, achieving much higher density.2 The trick was to put many wires through each core (as shown above), hardwiring the data: a 1 bit was stored by threading a wire through a core, while the wire bypassed the core for a 0 bit. Thus, once a core rope was carefully manufactured, using a half-mile of wire, data was permanently stored in the core rope.

We are restoring the Apollo Guidance Computer shown above. The core rope modules (which we don't have)4 would be installed in the empty space on the left. On the right of the AGC, you can see the two connectors that connected the AGC to other parts of the spacecraft, including the DSKY (Display/Keyboard). By removing the bolts holding the two trays together, we could disassemble the AGC. Pulling the two halves apart takes a surprising amount of force because of the three connectors in the middle that join the two trays. The tray on the left is the "A" tray, which holds the logic and interface modules. The tray on the right is the "B" tray, which holds the memory circuitry, oscillator, and alarm. The six core rope modules go under the metal cover in the upper right. Note that the core ropes took up roughly a quarter of the computer's volume.

How core rope works

At a high level, core rope is simple: sense wires go through cores to indicate 1's, or bypass cores to indicate 0's. By selecting a particular core, the sense wires through that core were activated to provide the desired data bits.

Magnetic cores have a few properties that made core memory work.7 By passing a strong current along a wire through the core, the core becomes magnetized, either clockwise or counterclockwise depending on the direction of the current. Normally the cores were all magnetized in one direction, called the "reset" state, and when a core was magnetized the opposite direction, this is called the "set" state. When a core flips from one state to another, the changing magnetic field induces a small voltage in any sense wires through the core. A sense amplifier detects this signal and produces a binary output.

The key advantage of core rope is that many sense wires pass through a single core, so you can store multiple bits per core and achieve higher-density storage. (In the case of the AGC, each core has 192 sense wires passing through (or around) it5, so each core stored 12 words of data.) This is in contrast to regular read/write core memory, where each core held one bit.

Core rope used an unusual technique to select a particular core to flip and read. Instead of directly selecting the desired core, inhibit lines blocked the flipping of every core except the desired one. In the diagram below, the current on the set line (green) would potentially flip all the cores. However, various inhibit lines (red) have a current in the opposite direction. This cancels out the set current in all the cores except #2, so only core #2 flips.

In the diagram above, only the sense lines (blue) passing through core #2 pick up an induced voltage. Thus, the weaving pattern of the sense lines controls what data is read from core #2. To summarize, the inhibit lines control which core is selected, and the sense wires woven through that core control what data value is read.

The inhibit lines are driven from the address lines and arranged so that all inhibit lines will be inactive for just the desired core. For any other address, at least one inhibit line will be activated, preventing the core from flipping and being read.

[The article continues with a detailed photo breakdown of the Apollo boards]

A video of the chips being weaved: https://www.youtube.com/watch?v=P12r8DKHsak

12
13
 
 
14
15
16
17
18
 
 
19
20
 
 

if there's a pirate bay or 1337x.to equivalent for paid datasets, please let me know

21
22
23
24
 
 
25
view more: next ›