this post was submitted on 27 Mar 2026
142 points (96.1% liked)

Technology

83931 readers
3330 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 39 comments
sorted by: hot top controversial new old
[–] apfelwoiSchoppen@lemmy.world 39 points 3 weeks ago (3 children)

Shouldn't it be Nucleus since they are Hooli?

[–] cecilkorik@lemmy.ca 31 points 3 weeks ago* (last edited 3 weeks ago) (2 children)

No, that is what it would be if we were using traditional, deterministic compression and using a reversible and verifiable mapping of data. But this is the new era of memetic compression, "Pied Piper" is what everyone remembers from the show, so we compress it to "Pied Piper" to minimize the amount of memetic overhead and allow the smallest possible compression artifact. Like with "AI", it doesn't need to be correct, just close enough for people to think it is! /s

[–] apfelwoiSchoppen@lemmy.world 13 points 3 weeks ago (1 children)
[–] smiletolerantly@awful.systems 12 points 3 weeks ago

It should be Dot Dot! But it's Dot Dot Dot! - sanest Bitchard moment

[–] tabular@lemmy.world 2 points 3 weeks ago

Smaller PP.

[–] uuj8za@piefed.social 11 points 3 weeks ago (1 children)

Yes, it should be Nucleus. Them calling it PiedPiper is a propaganda campaign to try to earn good will from people. Fuck Google locking down Android.

[–] hopesdead@startrek.website 3 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

Does their CEO have a signature that looks like a penis?

[–] apfelwoiSchoppen@lemmy.world 3 points 3 weeks ago

Almost guaranteed.

[–] hopesdead@startrek.website 3 points 3 weeks ago (1 children)

Sure but didn’t the plot line with Nucleus come in a later season?

Secondly, I am pretty certain the Google logo was always in the opening credits.

[–] apfelwoiSchoppen@lemmy.world 2 points 3 weeks ago (1 children)

It begins in the first season.

[–] hopesdead@startrek.website 2 points 3 weeks ago

I miss the thinking of the Moonshots whatever those were called.

[–] AbouBenAdhem@lemmy.world 31 points 3 weeks ago (1 children)

TurboQuant, meanwhile, could lead to efficiency gains and systems that require less memory during inference. But it wouldn’t necessarily solve the wider RAM shortages driven by AI, given that it only targets inference memory, not training — the latter of which continues to require massive amounts of RAM.

I didn’t realize the RAM shortage was mostly due to training—I would have thought inference was at least a big a factor.

[–] Dran_Arcana@lemmy.world 14 points 3 weeks ago (1 children)

Inference is dirt cheap in comparison. Hundreds to thousands of concurrent users can be served by hardware costing in the high-thousands to low-ten-thousands.

Training those same foundational models is weeks to months of time on tens to hundreds of millions worth of hardware.

[–] AbouBenAdhem@lemmy.world 9 points 3 weeks ago (2 children)

Yeah—but in theory you only need to train once, while inference costs are ongoing and scale up with usage.

I guess it’s ultimately a business decision by AI companies to weigh how often retraining is worth the cost.

[–] JGrffn@lemmy.world 12 points 3 weeks ago

Yeah i don't think they ever stop training is the thing. At this point I'd assume they have multiple training pipelines to try different shit out, just queued up to hit the big farms as soon as the last models are done training.

Resting isn't a thing in capitalism.

[–] douglasg14b@lemmy.world 1 points 3 weeks ago

Training is constant. None of these models by any of these providers are static. You'll notice that they are releasing new models and new model versions regularly.

This means that training is happening constantly. It never stops. There's always new shit being trained.

[–] hopesdead@startrek.website 22 points 3 weeks ago* (last edited 3 weeks ago) (3 children)

Okay, but did Google calculate how many dicks they could jerk off for maximum efficiency?

[–] spy@lemmy.dbzer0.com 5 points 3 weeks ago (2 children)

Well that depends on a lot of factors. One of them being the distance of dick to floor of every one they would jerk. Call that D2F.

Hopefully they thought of it.

Someone already done a paper on it:

https://ia800308.us.archive.org/32/items/pdfy-tG1MuMpwvrML6QD0/228831637-Optimal-Tip-to-Tip-Efficiency.pdf

Abstract A probabilistic model is introduced for the problem of stimulating a large male audience. Double jerking is considered, in which two shafts may be stimulated with a single hand. Both tip-to-tip and shaft-to-shaft configurations of audience members are analyzed. We demonstrate that pre-sorting members of the audience according to both shaft girth and leg length allows for more efficient stimulation. Simulations establish steady rates of stimulation even as the variance of certain parameters is allowed to grow, whereas naive unsorted schemes have increasingly flaccid perfor- mance.

[–] Thorry@feddit.org 1 points 3 weeks ago

Some people will have you believe it's all about the angle of the dangle, while we all know it's about length times diameter plus weight over girth divided by angle of the tip squared.

[–] mr_account@lemmy.world 3 points 3 weeks ago
[–] x00z@lemmy.world 2 points 3 weeks ago (1 children)

Yes it was part of their quarterly circlejerk.

[–] muffedtrims@lemmy.world 2 points 3 weeks ago

That's the earnings report

[–] Deconceptualist@leminal.space 20 points 3 weeks ago (3 children)

Should be called Middle-Out? That was the algorithm IIRC. Pied Piper was the name of the startup.

[–] jaalu@lemmy.world 2 points 3 weeks ago

I think Dropbox's Lepton is the closest thing to a real-world version of SV's middle-out algorithm

[–] crystalmerchant@lemmy.world 1 points 3 weeks ago (1 children)

Depends how many dicks you can jack off

[–] Deconceptualist@leminal.space 1 points 3 weeks ago

Pretty sure Google can afford to handle... checks math.... All of them.

[–] 4am@lemmy.zip -1 points 3 weeks ago

Too pedantic for normies

[–] Brewchin@lemmy.world 12 points 3 weeks ago (1 children)

This should come in handy for the recently projected need for 300 GB RAM* in upcoming self-driving cars.

*Not a typo. 😳

[–] Voroxpete@sh.itjust.works 9 points 3 weeks ago

Projected by a company that makes RAM and wants to juice their stock price.

[–] mr_account@lemmy.world 12 points 3 weeks ago (3 children)

All these upvotes and comments and not one joke about how it sounds like TurboCunt?

[–] fun_times@lemmy.world 3 points 3 weeks ago

I was thinking TurboQueef but that works too.

[–] dimjim@sh.itjust.works 1 points 2 weeks ago

THANK YOU! My brain literally pronounced it like that and I came to see if anyone else commented it lol

There you go, being the change you seek in the world.

[–] darkkite@lemmy.ml 6 points 3 weeks ago

was published over a year ago

[–] jagermo@feddit.org 5 points 3 weeks ago

What's its Weißman score?

[–] SalamenceFury@piefed.social 2 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

Funny thing that came with this: apparently Micron's stock fell off a cliff, and apparently so did RAM prices? Can't confirm that later one.

[–] BetaDoggo_@lemmy.world 1 points 3 weeks ago

Neither are true, Micron has been plummeting since their earnings report on the 18th. This might have caused a small dip but it's nothing compared to the cliff they just fell off of.

[–] goatinspace@feddit.org -2 points 3 weeks ago