Shouldn't it be Nucleus since they are Hooli?
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
No, that is what it would be if we were using traditional, deterministic compression and using a reversible and verifiable mapping of data. But this is the new era of memetic compression, "Pied Piper" is what everyone remembers from the show, so we compress it to "Pied Piper" to minimize the amount of memetic overhead and allow the smallest possible compression artifact. Like with "AI", it doesn't need to be correct, just close enough for people to think it is! /s
Thanks Bitchard!
It should be Dot Dot! But it's Dot Dot Dot! - sanest Bitchard moment
Smaller PP.
Yes, it should be Nucleus. Them calling it PiedPiper is a propaganda campaign to try to earn good will from people. Fuck Google locking down Android.
Does their CEO have a signature that looks like a penis?
Almost guaranteed.
Sure but didn’t the plot line with Nucleus come in a later season?
Secondly, I am pretty certain the Google logo was always in the opening credits.
It begins in the first season.
TurboQuant, meanwhile, could lead to efficiency gains and systems that require less memory during inference. But it wouldn’t necessarily solve the wider RAM shortages driven by AI, given that it only targets inference memory, not training — the latter of which continues to require massive amounts of RAM.
I didn’t realize the RAM shortage was mostly due to training—I would have thought inference was at least a big a factor.
Inference is dirt cheap in comparison. Hundreds to thousands of concurrent users can be served by hardware costing in the high-thousands to low-ten-thousands.
Training those same foundational models is weeks to months of time on tens to hundreds of millions worth of hardware.
Yeah—but in theory you only need to train once, while inference costs are ongoing and scale up with usage.
I guess it’s ultimately a business decision by AI companies to weigh how often retraining is worth the cost.
Yeah i don't think they ever stop training is the thing. At this point I'd assume they have multiple training pipelines to try different shit out, just queued up to hit the big farms as soon as the last models are done training.
Resting isn't a thing in capitalism.
Should be called Middle-Out? That was the algorithm IIRC. Pied Piper was the name of the startup.
Depends how many dicks you can jack off
Too pedantic for normies
was published over a year ago
Okay, but did Google calculate how many dicks thet could jerk off for maximum efficiency?
Yes it was part of their quarterly circlejerk.
That's the earnings report
Well that depends on a lot of factors. One of them being the distance of dick to floor of every one they would jerk. Call that D2F.
Hopefully they thought of it.
This should come in handy for the recently projected need for 300 GB RAM* in upcoming self-driving cars.
*Not a typo. 😳
Funny thing that came with this: apparently Micron's stock fell off a cliff, and apparently so did RAM prices? Can't confirm that later one.
What's its Weißman score?
