dejected_warp_core

joined 2 years ago

It's really the worst. For the uninitiated, the platen where your bags go is actually a scale. The self-check-kiosk software waits for this bagging scale to quit moving (see: de-bouncing) before weighing and approving the scan and purchase of a single item. This is why, occasionally, if you're too fast or too slow, the kiosk gets angry and makes you flag down an attendant.

That's not a problem for 10 items or less, but for a whole cart? All that waiting around adds up. Because of all that, it's literally impossible to achieve the same or better speed than an employee.

After watching president after president age like milk, I'm convinced that it's not the stress of office alone doing it. Even during normal times, it's an incredibly demanding position without a lot of room for downtime. Add an attending physician to the mix and you're good to go, packed to the gills with uppers, downers, antibiotics, blood thinners, beta-blockers, anti-parasitics, testosterone, you name it. While you're able to keep the schedule of a 30-year-old this way, you're also burning the candle at both ends and in the middle. It's little wonder that such a person can more or less shake off a bad diet at 80+.

Placing every last presidential decision on a coin flip would have yielded better results.

Easily done:

  1. Wait for AI bubble to burst
  2. Buy render farm on the cheap
  3. Avatar 4

Although I'm assuming that the raw rendering pipeline is what costs the most. I could be dead wrong about that - there's a whole army of artists, technical people, and actors that go into such a production too.

Thank you for your service.

like sifting through resumes.

Then we need to call it what it is: This exchange is an HR screen.

Make no mistake, the oligarchs see the personal computer as a 40-year-old experiment that has failed, or needs to fail. They want their mainframes and CPU/hr billing back. Server hosting for enterprise uses has already gone this way for the most part. Small consumers are next.

As far as I recall, that's how it went.

[–] dejected_warp_core@lemmy.world 2 points 4 days ago* (last edited 4 days ago)

I have a lot of thoughts on this because this is a complicated topic.

TL;DR: it's breakthrough tech, made possible by GPUs left over from the crypto hype, but TechBros and Billionaires are dead set on ruining it for everyone.

It's clearly overhyped as a solution in a lot of contexts. I object to the mass scraping of data to train it, the lack of transparency around what data exactly went into it, and the inability to request one's art from being excused from any/all models.

Neural nets as a technology have a lot of legitimate uses for connecting disparate elements in large datasets, finding patterns where people struggle, and more. There is ample room for legitimately curated (vegan? we're talking consent after all) training data, getting results that matter, and not pissing anyone off. Sadly, this has been obscured by everything else encircling the technology.

At the same time, AI is flawed in practice as it's single greatest strength is also its greatest weakness. "Hallucinations" are really all this thing does. We just call obviously wrong output that because that's in the eye of the beholder. In the end, these things don't really think, so it's not capable of producing right or wrong answers. It just compiles stuff out of its dataset by playing the odds on what tokens come next. It's very fancy autocomplete.

To put the above into focus, it's possible to use a trained model to implement lossy text compression. You ship a model of a boatload of text, prose, and poetry, ahead of time. Then you can send compressed payloads as a prompt. The receiver uses the prompt to "decompress" your message by running it through the model, and they get a facsimile of what you wrote. It wont' be a 1:1 copy, but the gist will be in there. It works even better if its trained on the sender's written work.

The hype surrounding AI is both a product of securing investment, and the staggeringly huge levels of investment that generated. I think it's all caught up in a self-sustaining hype cycle now that will eventually run out of energy. We may as well be talking about Stanley Cups or limited edition Crocs... the actual product doesn't even matter at this point.

The resource impact brought on by record investment is nothing short of tragic. Considering the steep competition in the AI space, I wager we have somewhere between 3-8x the amount of AI-capable hardware deployed than we could ever possibly use at the current level of demand. While I'm sure everyone is projecting for future use, and "building a market" (see hype above), I think the flaws and limitations in the tech will temper those numbers substantially. As much as I'd love some second-hand AI datacenter tech after this all pops, something tells me that's not going to be possible.

Meanwhile, the resource drain on other tangent tech markets have punched down even harder on anyone that might compete, let alone just use their own hardware; I can't help but feel that's by design.

[–] dejected_warp_core@lemmy.world 11 points 4 days ago (2 children)

Sweet tap-dancing christ, this whole thread. If there's anything I've learned today, it's that some teachers are the most petty dictators that cannot tolerate being proven in the wrong, nor can handle having their decision making skills challenged. They're out there doing real lasting damage to people and their ability to think critically.

It's almost enough to make me want to go into education, just to displace one of these tyrants.

Sincerely, I'm sorry all of you had to go through any of this. Here's hoping you have support and find closure.

[–] dejected_warp_core@lemmy.world 9 points 4 days ago (3 children)

I was gonna say this is at least Digg 3.0.

 
 

I used to really enjoy sites like this. I know there's joke accounts on Twitter and other sites here and there, but I haven't seen anything lately that has the whole site as one big running gag.

https://en.wikipedia.org/wiki/Q%26A_comedy_website

A Q&A website is a website where the site creators use the images of pop culture icons, historical figures, fictional characters, or even inanimate objects or abstract concepts to answer input from the site's visitors, usually in question/answer format. This format of website, most popular in the early 2000s, evolved from the much older Internet Oracle. The original progenitor of this type of site was the now-defunct Forum 2000. The Forum 2000 claimed to have run the site by means of artificial intelligence, and the personalities on the website were called SOMADs, or "State Of Mind Adjointness pairs". However, later Q&A sites usually dispensed with this pretense, with the most extreme example being Jerk Squad!, on which the administrators of the site provide many of the answers.

 

FTA:

Two Democratic legislators are introducing a bill on Wednesday aimed at Mr. Musk and the so-called Buffalo Billion project, in which the state spent $959 million to build and equip a plant that Mr. Musk’s company leases for $1 a year to operate a solar panel and auto component factory.

The bill would require an audit of the state subsidy deal to “identify waste, fraud and abuse committed by private parties to the contract.” It would determine whether the company, Tesla, was meeting job creation targets, making promised investments, paying enough rent and honoring job training commitments.

If Tesla was found to be not in compliance, the state could claw back state benefits, impose penalties or terminate contracts.

 

Some of you may remember this absolute diamond of insanity that was the "4-Day Time Cube." This was the go-to example of the internet as a universal amplifier for communication - for both the sane and insane alilke. It was there from nearly the start of the world-wide web, back in the 1990's. Alas, it ceased to be some time ago, but it still lives on in our hearts.

For the uninitiated: welcome. Read and join the rest of us that are "educated stupid."

Amateur documentary: https://www.youtube.com/watch?v=H7lWCqbgQnU

view more: next ›