140

top 50 comments
sorted by: hot top controversial new old
[-] SkingradGuard@hexbear.net 8 points 7 hours ago

I've found LLMs are good for code, but even then you're better off reading the docs.

Useless crap.

[-] gay_king_prince_charles@hexbear.net 2 points 22 minutes ago

I have never seen an LLM be more helpful than IRC. They are, however, more polite.

[-] adultswim_antifa@hexbear.net 12 points 8 hours ago

kelly AI? More like A Lie

[-] peeonyou@hexbear.net 33 points 11 hours ago

i work with a bunch of former googlers and googleXers(?) and they are some of the most insufferable people on the planet

[-] UlyssesT@hexbear.net 21 points 10 hours ago* (last edited 10 hours ago)

"Oh hey, do you know that the future is (grifty startup)? I can get you in on the ground floor; everyone's going to be using (grifty startup) very soon so let me hook you up to..." morshupls

[-] infuziSporg@hexbear.net 5 points 8 hours ago

BEFORE EVERY MOMENT

THERE IS A MOMENT

AMP UP.

[-] HumanBehaviorByBjork@hexbear.net 23 points 11 hours ago

they've been paid off by the electrical fire lobby

[-] Lussy@hexbear.net 30 points 12 hours ago

I feel like such a shitty engineer for not remembering or having the slightest interest in even the most basic electrical shit. i don’t even get this fucking meme.

I can do civil/mech/chem but show me electricity and I feel like I’m in preschool.

Pre 18th century ass brain capacity

551 amps will vaporize the wire and cause Fun for all involved.

[-] infuziSporg@hexbear.net 9 points 8 hours ago

0.2 amps of current going through a human torso is fatal in virtually all cases.

[-] Xavienth@lemmygrad.ml 25 points 10 hours ago* (last edited 9 hours ago)

North America uses 120 V for most circuits. Power is the product of voltage and current.

At 1 Amp, 120 watts are dissipated by the circuit. About the heat of two incandescent light bulbs.

At 10 Amps, 1200 watts are dissipated by the circuit, about the heat of a space heater.

At 551 Amps, 66,000 watts are dissipated by the circuit. I don't even have a good comparison. That's like the power draw of 50 homes all at once.

The higher the gauge, the lower the diameter of the wire. The lower the diameter of the wire, the more of that 66,000 watts is going to be dissipated by the wire itself instead of the load where it is desired. At 22 gauge, basically all of it will be dissipated by the wire, at least for the first fraction of a second before the wire vaporizes in a small explosion.

EDIT: In this scenario, the total resistance of the circuit must be at most 0.22 Ω. Otherwise, the current will not reach 551 A due to Ohm's Law, V=I×R. This resistance corresponds to a maximum length of 13 feet for copper wire and no load.

[-] PKMKII@hexbear.net 2 points 4 hours ago

I ran this by my brother who’s an electrician and he inferred that might be where the number is coming from, some data on how many amps you can dump into various wire gauges before they simply stop being solids.

[-] DamarcusArt@lemmygrad.ml 16 points 9 hours ago

That’s like the power draw of 50 homes all at once.

So the average crypto mining rig?

about $75k of mining rigs actually. 66kW is a lot of heat to dissipate

[-] Lussy@hexbear.net 7 points 10 hours ago

By dissipated by the wire instead of the load what do you mean?

[-] sawne128@hexbear.net 17 points 10 hours ago

The wire heats up.

Wires have a small resistance which causes a voltage drop over the wire if the current is big enough (U=RI), and therefore it draws power (P=UI) which warms it up. Thinner wires have more resistance.

[-] Flocklesscrow@lemm.ee 7 points 9 hours ago

The rebel alliance was famously all thin wires

[-] Aru@lemmygrad.ml 3 points 6 hours ago

Yeah, it took me a few minutes to realize "551 amps? that's an insane cartoonish number", also in the engineering field, but not electrical stuff

[-] context@hexbear.net 18 points 10 hours ago

imagine trying to direct the output of a fire hose on full blast through one of those thin red drinking straws that come with cocktails

[-] Flocklesscrow@lemm.ee 5 points 9 hours ago

Isn't that pretty much how a CNC machine works?

[-] context@hexbear.net 5 points 7 hours ago

sort of but if we're extending the analogy i don't think the thin plastic drinking straw will make an effective replacement for the steel nozzle at the end of a water jet cutter, either

[-] NephewAlphaBravo@hexbear.net 25 points 11 hours ago

551 amps is an amount you would put in a comedic joke about using way too many amps

[-] gay_king_prince_charles@hexbear.net 36 points 12 hours ago

It's not that they hired the wrong people, it's that LLMs struggle with both numbers and factual accuracy. This isn't a personel issue, it's a structural issue with LLMs.

[-] Xavienth@lemmygrad.ml 18 points 10 hours ago

Because LLMs just basically appeared in Google search and it was not any Google employee's decision to implement them despite knowing they're bullshit generators /s

[-] psivchaz@reddthat.com 8 points 9 hours ago

I mean, define employee. I'm sure someone with a Chief title was the one who made the decision. Everyone else gets to do it or find another job.

[-] GaveUp@hexbear.net 5 points 6 hours ago

I work with google coders all the time, I guarantee you they were all very excited for this feature

[-] gay_king_prince_charles@hexbear.net 1 points 26 minutes ago

I mean LLMs are cool to work on and a fun concept. An n dimensional regression where n is the trillions is cool. The issue is that it is cool in the same way as a grappling hook or a blockchain.

[-] UlyssesT@hexbear.net 6 points 9 hours ago* (last edited 9 hours ago)

They're Rube Goldbergian machines of bullshit but the bullshit peddlers (and the glazers) insist that adding more Rube Goldbergian layers to the Rube Goldberg machines will remove the systemic problems with it instead of just hiring people to fact-check. Hatred of human workers is the point, and even when they are used, they're made as invisible as possible, so it's just a Mechanical Turk in that case.

All of this, all that wasted electricity, all that extra carbon dumped into the air, all so credulous rubes can feel like the Singularity(tm) is nigh. debord-tired

Google gets around 9 billion searches per day. Human fact checking google search quick responses would be an impossible. If each fact check takes 30 seconds, you would need close to 10 million people working full time just to fact check that.

[-] EelBolshevikism@hexbear.net 2 points 5 hours ago* (last edited 5 hours ago)

Just let them call in for answers with real people instead. Than it takes more effort and less people will do it when it's not important

Edit: also I'm pretty sure Google could hire 10 million people

[-] gay_king_prince_charles@hexbear.net 1 points 32 minutes ago

also I'm pretty sure Google could hire 10 million people

Assuming minimum wage at full time, that is 36 billion a year. Google extracts 20 billion in surplus labor per year, so no, Google could not 10 million people.

load more comments (7 replies)
[-] gay_king_prince_charles@hexbear.net 17 points 12 hours ago

Shake hands with danger

[-] RION@hexbear.net 24 points 13 hours ago

My job wants to use an AI medical notetaker instead of hiring someone for it... Surely nothing like this will happen :clueless:

[-] nothx@hexbear.net 17 points 12 hours ago

encouraging people to BE their own fuses.

[-] btfod@hexbear.net 28 points 14 hours ago

return to oracles huffing gas in a cave, this shit sucks

[-] Feinsteins_Ghost@hexbear.net 23 points 13 hours ago* (last edited 13 hours ago)

putting the VFD into the ketchup every single time.

265k watts lol

For reference, 22awg solid is telephone wire. 22awg stranded is a hair thinner. I’ve made 22awg glow red-hot by dumping 12v and just a lil bit of anmps into it.

[-] collapse_already@lemmy.ml 17 points 13 hours ago

An enterprising lawyer is going to make a tidy sum when someone breathes copper vapor after following this advice.

[-] Utter_Karate@hexbear.net 4 points 8 hours ago

I'm not sure that the kind of person who follows this advice will take enough precautions to be doing much breathing in of vapors of any kind after.

nobody who'd want to put 551 amps through jumper wire has access to a 551 amp source

[-] collapse_already@lemmy.ml 4 points 7 hours ago

I got some three phases high voltage transmission lines near my house. Going to save on my electric bill by using 22 awg to hook directly to them bypassing the meter. Cause I am bigly smart.

Just put some jumper cables on an overhead line and put into a transformer

[-] FloridaBoi@hexbear.net 23 points 14 hours ago* (last edited 14 hours ago)

my company is going full steam ahead with AI stuff and a coworker (who is lebanese and we talk about palestine but he has jewish cabal conspiracy brainworms) loves the promise (fantasy?) of AI, especially GenAI. This mfer uses it to summarize short articles and write his emails. I feel like I'm a crazy person because I enjoy reading stuff and writing too.

He sent me a demo yesterday where they had a local instance of an LLM trained on internal data and sure enough it was able to pull info from disparate sources and it was legit kinda neat. Most of what it did was chatbot stuff but with NLP and NLG. To me, this seems like really complicated way of having a search algorithm which we know to be more efficient and faster especially since it was just fetching info.

However it was only neat bc it was running on internal data with strict boundaries, also it belies that a massive, comprehensive data dictionary had to be made and populated by people to allow for these terms/attributes/dimensions to be linked together. One of the things it did in the demo was execute SQL based off of a question how many of these items on this date? which it then provided as select sum(amount) from table where report_date = date and it also provided graphs to show fluctuations in that data over time. I didn't validate the results but I would hope it wouldn't make stuff up especially since the training set was only internal. My experience with other AI apps is that you can ask the thing the same question and you'll get different results.

[-] GaveUp@hexbear.net 3 points 6 hours ago* (last edited 6 hours ago)

but I would hope it wouldn't make stuff up especially since the training set was only internal

I use an internal LLM at one of the biggest tech companies and it makes shit up all the time lol

[-] FloridaBoi@hexbear.net 1 points 3 hours ago

Jfc. Like who do you blame here? The model for being stupid, the prompter for not validating and if they’re validating then are there any time savings?

load more comments
view more: next ›
this post was submitted on 03 Oct 2024
140 points (100.0% liked)

chapotraphouse

13467 readers
770 users here now

Banned? DM Wmill to appeal.

No anti-nautilism posts. See: Eco-fascism Primer

Vaush posts go in the_dunk_tank

Dunk posts in general go in the_dunk_tank, not here

Don't post low-hanging fruit here after it gets removed from the_dunk_tank

founded 3 years ago
MODERATORS