165
top 50 comments
sorted by: hot top controversial new old
[-] Barx@hexbear.net 87 points 2 months ago

Uncritical support for these AI bros refusing to learn CS and thereby making the CS nerds that actually know stuff more employable.

[-] Collatz_problem@hexbear.net 31 points 2 months ago

But to recognize people who know something you too need to know something, but techbros are very often bazingabrained AI-worshippers.

load more comments (6 replies)
[-] regul@hexbear.net 63 points 2 months ago

walter-yell "INTEGER SIZE DEPENDS ON ARCHITECTURE!"

[-] buh@hexbear.net 37 points 2 months ago

That will also be deduced by AI

load more comments (6 replies)
[-] FunkyStuff@hexbear.net 60 points 2 months ago

This is simply revolutionary. I think once OpenAI adopts this in their own codebase and all queries to ChatGPT cause millions of recursive queries to ChatGPT, we will finally reach the singularity.

[-] hexaflexagonbear@hexbear.net 24 points 2 months ago

There was a paper about improving llm arithmetic a while back (spoiler: its accuracy outside of the training set is... less than 100%) and I was giggling at the thought of AI getting worse for the unexpected reason that it uses an llm for matrix multiplication.

[-] FunkyStuff@hexbear.net 17 points 2 months ago

Yeah lol this is a weakness of LLMs that's been very apparent since their inception. I have to wonder how different they'd be if they did have the capacity to stop using the LLM as the output for a second, switched to a deterministic algorithm to handle anything logical or arithmetical, then fed that back to the LLM.

load more comments (3 replies)
load more comments (1 replies)
[-] InevitableSwing@hexbear.net 49 points 2 months ago

mallocPlusAI

That made me think of... molochPlusAI("Load human sacrifice baby for tokens")

[-] citrussy_capybara@hexbear.net 30 points 2 months ago

I'd just like to interject for a moment. What you're refering to as molochPlusAI, is in fact, GNU/molochPlusAI, or as I've recently taken to calling it, GNUplusMolochPlusAI.

load more comments (4 replies)
[-] kleeon@hexbear.net 48 points 2 months ago* (last edited 2 months ago)

modern CS is taking a perfectly functional algorithm and making it a million times slower for no reason

[-] Llituro@hexbear.net 15 points 2 months ago

inventing more and more creative ways to burn excess cpu cycles for the demiurge

load more comments (1 replies)
[-] bdonvr@thelemmy.club 40 points 2 months ago

Can we make a simulation of a CPU by replacing each transistor with an LLM instance?

Sure it'll take the entire world's energy output but it'll be bazinga af

[-] blame@hexbear.net 17 points 2 months ago

why do addition when you can simply do 400 billion multiply accumulates

[-] WhyEssEff@hexbear.net 39 points 2 months ago* (last edited 2 months ago)

lets add full seconds of latency to malloc with a non-determinate result this is a great amazing awesome idea it's not like we measure the processing speeds of computers in gigahertz or anything

[-] WhyEssEff@hexbear.net 26 points 2 months ago

sorry every element of this application is going to have to query a third party server that might literally just undershoot it and now we have an overflow issue oops oops oops woops oh no oh fuck

[-] WhyEssEff@hexbear.net 23 points 2 months ago* (last edited 2 months ago)

want to run an application? better have internet fucko, the idea guys have to burn down the amazon rainforest to puzzle out the answer to the question of the meaning of life, the universe, and everything: how many bits does a 32-bit integer need to have

[-] WhyEssEff@hexbear.net 21 points 2 months ago* (last edited 2 months ago)

new memory leak just dropped–the geepeetee says the persistent element 'close button' needs a terabyte of RAM to render, the linear algebra homunculus said so, so we're crashing your computer, you fucking nerd

[-] WhyEssEff@hexbear.net 23 points 2 months ago* (last edited 2 months ago)

the way I kinda know this is the product of C-Suite and not a low-level software engineer is that the syntax is mallocPlusAI and not aimalloc or gptmalloc or llmalloc.

[-] WhyEssEff@hexbear.net 22 points 2 months ago* (last edited 2 months ago)

and it's malloc, why are we doing this for things we're ultimately just putting on the heap? overshoot a little–if you don't know already, it's not going to be perfect no matter what. if you're going to be this annoying about memory (which is not a bad thing) learn rust dipshit. they made a whole language about it

[-] Llituro@hexbear.net 22 points 2 months ago

if you're going to be this annoying about memory (which is not a bad thing) learn rust dipshit. they made a whole language about it

holy fuck that's so good data-laughing

[-] WhyEssEff@hexbear.net 20 points 2 months ago

wait is this just the e = mc^2^ + AI bit repackaged

load more comments (1 replies)
load more comments (5 replies)
[-] Llituro@hexbear.net 36 points 2 months ago

there it is, the dumbest thing i'll see today, probably.

[-] unmagical@lemmy.ml 46 points 2 months ago
[-] iByteABit@hexbear.net 17 points 2 months ago

Please be a bit, please be a bit

[-] peeonyou@hexbear.net 17 points 2 months ago

oh this is just gold

load more comments (2 replies)
[-] halfpipe@hexbear.net 33 points 2 months ago

Society is 12 hours of internet outage away from chaos.

load more comments (2 replies)
[-] miz@hexbear.net 32 points 2 months ago

this is definitely better than having to learn the number of bytes your implementation uses to store an integer and doing some multiplication by five.

[-] LodeMike@lemmy.today 18 points 2 months ago
[-] miz@hexbear.net 21 points 2 months ago

whoa, whoa. this is getting complicated!

[-] LodeMike@lemmy.today 18 points 2 months ago* (last edited 2 months ago)

There's no way the post image isn't joking right? It's literally a template.

edit: Type * arrayPointer = malloc(numItems * sizeof(Type)):

[-] FunkyStuff@hexbear.net 15 points 2 months ago

Yeah lol it does seem a ton of people ITT missed the joke. Understandably since the AI people are... a lot.

load more comments (1 replies)
[-] roux@hexbear.net 28 points 2 months ago

This right here is giving me flashbacks of working with the dumbest people in existence in college because I thought I was too dumb for CS and defected to Comp Info Systems.

[-] keepcarrot@hexbear.net 25 points 2 months ago

One of the things I've noticed is that there are people who earnestly take up CS as something they're interested in, but every time tech booms there's a sudden influx of people who would be B- marketing/business majors coming into computer science. Some of them even do ok, but holy shit do they say the most "I am trying to sell something and will make stuff up" things.

[-] Pavlichenko_Fan_Club@hexbear.net 23 points 2 months ago

Chatgeepeetee please solve the halting problem for me.

[-] DefinitelyNotAPhone@hexbear.net 19 points 2 months ago

My guy, if you don't want to learn malloc just learn Rust instead of making every basic function of 99% of electronics take literal seconds.

[-] Zvyozdochka@hexbear.net 19 points 2 months ago

Another contender for top tech innovation

load more comments (6 replies)
[-] Parzivus@hexbear.net 19 points 2 months ago

I switched degrees out of CS because of shit like this. The final straw was having to write code with pencil and paper on exams. I'm probably happier than I would've been making six figures at some bullshit IT job (biaoqing-copium)

[-] Belly_Beanis@hexbear.net 23 points 2 months ago

Honestly I like writing code on paper vs. actual code on a computer. But that just means I should have majored in math.

[-] TheSpectreOfGay@hexbear.net 20 points 2 months ago

if it makes u feel better i graduated out of CS and can't find a job bc the field is so oversaturated

[-] GVAGUY3@hexbear.net 16 points 2 months ago

I had to do that once. That was pre ChatGPT tho so I can imagine it's worse because the rampant cheating.

load more comments (2 replies)
[-] hexaflexagonbear@hexbear.net 19 points 2 months ago* (last edited 2 months ago)
load more comments (3 replies)
[-] GVAGUY3@hexbear.net 19 points 2 months ago

I do Quality Engineering. This makes me shutter with fear even though I'm not the one coding.

[-] BeamBrain@hexbear.net 18 points 2 months ago* (last edited 2 months ago)

Every time the program compiles, malloc() allocates a different amount of memory. A third of these crash the system because ChatGPT pulled a joke post claiming that a single integer takes up 128GB.

[-] radio_free_asgarthr@hexbear.net 17 points 2 months ago* (last edited 2 months ago)

Did the C 2024 standard include a specific ChatGPT specification? Seems like an easy opening for unspecified behavior.

[-] Thallo@hexbear.net 16 points 2 months ago

When do we get HexbearPlusAI?

load more comments (1 replies)
[-] Abracadaniel@hexbear.net 15 points 2 months ago
load more comments (1 replies)
[-] gay_king_prince_charles@hexbear.net 13 points 2 months ago

BRB abusing LD_PRELOAD, recompiling Linux, pushing to prod and taking a sabbatical in Alaska.

load more comments
view more: next ›
this post was submitted on 18 Oct 2024
165 points (100.0% liked)

technology

23383 readers
161 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 4 years ago
MODERATORS