this post was submitted on 06 Mar 2026
81 points (100.0% liked)

technology

24275 readers
131 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 5 years ago
MODERATORS
 

I feel like I just need somewhere to run right now and I don't have anywhere really. I've started this new job about a month ago and the whole schtick was that we'd be rewriting this old app to meet a tight deadline and immediately putting it on a code freeze to make something better to replace it.

fine, I get it, tight deadlines and lots to do of course you going to tempted to use AI. when youre architecting a complex application on top of APIs that you've not worked with before you're going to miss stuff anyway. But honestly, out of the four of us, two of the mobile devs are some of the biggest pig shits I've ever had to work with. fundamentally having to explain that this guy can't just merge his changes into the main branch, or not to leave file spanning comments spat out by Gemini to explain the code that he's not even bothered to read, A shit that goes beyond being in a rush to not being competent to do the job you've been given.

again, fine, I can get around some of this with the promise of slowing down and picking up The new projects and enforcing some higher quality standards which everyone supposedly wishes for. fine.

nope. whilst me and the other competent dev are trying to sort out the slop that has been dumped on the app in question over the last month, the deadline has passed us (something which I said was going to happen but was ignored), and the two vibe coders have complained that they "don't have any work to do" And on now architecting the next fucking project with senior team members whilst I'm getting grilled over "why is XYZ taking so long?" because the fucking vibe code is vibe coded that you fucking hacks and you rewarded them with this new project.

it's disheartening, because I said all of this upfront, but because I wasn't the first person in the team out of the gate my opinions are basically worth as much as this rant will (understandably) be worth to most of you: pittance.

I'm not the best programmer on earth, not even the most experienced, but it does my fucking head in to have imposter syndrome every time I log into work every morning because there are two sodding imposters in front of me getting the credit!!! absolute dog shit.

I don't know. round time. free Palestine and free Iran and death 2 America xxx

you are viewing a single comment's thread
view the rest of the comments
[–] fox@hexbear.net 19 points 1 day ago (2 children)

You tell an LLM to write some code. You do not inspect it. If something breaks, ask the LLM to fix it. And so on.

[–] Belly_Beanis@hexbear.net 9 points 1 day ago

I really, really scratch my head at this being allowed because I used to be a CS major. At the time, there was a debate between Object-Oriented Design and Functional Logic. We were being taught both because it was assumed we'd encounter both once we were part of the workforce and taking jobs where Logic programs were already in place, while newer projects would be OOD.

Here we are only a decade or so later and all of that is out the window in favor of the least inefficient code possible.

[–] built_on_hope@hexbear.net 7 points 1 day ago (2 children)

That’s absolutely wild. A few months ago I made the switch to Linux and I thought I would try asking gpt/deepseek for help on various things, following their advice was an absolute minefield and I broke everything before managing to fix it again. Lesson learned. If they are that bad at fairly basic Linux things I can’t imagine them being at all competent writing whole chunks of code or fixing problems. Every time I asked for help fixing an issue it would just tell me to change a different and unrelated environment file in a way that was usually wrong and often damaging

[–] fox@hexbear.net 11 points 1 day ago (1 children)

LLMs have no knowledge. They produce statistically likely text that passes as confident language. All output from an LLM is always a "hallucination", but sometimes that output is coincidentally true or accurate. Don't rely on them for anything because their output is always confident language even if it's shit like "delete system32 to improve speeds"

[–] built_on_hope@hexbear.net 1 points 11 hours ago

Ya I learned my lesson. Never again

They’re better than you’d expect at some things and worse than you’d expect at others. Competency just doesn’t seem to grow the same way in a human as it does in an LLM, which makes some amount of sense