this post was submitted on 18 Dec 2025
125 points (96.3% liked)

Ask Lemmy

36510 readers
783 users here now

A Fediverse community for open-ended, thought provoking questions


Rules: (interactive)


1) Be nice and; have funDoxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them


2) All posts must end with a '?'This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?


3) No spamPlease do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.


4) NSFW is okay, within reasonJust remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com. NSFW comments should be restricted to posts tagged [NSFW].


5) This is not a support community.
It is not a place for 'how do I?', type questions. If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.


6) No US Politics.
Please don't post about current US Politics. If you need to do this, try !politicaldiscussion@lemmy.world or !askusa@discuss.online


Reminder: The terms of service apply here too.

Partnered Communities:

Tech Support

No Stupid Questions

You Should Know

Reddit

Jokes

Ask Ouija


Logo design credit goes to: tubbadu


founded 2 years ago
MODERATORS
 

Every industry is full of technical hills that people plant their flag on. What is yours?

top 50 comments
sorted by: hot top controversial new old
[–] dfyx@lemmy.helios42.de 96 points 3 weeks ago (4 children)

For any non-trivial software project, spending time on code quality and a good architecture is worth the effort. Every hour I spend on that saves me two hours when I have to fix bugs or implement new features.

Years ago I had to review code from a different team and it was an absolute mess. They (and our boss) defended it with "That way they can get it done faster. We can clean up after the initial release". Guess what, that initial release took over three years instead of the planned six months.

[–] Flax_vert@feddit.uk 25 points 3 weeks ago (3 children)

The joys of agile programming....

[–] dfyx@lemmy.helios42.de 21 points 3 weeks ago

What they did was far beyond "agile". They didn't care for naming conventions, documentation, not committing commented-out code, using existing solutions (both in-house and third-party) instead of reinventing the wheel...

In that first review I had literally hundreds of comments that each on their own would be a reason to reject the pull request.

load more comments (2 replies)
load more comments (3 replies)
[–] rowinxavier@lemmy.world 64 points 3 weeks ago (19 children)

I work in disability support. People in my industry fail to understand the distinction between duty of care and dignity of risk. When I go home after work I can choose to drink alcohol or smoke cigarettes. My clients who are disabled are able to make decisions including smoking and drinking, not to mention smoking pot or watching porn. It is disgusting to intrude on someone else's life and shit your own values all over them.

I don't drink or smoke but that is me. My clients can drink or smoke or whatever based on their own choices and my job is not to force them to do things I want them to do so they meet my moral standards.

My job is to support them in deciding what matters to them and then help them figure out how to achieve those goals and to support them in enacting that plan.

The moment I start deciding what is best for them is the moment I have dehumanised them and made them lesser. I see it all the time but my responsibility is to treat my clients as human beings first and foremost. If a support worker treated me the way some of my clients have been treated there would have been a stabbing.

[–] RebekahWSD@lemmy.world 17 points 3 weeks ago

Disabled people are so often treated like children and it just sucks.

load more comments (18 replies)
[–] flamingo_pinyata@sopuli.xyz 54 points 3 weeks ago* (last edited 3 weeks ago) (3 children)

Not strictly technical, although organizational science might be seen as a technical field on it's own.

Regularly rotating people between teams is desirable.

Many companies just assign you in a team and that's where you're stuck forever unti you quit. In slightly better places they will try to find a "perfect match" for you.

What I'm saying is that moving people around is even better:
You spread institutional knowledge around.
You keep everyone engaged. Typically on a new job you learn for the first few months, then you have a peak of productivity when you have all the new ideas. After some 2 years you either reach a plateau or complacency.

[–] Kyle_The_G@lemmy.world 12 points 3 weeks ago

I'm in health sciences and I wish we would do more education days/conferences. I'm a med lab tech and I feel like no one knows what the lab actually does, they just send samples off and the magic lab gremlins Divine these numbers/results. I feel the same way when another discipline discusses what they do, its always interesting!

load more comments (2 replies)
[–] DasFaultier@sh.itjust.works 48 points 3 weeks ago (4 children)

Not everything needs to be deployed to a cluster of georedundant K8s nodes, not everything needs to be a container, Docker is not always necessary. Just run the damn binary. Just build a .deb package.

(Disclaimer: yes, all those things can have merit and reasons. Doesn't mean you have to shove them into everything.)

[–] slazer2au@lemmy.world 14 points 3 weeks ago (4 children)

But then how will I ship my machine seeing as it works for me?

load more comments (4 replies)
load more comments (3 replies)
[–] Godnroc@lemmy.world 46 points 3 weeks ago (3 children)

Cleaning, organizing, and documentation are high priorities.

Every job I've worked at has had mountains of "The last guy didn't..." that you walk into and it's always a huge pain in the ass. They didn't throw out useless things, they didn't bother consolidating storage rooms, and they never wrote down any of their processes, procedures, or rationals. I've spent many hours at each job just detangling messes because the other person was to busy or thought it unimportant and didn't bother to spend the time.

Make it a priority, allocate the time, and think long-term.

[–] NOT_RICK@lemmy.world 14 points 3 weeks ago

Starting a new job soon, and I’m paying for some holes in documentation as I prep my offboarding documentation for my current team. Definitely making it a priority to do better going forward! Being lazy in the moment is nice but the “stitch in time” adage is definitely true

[–] mech@feddit.org 13 points 3 weeks ago

Make it a priority, allocate the time, and think long-term.

In many jobs, someone with the power to fire you makes the priorities, allocates your time and does not think long-term.

load more comments (1 replies)
[–] JackbyDev@programming.dev 38 points 3 weeks ago (3 children)

This is a non technical hill but it is applicable to my technical career. The hill is that REMOTE WORK WORKS. I am so frustrated that so many businesses are going back to hybrid or full RTO.

[–] Thermite@lemmings.world 19 points 3 weeks ago (3 children)

RTO is about control and management/owners thinking that everyone else is lazy and would not do anything if not constantly pushed. I believe that is because they are the kind of people who would need that kind of supervision.

The financial side is that making people go to work maintains value. The money you spend on lunch, travel, dry cleaning, maintenance of cars, and the increased value of property near places of business add to the ownership class's wealth. All that money you spend traveling to/from and while you are at work goes to them. If you save that money by working from home, the wealth stays with you.

load more comments (3 replies)
load more comments (2 replies)
[–] kescusay@lemmy.world 35 points 3 weeks ago (3 children)

React sucks. I'm sorry, I know it's popular, but for the love of glob, can we not use a technology that results in just as much goddamn spaghetti code as its closest ancestor, jQuery? (That last bit is inflammatory. I don't care. React components have no opinionated structure imposed on them, just like jQuery.)

load more comments (3 replies)
[–] jordanlund@lemmy.world 35 points 3 weeks ago (2 children)

AI is a fad and when it collapses, it's going to do more damage than any percieved good it's had to date.

[–] tal@lemmy.today 11 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

I can believe that LLMs might wind up being a technical dead end (or not; I could also imagine them being a component of a larger system). My own guess is that language, while important to thinking, won't be the base unit of how thought is processed the way it is on current LLMs.

Ditto for diffusion models used to generate images today.

I can also believe that there might be surges and declines in funding. We've seen that in the past.

But I am very confident that AI is not, over the long term, going to go away. I will confidently state that we will see systems that will use machine learning to increasingly perform human-like tasks over time.

And I'll say with lower, though still pretty high confidence, that the computation done by future AI will very probably be done on hardware oriented towards parallel processing. It might not look like the parallel hardware today. Maybe we find that we can deal with a lot more sparseness and dedicated subsystems that individually require less storage. Yes, neural nets approximate something that happens in the human brain, and our current systems use neural nets. But the human brain runs at something like a 90 Hz clock and definitely has specialized subsystems, so it's a substantially-different system from something like Nvidia's parallel compute hardware today (1,590,000,000 Hz and homogenous hardware).

I think that the only real scenario where we have something that puts the kibosh on AI is if we reach a consensus that superintelligent AI is an unsolveable existential threat (and I think that we're likely to still go as far as we can on limited forms of AI while still trying to maintain enough of a buffer to not fall into the abyss).

EDIT: That being said, it may very well be that future AI won't be called AI, and that we think of it differently, not as some kind of special category based around a set of specific technologies. For example, OCR (optical character recognition) software or speech recognition software today both typically make use of machine learning


those are established, general-use product categories that get used every day


but we typically don't call them "AI" in popular use in 2025. When I call my credit card company, say, and navigate a menu system that uses a computer using speech recognition, I don't say that I'm "using AI". Same sort of way that we don't call semi trucks or sports cars "horseless carriages" in 2025, though they derive from devices that were once called that. We don't use the term "labor-saving device" any more


I think of a dishwasher or a vacuum cleaner as distinct devices and don't really think of them as associated devices. But back when they were being invented, the idea of machines in the household that could automate human work using electricity did fall into a sort of bin like that.

load more comments (1 replies)
load more comments (1 replies)
[–] 0x0@lemmy.zip 30 points 3 weeks ago (10 children)

Weird i haven't seen this one yet: the cloud is just someone else's computers.

load more comments (10 replies)
[–] MudMan@fedia.io 28 points 3 weeks ago (21 children)

Is there anybody on Lemmy that isn't a software engineer of some description? No? Anyone?

[–] SchmidtGenetics@lemmy.world 13 points 3 weeks ago
[–] buttmasterflex@piefed.social 13 points 3 weeks ago

I'm a geologist!

[–] slazer2au@lemmy.world 10 points 3 weeks ago (1 children)

Yes, me. I am a network engineer with an expired CCNA

load more comments (1 replies)
load more comments (18 replies)
[–] slazer2au@lemmy.world 25 points 3 weeks ago* (last edited 3 weeks ago) (5 children)

They should stop teaching the OSI model and stick to the DOD TCP/IP model

In the world of computer networking you are constantly hammered about the OSI model and how computer communication fits into that model. But outside of specific legacy uses, nothing runs the OSI suite, everything runs TCP/IP.

[–] flamingo_pinyata@sopuli.xyz 12 points 3 weeks ago (10 children)

Understanding that other protocols are possible is important. Sure, reality doesn't fit neatly into the OSI model, but it gives you a conceptual idea of everything that goes into a networking stack.

load more comments (10 replies)
load more comments (4 replies)
[–] KokusnussRitter@discuss.tchncs.de 25 points 3 weeks ago (6 children)

I fucking hate AI in HR/hiring. I try so hard not to spread my personal data to LLMs/AI ghuls and the moment I apply for a job I need to survive I have to accept that the HR department's AI sorting hat now knows a shit ton about me. I just hope these are closed systems. if anyone from a HR department knows more, please let me know

load more comments (6 replies)
[–] vrighter@discuss.tchncs.de 25 points 3 weeks ago (3 children)

the hill i am willing to die on is: FUCK AI. I'll be dead before I let it write a single line of code.

load more comments (3 replies)
[–] Horsey@lemmy.world 24 points 3 weeks ago (5 children)

Transparency + blur + drop shadow is peak UI design and should remain so for the foreseeable future. It provides depth, which adds visual context. Elements onscreen should not appear flat; our human predator brains are hardwired and physiologically evolved to parse depth information.

load more comments (5 replies)
[–] EponymousBosh@awful.systems 23 points 3 weeks ago (3 children)

Cognitive behavioral therapy/dialectical behavioral therapy are not the universal cure for everything and they need to stop being treated as such

[–] corsicanguppy@lemmy.ca 13 points 3 weeks ago (4 children)

I'll join you on this hill, soldier.

CBT is the only one they've tested, and they tested themselves, and of course they look great. It offloads all success and failure 100% to the victim, and so many failures don't reflect on the process; ever. It resembles a massive sham.

My counsellor friend calls it "sigma-6 for mental health" and notes how it's often not covered by insurance (even outside America's mercenary system) so it's a nice cash cow for the indu$try.

load more comments (4 replies)
load more comments (2 replies)
[–] Fafa@lemmy.world 22 points 3 weeks ago* (last edited 3 weeks ago) (8 children)

Okay, I'm pretty late to the party, but here we go. My field is illustration and art, and especially color theory is something that a lot too often is teached plainly wrong. I think it was in the 1950s when Johannes Itten introduced his book on colortheory. In this book, he states that there are three "Grundfarben" (base colors) that will mix into every color. He explained this model with a color ring that you will still find almost anywhere. This model and the fact that there are three Grundfarben is wrong.

There are different angles from where you can approach color mixing in art, and it always depends on what you want to do. When we speak about colors, we actually mean the experience that we humans have, when light rays fall into our eyes. So, it's actually a perceptual phenomenon, which means it is actually something that has small statistical differences from individual to individual. For example, a greenish blue might be a little bit more green for one person or a little more blue for the other.

Every color, however, has its opposite color. Everybody can test this. Look into a red (not too bright) light for some time and then onto a white wall. The color you will see is the opposite. They will cancel each other out and become white / neutral.

Ittens colormodel, however, is not based in perception. In this model yellow is opposed to violet, which might mix to a neutral color with pigments but not with lightrays. But even that doesn't work a lot of times. I mean, even his book is printed in six colors, even though his three basecolors are supposedly enough to print every color..

In history lot of colormodels have been less correct course. What is so infuriating is that in Ittens case, he just plainly ignored the correct colortheory that already existed (by Albert Henry Munsell) and created his own with whatever rules that he believes are correct.

Even today, this model and rules are teached at art schools and you can see his color circle plastered all over the internet.

Tldr: Johannes Ittens colormodel is wrong, even though it's almost everywhere.

(Added tldr)

load more comments (8 replies)
[–] early_riser@lemmy.world 21 points 3 weeks ago* (last edited 3 weeks ago) (2 children)

A plain text physical password notebook is actually more secure than most people think. It's also boomer-compatible. My folks understand that things like their social security cards need to be kept secure and out of public view. The same can be applied to a physical password notebook. I also think a notebook can be superior to the other ways of generating and storing passwords, at least in some cases.

  1. use the same password for everything: obviously insecure.
  2. Use complex unique passwords for everything: You'll never remember them. If complex passwords are imposed as a technical control, even worse if you have to change them often, you'll just end up with passwords on post-its.
  3. use a password manager: You're putting all your eggs in one basket. If the manager gets breached there goes everything.
load more comments (2 replies)
[–] Tar_alcaran@sh.itjust.works 20 points 3 weeks ago (2 children)

Workplace safety is quickly turning from a factual and risk-based field into a vibes-based field, and that's a bad thing for 95% of real-world risks.

To elaborate a bit: the current trend in safety is "Safety Culture", meaning "Getting Betty to tell Alex that they should actually wear that helmet and not just carry it around". And at that level, that's a great thing. On-the-ground compliance is one of the hardest things to actually implement.

But that training is taking the place of actual, risk-based training. It's all well and good that you feel comfortable talking about safety, but if you don't know what you're talking about, you're not actually making things more safe. This is also a form of training that's completely useless at any level above the worksite. You can't make management-level choices based on feeling comfortable, you need to actually know some stuff.

I've run into numerous issues where people feel safe when they're not, and feel at risk when they're safe. Safety Culture is absolutely important, and feeling safe to talk about your problems is a good thing. But that should come AFTER being actually able to spot problems.

load more comments (2 replies)
[–] PeriodicallyPedantic@lemmy.ca 20 points 3 weeks ago (15 children)

Dynamic typing sucks.

Type corrosion is fine, structural typing is fine, but the compiler should be able to tell if types are compatible at compile time.

[–] Nibodhika@lemmy.world 13 points 3 weeks ago (3 children)

This is one of those things like a trick picture where you can't see it until you do, and then you can't unsee it.

I started with C/C++ so typing was static, and I never thought about it too much. Then when I started with Python I loved the dynamic typing, until it started to cause problems and typing hints weren't a thing back then. Now it's one of my largest annoyances with Python.

A similar one is None type, seems like a great idea, until it's not, Rust solution is much, much better. Similar for error handling, although I feel less strongly about this one.

load more comments (3 replies)
load more comments (14 replies)
[–] jode@pawb.social 19 points 3 weeks ago (10 children)

Any tolerance on a part less than +/- 0.001 isn't real. If I can change the size of the part enough to blow it out of tolerance by putting my hand on it and putting some of my body temperature into it then it's just not real.

load more comments (10 replies)
[–] unknownuserunknownlocation@kbin.earth 17 points 3 weeks ago (7 children)

IT restrictions should be much more conservatively applied (at least in comparison to what's happening in my neck of the woods). Hear me out.

Of course, if you restrict something in IT, you have a theoretical increase in security. You're reducing the attack surface in some way, shape or form. Usually at the cost of productivity. But also at the cost of the the employees' good will towards the IT department and IT security. Which is an important aspect, since you will never be able to eliminate your attack surface, and employees with good will can be your eyes and ears on the ground.

At my company I've watched restrictions getting tighter and tighter. And yes, it's reduced the attack surface in theory, but holy shit has it ruined my colleagues' attitude towards IT security. "They're constantly finding things to make our job harder." "Honestly, I'm so sick of this shit, let's not bother reporting this, it's not my job anyway." "It will be fine, IT security is taking care of it anyway." "What can go wrong when are computers are so nailed shut?" It didn't used to be this way.

I'm not saying all restrictions are wrong, some definitely do make sense. But many of them have just pissed off my colleagues so much that I worry about their cooperation when shit ends up hitting the fan. "WTF were all these restrictions for that castrated our work then? Fix your shit yourself!"

load more comments (7 replies)
[–] Bruncvik@lemmy.world 15 points 3 weeks ago (4 children)

Professionally: Waterfall release cycle kills innovation, and whoever advocates it should be fired on the spot. MVP releases and small, incremental changes and improvements are the way to go.

Personally: Don't use CSS if tables do what you need. Don't use Javascript for static Web pages. Don't overcomplicate things when building Web sites.

load more comments (4 replies)
[–] hawgietonight@lemmy.world 15 points 3 weeks ago

Ebikes are motorbikes, not bycicles.

Not saying they aren't fun or useful at times, but they shouldn't be treated as a bicycles.

I don't care if the motor engages using a button, twist grip, your feet or twitching your nose, it is a motor and exceeds your natural body power.

There is no goddamn reason to continue to use magneto ignition in aircraft engines. I've been a Rotax authorized service technician for 13 years, I have never seen the digital CDI installed on a Rotax 900 series engine fail in any way, and you've still got two. Honestly I believe a CDI module is more reliable and less prone to failure than a mechanical magneto. The only reason why we're still using pre-WWII technology in modern production aircraft engines is societal rot.

[–] Treczoks@lemmy.world 15 points 3 weeks ago (4 children)

There are a load of things in IT where using a processor is the wrong choice, and using an FPGA instead would have made a lot of problems a non-issue.

[–] CanadaPlus@lemmy.sdf.org 11 points 3 weeks ago* (last edited 3 weeks ago) (6 children)

Is that controversial? I've always assumed people avoid FPGAs just because they're unfamiliar with them.

load more comments (6 replies)
load more comments (3 replies)
[–] greedytacothief@lemmy.dbzer0.com 13 points 3 weeks ago

Maybe not technical, but teaching is weird.

If people aren't having fun/engaged they're not learning much. People don't care how much you know until they know how much you care. It's so frustrating to come across someone who writes the standards you're supposed to follow and they are the most boring and fake teacher you've experienced.

[–] fruitycoder@sh.itjust.works 13 points 3 weeks ago (2 children)

If you don't understand that development, security, and operations are all one job you will constantly make crap and probably point at some other team to make excuses about it, but it will be actually be your fault.

Programs have to run. They have to be able to change to meet needs. Implementing working security measures is one of those needs.

The amount of times I've had to slap devs hands that wanted to just disable security or remind security that just shutting it down is denial of service is crazy. If it can't deploy or is constantly down or uses stupid amount of resources it's also worthless no matter what it looked like for split second you ran on on the dev machine.

The next patch isn't going to fucking fix it if no one that writes patches knows about the damn issue. Work arounds are hidden technical debt and you have to assume that they will fucking break on some update later. If you are not updating because it breaks your unreported workarounds you will get ignored by the devs at some point, and they are right in doing so.

If you depend on something communicate with the team that works on it. We can send a fucking petabyte of info around the world and to the moon and back before some people write a fucking Ticket, email, or even a IM. Look dumb and asking the stupid question rather than being an actual idiot and leaving something broken for the next decade. We're all dumb, it's why we built computers, get over it and just talk to people. If you really struggle with, don't just communicate, try to over communicate, say an obvious thing now and again just to keep the dialogue open and test that you really on the same page.

That's my rant/hill borne from ulcers supporting crappy IT orgs and having to overcome my own shortcomings to actually say something in channels where things can actually change and not just griping in private about it.

load more comments (2 replies)
[–] untorquer@lemmy.world 12 points 3 weeks ago

People are idiots and it's the designers' duty to remove opportunities for an idiot to hurt themselves up and just short of impacting function.

[–] philpo@feddit.org 12 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

Technisation and standardisation are good for the EMS sector.

The whole "it was better when we could do what we want and back then we had only real calls with sicker people and everything was good" is fucking aweful and hurting the profession.

Look, you fucking volunteer dick, I know you do this for 10 years longer than me (and I do it for 25 now),but unlike you I did it full-time and probably had more shifte in one year than you had in your life. Now my back is fucked because back then there was no "electrohydraulic stretcher", no stair chair, the ventilator was twice as heavy (and could basically nothing), the defibrillator weighted so much we often had to switch carrying it after two floors up.

And we had just as many shit calls,but got actually attacked worse because the shit 2kg radios were shit and had next to zero coverage indoors, and so had cellphones which led to you being unable to even call for backup.

And of course we had longer shifts,needed to work more hours and the whole job market was even more fucked.

"But we didn't need this and that,we looked at the patient". Yeah,go fuck yourself. MUCH more people died or took damage from that. So many things were not seen. And it was all accepted as "yeah, that's how life is".

So fuck everyone in this field and their nostalgia.

load more comments (1 replies)
[–] Diva@lemmy.ml 11 points 3 weeks ago (2 children)

if you're using modern fabrication techniques, a couple 10uf mlcc capacitors in small packages are just as good as traditional decade capacitors (10uf,1uf,0.1uf) for decoupling in pretty much every situation, and you need to worry about less varieties on your bill of materials

load more comments (2 replies)
[–] chunes@lemmy.world 10 points 3 weeks ago (4 children)

If people used a language that actually leverages the strengths of dynamic typing, they wouldn't dislike it so much.

I encourage every programmer to build a Smalltalk program from the ground up while it's running the entire time. It really is a joy

load more comments (4 replies)
load more comments
view more: next ›