this post was submitted on 09 Jun 2025
439 points (99.1% liked)
Technology
71163 readers
3972 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
God, I hate security "researchers". If I posted an article about how to poison everyone in my neighborhood, I'd be getting a knock on the door. This kind of shit doesn't help anyone. "Oh but the state-funded attackers, remember stuxnet". Fuck off.
Without researchers like that, someone else would figure it out and use it maliciously without telling anyone. This researcher got Google to close the loophole that the exploit requires before publicly disclosing it.
That's the fallacy I'm alluding to when I mention stuxnet. We have really well funded, well intentioned, intelligent people creating tools, techniques and overall knowledge in a field. Generally speaking, some of these findings are more makings then findings.
This disclosure was from last year and the exploit was patched before the researcher published the findings to the public.
I think the method of researching and then informing the affected companies confidentially is a good way to do it but companies often ignore these findings. It has to be publicized somehow to pressure them into fixing the problem.
Indeed, then it becomes a market and it incentivises more research on that area. Which I don't think is helpful for anyone. It's like your job description being "professional pessimist". We could be putting that amount of effort into building more secure software to begin with.
I think it's important for users to know how vulnerable they really are and for providers to have a fire lit under their ass to patch holes. I think it's standard practice to alert providers to these finds early, but I'm guessing a lot of them already knew about the vulnerabilities and often don't give a shit.
Compared to airing this dirty laundry I think the alternatives are potentially worse.
Hmm I don't know... Users usually don't pay much attention to security. And the disclosure method actively hides it from the user until it no longer matters.
For providers, I understand, but can't fully agree. I think it's a misguided culture that creates busy-work at all levels.