514
this post was submitted on 13 Jan 2026
514 points (99.8% liked)
Technology
78661 readers
3375 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Your phrasing seems to imply I said it was, but I never said that.
If you are in a discussion about the development and deployment of technology to facilitate a surveillance state, then saying “technology is neutral” is the least interesting thing you could possibly say on the subject.
In a completely abstract, disconnected-from-society-and-current-events sense it is correct to say technology is amoral. But we live in a world where surveillance technology is developed to make it easier for corporations and the state to invade the privacy of individuals. We live in a world where legal rights are being eroded by the use of this technology. We live in a world where this technology is profitable because it helps organizations violate individual rights. If you live in the US, as I do, then you live in a world where federal law enforcement agencies have become completely contemptuous of the law and are literally abducting innocent people off the street. They use the technology under discussion here to help them do that.
That a piece of tech might potentially be used for a not-immoral purpose is completely irrelevant to how it is actually being used in the real world.
And that is what we need to focus our messaging on. The evil people and institutions enabling this as those are permanent. Tech comes and goes (and should not be anthropomized). Focusing on the tech just means in institution looks for another path. Focusing on the institution is to block the at the source.
“Technology is neutral” is a bromide engineers use to avoid thinking about how their work impacts people. If you are an engineer working for flock or a similar company, you are harming people. You are doing harm through the technology you help to develop.
The massive surveillance systems that currently exist were built by engineers who advanced technology for that purpose. The scale and totality of the resulting surveillance states are simply not possible without the tech. The closest alternatives are stasi-like systems that are nowhere near as vast or continuous. In the actual world the actual tech is immoral. Because it was created for immoral purposes and because it is used for immoral purposes.
All technology has that potential. Some more than others. The issue is that institutions, like flock, exist solely for the evil applications.
As I said before: In a conversation about technology as it actually exists, talking about potentials is not interesting. Yes all technology has the potential to be good or bad. The massive surveillance tech is actually bad right now in the real world
This issue with asserting that technology is neutral is it lets the people who develop it ignore the impacts of their work. The engineers that make surveillance tech make it, ultimately, for immoral purposes. When they are confronted with the effects of their work on society they avoid according with the ethics of what it is that they are doing by deploying bromides like “technology is neutral.”
Example: Building an operant conditioning feedback system into a social media app or video game is not inherently bad, you could use it to reinforce good behaviors and deploy it ethically by obtaining the consent of the people you use on. But the operant conditioning tech in social media apps and video games that actually exists is very clearly and unambiguously bad. It exists to get people addicted to a game or media app, so that they can be more easily exploited. Engineers built that tech stack out for the purpose of exploiting people. The tech, as it exists in the real world, is bad. When these folks were confronted with what they had done, they responded by claiming that tech is not inherently good or bad. (This is a real thing social media engineers really said) They ignored the tech—as it actually exists—in favor of an abstract conversation about some potential alternative tech that does not exist. The effect of which is the people doing harm built a terrible system without ever confronting what it was they were doing.
I don't see how that is the case. The tech is neutral, but the engineers know what the application they are hired for is. That is determined by people and subject to morality.
Would you say openCV or the people working on it are evil? I wouldn't. I would say that once someone takes that project for flock is evil.
I think this framing is more important when talking with the general public as they are likely to walk away thinking that its the tech that creates problems and not the for profit corporations who will be free to continue doing the same, so long as they don't use that tech.
It is literally the case. People who have literally made tools to do bad things justified it by claiming that tech is neutral in an abstract sense. Find an engineer who is building a tool to do something they think is bad, they will tell you that bromide.
OpenCV is not, in itself, immoral. But openCV is, once again, actual tech that exists in the actual world. In fact, that is how I know it is not bad, I use the context of reality—rather than hypotheticals or abstractions—to assess the morality of the tech. The tech stack that makes up Flock is bad, once again I make that determination by using the actual world as a reference point. It does not matter that some of the tech could be used to do good. In the case of Flock, it is not, so it’s bad.
Bold a keyword there for you
At no point in this conversation have I ever said that tech in an abstract sense is inherently good or bad. The point that I am making— and this is the last time I will make it— is that it is not interesting to talk about the ethics of some technology in an abstraction in cases where the actual tech is as it is actually implemented is clearly bad.
Saying “tech is neutral” is a dodge. People say that to avoid thinking about the ethics of what it is they are doing.
But that is what you are doing and I am saying that it is people who are responsible for the implementation.
Saying “tech is neutral” is a dodge. People say that to avoid thinking about the ethics of what it is they are doing.
People are the ones who do things with tech; hence they are responsible for the actions. Tech is just an object with no will of its own to do right or wrong.
Last attempt, I swear.
By digressing to abstraction, good people can and do justify building tech for immoral purposes. It is irrelevant that tech is not inherently good or bad in cases where it is built to do bad things. Talking about potential alternate uses in cases where tech is being used to do bad is just a way of avoiding the issues.
I have no problem calling flock or facebooks tech stack bad because the intentions behind the tech are immoral. The application of the tech by those organizations is for immoral purposes (making people addicted, invading their privacy etc). The tech is an extension of bad people trying to do bad things. Commentary about tech’s abstract nature is irrelevant at that point. Yeah, it could be used to do good. But it’s not. Yeah, it is not in-and of-itself good or bad. Who cares? This instantiation of the tech is immoral, because it’s purposes are immoral.
The engineers who help make immoral things possible should think about that, rather than the abstract nature of their technology. In these cases, saying technology is neutral is to invite the listener to consider a world that doesn’t exist instead of the one that does.
And did those assemble themselves to be evil? Or did someone make them that way?
To go back go my openCV example it is just tech. It does not become a lpr with a cop back end until flock configures it that way
Yes, exactly my point.
The technology enables the surveillance state. Therefore the technology is not amoral.