this post was submitted on 13 Jan 2026
514 points (99.8% liked)

Technology

78661 readers
3375 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

The Flock saga continues.

A handful of police departments that use Flock have unwittingly leaked details of millions of surveillance targets and a large number of active police investigations around the country because they have failed to redact license plates information in public records releases. Flock responded to this revelation by threatening a site that exposed it and by limiting the information the public can get via public records requests.

Completely unredacted Flock audit logs have been released to the public by numerous police departments and in some cases include details on millions Flock license plate searches made by thousands of police departments from around the country. The data has been turned into a searchable tool on a website called HaveIBeenFlocked.com, which says it has data on more than 2.3 million license plates and tens of millions of Flock searches.

The situation highlights one of the problems with taking a commercial surveillance product and turning it into a searchable, connected database of people’s movements and of the police activity of thousands of departments nationwide. It also highlights the risks associated with relying on each and every law enforcement customer to properly and fully redact identifiable information any time someone requests public records; in this case, single mistakes by individual police departments have exposed potentially sensitive information about surveillance targets and police investigations by other departments around the country.

Archive: http://archive.today/yXLPQ

you are viewing a single comment's thread
view the rest of the comments
[–] InternetCitizen2@lemmy.world 0 points 1 day ago (2 children)

Your phrasing seems to imply I said it was, but I never said that.

[–] JollyG@lemmy.world 1 points 5 hours ago (1 children)

If you are in a discussion about the development and deployment of technology to facilitate a surveillance state, then saying “technology is neutral” is the least interesting thing you could possibly say on the subject.

In a completely abstract, disconnected-from-society-and-current-events sense it is correct to say technology is amoral. But we live in a world where surveillance technology is developed to make it easier for corporations and the state to invade the privacy of individuals. We live in a world where legal rights are being eroded by the use of this technology. We live in a world where this technology is profitable because it helps organizations violate individual rights. If you live in the US, as I do, then you live in a world where federal law enforcement agencies have become completely contemptuous of the law and are literally abducting innocent people off the street. They use the technology under discussion here to help them do that.

That a piece of tech might potentially be used for a not-immoral purpose is completely irrelevant to how it is actually being used in the real world.

[–] InternetCitizen2@lemmy.world 0 points 4 hours ago (1 children)

to make it easier for corporations and the state to invade the privacy of individuals.

And that is what we need to focus our messaging on. The evil people and institutions enabling this as those are permanent. Tech comes and goes (and should not be anthropomized). Focusing on the tech just means in institution looks for another path. Focusing on the institution is to block the at the source.

[–] JollyG@lemmy.world 1 points 4 hours ago (1 children)

“Technology is neutral” is a bromide engineers use to avoid thinking about how their work impacts people. If you are an engineer working for flock or a similar company, you are harming people. You are doing harm through the technology you help to develop.

The massive surveillance systems that currently exist were built by engineers who advanced technology for that purpose. The scale and totality of the resulting surveillance states are simply not possible without the tech. The closest alternatives are stasi-like systems that are nowhere near as vast or continuous. In the actual world the actual tech is immoral. Because it was created for immoral purposes and because it is used for immoral purposes.

[–] InternetCitizen2@lemmy.world 1 points 3 hours ago (1 children)

You are doing harm through the technology you help to develop.

All technology has that potential. Some more than others. The issue is that institutions, like flock, exist solely for the evil applications.

[–] JollyG@lemmy.world 1 points 3 hours ago (1 children)

As I said before: In a conversation about technology as it actually exists, talking about potentials is not interesting. Yes all technology has the potential to be good or bad. The massive surveillance tech is actually bad right now in the real world

This issue with asserting that technology is neutral is it lets the people who develop it ignore the impacts of their work. The engineers that make surveillance tech make it, ultimately, for immoral purposes. When they are confronted with the effects of their work on society they avoid according with the ethics of what it is that they are doing by deploying bromides like “technology is neutral.”

Example: Building an operant conditioning feedback system into a social media app or video game is not inherently bad, you could use it to reinforce good behaviors and deploy it ethically by obtaining the consent of the people you use on. But the operant conditioning tech in social media apps and video games that actually exists is very clearly and unambiguously bad. It exists to get people addicted to a game or media app, so that they can be more easily exploited. Engineers built that tech stack out for the purpose of exploiting people. The tech, as it exists in the real world, is bad. When these folks were confronted with what they had done, they responded by claiming that tech is not inherently good or bad. (This is a real thing social media engineers really said) They ignored the tech—as it actually exists—in favor of an abstract conversation about some potential alternative tech that does not exist. The effect of which is the people doing harm built a terrible system without ever confronting what it was they were doing.

[–] InternetCitizen2@lemmy.world 1 points 3 hours ago (1 children)

This issue with asserting that technology is neutral is it lets the people who develop it ignore the impacts of their work.

I don't see how that is the case. The tech is neutral, but the engineers know what the application they are hired for is. That is determined by people and subject to morality.

Would you say openCV or the people working on it are evil? I wouldn't. I would say that once someone takes that project for flock is evil.

I think this framing is more important when talking with the general public as they are likely to walk away thinking that its the tech that creates problems and not the for profit corporations who will be free to continue doing the same, so long as they don't use that tech.

[–] JollyG@lemmy.world 1 points 3 hours ago (1 children)

I don’t see how that is the case.

It is literally the case. People who have literally made tools to do bad things justified it by claiming that tech is neutral in an abstract sense. Find an engineer who is building a tool to do something they think is bad, they will tell you that bromide.

OpenCV is not, in itself, immoral. But openCV is, once again, actual tech that exists in the actual world. In fact, that is how I know it is not bad, I use the context of reality—rather than hypotheticals or abstractions—to assess the morality of the tech. The tech stack that makes up Flock is bad, once again I make that determination by using the actual world as a reference point. It does not matter that some of the tech could be used to do good. In the case of Flock, it is not, so it’s bad.

[–] InternetCitizen2@lemmy.world 1 points 2 hours ago (1 children)

People who have literally made tools to do bad things justified it by claiming that tech is neutral in an abstract sense

Bold a keyword there for you

[–] JollyG@lemmy.world 1 points 2 hours ago (1 children)

At no point in this conversation have I ever said that tech in an abstract sense is inherently good or bad. The point that I am making— and this is the last time I will make it— is that it is not interesting to talk about the ethics of some technology in an abstraction in cases where the actual tech is as it is actually implemented is clearly bad.

Saying “tech is neutral” is a dodge. People say that to avoid thinking about the ethics of what it is they are doing.

[–] InternetCitizen2@lemmy.world 1 points 2 hours ago (1 children)

that it is not interesting to talk about the ethics of some technology in an abstraction in cases where the actual tech is as it is actually implemented is clearly bad.

But that is what you are doing and I am saying that it is people who are responsible for the implementation.

[–] JollyG@lemmy.world 1 points 2 hours ago (1 children)

Saying “tech is neutral” is a dodge. People say that to avoid thinking about the ethics of what it is they are doing.

[–] InternetCitizen2@lemmy.world 1 points 1 hour ago (1 children)

People are the ones who do things with tech; hence they are responsible for the actions. Tech is just an object with no will of its own to do right or wrong.

[–] JollyG@lemmy.world 1 points 1 hour ago (1 children)

Last attempt, I swear.

By digressing to abstraction, good people can and do justify building tech for immoral purposes. It is irrelevant that tech is not inherently good or bad in cases where it is built to do bad things. Talking about potential alternate uses in cases where tech is being used to do bad is just a way of avoiding the issues.

I have no problem calling flock or facebooks tech stack bad because the intentions behind the tech are immoral. The application of the tech by those organizations is for immoral purposes (making people addicted, invading their privacy etc). The tech is an extension of bad people trying to do bad things. Commentary about tech’s abstract nature is irrelevant at that point. Yeah, it could be used to do good. But it’s not. Yeah, it is not in-and of-itself good or bad. Who cares? This instantiation of the tech is immoral, because it’s purposes are immoral.

The engineers who help make immoral things possible should think about that, rather than the abstract nature of their technology. In these cases, saying technology is neutral is to invite the listener to consider a world that doesn’t exist instead of the one that does.

[–] InternetCitizen2@lemmy.world 1 points 1 hour ago

I have no problem calling flock or facebooks tech stack bad because the intentions behind the tech are immoral.

And did those assemble themselves to be evil? Or did someone make them that way?

To go back go my openCV example it is just tech. It does not become a lpr with a cop back end until flock configures it that way

The engineers who help make immoral things possible should think about that

Yes, exactly my point.

[–] NaibofTabr@infosec.pub 1 points 1 day ago

The technology enables the surveillance state. Therefore the technology is not amoral.