this post was submitted on 06 Jul 2024
413 points (94.2% liked)

Privacy

31992 readers
368 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] catalog3115@lemmy.world 126 points 4 months ago (8 children)

E2EE is not supposed to protect if device get compromised.

[–] NegativeLookBehind@lemmy.world 53 points 4 months ago (16 children)

One could argue that Windows is compromised right out of the box.

load more comments (16 replies)
[–] potatopotato@sh.itjust.works 17 points 4 months ago (2 children)

Intrinsically/semantically no but the expectation is that the texts are encrypted at rest and the keys are password and/or tpm+biometric protected. That's just how this works at this point. Also that's the government standard for literally everything from handheld devices to satellites (yes, actually).

At this point one of the most likely threat vectors is someone just taking your shit. Things like border crossings, rubber stamped search warrants, cops raid your house because your roommate pissed them off, protests, needing to go home from work near a protest, on and on.

[–] 9tr6gyp3@lemmy.world 11 points 4 months ago* (last edited 4 months ago) (15 children)

If your device is turned on and you are logged in, your data is no longer at rest.

Signal data will be encrypted if your disk is also encrypted.

If your device's storage is not encrypted, and you don't have any type of verified boot process, then thats on you, not Signal.

load more comments (15 replies)
load more comments (1 replies)
load more comments (6 replies)
[–] x1gma@lemmy.world 84 points 4 months ago* (last edited 4 months ago) (6 children)

How in the fuck are people actually defending signal for this, and with stupid arguments such as windows is compromised out of the box?

You. Don't. Store. Secrets. In. Plaintext.

There is no circumstance where an app should store its secrets in plaintext, and there is no secret which should be stored in plaintext. Especially since this is not some random dudes random project, but a messenger claiming to be secure.

Edit: "If you got malware then this is a problem anyway and not only for signal" - no, because if secure means to store secrets are used, than they are encrypted or not easily accessible to the malware, and require way more resources to obtain. In this case, someone would only need to start a process on your machine. No further exploits, no malicious signatures, no privilege escalations.

"you need device access to exploit this" - There is no exploiting, just reading a file.

[–] lemmyvore@feddit.nl 27 points 4 months ago (3 children)

You. Don't. Store. Secrets. In. Plaintext.

SSH stores the secret keys in plaintext too. In a home dir accessible only by the owning user.

I won't speak about Windows but on Linux and other Unix systems the presumption is that if your home dir is compromised you're fucked anyway. Effort should be spent on actually protecting access to the home personal files not on security theater.

[–] x1gma@lemmy.world 8 points 4 months ago

Kinda expected the SSH key argument. The difference is the average user group.

The average dude with a SSH key that's used for more than their RPi knows a bit about security, encryption and opsec. They would have a passphrase and/or hardening mechanisms for their system and network in place. They know their risks and potential attack vectors.

The average dude who downloads a desktop app for a messenger that advertises to be secure and E2EE encrypted probably won't assume that any process might just wire tap their whole "encrypted" communications.

Let's not forget that the threat model has changed by a lot in the last years, and a lot of effort went into providing additional security measures and best practices. Using a secure credential store, additional encryption and not storing plaintext secrets are a few simple ones of those. And sure, on Linux the SSH key is still a plaintext file. But it's a deliberate decision of you to keep it as plaintext. You can at least encrypt with a passphrase. You can use the actual working file permission model of Linux and SSH will refuse to use your key with loose permissions. You would do the same on Windows and Mac and use a credential store and an agent to securely store and use your keys.

Just because your SSH key is a plaintext file and the presumption of a secure home dir, you still wouldn't do a ~/passwords.txt.

[–] floquant@lemmy.dbzer0.com 7 points 4 months ago (3 children)

Not true, SSH keys need their passphrase to be used. If you don't set one, that's on you.

[–] Mubelotix@jlai.lu 12 points 4 months ago* (last edited 4 months ago) (3 children)

Come on, 95% of users don't set passwords on their ssh keys

load more comments (3 replies)
load more comments (2 replies)
load more comments (1 replies)
[–] refalo@programming.dev 12 points 4 months ago* (last edited 4 months ago) (2 children)

How in the fuck are people actually defending signal for this

Probably because Android (at least) already uses file-based encryption, and the files stored by apps are not readable by other apps anyways.

And if people had to type in a password every time they started the app, they just wouldn't use it.

[–] Liz@midwest.social 19 points 4 months ago (1 children)

Popular encrypted messaging app Signal is facing criticism over a security issue in its desktop application.

Emphasis mine.

[–] ChapulinColorado@lemmy.world 11 points 4 months ago (1 children)

I think the point is the developers might have just migrated the code without adjustments since that is how it was implemented before. Similar to how PC game ports sometimes run like shit since they are a close 1-1 of the original which is not always the most optimized or ideal, but the quickest to output.

load more comments (1 replies)
load more comments (1 replies)
[–] possiblylinux127@lemmy.zip 8 points 4 months ago (1 children)

If someone has access to your machine you are screwed anyway. You need to store the encryption key somewhere

load more comments (1 replies)
load more comments (3 replies)
[–] thayer@lemmy.ca 51 points 4 months ago* (last edited 4 months ago) (19 children)

While it would certainly be nice to see this addressed, I don't recall Signal ever claiming their desktop app provided encryption at rest. I would also think that anyone worried about that level of privacy would be using disappearing messages and/or regularly wiping their history.

That said, this is just one of the many reasons why whole disk encryption should be the default for all mainstream operating systems today, and why per-app permissions and storage are increasingly important too.

[–] ooterness@lemmy.world 27 points 4 months ago (1 children)

Full disk encryption doesn't help with this threat model at all. A rogue program running on the same machine can still access all the files.

[–] thayer@lemmy.ca 17 points 4 months ago

It does help greatly in general though, because all of your data will be encrypted when the device is at rest. Theft and B&Es will no longer present a risk to your privacy.

Per-app permissions address this specific threat model directly. Containerized apps, such as those provided by Flatpak can ensure that apps remain sandboxed and unable to access data without explicit authorization.

load more comments (18 replies)
[–] HappyTimeHarry@lemm.ee 50 points 4 months ago (4 children)

That applies to pretty much all desktop apps, your browser profile can be copied to get access to all your already logged in cookie sessions for example.

[–] kryllic@programming.dev 10 points 4 months ago

IIRC this is how those Elon musk crypto livestream hacks worked on YouTube back in the day, I think the bad actors got a hold of cached session tokens and gave themselves access to whatever account they were targeting. Linus Tech Tips had a good bit in a WAN show episode

load more comments (3 replies)
[–] Mubelotix@jlai.lu 39 points 4 months ago (1 children)

Sure, I was aware. You have the same problem with ssh keys, gpg keys and many other things

[–] mr_satan@monyet.cc 8 points 4 months ago (4 children)

However, you can save encrypted ssh, gpg keys and save that encryption key in the OS keyring.

load more comments (4 replies)
[–] DemBoSain@midwest.social 34 points 4 months ago (37 children)

Why is Signal almost universally defended whenever another security flaw is discovered? They're not secure, they don't address security issues, and their business model is unsustainable in the long term.

But, but, if you have malware "you have bigger problems". But, but, an attacker would have to have "physical access" to exploit this. Wow, such bullshit. Do some of you people really understand what you're posting?

But, but, "windows is compromised right out of the box". Yes...and?

But, but, "Signal doesn't claim to be secure". Fuck off, yes they do.

But, but, "just use disk encryption". Just...no...WTF?

Anybody using Signal for secure messaging is misguided. Any on of your recipients could be using the desktop app and there's no way to know unless they tell you. On top of that, all messages filter through Signal's servers, adding a single-point-of-failure to everything. Take away the servers, no more Signal.

[–] Zak@lemmy.world 34 points 4 months ago (3 children)

If someone can read my Signal keys on my desktop, they can also:

  • Replace my Signal app with a maliciously modified version
  • Install a program that sends the contents of my desktop notifications (likely including Signal messages) somewhere
  • Install a keylogger
  • Run a program that captures screenshots when certain conditions are met
  • [a long list of other malware things]

Signal should change this because it would add a little friction to a certain type of attack, but a messaging app designed for ease of use and mainstream acceptance cannot provide a lot of protection against an attacker who has already gained the ability to run arbitrary code on your user account.

[–] douglasg14b@lemmy.world 16 points 4 months ago* (last edited 4 months ago) (2 children)

Not necessarily.

https://en.m.wikipedia.org/wiki/Swiss_cheese_model

If you read anything, at least read this link to self correct.


This is a common area where non-security professionals out themselves as not actually being such: The broken/fallacy reasoning about security risk management. Generally the same "Dismissive security by way of ignorance" premises.

It's fundamentally the same as "safety" (Think OSHA and CSB) The same thought processes, the same risk models, the same risk factors....etc

And similarly the same negligence towards filling in holes in your "swiss cheese model".

"Oh that can't happen because that would mean x,y,z would have to happen and those are even worse"

"Oh that's not possible because A happening means C would have to happen first, so we don't need to consider this is a risk"

....etc

The same logic you're using is the same logic that the industry has decades of evidence showing how wrong it is.

Decades of evidence indicating that you are wrong, you know infinitely less than you think you do, and you most definitely are not capable of exhaustively enumerating all influencing factors. No one is. It's beyond arrogant for anyone to think that they could 🤦🤦 🤦

Thus, most risks are considered valid risks (this doesn't necessarily mean they are all mitigatable though). Each risk is a hole in your model. And each hole is in itself at a unique risk of lining up with other holes, and developing into an actual safety or security incident.

In this case

  • signal was alerted to this over 6 years ago
  • the framework they use for the desktop app already has built-in features for this problem.
    • this is a common problem with common solutions that are industry-wide.
  • someone has already made a pull request to enable the electron safe storage API. And signal has ignored it.

Thus this is just straight up negligence on their part.

There's not really much in the way of good excuses here. We're talking about a run of the mill problem that has baked in solutions in most major frameworks including the one signal uses.

https://www.electronjs.org/docs/latest/api/safe-storage

load more comments (2 replies)
[–] gomp@lemmy.ml 12 points 4 months ago* (last edited 4 months ago) (6 children)

Those are outside Signal's scope and depend entirely on your OS and your (or your sysadmin's) security practices (eg. I'm almost sure in linux you need extra privileges for those things on top of just read access to the user's home directory).

The point is, why didn't the Signal devs code it the proper way and obtain the credentials every time (interactively from the user or automatically via the OS password manager) instead of just storing them in plain text?

load more comments (6 replies)
load more comments (1 replies)
load more comments (36 replies)
[–] Prethoryn@lemmy.world 34 points 4 months ago (5 children)

Ah yes, another prime example that demonstrates that Lemmy is no different than Reddit. Everyone thinks they are a professional online.

Nothing sensitive should ever lack encryption especially in the hands of a third party company managing your data claiming you are safe and your privacy is protected.

No one is invincible and it's okay to criticize the apps we hold to high regards. If your are pissed people are shitting on Signal you should be pissed Signal gave people a reason to shit on them.

[–] possiblylinux127@lemmy.zip 10 points 4 months ago

Where are you going to store the encryption key? At the end of the day the local machine is effectively pwded anyway

load more comments (4 replies)
[–] jsomae@lemmy.ml 27 points 4 months ago (3 children)

The real problem is that the security model for apps on mobile is much better than that for apps on desktop. Desktop apps should all have private storage that no other non-root app can access. And while we're at it, they should have to ask permission before activating the mic or camera.

[–] Cuntessera@sh.itjust.works 9 points 4 months ago* (last edited 4 months ago) (3 children)

macOS has nailed it*, even though it’s still not as good as iOS or Android, but leagues and bounds better than Windows and especially Linux.

ETC: *sandboxing/permission system

[–] Vash63@lemmy.world 6 points 4 months ago (1 children)

What's wrong with the Flatpak permissions system on Linux?

load more comments (1 replies)
load more comments (2 replies)
load more comments (2 replies)
[–] Carbophile@lemmy.zip 27 points 4 months ago* (last edited 4 months ago) (5 children)

The backlash is extremely idiotic. The only two options are to store it in plaintext or to have the user enter the decryption key every time they open it. They opted for the more user-friendly option, and that is perfectly okay.

If you are worried about an outsider extracting it from your computer, then just use full disk encryption. If you are worried about malware, they can just keylog you when you enter the decryption key anyways.

load more comments (5 replies)
[–] mlg@lemmy.world 19 points 4 months ago

Bruh windows and linux have a secrets vault (cred manager and keyring respectively, iirc) for this exact purpose.

Even Discord uses it on both OSs no problem

[–] Majestic@lemmy.ml 14 points 4 months ago (5 children)

There is just no excuse for not even salting or SOMETHING to keep the secrets out of plaintext. The reason you don't store in plaintext is because it can lead to even incidental collection. Say you have some software, perhaps spyware, perhaps it's made by a major corporation so doesn't get called that and it crawls around and happens to upload a copy of a full or portion of the file containing this info, now it's been uploaded and compromised potentially not even by a malicious actor successfully gaining access to a machine but by poor practices.

No it can't stop a sophisticated malware specifically targeting Signal to steal credentials and gain access but it does mean casual malware that hasn't taken the time out to write a module to do that is out of luck and increases the burden on attackers. No it won't stop the NSA but it's still something that it stops someone's 17 year old niece who knows a little bit about computers but is no malware author from gaining access to your signal messages and account because she could watch a youtube video and follow along with simple tools.

The claims Signal is an op or the runner is under a national security letter order to compromise it look more and more plausible in light of weird bad basic practices like this and their general hostility. I'll still use it and it's far from the worst looking thing out there but there's something unshakably weird about the lead dev, their behavior and practices that can't be written off as being merely a bit quirky.

load more comments (5 replies)
[–] notannpc@lemmy.world 11 points 4 months ago (2 children)

This just in: threat actors compromising your devices is bad. More at 11.

load more comments (2 replies)
[–] yogthos@lemmy.ml 6 points 4 months ago (2 children)

This shows an incredibly cavalier approach to security on the part of the team working on signal.

load more comments (2 replies)
load more comments
view more: next ›