This blog is reserved for more serious things, and ordinarily I wouldn’t spend time on questions like the above. But much as I’d like to spend my time writing about exciting topics, som…
Blog post by crypto professor Matthew Green, discussing what Telegram does (I wasn't familiar with it) and criticizing its cryptography. He says Telegram by default is not end-to-end encrypted. It does have an end-to-end "secret chat" feature, but it's a nuisance to activate and only works for two-person chats (not groups) where both people are online when the chat starts.
It still isn't clear to me why Telegram's founder was arrested. Green expresses some concern over that but doesn't give any details that weren't in the headlines.
I mean, Signal has over 100 million downloads on the Play Store alone. Even on the odd chance those phone numbers do somehow end up in the hands of the NSA or whatever the chances of it actually relaying any real information about you is second to none.
Even then, you can't assume everyone who uses Signal wants to use e2ee explicitly. Some might just like the app's style, some might have family members who only use Signal, some might have an ethical problem with corporate apps but aren't computer-brained enough to know how SimpleX or Jabber or some other obscure alternative works.
Is the phone number requirement bad? Yes, absolutely. Does that instantly rule out all opportunity for it being a good app, privacy wise? Definitely not.
Further; privacy should be simple. Signal is designed to be as close to perfect as it can be without compromising too much privacy. They have decided that a phone number is necessary to prevent spam, and to combat the privacy implications of that they have chosen not to block temporary numbers for those who are more concerned.
Private chat apps are useless if noone knows how to use them. Signal tries to fix that, and I think they're doing a pretty good job even if it does have it's pitfalls.
I think you missed the point there. The value for the NSA is in knowing which phone numbers communicate with other phone numbers which is precisely the metadata that Signal leaks. This allows you to build networks of users who are doing private communication. Next, you can cross reference the phone numbers with the data from Google, Meta, etc. and then if you see that one of them is a person of interest, then you immediately know the other people they talk to who are now also of interest.
The fact that people keep trying to downplay this is truly phenomenal to me. Once again, Signal is an app that uses a central server based in the US, that almost certainly shares data with the government. Anybody who minimally cares about their privacy should be concerned about this.
Signal is not an app that's private. Period. If you don't understand this then you don't understand what the term privacy means.
Assuming that data that can be leaked is being leaked is the only sensible thing to do if you care about privacy. Clearly this is too difficult a concept for you to wrap your head around.
Thanks for at least being honest that you don't actually care about privacy. Bye.
Right, and it's strange to me that such a fundamental rule is being ignored when it comes to Signal. All of a sudden people start making all kinds of excuses as to why it's not a problem in this case.
If the metadata is being leaked then you have to assume it's being used in an adversarial way. Privacy can't be trust based. Either the protocol is secure and it guarantees that your metadata is private, or you're taking it on faith that people operating Signal servers are good actors and will never leak this data to anybody you wouldn't want them to.
Also worth noting that thanks to US laws, Signal would not even be allowed to say they're sharing data with the government even if they wanted to.
As far as I know, nothing in the protocol prevents the server from connecting who is talking to whom. In fact, given that the phone number is how an account is identified, the server effectively has to do this in order to pass messages between people.