"Some viable alternatives I can recommend are Threema or TextSecure."
The problem with Threema and TextSecure is that it's hard to convince my friends to use them because
- Threema is not free and does not have desktop/web clients, and multi-device setup is not supported afaik
- TextSecure is not really multi-platform
The only popular, multiplatform messengers at the moment are Whatsapp, FB Messenger, Viber/Line and Telegram. Whatsapp doesn't have a desktop client, Messenger requires a Facebook Login, Telegram and Whatsapp does not have voice chat and I hate Viber's design -> so I keep using Telegram for messaging and Skype for voice.
The network effects can be very nicely observed in messenger usages. In my home country almost everyone uses Viber, but in other countries I have worked (W-EU) almost everyone uses Whatsapp. In Germany people seem to know Telegram, in other places nobody heard about it.
I know you most likely meant synchronous voice chat, but if you can deal with asynchronous voice chat, you'll be able to have full end-to-end encryption for that and text messages using WhatsApp in partnership with Open Whisper Systems in the near-ish future.
Yeah, that's not really acceptable when I'm calling the tax authorities or my car dealer to arrange an appointment :) for that purpose, Skype is 100% workable for me (even under Linux).
I find it astonishing that in 2015 there's still no org is funded enough to have a server infrastructure and a clean UI that everyone can use and is smart enough to use crypto correctly when it comes to chat clients. What gives?
It uses Otr for encrypted chat, so you can use any otr client for the other side of the chat. I personally use pidgin on my laptop and ChatSecure on my cell. It currently supports at least iOS and Android, which encompasses most people that I know.
If I recall correctly the issue with ios xmpp clients is that you cannot run them in the background. Instead you must either have a provider that does push notifications or the client has to run constantly, draining the battery and closing automatically every now and then. At least that was the fatal result when I tried to convince friends to use xmpp a while ago.
I personally don't use iOS so I'm not sure whether that is still an issue. They released an update for the iOS client fairly recently so you could give that a shot? The new iOS client seems like it is very nice: https://chatsecure.org.
My experience with Chatsecure (at least the version I got from F-Droid) was that it was too buggy to be worth it. Specifically, it would initiate OTR conversations with people without me asking. I think it even did so when I explicitly told it not use OTR at all, though it's been a while. If they fix that I might use it again. Do you have this problem, and/or have they fixed it to your knowledge?
I think the version in f-droid is 13.1.2. If you enable the guardian repository in f-droid that should have the 14.0.9 client which is considerably nicer. You can disable all otr if you want, or set it so you need to explicitly enable it when you are chatting with someone. The latest version has adopted the Android L theme as well so it is very well integrated. (I did have the same issue as you with the version in the f-droid repository).
Huh! Just yesterday I was fed-up with Textsecure being so buggy and bad that I looked at the Telegram homepage and thought "nice" apart from not completely being free software (or is it?). It says "Telegram messages are heavily encrypted and can self-destruct." which made me believe that all messages are heavily encrypted.
Now this post says "By default, messages (...) are logged and stored on Telegram's servers". Damn.
I was spending far too much money sending SMS messages to my friends overseas. A few hundred per month at EUR0.12 each starts to add up pretty quickly. Data is cheap, so instant messengers tend toward being almost free.
Telegram has official clients for my FirefoxOS phone and for my Linux desktop.
I've never used their secret chats. I appreciate their open source nature and am happy to support them.
This is exactly the problem with the communities "greater than though" tone toward anything non-EU/US engineered. The Telegram client is open source. Threema for example is NOT. WhatsApp for example, is NOT. Facebook Messenger, is not. Telegram appears to have the right balance in getting real peer review from having the system open source and verifiable while also garnering this huge animosity which has lead to even more review. As far as I can see Telegram and TextSecure are the only clients where the protocol can be verified through the open source client. I'm not advocating for its use but I'm sick of the misconceptions.
I find it interesting in how there is some dependence on the fingerprint secret being shared across the Telegram channel in order to verify identity.
What if a user sends the screenshots of the fingerprint over, say, a medium decided in the chat on the fly? Unless you can MITM every service they're using, you could just /tweet/ the fingerprints, for instance, and check on a different machine for a matching key. Or what if the screenshot is tampered with by hand, like, signing the center of the square with a finger drawn, transparent signature? How would an attacker replicate that?
Of course, the "supervillain" model still works and enough computation time and power might find it, but there are a multitude of ways to ensure that the secrets match on everyone's devices.
Every one of these new chat systems have similar problems with key authentication. they fail in similar ways and try to solve the problem in similar and different ways. This articles contribution, while very valid with its "greater than though" tone, is gets on my nerves. I have no idea why it always seems to be a constant when people talk about Telegram without really investigating the dramatic headlines authors claim. Is it that we just hate anything engineered in Russia, China. Its boring already.
Do you have a data point to suggest real bias against the geographic engineering behind Telegram?
All the substantial warnings I have read against it revolve around its lack of a respected crypto team (in fact, not having any experienced crypto team and conflating a bunch of maths PhDs with a real crypto team), not using acknowledged crypto design best practices (and flagrantly ignoring them for counrer-intuitive designs), and most importantly, using a contest instead of a real audit.
Contest != audit. An audit has a specialized team of cryptographers who try to break the application in any scenario. A contest has restrictive scope that sets up a strawman and doesn't attract real, serious crypto auditing. As an example, this article's MITM doesn't qualify despite being a real attack.
A contest means that you have no real knowledge of whether the crypto is broken if nobody succeeds. An audit means you can be significantly towards certain. You might already know this, but consider that if someone hosts a contest instead of an audit, and has all the other issues stated above, they will be met with hostility.
Valid criticism but I feel the items you mention get blown out of proportion. The Cryptocat team had similar areas where it was lacking and yet did not suffer the same backlash. This could be because they never ran a competition and worked to find a way to fund an audit several years later instead.
The data point(s) that lead me to wonder of a geographic bias have to due with the tone of prior analysis of telegram I've read over the past year. This article included. And also a general, very male, arrogance in the security community that seems to constantly search for any reason to ridicule. I will have to go back and trace the history of my perception to find the tone I mention, which I can do tomorrow if you likewise can provide a datapoint for your understanding that they flagrantly ignored crypt design best practices.
In watching the evolution of Pond, Telegram and TextSecure we see that to solve security or usability problems designers are required to venture to some degree into uncharted territory, away from best practices. So I question the flagrant claim.
But I agree contests are just marketing ploys, little else. If Telegram wanted to "claim" otherwise they would need to start with paying the analysts from this article the full bounty.
You might be the first person I've ever heard suggest that Cryptocat didn't have a "backlash". Cryptocat comes pretty close to being reviled. It's 10x more so in private conversations between technical experts, but it's also very visible publicly.
(I think the project deserves the flak it takes, and continues to endanger its users today).
The "Russia" thing is just dust people throw into the air to thwart reasoned discussion. People don't like Telegram because it is prima facie cryptographically incompetent. There's no one resource that sums up all the brokenness and weirdness --- certainly not Alex's post here, which concerns itself with a specific bug he and Juliano found. You have to collect all the data and paint the picture yourself.
You're always present when crypto comes up and I honestly usually _wait_ for that opinion.
Now, let's ignore Telegram for a second. I haven't installed it, ever, but the feature set seems compelling and the UI (screenshots, demos) is nice.
Would you agree that the Better Stuff™ is just less usable and less accessible right now? Is there a way we can fix that?
If Telegram gets the UX right, but fails in the underlying implementation, can we take a decent implementation and fix the UX issues?
See, Moxie's solution just plain doesn't work. I'm sure it's sound and great, but .. it uses a phone number as the identifier (like WhatsApp, and that's not something I'd like to be compared with). What about my laptop or tablet? What about cross-platform solutions? I'm trying to migrate to FxOS as soon as they fix some mail related shortcomings. There seems to be a Telegram client. No (official) WhatsApp client and I doubt that most of the 'better' alternatives offer something here.
If we distill this post into one line: How can we have a decent user experience, without throwing out privacy and security?
What would be the best candidate to start from, because .. all of them suck right now?
I'm just going to be honest, because the honest answer is simple.
It is not my job, or anyone else's, to square the circle of sound cryptography with good UX. Advice not to use broken crypto doesn't become bad advice simply because the good crypto options have crappy UX. If you care about security, good crypto trumps bad UX. If you don't care about security, just use Google Chat. Seriously: you are probably better off even from a security perspective using Google Chat than you are by using fly-by-night crypto.
If you asking, "why don't more of the people who strongly believe we need better encrypted messaging contribute UX work to the good crypto projects like TextSecure instead of inventing their own broken systems?", I'm as confused as you are by that phenomenon.
This wasn't directed towards me, but I'll participate.
>> Now, let's ignore Telegram for a second. I haven't installed it, ever, but the feature set seems compelling and the UI (screenshots, demos) is nice.
Agree. I think Telegram has a nice design.
>> If we distill this post into one line: How can we have a decent user experience, without throwing out privacy and security?
Here is the thing, this is a relatively new ground. The technology for secure messaging systems has been around for a much longer time but it wasn't as popular or fashionable.
You can and will have secure messaging built on solid crypto with an attractive user interface and available everywhere. That's Moxie's long-term goal.
But this is how you get there (choose your adventure):
1. Build the world's sexiest user interface for secure messaging apps and focus on cross-platform compatibility, while building it on top of an okay crypto system (or worse, a weird custom one that ignores industry practices). Once you have mindshare, gradually improve your crypto (Telegram is demonstrably not doing this by conflating a contest with an audit, but that's another comment). You'll start off with a lot of hype and users, but not much respect from the information security community because, well, you haven't earned it yet.
2. Take a respected crypto team, build out the world's best crypto system for a secure messaging app, leveraging both established best practices and professional peer review. Focus on building a really good prototype in one area with an acceptably attractive and functional user interface, putting other things on the backburner in order to finish a core product.
3. Go for the hail mary and try to build out the most amazing crypto system with the most amazing user interface and the most amazing features running on everything from iOS 4.4 to bleeding edge Debian to please everyone. In a year, burn out and die off because you almost certainly don't have the Herculean development resources this would require in completely orthogonal disciplines.
Which of these tracks sounds most like what successful startups do?
I acknowledge your point that no offering is perfect right now. But TextSecure focuses on the right things.
If I could sum this up entirely: nothing is perfect (yet), but TextSecure is the best offering on the market for a secure messaging application, and if what you want is a secure messaging application, start from solid crypto and work your way to everything else, not from a sexy user interface to okay/passing crypto.
Telegram might very well have the best of intentions in mind and it might have solid crypto under the hood. But that's the sort of thing you consider false until proven true, and so far their behavior is similar to a company that promises a fantastic new service in financial trading but prioritizes ubiquity and design without ever auditing the core trading system itself.
So, you're basically saying TextSecure is the real thing (unsurprisingly) and that it might just add the UX related stuff (multiple devices, devices that .. well .. don't have a phone number) later.
I'm still unconvinced.
I can follow your argument if we're talking crypto only (Disclaimer: I .. shouldn't talk about crypto). So, ignoring Telegram again, TextSecure is putting encryption first. Great. Except that I don't see a way to remove the base assumption here. "Your identifier is a mobile number".
That is absolutely crap. Both for UX reasons (see above and up-thread, devices that I own that cannot claim my mobile number) and .. because I don't want to share my mobile number for random IM chats.
Again: Yes, my mobile number _might_ uniquely identify me. No, that is not a cool 'user name' in any sort of messaging app.
I fail to understand how TextSecure can go from this fatal and flawed assumption and .. fix that. It's broken. The encryption might be sound, but the idea is crap.
Granted, I'm not the best programmer and far from a crypto expert, but .. it seems to me as if changing this fundamental assumption would be hard to do for TextSecure. Their partnership with WhatsApp makes it worse, because those guys do the same insane bullshit and just tell your mom, your ex and your aunt twice removed that you're now available to chat on WhatsApp.
Let me state it differently: I have TextSecure installed. I cannot run it. It is, at this point, utterly unusable. I wouldn't know how I could market this to friends or family, because it is unusable.
How can we fix that? How can we - again, I've never installed or used that one, but it's the thread's topic right now - approximate Telegram?
I would have to agree with you on that one - TextSecure seems like a great option, with the exception of its reliance on mobile numbers. This is mostly because they're centrally controlled, and at no point do you actually "own" your mobile number.
Changing this wouldn't be a big deal, as TextSecure doesn't really use mobile numbers for anything important as far as I can tell. I get the feeling it was more intended for ease of use among the average person, but it is frustrating for someone who would rather get rid of mobile numbers altogether.
Weak protocols or not, what bothers me most about Telegram is that it doesn't use end to end encryption by default - yet still has the nerve to say it's the most secure app on the planet. I find that incredibly misleading towards the users, and many have fallen for that false advertising, which only makes me more furious about it.
So I doubt it has anything to do with geography. But more with Telegram's own arrogance about their own app, which not only doesn't have 1) a proper crypto team not "mathematicians", 2) crypto designs recognized by most cryptographers as being solid, but also 3) doesn't encrypt end to end the messages for 99 percent of its users.
TextSecure, Threema and Pond encrypt everything end to end by default and use solid designs to do so as well.
The tone is like that because people have been saying that Telegram's protocol was constructed seemingly arbitrarily and includes outdated and mostly-unstudied constructs. No serious cryptographer would build a protocol like that, and many well-known cryptographers have pointed that out.
The Telegram team has been hostile to legitimate criticism, and has hosted cracking "contests" as if that in any way demonstrates the security of their protocol. When you roll your own protocol from practically unused primitives, commit most sins in the crypto book, dodge criticism, store your users' messages, and refuse to open-source the server-side code, you deserve that tone.
When news comes around about Telegram it typically includes the criticism of their team being hostile. I just haven't seen it though. Instead I've seen their team respond to posts with gratitude for the critique and disclosure. I think this is where I get perplexed on this topic. Though maybe I just missed a particularly hostile event.
It's just rather dodgy all the criticism. What other secure chat system of late provides the server code? Only Pond comes to mind. What other chat system provides an OSS client, Only Pond TextSecure and Telegram.
Cracking contests are either absurd or just asking for people to ridicule you. So I understand part of the animosity.
many of the new secure chat systems have homecooked elements. this is because they are trying to solve usability features that have not been thoroughly dealt with before.
"If there was a way to efficiently cycle through DH parameters, an active man-in-the-middle attacker could spoof the fingerprint."
Isn't this a rather big if? In my limited understanding of the math involved, the whole reason dhe is still secure is because this isn't the case. Which means an attacker couldn't just use any old sha1 collision, it would have to corellate to a valid set of dh params, which would have an effect on the computational complexity, right?
This is not a attack on DH, it is a attack on the verification mechanism. So the trick is, the attacker does not need to break DH, he just needs to find two sufficiently similar DH parameters, that the verification still works.
I understand that this is not an attack on DH. I have not analyzed telegram but the statement I quoted seems to indicate it would need to be able to quickly generate many DH params. That doesn't seem to be a a computationally trivial task.
http://security.stackexchange.com/questions/51129/can-you-ge...
It seems like for this attack to be realistic you would need the math underlying DH to be broken somehow. At which point I think there are many applicatons who's security would be compromised that are far more disturbing than the security of telegram.
Again I am asking for clarification because I may be misunderstanding the use of the quoted text and I have not looked at the code myself.
As I read it, this analsis seems akin to someone stating that, a wall provides no security if we lived in an alternative universe where solids could freely pass through one another. That statement maybe true but who cares, that universe wouldn't continue using walls for security measures anyways. Excuse my metaphor there, I hope it clears up what I was asking about.
As far as I understand the attack, the attacker brute forces his own DH secret, not the DH parameters. So attacker has the parameters for both connections, gets the public keys from both chat partners and then brute forces two secrets such that the hashes of the resulting DH shared secret match. For this the attacker needs 'only' to brute force the keys, not the parameters.
Wrong, the whole point of end-to-end encryption is that security doesn't rely on intermediaries working as expected; they're only transports. Which is why having Telegram's server code wouldn't be useful from a security POV.
What it could be used for instead is running alternate "pods" to distribute load over multiple servers.
Isn't all this pointless when Telegram could just MITM everyone at will of any agency, as the client is closed source?
Unless the whole client code is audited by third parties (and it is ensured that app store version = audited version), this whole exchange of keys with colorful qr code could be a charade.
I believe the OP was referring to the fact that you have no assurance the application binary you receive on mobile is the same as the public open source version. This goes hand in hand with a third party audit.
Whether or not it's open source doesn't mean much if you can't audit it. Open source counts if you're a developer and can read the code to see that a piece of software isn't e.g. calling home with user credentials in cleartext and storing them or worse.
But this is a custom cryptosystem. Most people can't audit it, and the people who can are not going to use their (extremely lucrative) skillset on the software without personal investment like "I want to use this."
Are they going to pay a consulting firm like Riscure to completely audit the codebase? No? Then it is insecure until proven secure. Open source isn't good enough.
Okay this problem of binary verification you should blame on Google and Andriod app ecosystem. At least on andriod if you install the apk yourself with Telegram and TextSecure you know exactly what you are running. For everythign else, you have no clue.
> if you install the apk yourself [...] you know exactly what you are running
No, you don't. You know what you're running only if you compiled the sources or if you can compare the binary signature with the build of someone you trust (you or others that built from sources).
Indeed. Or to go further, you can't trust even compiled sources you place on dev mode device if you didn't also compile the entire OS and drivers. Still, not Telegram/TextSecure's fault
I think that since these programs are all about security, they do have some obligation to distribute securely their binaries, more than a torchlight app. For example I'd be more at ease if the binary distributed by Telegram had the same signature present in a document signed by whoever did the audit (maybe it's even like that, I don't know).
Of course I still have to trust my OS, but if the OS is compromised I have utterly lost.
The scenario that merits trusting the OS, or app store, is one that assumes you are not individually a target. This means you trust Apple/Google and US/EU. I don't know, I could see there being reason for scenarios that fit this use case but have trouble bringing any to mind other than making fun with crytpo to feel better about yourself, or just muddy the lines for the FBI. If you're an activist, most definitely you fall outside this scenario and shouldn't be using these phones.
So the attack is due to the fingerprint of the key being smaller than the actual key and therefore theoretically possible to find another key with the same fingerprint..
Doesn't textsecure and threema have the exact same problem? Both show a fingerprint to the user instead of the actual key.
"This past spring, Juliano Rizzo (@julianor) and I came up with a cryptographic attack on Telegram's MTProto "secret" chat communications which can be performed in O(2^64) time."
Using asymptotic notation without an argument (usually "n") makes me cringe. O(2^64) = O(1)
Maybe it's the historian in me, but I found your numbers really, really funny!
>Even if you find 10,000 computers capable of 1 million checks per second (a huge "if"), it would still take 58 years.
I'm sorry if I'm missing the context but it sounds straight out of 1958! Like, when a PC takes a room, costs $2.9 million (2014 $23.1 million), and comes with a with a processing speed of around 100 Kflop/s [1]
Like, "even if you find TEN THOUSAND" computers - like where would anyone find ten thousand computers in 2014
(Amazon and Microsoft's cloud farms only have like a million servers instnatly available on their clouds).... capable of a MILLION checks per second! That's a full Megahertz!
It would still take you 58 years :)
:-D Sorry!! It would have been a good argument in 1958 though :-D
I was specifically referring to the OP's awed overtones of 10,000 x 1 mhz, resulting in 58 years. Which I found really funny.
The idea of 2^64 being inherently a large number... when a single graphics card costing a couple of hundred incidentally goes through 2^64 flop's all by itself every few weeks.
There's nothing expensive about 2^64 as such. I mean it's expensive in the sense of, we're talking dozens of dollars here :)
The problem with Threema and TextSecure is that it's hard to convince my friends to use them because
- Threema is not free and does not have desktop/web clients, and multi-device setup is not supported afaik - TextSecure is not really multi-platform
The only popular, multiplatform messengers at the moment are Whatsapp, FB Messenger, Viber/Line and Telegram. Whatsapp doesn't have a desktop client, Messenger requires a Facebook Login, Telegram and Whatsapp does not have voice chat and I hate Viber's design -> so I keep using Telegram for messaging and Skype for voice.
The network effects can be very nicely observed in messenger usages. In my home country almost everyone uses Viber, but in other countries I have worked (W-EU) almost everyone uses Whatsapp. In Germany people seem to know Telegram, in other places nobody heard about it.