Hacker News new | past | comments | ask | show | jobs | submit login
Signal Server code on GitHub is up to date again (github.com/signalapp)
272 points by domano on April 7, 2021 | hide | past | favorite | 194 comments



A lot of these comments are just manifestations of the kneejerk HN "crypto bad" reflex. Here's the deal:

- Whether or not Signal's server is open source has nothing to do with security. Signal's security rests on the user's knowledge that the open source client is encrypting messages end to end. With that knowledge, the server code could be anything, and Signal inc. would still not be able to read your messages. In fact, having the server code open source adds absolutely nothing to this security model, because no matter how open source and secure the server code might be, Signal inc. could still be logging messages upstream of it. The security rests only upon the open source client code. The server is completely orthogonal to security.

- Signal's decision to keep early development of the MobileCoin feature set private was valid. Signal is not your weekend node.js module with two stars on Github. When changes get made to the repo, they will be noticed. This might mess up their marketing plan, especially if they weren't even sure whether they were going to end up going live with the feature. Signal is playing in the big leagues, competing with messengers which have billions of dollars in marketing budget, will never ever be even the smallest amount open source, and are selling all your messages to the highest bidder. They can't afford to handicap themselves just to keep some guys on Hacker News happy.

- Signal's decision to keep development to the (private) master branch, instead of splitting the MobileCoin integration into a long-running feature branch is a valid choice. It's a lot of work to keep a feature branch up to date over years, and to split every feature up into the public and non-public components which then get committed to separate branches. This would greatly affect their architecture and slow down shipping for no benefit, given that the open sourceness of the server is orthogonal to security.


> Whether or not Signal's server is open source has nothing to do with security

This true only when you are exclusively concerned about your messages' content but not about the metadata. As we all know, though, the metadata is the valuable stuff.

There is a second reason it is wrong, though: These days, lots of actual user data (i.e. != metadata) gets uploaded to the Signal servers[0] and encrypted with the user's Signal PIN (modulo some key derivation function). Unfortunately, many users choose an insecure PIN, not a passphrase with lots of entropy, so the derived encryption key isn't particularly strong. (IMO it doesn't help that it's called a PIN. They should rather call it "ultra-secure master passphrase".) This is where a technology called Intel SGX comes into play: It provides remote attestation that the code running on the servers is the real deal, i.e. the trusted and verified code, and not the code with the added backdoor. So yes, the server code does need to be published and verified.

Finally, let's not forget the fact that SGX doesn't seem particularly secure, either[1], so it's even more important that the Signal developers be open about the server code.

[0]: https://signal.org/blog/secure-value-recovery/

[1]: https://blog.cryptographyengineering.com/2020/07/10/a-few-th...


Addendum: Out of pure interest I just went into a deep dive into the Signal-Android repository and tried to figure out where exactly the SGX remote attestation happens. I figured that somewhere in the app there should be hash or something of the code running on the servers.

Unfortunately, `rg -i SGX` only yielded the following two pieces of code:

https://github.com/signalapp/Signal-Android/blob/master/libs...

https://github.com/signalapp/Signal-Android/blob/master/libs...

No immediate sign of a fixed hash. Instead, it looks like the code only verifies the certificate chain of some signature? How does this help if we want to verify the server is running a specific version of the code and we cannot trust the certificate issuer (whether it's Intel or Signal)?

I'm probably (hopefully) wrong here, so maybe someone else who's more familiar with the code could chime in here and explain this to me? :)


The hash of the code that is running in the enclave is called "MRENCLAVE" in SGX.

During remote attestation, the prover (here, Signal's server) create a "quote" that proves it is running a genuine enclave. The quote also includes the MRENCLAVE value.

It sends the quote to the verifier (here, Signal-Andriod), which in turn sends it to Intel Attestation Service (IAS). IAS verifies the quote, then signs the content of the quote, thus signing the MRENCLAVE value. The digital signature is sent back to the verifier.

Assuming that the verifier trusts IAS's public key (e.g., through a certificate), it can verify the digital signature, thus trust the MRENCLAVE value is valid.

The code where the verifier is verifying the IAS signature is here: https://github.com/signalapp/Signal-Android/blob/6ddfbcb9451...

The code where the MRENCLAVE value is checked is here: https://github.com/signalapp/Signal-Android/blob/6ddfbcb9451...

Hope this helps!


Thank you, this indeed helps a lot and I've now found the hashes![1] Moreover, thank you for providing me with some additional keywords that I could google for – this made it much easier to follow the SGX 101 "book"[0] and I think I've got a much better grasp now on how SGX works!

[0]: https://sgx101.gitbook.io/sgx101/sgx-bootstrap/attestation#r...

[1]: For other people interested in this matter: I've followed the path of the MRENCLAVE variable to the very end,

https://github.com/signalapp/Signal-Android/blob/6ddfbcb9451...

where it gets injected by the build config. The build config, in turn, is available here:

https://github.com/signalapp/Signal-Android/blob/6ddfbcb9451...

(The MRENCLAVE values can be found around line 120.)


Correction: The MRENCLAVE value is the one on line 123, compare the signature of the `KbsEnclave` constructor -> https://github.com/signalapp/Signal-Android/blob/7394b4ac277...


Matthew Green's tweet[0] has now sent me down the rabbit hole again as I was trying to figure out where the IAS's certificate gets verified. (Without verification the IAS's attestation that the enclave's quote carries the correct signature would obviously be worthless.) I wanted to find out whether it gets pinned or whether Signal trusts a CA. It seems to be the latter:

https://github.com/signalapp/Signal-Android/blob/6ddfbcb9451...

For completeness, let's also have a look at where the CA certificate(s) come(s) from. The PKIX parameters[1] are retrieved from the trustStore aka iasKeyStore which, as we follow the rabbit hole back up, gets instantiated here:

https://github.com/signalapp/Signal-Android/blob/6ddfbcb9451...

As we can see, the input data comes from

https://github.com/signalapp/Signal-Android/blob/6ddfbcb9451...

`R.raw.ias` in line 20 refers to the file `app/src/main/res/raw/ias.store` in the repository and as we can see from line 25 it's encrypted with the password "whisper" (which seems weird but it looks like this a requirement[2] of the API). I don't have time to look at the file right now but it will probably (hopefully) contain only[3] Intel's CA certificate and not an even broader one. At least this is somewhat suggested by the link I posted earlier:

https://github.com/signalapp/Signal-Android/blob/master/libs...

In any case, it seems clear that the IAS's certificate itself doesn't get pinned. Not that it really matters at this point. Whether the certificate gets pinned or not, an attacker only needs access to the IAS server, anyway, to steal its private key. Then again, trusting a CA (and thus any certificate derived from it) obviously widens the attack vector. OTOH it might be that Intel is running a large array of IAS servers that come and go and there is no guarantee on Intel's part that a pinned certificate will still be valid tomorrow. In this case, the Signal developers obviously can't do anything about that.

[0]: https://twitter.com/matthew_d_green/status/13802817973139742...

[1]: https://docs.oracle.com/javase/8/docs/api/index.html?java/se...

[2]: https://stackoverflow.com/questions/4065379/how-to-create-a-...

[3]: If it contains multiple CA certificates, each one of them will be trusted, compare [1].


Addendum to the addendum: Whether there's a fixed hash inside the Signal app or not, here's one thing that crossed my mind last night that I have yet to understand:

Let's say we have a Signal-Android client C, and the Signal developers are running two Signal servers A and a B.

Suppose server A is running a publicly verified version of Signal-Server inside an SGX enclave, i.e. the source code is available on GitHub and has been audited, and server B is a rogue server, running a version of Signal-Server that comes with a backdoor. Server B is not running inside an SGX enclave but since it was set up by the Signal developers (or they were forced to do so) it does have the Signal TLS certificates needed to impersonate a legitimate Signal server (leaving aside SGX for a second). To simplify things, let's assume both servers' IPs are hard-coded in the Signal app and the client simply picks one at random.

Now suppose C connects to B to store its c2 value[0] and expects the server to return a remote attestation signature along with the response. What is stopping server B then from forwarding the client's request to A (in its original, encrypted and signed form), taking A's response (including the remote attestation signature) and sending it back to C? That way, server B could get its hands on the crucial secret value c2 and, as a consequence, later on brute-force the client's Signal PIN, without C ever noticing that B is not running the verified version of Signal-Server.

What am I missing here?

Obviously, Signal's cloud infrastructure is much more complicated than that, see [0], so the above example has to be adapted accordingly. In particular, according to the blog post, clients do remote attestation with certain "frontend servers" and behind the frontend servers there are a number of Raft nodes and they all do remote attestation with one another. So the real-life scenario would be a bit more complicated but I wanted to keep it simple. The point, in any case, is this: Since the Signal developers are in possession of all relevant TLS certificates and are also in control of the infrastructure, they can always MITM any of their legitimate endpoints (where the incoming TLS requests from clients get decrypted) and put a rogue server in between.

One possible way out might be to generate the TLS keys inside the SGX enclave, extract the public key through some public interface while keeping the private key in the encrypted RAM. This way, the public key can still be baked into the client apps but the private key cannot be used for attacks like the one above. However, for this the clients would once again need to know the code running on the servers and do remote attestation, which brings us back to my previous question – where in Signal-Android is that hash of the server code[1]?

[0]: https://signal.org/blog/secure-value-recovery/

[1]: More precisely, the code of the frontend enclave, since the blog post[0] states that its the frontend servers that clients do the TLS handshake with:

> We also wanted to offload the client handshake and request validation process to stateless frontend enclaves that are designed to be disposable.


TLS is used, but there is another layer of encryption e2e from the client to inside the enclave. Your MITM server B can decrypt the TLS layer, but still can't see the actual traffic.


Just came back to post this but you beat me to it haha. Thank you! :) I just looked at the SGX 101 book and found the relevant piece: Client and enclave are basically doing a DH key exchange. https://sgx101.gitbook.io/sgx101/sgx-bootstrap/attestation#s...


> These days, lots of actual user data (i.e. != metadata) gets uploaded to the Signal servers[0] and encrypted with the user's Signal PIN (modulo some key derivation function). Unfortunately, many users choose an insecure PIN, not a passphrase with lots of entropy, so the derived encryption key isn't particularly strong.

If I understand what you are saying and what Signal says, Signal anticipates this problem and provides a solution that is arguably optimal:

https://signal.org/blog/secure-value-recovery/

My (limited) understanding is that the master key consists of the user PIN plus c2, a 256 bit code generated by a secure RNG, and that the Signal client uses a key derivation function to maximize the master key's entropy. c2 is stored in SGX on Signal's servers. If the user PIN is sufficiently secure, c2's security won't matter - an attacker with c2 still can't bypass the PIN. If the PIN is not sufficiently secure, as often happens, c2 stored in SGX might be the most secure way to augment it while still making the the data recoverable.

I'd love to hear from a security specialist regarding this scheme. I'm not one and I had only limited time to study the link above.


> If I understand what you are saying and what Signal says, Signal anticipates this problem and provides a solution that is arguably optimal

Yep, this is what I meant when I said "This is where a technology called Intel SGX comes into play". :)

And you're right, SGX is better than nothing if you accept that people use insecure PINs. My argument mainly was that

- the UI is designed in the worst possible way and actually encourages people to choose a short insecure PIN instead of recommending a longer one. This means that security guarantees suddenly rest entirely on SGX.

- SGX requires the server code to be verified and published (which it wasn't until yesterday). Without verification, it's all pointless.

> uses a key derivation function to maximize the master key's entropy

Nitpick: Technically, the KDF is deterministic, so it cannot change the entropy and – as the article says – you could still brute-force short PINs (if it weren't for SGX).

> I'd love to hear from a security specialist regarding this scheme. I'm not one and I had only limited time to study the link above.

Have a look at link [1] in my previous comment. :)


SGX is just the processor pinky swearing (signed with Intel keys) that everything is totally legit. Nation State Adversaries can and will take Intel's keys and lie.


SGX is also supposed to protect against Signal as a potential adversary, though, as well as against hackers. Or at least that's how I understood the blog article.


Focussing on whether the changes directly make things insecure is missing the point. Fundamentally this sort of security is about trust.

While it's nice to try to have Signal be resilient to attacks by the core team, there just aren't enough community-minded independent volunteer code reviewers to reliably catch them up. I doubt the signal foundation gets any significant volunteer efforts, even by programmers who aren't security experts.

That means I need to decide if I trust the Signal Foundation. Shilling sketchy cryptocurrencies is indicative of loose morals, which makes me think I was wrong to trust them in the past.


> Shilling sketchy cryptocurrencies is indicative of loose morals, which makes me think I was wrong to trust them in the past.

Who decided it was sketchy?

The "I don't like change so I'm going to piss all over you" attitude is what sinks a lot good things.

How does Signal benefit from being a shill for this coin? Are they being paid by MOB or do they get a % of the cut?

So far all I've read are people screaming their heads off that MOB eats babies and how dare Signal stoop so low as to even fart in their general direction, but I have yet to see anyone explain why MOB is bad or how Signal is bad for giving MOB a platform.


Part of the problem is that at the moment any government trying to force Signal to break the e2e security model is clearly interfering with speech.

By incorporating cryptocurrency/payments, governments are being handed a massive lever to force Signal to comply with the financial monitoring requirements that governments have in place.

This has a negative impact on those of us who just wanted a secure communications platform.


Thank you, this is honestly the only clear and valid argument I've seen from anyone around the feature.

Everyone else is trying to pass off false information, doctored white papers, and all sort of conspiracy theories to support their dislike for the feature.


> How does Signal benefit from being a shill for this coin? Are they being paid by MOB or do they get a % of the cut?

The CEO of signal messenger LLC was/is the CTO of MOB.

See https://www.reddit.com/r/signal/comments/mm6nad/bought_mobil... and https://www.wired.com/story/signal-mobilecoin-payments-messa...


> The CEO of signal messenger LLC was/is the CTO of MOB.

But in Edit 4 of the Reddit post it says

> Also, the extended whitepaper wrongly cites Moxie as chief technology officer, while he is the technical advisor.

On top of that, this whitepaper that's being passed around is a forgery with 1.5 pages of factually incorrect information.

The Wired article you link isn't even really critical, it just matter-of-factly explains the feature and it's back story.

> Signal's choice of MobileCoin is no surprise for anyone watching the cryptocurrency's development since it launched in late 2017. Marlinspike has served as a paid technical adviser for the project since its inception, and he's worked with Goldbard to design MobileCoin's mechanics with a possible future integration into apps like Signal in mind. (Marlinspike notes, however, that neither he nor Signal own any MobileCoins.)

It seems like you're trying conclusions based on lies you read 2nd or 3rd hand and didn't bother to verify. The reddit post you linked to would be received very differently depending on when you read it since it's been edited numerous times with addendums refuting earlier claims. Only by reading from beginning to end with all edits can you start to see a clear picture. Even then the picture I see is someone backpedaling a lot of false claims they made.


Yea, I'm bearish on cryptocurrencies, but I think moxie and his team have built up an incredible amount of goodwill in my book. Enough for me to hear out their solution before making a decision. I'm assuming they didn't write dogecoin2 or even a bitcoin clone. It will be interesting to learn about it.


You're apologizing for a project that has repeatedly damaged user trust with excuses.

These are "valid" reasons for keeping the source code private for a year? By whose book? Yours? Certainly not by mine. I wouldn't let any other business abscond from its promise to keep open source open source in spirit and practice, why would I let Signal?

This is some underhanded, sneaky maneuvering I'm more used to seeing from the Amazons and the Facebooks of the world. These are not the actions of an ethically Good organization. And as has already been demonstrated by Moxie in his lust to power, he's more than capable of deviance. On Wire vs Signal: "He claimed that we had copied his work and demanded that we either recreate it without looking at his code, or take a license from him and add his copyright header to our code. We explained that we have not copied his work. His behavior was concerning and went beyond a reasonable business exchange — he claimed to have recorded a phone call with me without my knowledge or consent, and he threatened to go public with information about alleged vulnerabilities in Wire’s implementation that he refused to identify." [1]

These are not the machinations of the crypto-idealist, scrappy underdog for justice we are painted by such publications as the New Yorker. This is some straight up cartoon villain twirling their moustache plotting.

So now I'm being sold on a business vision that was just so hot the public's eyes couldn't bear it? We're talking about a pre-mined cryptocurrency that its inventors are laughing themselves to the bank with.

At least Pavel Durov of Telegram is honest with his users. At least we have Element doing their work in the open for all to see with the Matrix protocol. There are better, more ethical, less shady organizations out there who we can and ought to be putting our trust in, not this freakshow of a morally-compromised shamble.

[1] https://medium.com/@wireapp/axolotl-and-proteus-788519b186a7


Thanks for linking this, I had no idea this occurred.


Repeatedly? This is the first I'm aware of, what are the others?


> - Whether or not Signal's server is open source has nothing to do with security. [...] having the server code open source adds absolutely nothing to this security model, [...] The security rests only upon the open source client code. The server is completely orthogonal to security.

The issue a lot of people have with Signal is that your definition here of where security comes from is an extremely narrow & technical one, and many would rather look at security in a more holistic manner.

The problem with messaging security is that there's two ends, and individually we only control one of them. Ok, screenshotting & leaking your messages will always be a concern no matter what technology we develop, but the other challenge is just getting the other end to use Signal in the first place and that's governed by the network effect of competitors.

Open Source is essential for security because one of the most fundamental security features we can possibly hope to gain is platform mobility. Signal doesn't offer any. If Signal gains mass adoption and the server changes, we're right back to our current security challenge: getting your contacts onto the new secure thing.


You're redefining the word "security" here to an incredibly expansive definition which includes all kinds of details about the ability for someone else to set up an interoperable service.


Yup. Security is hard.


But now the server code is there, so we now have this mobility, no?


Yes and no.

Signal is not actually designed with mobility in mind (in fact I would argue, based on Moxie's 36C3 talks, it was designed to be and continues to be persistently kept anti-mobility). That fact is independent of it being open- or closed-source.

However, if the server is open-source, it opens the door for future mobility in the event of org change. If it's closed-source, you get what's currently happening with WhatsApp.

In actuality, if we had something federated, with mobility pre-baked in, having a closed-source server would be less of a security-risk (the gp's comments on only needing to trust the client would apply more strongly since mobility removes the power to change from server maintainers)

Basically:

- with multi-server clients (e.g. Matrix/OMEMO), you have no dependency on any orgs' server, so their being open-source is less relevant (provided the protocol remains open—this can still go wrong, e.g. with GChat/FBMessenger's use of XMPP).

- with single-server clients (Telegram/WhatsApp/Signal), you are dependent on a single server, so that server being open-source is important to ensure the community can make changes in the event of org change.


So in principle we do have this mobility because you can run your own servers. Perhaps it is not all that unlikely that they will do a bridge to matrix.


You cannot currently run your own Signal server, no. That's what prevents mobility.

You are free to examine the source of theirs (if they choose to continue releasing it), but you cannot self-host.


If both the code and the server are open source then how come you can't run it?


If you checkout the client source, compile it, and install it on your own mobile device, you can then connect it to your own self-hosted server instance. However Signal's own server instance will then block your client (and there's no way to connect the client binaries they distribute to anything but their own server).

So you would have to then follow the above steps for any contacts you want to communicate with, distributing your own client to them. Signal devs have generally been extremely hostile toward anyone wishing to do this however.

The only way out of this situation would be if the Signal project itself was forked and people moved to that forked open-source multi-server client.


Ok, but they should be forced then to do the things they don't want to do?

What I mean is: if Signal is not Elment.io/matrix, and that the latter is better for freedom and openness, then one can agree with with that. But what I don't understand is the demand from people that Signal somehow owes them the ability to be like matrix, be federated, etc. and also be so judgemental about it, is what rubs me the wrong way.


I've tried to approach this thread in good faith, as your earlier replies seemed genuinely curious/discussion oriented, but the "ok, but" tone is making them seem increasingly shill-like.

I don't think anyone's "demanding" or "forcing" anything here. We're simply describing a definition of what we consider desirable as a sustainable secure messaging option, and pointing out the specific reasons that Signal isn't currently living up to that definition.

It's maintainers are free to continue on their way ignoring said definition.

Personally, my own comments are not targeted at Signal devs but rather at others who might consider using Signal thinking it provides certain guarantees when it doesn't.


Until they decide to go silent for another 11 months


Most of the popular chat-app space is not open source. What is it with Signal that people feel entitled to condemn it for not having the latest commits on github?


By silent, I don't just mean they held back commits. They were evasive about it the entire time. They could have explained and chose not to.

They don't owe me anything but I think it's a shame that the leading open source messenger app does such a poor job of communicating with its users and the larger open source community.


What is it with chat apps that people don't condemn them for being closed source? Imagine if GCC hid their changes for a year.


Sure, it would be nice if any software were open source, but that you are entitled for it? Funny attitude.


There's plenty of writing on that issue [1]. It makes a lot of sense to think of people being actually entitled to certain rights, especially in domains with network effects.

Btw, the Signal Foundation is a non-profit organization that benefits from community goodwill based on an open-source ethos. So people are critical when its software is closed source.

[1] https://www.gnu.org/philosophy/free-sw.en.html


I don't think a piece on gnu.org qualifies as "plenty of writing" and for sure doesn't count as basis for what you are entitled for :).


> I don't think a piece on gnu.org qualifies as "plenty of writing"

There are some links there to other pieces if you want to read more about it.

> for sure doesn't count as basis for what you are entitled for

I'm not claiming that moral authority flows from the Gnu brand; rather, they provide some information and reasoning which people can use to come to their own conclusions.


Most if not all of the links point to themselves..

It's ok to think that in an ideal world it would be like that, but argumenting as if you were entitled to the source because of it doesn't seem that it will persuade others. After all, if you aren't empathetic to the reality, how would you expect others be empathetic to you?


The reality is pretty diverse. Plenty of people use mostly or only free software. I certainly do.


...it's software is open source.


The reason is that this story is on HN is that the source was previously missing.


> A lot of these comments are just manifestations of the kneejerk HN "crypto bad" reflex.

Nope. It's a reaction to "who the f* asked for this in a messaging app?!".


Text isn't the only thing I want to be able to send to people. I wish there were a universal "send thing" api that could be implemented for text, images, money, whatever.


Like Matrix?


Unless you have your head thoroughly buried in the sand, you'd understand that all the major players allow people to send money AND people are using those platforms to send money.

When people evaluate a new messaging client, the minimum feature set required to be considered viable now includes sending money for a lot of the population.

* removed insult


> all the major players allow people to send money AND people are using those platforms to send money.

So if major players jump off a cliff, everyone should always follow?

> the minimum feature set required to be considered viable now includes sending money for a lot of the population.

? No, thank you. Not where the actual banking system is working.


Just because you don't like it or don't find it relevant in your local circle doesn't mean it is without value. Last time I traveled to Italy, hailing and paying for taxis was handled exclusively over OTT messaging.


Signal Foundation has legitimate self-serving strategic reasons to prefer such secrecy, sure.

But users also have legitimate reasons to want more transparency into both source-code & strategy.

Whether such secrecy best serves the users & the cause of private messaging is an open question.


Exactly!

It isn't about cryptocurrency at all. It's about trust.

The relationship between mobilecoin and signal isn't even clear as far as I can tell. You're just asked to look at an obvious cash grab and be willing to accept it as normal operating procedure, and then expected to trust your most private communications to the same individuals that won't reveal their true intentions.


>kneejerk HN "crypto bad" reflex

You make it sound as if that's a bad thing. Reflexes are beneficial when the treats are real. And the crypto“currency” multi-level marketing pyramid schemes are very real. They do nothing but induce greed, gambling, spam, and all-around toxic behavior. It's a digital cancer that needs to end.


> A lot of these comments are just manifestations of the kneejerk HN "crypto bad" reflex

The contrarian dynamic strikes again: https://hn.algolia.com/?dateRange=all&page=0&prefix=true&sor...


> Whether or not Signal's server is open source has nothing to do with security.

Wrong. Availability is a core component of security. Keeping your server implementation closed and only available from one single entity gives signal a failing grade in my book.

Signal is great for one-off messaging but is a poor long-term solution for secure instant messaging.


> The security rests only upon the open source client code. The server is completely orthogonal to security.

For Android at least, builds are reproducible https://signal.org/blog/reproducible-android/ (would be neat if there was one or more third party CI's that also checked that the CI-built app reproduces the one on Google Play Store – or maybe there already are?)


While I find laudable this effort and consider it a valid security improvement, the fact that Signal is opposed to alternate clients (and even builds from other sources like FDroid) opens another orthogonal risk.

I doubt that many people rebuild the app at each update to check that the new binaries match the ones provided by their store. If, for example, the PlayStore distributed at large a binary that doesn't match the published sources, some dedicated user would probably spot the issue.

However, the PlayStore (and Signal, but it's not even necessary for the following) being under US jurisdiction, any user not checking each update it receives is vulnerable to the NSL + gag order famous combo in case of a targeted attack. I recognize that this is probably something that most people do not include in their threat model but I'm still a bit dubious about the fact that convenience related to release management and not having to worry about interoperability is worth accepting the risks linked to a unique delivery channel, especially for what could (and is widely thought to) be a completely secure IM solution. "Almost secure" is frighteningly the worse obstacle to "secure"...

I'm admittedly biased since I'm convinced that federation, multiple client/server implementations and multiple distribution channels are a requirement for a secure IM infrastructure (which is why my heart goes to Matrix nowadays).


That's pretty neat, I wasn't aware that was possible.


So it just took close to a year to dump thousands of private commits into the public repo! Is there an official response as to why they stopped sharing the code for so long and more importantly, why they started sharing it publicly again? Who gains what with the publication now? And seriously, why is it even relevant anymore?


The first commit that they omitted in April 2020 is related to the payment feature they just announced. So the two events coinciding (server code being published and payment feature being announced) might not have been a coincidence. They apparently didn't want to bother creating a private test server running a private fork of the server code and just pushed their experiments to production, just not releasing the source code to prevent people from seeing the feature before an official announcement. They neccessarily built test client apps because I couldn't find any old commit mentioning payments in the client app git log.

https://news.ycombinator.com/item?id=26718134


This leaves a very bad taste in my mouth. Unclear how much practical damage this caused (how many security analysts are using the Signal server source to look for vulns?) but this is damaging to the project's claims of transparency and trustworthiness.

It’s quite clear that this crypto integration provides a perverse incentive for the project that points in the opposite direction of security.


Forgive me if this is a stupid question, but how exactly is that the case?

It's been damaging to their claims of transparency for almost a year now, if anything this should be the first step in repairing that slight. How is dumping a year's worth of private work into your public repo somehow doing damage to their trustworthiness?


For one security through obscurity is a thing. Depending on it as your primary "security measure" is stupid on all levels but being part of your security is not a bad thing. Before all someone could get would be your chat history. Other than police, jilted lovers, and state actors no one else gives a crap about that most likely unless you are targeted as an individual. Now by adding access to money that might be accessible via Signal adds more incentive for hackers to not try to hack something else and now make a beeline for Signal. Also it dilutes the efforts of the Signal developers efforts to make a better messaging app. Also crypto in and of itself is questionable, but one that is 85% by one entity waiting to liquidate has been questioned by many organizations as well. The people who own that will expect fair value for it and in essence become billionaires several times over if this really comes to fruition.


You're right that the damage to trustworthiness was always there. (I.e. they did the damage when they stopped publishing their source code, and they compounded that damage the longer they declined to publish their code). My point was more that the damage now seems to be directly attributable to the new payments integration.

Prior to seeing this post, I was already concerned that adding a crypto/payments integration would damage the Signal project, and this appears to be an immediate example of the kind of harms/perverse incentives I was concerned about.

(A counterargument to my theory here would perhaps be "Signal was always doing stuff like declining to publish their server code even prior to the payments integration", I'm not familiar enough with the history of the project to know the details there.)


Reading the other article on HN definitely helped me understand more. I think really it comes down to me not understanding why they had so much trustworthiness to begin with.

They've been obscuring their code for about a year and even then, it's not like Signal has always come out and said "we love the passion our fellow developers have for our commitment to privacy and security". They just let people sell their relatives on that promise and waited until they had a massive userbase to start monetizing their platform.

Thanks for your reply, I just wonder where all this trustworthiness has been coming from for the last 12 months while they've been quietly working on the platform without publishing any changes. It feels like a beta tester for a game being mad that there were lootboxes in the full release of the game when they weren't in the beta. Even if you didn't know they were coming, you had to assume something like it was inevitable given enough traction.


Signal's choices never really felt right, as their justifications tended towards authoritarian paternalism - eg willfull reliance on Play services, keeping it out of F-Droid (which while flawed as Signal pointed out, seems to be the best we currently have), bottleneck centralized servers, and phone numbers as primary identifiers (?!).

But the standard Free Software development/distribution model does lack in some areas. And so Signal got a bunch of community leeway for going against the grain, in the hopes that a fresh approach would somehow bear fruit.

We're now apparently seeing some of the fruit from that approach.


I don't agree with adding cryptocurrencies, but I was very sympathetic to the play services argument. Android is very difficult to program for, and it's even more difficult without play services.

For notifications the alternatives are noticably worse (higher battery usage because you can't coordinate request timings with other apps, an annoying permanent notification), and the leakage is minimal. If you protect your encrypted packets from Google the NSA will see them anyway.

Your custom implementation will be quite complicated, and if you only enable it for a small subset of your users it'll be a pain to debug.


I said willfully for a reason, as opposed to just reluctantly.

I agree about the sorry state of non-Google notifications on Android. I wish someone would make a common notification framework for the Free world that would be installed alongside system-level F-Droid. Although F-Droid Conversations and Element notifications do work fine for me, regardless of purportedly less battery life, I can understand not everyone wants to make the same choice.

However, I'm referencing more than the notifications issue. I recall an early thread from Signal where they touted the benefits of fully opting into the Google ecosystem - the gist was that Google has expended all of this effort on security and they wanted to take advantage of it to bring security to the masses. And that simply doesn't line up with my own threat model, in which Google is one of the most significant attackers.


> Signal where they touted the benefits of fully opting into the Google ecosystem - the gist was that Google has expended all of this effort on security and they wanted to take advantage of it to bring security to the masses

What exactly do they rely on google for? They use them for their push notifications and they use some google servers on the back end.

They do offer the app on the app store as 99% of android users get their apps that way, but signal also offers app downloads from the signal website if the user doesn't want to use play store.


I wish I could find the original source. It was an early post by Moxie about how embracing the Google ecosystem would give security to the masses of users, rather than worrying about the technical crowd that wants to be free of Google.


The server being or not being secure is only important to the people who operate it. You can examine the client code and see that your messages are encrypted end to end. Signal's entire security model revolves around the idea that you don't need to trust the server.


There's no concern about metadata leakage?


Even if you have access to an up-to-date source code it doesn't guarantee at all they'd be running a completely different version if so they wish. I mean this have just happened yet this question kind of implies you'd still trust such entity to run the server from the source code you have access to. I hope this collective illusion dies already.


True, neither the absence of an identified vuln in published source code, nor the absence of published source code can guarantee that you don't have vulns. And sure, a bad-faith operator can always back-door the server and run different code.

But, a good-faith operator can find and fix bugs faster if they operate in the open and in collaboration with the community. "Given enough eyeballs, all bugs are shallow" etc.


It was called out as recently as 4 weeks ago [0] and was voted to the front-page but then weighted-out possibly incorrectly by mods (may be because the top comment is dismissive of concerns raised [1]?) before a discussion could flourish.

cc: @dang

[0] https://news.ycombinator.com/item?id=26345937

[1] The title is the only thing worth reading in this pile of speculation and hand waving.


Here's a response by MobileCoin folks:

> Signal had to verify that MobileCoin worked before exposing their users to the technology. That process took a long time because MobileCoin has lots of complicated moving parts.

> With respect to price, no one truly understands the market. It’s impossible to predict future price.

- https://twitter.com/mobilecoin/status/1379830618876338179

Reeks of utter BS. As the reply on this tweet says, features can be developed while being kept switched off with a flag.


> features can be developed while being kept switched off with a flag

But maybe you don't want everyone to know about all the features / announcements months in advance?


They already did this development privately. I don't think anyone has a problem with building out a new feature before it's announced. The problem people have, IMO understandably, is that they pushed this code to production servers instead of testing it privately.


> Is there an official response as to why they stopped sharing the code for so long

Not oficially, but see https://news.ycombinator.com/item?id=26725117. They stopped publishing code when they started on the cryptocurrency integration.


better question yet: Did we ever get a full post-mortem of the six day outage the service had? other than hand waving statements about user subscriptions? what fixes were made or lessons learned?


The Signal outage was SIX DAYS?


(All the news I'm finding is that it was just one day.)


As I recall it was 1-2 days, yeah.


no it wasn't


My guess is that the whole public repo thing was just one employee's idea, nobody else at the company cared, and that employee either forgot about it or was too busy to do it for months.


I think it's proof that security (and privacy) doesn't matter. So it is very relevant. (As if telegram as competitor isn't enough proof.)

The entirety of the signal "stack" depends on the SGX enclave. The fact that no one, in all time, has bothered to notice that the running code is different than the published code, is telling.

There's actually a newer SGX exploit, and related mitigation, that came to light at about the same time when they released their discovery protocol. Those mitigations were never backported to the base signal functionality. That no one audited and complained about this says quite a lot.

I've not looked at this code dump but perhaps the newer fixes finally made their way in. Or have been there all along.


> The fact that no one, in all time, has bothered to notice that the running code is different than the published code

It’s client apps who verify (via attestation) that the code inside an SGX enclave is what they expect it to be, and clients are open source.

> The entirety of the signal "stack" depends on the SGX enclave

Only private contact discovery depends on trusting SGX.


> It’s client apps who verify (via attestation) that the code inside an SGX enclave is what they expect it to be, and clients are open source.

If the attestation signature matches the published enclave code, then we can know if there's a match. So either there's a missing mitigation, which no one ever has complained about, or the running enclave code doesn't match the source, which also no one ever has complained about. Without independent audit, there is no verification and we have established that independent parties do not care.

> Only private contact discovery depends on trusting SGX.

uh, no. this is demonstrably and obviously wrong.


> uh, no. this is demonstrably and obviously wrong.

Yes? How?


Then please demonstrate.


"Signal Server code on GitHub is up to date again - now with a freshly added shitcoin!"


The addition of micropayments to Signal is discussed separately at https://news.ycombinator.com/item?id=26724237


It's obviously related. The implication is that they pushed code to Github just to gain public trust that can be leveraged to market their cryptocurrency.


I think you’re part correct. I suspect they didn’t want to go public with the shitcoin until it was done.


That makes a lot of sense, though.

It was probably apparent to them that adding the new crypto payments feature would create at least some kind of community pushback.

Waiting until the feature is reasonably complete and can be judged on its merits is good from a business perspective.


It's also good from a very personal business perspective to have the crucial inside information of an obscure coin named "MobileCoin" (sitting on CoinMarketCap Rank #2378) about to be pumped HARD once the news about inclusion into a popular messaging app hits the airwaves.

It's especially good to have this information while said coin is already publicly traded on some (albeit obscure) exchanges. Because that allows you, as an insider, to slowly accumulate a nicely sized position in that coin while it's still cheap. "MobileCoin" wasn't publicly traded until December of last year, there wasn't even a live blockchain until that month, so if it's true that the first commits in the Signal server repo hinting at the crypto payment plans were created shortly after the repo went silent in April 2020, it is obvious that keeping these commits secret until at least December 2020 was crucial to successfully realize these good personal business perspectives - a.k.a. insider trading, but nobody cares about it if it's crypto.


Obviously. Otherwise Signal employees would not be able to do shitcoin insider trading.


Is it though? We're already 6 days without a commit. Who knows that the history isn't frozen again until the next major release?


If you have a PhD you might be able to verify from the client-side it does not matter. If you are into blockchain there might be another (but very expensive) way to show a system can be trusted.

For normal development, I am advocating an always auditable runtime that runs only public source code by design:- https://observablehq.com/@endpointservices/serverless-cells

Before sending data to a URL, you can look up the source code first, as the URL encodes the source location.

There is always the risk I decided to embed a trojan in the runtime (despite it being open source). However, if I am a service provider for 100k customers built upon the idea of a transparent cloud, then compromising the trust of one customer would cause loss of business across all customers. Thus, from a game-theoretic perspective, our incentives should align.

I think running public source code, which does not preclude injecting secrets and keeping data private, is something that normal development teams can do. No PhDs necessary, just normal development.

Follow me on https://twitter.com/tomlarkworthy if you want to see this different way of approaching privacy: always auditable source available server-side implementations. You can trust services implemented this way are safe, because you can always see how they process data. Even if you cannot be bothered to audit their source, the sheer fact that someone can, inoculates you against bad faith implementations.

I am building a transparent cloud. Everything is encoded in public notebooks and runs open-source https://observablehq.com/collection/@endpointservices/servic... There are other benefits, like being able to fork my implementations and customize, but primarily I am doing this for trust through transparency reasons.


How do you prove the endpoint is running the code to which it links?


Simple but not 100% foolproof, you can mutate your source code and verify the changes propagate.

Note the endpoint does a DYNAMIC lookup of source code. So you can kinda reassure yourself the endpoint is executing dynamic code just by providing your own source code.

It might be more obvious the runtime does nothing much if you see the runtime https://github.com/endpointservices/serverlesscells

The clever bits that actually implement services are all in the notebooks.


> Simple but not 100% foolproof, you can mutate your source code and verify the changes propagate.

If I was evil, I wouldn't have a totally separate source tree and binary that I shipped; I'd have my CI process inject a patch file. As a result, everything would work as expected - including getting any changes from the public source code - but the created binaries would be backdoored.


Yeah I can fix this with work but just getting some users would be helpful first



That doesn't seem to provide any meaningful indication the endpoint runs the code it claims. Can't I just create an evil endpoint that links to legit code?


No the endpoint is shared across all customers, the service providers do not self host, generally. The end point is the infra provider. Later I might try to code sign that and open up the cloud console for visibility, but not short term


Nice, if only they could be so kind after all this time to provide instructions on how to run it. I don't get why they have been dancing their reputation around Hacker News this way by not releasing sources until there were a bunch of front page posts about it.


If it looks like a duck, and quacks like a duck, it's probably a duck.


I read some speculation that the delay was to keep this objectionable crypto payment development under wraps until they were ready to launch.


Yep. I posted this on a different Signal HN submission, but the very next commit on April 22nd, 2020 was when they first began working on the integration.

https://github.com/signalapp/Signal-Server/commit/95f0ce1816...


Oh wow. That’s incredibly suspicious...


It could just be an arguably-legitimate desire to keep the hot new feature secret until the big announcement; this particular bit is... sub-optimal... but it doesn't seem like it needs to be nefarious.


Open source trust is a big selling point. Arguably more so than a new payment feature. Hard to understand the desire to hide it all...


Next time they will freeze code to keep NSA integration secret.


This might be a legitimate reason to keep the source code non-public temporarily. However, the communication strategy by Signal about this was horrible (or rather non-existent).

People in the user forum (https://community.signalusers.org/t/where-is-new-signal-serv...) and in other places on the internet were upset for months, because the server wasn't being updated anymore. At the same time, Signal regularly tweetet that "all they do is 100% open source", even at a point in time where no source code was released for almost a year.

Just 2 days ago this was getting picked up by some larger tech news platforms:

https://www.golem.de/news/crypto-messenger-signal-server-nic...

https://www.androidpolice.com/2021/04/06/it-looks-like-signa...

It's normal that Signal ignores its users, but apparently they didn't even reply to press inquiries about the source code. All it would have taken is a clear statement like "we're working on a cool new feature and will release the sources once that's ready, please bear with us". Instead, they left people speculating for months.

This communication strategy, combined with the cryptocurrency announcement, may cause serious harm to Signal's reputation.


The devious aspect to this is that nobody knew the development was happening so we couldn’t invest even if we wanted to.

It was kept under wraps for a grade A pump.


OTOH, announcing this development semi-privately on GitHub but not to the public at large (including the current MobileCoin owners) could be considered as "insider trading", and it's a criminal offense in US.


Which is probably why they’re avoiding the SEC at all costs.


kinda crazy that the signal team doesn't GPG sign their commits.


A example of how irrational hate can make smart people do stupid things, unfortunately.

Certain people on their team don't like the PGP standard despite the fact that it is mature, standardized, and proven to work well for code signing. When questioned about their reasoning, they'd usually deflect and criticize some aspect of PGP that is irrelevant to code signing at all.

In their minds, they believe it is better to rely on git's broken SHA1 fingerprints than to use PGP.


Thank Fefe for that: https://blog.fefe.de/?ts=9e9221ad (second update in posting, completely in german) Coincidence or bad press covfefe? ;)


For updating the source? Fwiw I skimmed it and it doesn't say anything of the sort. But thanks for making me look at this, now I know which of my friends keep up with their blog to the minute. I've heard these arguments word for word a couple hours ago...


I am disappointed they are supporting mobilecoin yet it's not available for use in the US. I understand it has important privacy and security features, but there seems little point to it if it can't be used in most of the world (the us isn't that big, but it's important financially). There were some confusing comments that it could be supported one day, easily. I guessed that it wasn't supported because they somehow want to avoid us financial regulation. So what is the reason for no us capable currencies here?


Is there any mechanism to validate that the code running on Signal's servers is the same as on Github?


I am curious how this could even possibly be done.

As far as my understanding goes, it's hardly possible to even verify that a compiled binary represents a faithfully executed representation of the source instructions, let alone that it will execute that way when run through a modern OS and CPU pipeline.

I would think the objective here is more about releasing server code that can be run independently in a way that 1) doesn't involve signal's infrastructure and 2) allows the client/server interactions to be audited in a way that trust of the server side is unnecessary, regardless of what code it may or may not be running.


There isn't, but people are working on getting us there. The first project that comes to mind is "System Transparency".

https://system-transparency.org/


I think the argument would be that with end-to-end encryption this is unnecessary, which is good because it's impossible.

There's a counter-argument that there is still useful metadata a server can glean from its users, but it's certainly minimised with a good protocol... like the Signal protocol.


Wait, how would end-to-end encryption help this problem at all? I agree that it is impossible (currently), but not sure how E2E helps anything?

E2E encryption only helps you verify WHO you are connecting to, not what they are doing with your connection once it is established.


Because the other end in E2E is your friend's phone, not a server. We call end-to-server encryption "in-flight" encryption.


Ah, ok I misunderstood


Because it doesn't matter what the server does in terms of message content.


That's basically the same problem as DRM, so no, you can't verify that someone is running only code you want them to run against data you gave them, on hardware they own.


Yet DRM does exist. (Yes, these schemes usually end up getting broken at some point, but so does other software.)

The problem is more generally called trusted computing, with Intel SGX being an implementation (albeit one with a pretty bad track record).


DRM has only been successful in the space of making easily replicable attacks more expensive than what is being protected by the DRM. Microsoft has talked about this publicly in this great talk on the Xbox One's physical device security. 'We can't stop people hacking, but we can make each hack more expensive than what someone would spend on games on average'. https://www.youtube.com/watch?v=U7VwtOrwceo

SGX running on centralized servers turns that calculus on it's head by concentrating the benefits of the hack all in one place.


Yes.

Auditors

Trusted Enclaves (but then you trust Intel)

Signed chain of I/O with full semantics specified (blockchain style).


How would that work? You'd be layering trust on trust, wherein if they're willing to lie about one thing they're willing to lie about confirmation of that same thing (or not).

Unless you're going to hire some independent auditor (that you still have to trust) it seems logically problematic.


SGX enclaves can attest to the code they are running, so you don't exactly need to take Signal's word on faith.


That isn't a solution to the problem being discussed (a provider's server code being verifiable by end users). I'm quite confused by the suggestion that it could be/is.


Except SGX enclaves are horribly broken.


Like, does an SGX enclave attest that meltdown is patched in microcode? That's one way to pull the keys out.

The recentish work to get read write access to some Intel CPU's microcode can probably break SGX too. I wouldn't be surprised if the ME code execution flaws could be used that way too.


As others have already mentioned there is Intel SGX and the Signal developers indeed say they use it, see

https://news.ycombinator.com/item?id=26729786



No.

If Signal /was/ federated it would be a strong hint that the server code stays the same.

And even if it's not the same, people would be able to run their own trusted servers.


Federation pretty much guarantees the opposite. There would likely be many servers running many different versions whereby you’d have no way of knowing which are trusted or not. It, by design, distributes trust. This means there are more parties to trust.

Anyway, Signal is designed to handle all the private bits at the client side with e2ee so you have to put as little trust in the server as possible.


No.


Seems there should be an API endpoint, similar to a health check endpoint, that allows one to validate that the code on the server matches what's in GitHub. How exactly that would work is beyond me since I'm not a cryptographer but seems like an easy way to let developers/auditors/the curious check to see that the code on the server and GitHub match.


How could that possibly work? The API endpoint of a malicious modified server could just return whatever the API endpoint of the non-malicious non-modified server returns.


if you assume that the server can lie to you, then it's physically impossible. Any query could be answered by interrogating a copy of the github version of the server and returning the answer.


   validate_endpoint() {
     return hash_against_other_file_not_exe();
   }


You can't even use MobileCoin in the US (Not that I'm a Signal user) but are they either planning on focusing on foreign markets, or planning on getting audited heavily by all the US financial agencies?


I think they're anticipating support for other coins, which might work in the US, judging by their blog post [0]:

> The first payments protocol we’ve added support for is a privacy focused payments network called MobileCoin, which has its own currency, MOB.

(Emphasis mine.)

[0] https://signal.org/blog/help-us-test-payments-in-signal/


How long until it gets closed down because you know dealers and stuff? On the other hand is it possible to compile the whole thing and run Signal on premises end to end?


Do you still need to give them your phone number though?


Given that you have said...

"the bulk of users is either too stupid or unwilling to invest even the tiniest amount of effort into their privacy."

I don't feel the need to pull my punches.

This is the most deluded, idiotic response I've seen on hacker news in a long time.

It seems unlikely that the average person (or even a non-techie person of above average intelligence - e.g. a doctor) will be able to set up matrix in a way that is more secure than just installing signal. Your security relies not just on you but on the weakest node in your network. Getting good security might require trade-offs. Your all or nothing mindset will not achieve it. The saying "Perfect is the enemy of done" comes to mind. Perfect security (or what you purpose) is not one of the options in a secure system that has to exist in the real world.

Please remove your head from it's dark cavernous home.

Love, Me


You can't attack others like this on HN, so we've banned the account.

If you don't want to be banned, you're welcome to email hn@ycombinator.com and give us reason to believe that you'll follow the rules in the future. They're here: https://news.ycombinator.com/newsguidelines.html.

We detached this subthread from https://news.ycombinator.com/item?id=26727160.


On the one hand, you're absolutely right about usability. On the other hand...

> "the bulk of users is either too stupid or unwilling to invest even the tiniest amount of effort into their privacy."

Based on my interactions with users writ large, this assessment is on the money. Normies do not care and will never care.


the average person doesn't use signal they use phones, whatsapp and facebook. You can use the public matrix node and i can use my own node and we can talk. No setup needed. If they care they can buy a hosted matrix service package like this https://element.io/pricing with their own instance run by people who probably know what they are doing.

In before "but it's not free as in no cost". That's why big corps will always fuck over the normies. As it stands, one cannot use the internet without either giving away their privacy or learning a lot about computers and how they work and how to use them.

The majority chose the "i don't care, give me shiny app" route. and they fucked us all over by doing so. There's no right to easy privacy friendly computing. There's only the harsh reality that behind friendly blue and rainbow colored companies sit people that will sell a digital recreation of yourself to anyone who cares to pay and give you a few gigs of free e-mail space and a shiny app for it.


to prove a point here, use a browser without adblocking and other extensions like chrome for a week and look at how they are targeting you. That's what they do to anyone without the willingness to fight it with technical knowledge. And that sadly is the majority.


[flagged]


I'm disappointed WhatsApp became the defacto solution for secure communication. I am also disappointed Signal became the defacto alternative. I also think Matrix is a better solution and I'm rooting for it. But I think your response is unreasonably pessimistic and, frankly, arrogant.

For all its problems, WhatsApp was an improvement to the status quo. User messages went from being fully public to only being accessible by Facebook. Signal was also an improvement. User messages were finally encrypted from end-to-end, even though Signal retained control of the infrastructure and kept some of it opaque. Things are improving and there is no reason to believe Signal will have any more staying power than WhatsApp or any other platform.

As for why Signal beat out Matrix, I think the technical hurdles are a relatively minor factor. After all, signing up with Element doesn't require hardly any technical understanding. I think Signal was just a better known brand with a larger established userbase, streamlined on-boarding process, and ubiquitous feature support across many platforms.

I don't think it's appropriate to call Signal's users or the techies who recommended it "stupid". I think most of them just realized a communication platform is only valuable if the people you communicate with actually use it and that Signal would be an easier short-term sell. I think the chaps at Element realize that and are trying to position themselves as the next logical alternative once Signal eventually has its own mass-exodus.

There is still hope for Matrix. Just keep championing it and show some patience to your fellow human.


That's such a nice response that i just want to thank you for it. Thank you, it really made my day! I'm just depressed from being stuck inside due to covid for such a long time, maybe that made me lose hope there..


No problem. With all the bad news out there, it's easy to get hung up on minor setbacks and loose track of how things are genuinely getting better overall.


> the bulk of users is either too stupid or unwilling to invest even the tiniest amount of effort into their privacy

This is an awful attitude. Not everyone understands computers at a technical level. Not everyone has the knowledge to install and troubleshoot Matrix.

Most users are non-technical, full stop. You can't expect them to use something that requires technical knowledge.


Then the reality is that those users cannot expect privacy anymore


This part I disagree with. They can (and should) lobby for laws that protect their privacy. Users don't want to give up convenience, but if the majority are using the services of BigCorp, and we push to force BigCorp to provide privacy through legislation, then society still wins.


Lobbying for privacy laws requires legal knowledge to know which proposals can actually be effective, otherwise one is merely lending their uninformed support to whatever the media tells them is a good idea today.

It all comes down to self actualization, no matter how you slice it. Installing Element instead of Signal (even with its @weird%identity.syntax) is quite technically easy, especially compared to politics! The issue is almost entirely social, in that it requires pushing back on friends who want you to install the proprietary stuff to communicate, and pushing them to the Free solutions.


They should but rarely do and even if a tiny victory like GDPR is won, it's immediately dismantled by big corps. Facebook hasn't been fined any significant amount yet it violates GDPR every day. As it stands right now, the average user has no privacy.

Just try this: Use a default browser without adblocking/tracking protection etc. for a week and look how the ads are targeting you more and more. That's the internet for the average user. We just don't see it as much because we default to protect ourselves.


> the bulk of users is either too stupid or unwilling to invest even the tiniest amount of effort into their privacy.

Those of us who have worked in security have known this for years. There are countless examples of massive security gains through minor inconvenience, and users rejecting them. Two factor auth is a big one. Yeah, it's a slight hassle. It gives major security improvements. Even security people don't like to use it. Even people who have been scammed multiple times, and told if they just turned on 2FA it help, would rather deal with being scammed again than use 2FA (I saw this at eBay and PayPal a few times, where the user rejected a free security token despite having been scammed multiple times).

Users hate friction. If it's not as easy as "download app, put in contacts" they aren't interested.


And of course, that's exactly why people were recommending Signal. Of all the practically frictionless apps it provided best encryption.


privacy isn't just encryption. Forcing users to link their phone number as an identifier like signal does is just one of the myriad of questionable choices signal has made. And with their coin based payment they just painted a large bullseye on them.


Sure, and if there had been a frictionless alternative that didn't require that, that would've been great - but there wasn't.


> I saw this at eBay and PayPal a few times, where the user rejected a free security token despite having been scammed multiple times.

People here downvote me for this but this is the issue. Those people cannot be helped and they don't want help. They chose this and have to live with the consequences. There's no right to easy computing.


I think you got downvotes for the way you presented your argument, not the content...


Yeah, i'm tired and hangry. you're right. I'm just happy SOMEONE understood my argument and saw the same issues i did


> To me this drives home one key issue: the bulk of users is either too stupid or unwilling to invest even the tiniest amount of effort into their privacy.

Something tells me you've never given tech support to non-technical family members. (And that's being generous – because outright considering everyone non-technical "stupid" would be a pretty sad worldview.)


> or unwilling to invest even the tiniest amount of effort into their privacy.


> highly secure alternative exists in the form of matrix

At least for metadata, as of now, Signal seems to provide better guarantees than Matrix.

I can imagine Matrix competing with Discord and Slack, but I don't think they'll ever be able to compete with WhatsApp and Signal. You can blame "stupid" users and the media all you want, that won't change the path of least resistance. I really like Matrix as an IRC replacement though.


> At least for metadata, as of now, Signal seems to provide better guarantees than Matrix.

Agree here. Matrix servers log everything by default. If somebody cares about protecting metadata, I don't know why they'd choose Matrix over Signal.


The people screaming the loudest about how signal is bad for x reason always seem to be the ones recommending y(usually matrix) that suffers the same issue.


Matrix is still broken though. I find XMPP to be sufficient at least until matrix fixes their group encryption


Got a source for matrix group encryption issues?

EDIT:

From their [FAQ](https://matrix.org/faq/)

> End-to-End Encryption is fully supported in Matrix. New rooms have encryption enabled by default, and all existing rooms can optionally have End-to-End Encryption turned on.

What exactly do you think is broken here?


In group chat constantly there are members who you cannot decrypt.


ux is seriously broken. i run my own instance and even i cant verify my encrypted sessions. Like i already have some three security codes that i have no idea when to use and verification of my own devices often randomly fails.

its just mess and i think it works properly maybe on main instance.

its nowhere near encryption on signal


This is an unhelpful attitude, IMHO.

Your best chance, as a privacy-conscious, tech-savvy individual is to push for mass-market adoption of strong encryption and good government privacy regulations that will help everyone.

Lacking those, you will stand out like a sore thumb as one of a tiny number of "weirdos" using Matrix, or Brave, or Tor or GrapheneOS or whatever other hardcore self-hosted, federated niche tools you favour. Any interested spooks can focus their considerable resources on the user base for these tools. Merely having these things installed becomes suspicious in a way that WhatsApp or Signal is not.


> This is an unhelpful attitude, IMHO.

I agree. I gave up fighting for privacy for other people because they don't care. I'm now fighting for my privacy because it's a war that the normies don't care to fight. They never did. Give them a shiny app and they'll sign away their firstborn.

> Your best chance, as a privacy-conscious, tech-savvy individual is to push for mass-market adoption of strong encryption and good government privacy regulations that will help everyone.

Governments, at least the one here in Germany, are already pushing for mandatory backdoors in all messengers and so far they have been hugely successful in pushing for all anti-privacy anti freedom laws they want with the only hindrance being our highest court. Just this week our NSA equivalent, the BND was granted sweeping new rights to hack messenger services around the world, to install rootkits and trojans on people's devices and listen into large parts of the internet.

WE. FUCKING. LOST.

This was all done in the open, the mass media didn't object, normies didn't object, no one cared. Heck my parent's are still using whatsapp despite my best efforts and my father is an IT Professional himself.

> any interested spooks can focus their considerable resources on the user base for these tools. Merely having these things installed becomes suspicious in a way that WhatsApp or Signal is not.

bullshit, it's massively easier to force signal/whatsapp/whatever to hand over user data then to hack my Graphene OS phone (yes, really it's my main phone), backdoor into my qubes OS notebook (yes really, i'm writing this comment using it) or attack my raspberry pi hosted matrix instance. Sure it's possible. If they want to get me they don't have to do much, just threaten me with a rubberhose. But they can't fuck me over by legally attacking large cloud infrastructure that hosts everyone.

i can't save those that don't care but i can build a fucking cyberbunker and host my own stuff in the vain hopes that it protects me from the worst, and it certainly protects me from facebook, google and co. Sure the NSA can hack me if they want to. We lost that fight before we knew we had to fight it. But at least the cost of doing it is slightly higher then using the government mandated backdoors in whatsapp, android and co.


Cheers for fighting the good fight. Quixotic idealists almost never get anything done... and yet they're still the only ones who do manage it ;P

With high-variance results, but heck, that's nature.


But if you are promoting something that is opaque and in some situations worse than an already-existing alternative, is it really the right path to recommend the problematic tool because it is more popular?


Especially if your interest is in not necessarily your personal privacy, but that of lawyers, journalists, whistleblowers, etc.


I wish i could show you what lawyers in germany did with their lawyer-specfic mail service. I don't have any non-german sources but they are a security nightmare and e2e encryption was deemed "unnecessary". Even vulnerable groups don't care anymore.


[flagged]


It probably isn't legal but people use non-profits all the time for personal gain and get away with it almost always.


Exploiting nonprofit status to generate a personal benefit is not legal in the US. So if the board members, officers or whomever have some personal benefit to this crapcoin taking off, it's not legal for them to use nonprofit resources and tax breaks to push for that.


OK, good!

Now please remove that cryptocoin stuff from the app. Don’t create another avenue for money laundering, tax evasion and drugs sales (if not worse).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: