They could literally have a hidden function in WhatsApp that scoops up all your chat history and sends it to Facebook if the government ask them to. It’s closed source. No one has a clue what it’s doing.
To be clear I’m not suggesting this is absolutely happening. I’m merely pointing out it’s entirely possible from a technological perspective given it’s closed source software owned by Facebook. That’s not a recipe for privacy.
To be clear about the threat vector, there's also nothing stopping signal from doing the same if they wanted to. Its impossible to tell if the version of signal you download from the app store is unmodified from the code you can find on github. I trust signal more than I trust facebook, but if you use signal, even though its opensource you still have to trust them not to put anything funky in the binary they upload to apple/google.
I'd love for iOS and android to add some sort of OS-level application hash or something. "This app was compiled with xcode version X / llvm version Y with this set of options. The resulting binary hashes to ZZZ". That way with the source code you could verify that the binary on your phone is unchanged.
(Another approach would be to get apple / google to do the compilation themselves from the project on github. If apple builds my project, they could put some signed metadata in the bundle saying "We (apple) compiled this from git SHA XXX")
Reproducible builds do not help to determine if the version you download via the Play Store (or, for those on enterprise devices, any pre-installed corporate stores) is the same as you build - Play Store presents no real means to verify that. This includes any auto-updates if they are enabled.
It's an issue with Play Store as a delivery channel, the individual app in question can't do much about that.
Reproducible builds help if you:
- download the APK separately (includng from the Signal website, or some of the other sources)
- install the file locally via sideload
- disable updates (!)
This is very true. Reproducible builds for mobile apps would be far superior. You can build Signal from source for Android if you wish, although obviously this is a massive pain to do for each update, there’s absolutely nothing stopping you from doing it.
On iOS it's a lot more difficult to get the required certificates from Apple but you can run your own build in Xcode and deploy it to your personal device if you are a registered Apple developer.
While reproducible builds are obviously the gold standard, for apps you install from the Play Store or the App Store, developers sign the apps that get distributed with their own private keys. As Google and Apple don’t have access to these it should be verifiable that the apps are not tampered with.
There is an exception here with the Play Store, where there is an opt-in option for Google to sign the app on your behalf [1], but I think we can safely assume Signal are manually signing with their own private keys.
In any case it's easy to just grab an APK from an Android device and check signatures for yourself.
For iOS though, no surprises here it’s locked down. Although from what I gather reading Apple’s security documentation, it confirms that apps must be signed by developers with their private keys. [2] But unlike Android there’s sadly no way I can tell for the user to independently verify this without jailbreaking.
But ultimately, short of building each version yourself, all this is moot if you distrust the developers.
I spent a few hours trying to get a local build of signal-ios working a few weeks ago, in order to write a PR fix a bug with lost voice messages. The xcode project uses a plethora of device entitlements I'm not allowed to have (since I don't have the proper signal signing key). Even after a couple hours of tweaking to get it building and deployed to my device, its currently crashing on startup because it can't access some special signal local device store.
You can certainly get your own build working (without notifications and other features). But personally I found it prohibitively difficult to do so.
I think you will have a problem when it comes to push notifications. I doubt a local build would be able to receive push notifications addressed to App Store builds.
Reverse engeneering is a thing, though. I would think, there is fame to be gained to show such a behavior from whatsapp, so some hackers could feel motivated to do this from time to time.
Absolutely. Of course hackers are reverse engineering WhatsApp, that's how all those nasty exploits it has keep getting sold to governments by the NSO Group.
But reverse engineering is a skill in itself and modern day smartphone OS's use a lot of code obfuscation when apps are compiled. This effectively means even those talented hackers are going through the reverse engineering process pulling at threads until they get lucky.
Reverse engineering (in this context, at least) doesn't just show you the code as the developer wrote it. And FB hires a lot of very clever people including cybersecurity experts who could sneak these things in using innocent looking code scrambled around the app. Even open source projects are at risk of having backdoors put in that pass review and simply look like innocent bugs if they get discovered, let alone closed source apps that have to be reverse engineered.
Again not going conspiracy nut and saying that's what FB is doing. Just saying it'd be very easy for FB to hide it if they were doing it.
To me the biggest confirmed weakness of WhatsApp is the cloud backups. E2EE is pointless when the message database is synced up to iCloud or Google Drive. WhatsApp even tells you this itself. When you enable cloud backups (and they keep bugging you until you do it) it literally tells you the backups aren't secured by E2EE. [1] Because, well, of course they aren't.
The same is true of Signal in most practical ways. You can only run it on platforms that are fundamentally closed-source (either iOS or Google Play Services), so there's no reason to believe the RNGs it uses (and therefore all your session keys) are not backdoored. And you can only install it through official app stores where it's difficult or impossible to inspect what binary you have or "pin" a given version. So I don't see that it's meaningfully more secure than WhatsApp.
The mere fact it hoovers up less metadata alone makes it more private. I also trust the developers more way way more than I trust Facebook. That's a personal preference though and if you trust WhatsApp and don't mind that it leaks contacts and other metadata to Facebook to profile you then use WhatsApp.
If I wanted to I could install a fork of Signal that doesn't require Google Play [1] and run it on any non-Google Android build. I would do if it wasn't for the fact I'm currently using an iPhone.
I see that Signal no longer depends on Google Play Services specifically. However it's still the case that it depends on proprietary Google code (it just includes that code in its own APK now) and still can't practically be installed without auto-update (again, it just includes that in its APK).
The "proprietary Google code" is a library with a well defined API, you can see what it has access to. I agree that Signal should take it out, but it's not an especially big deal from a security perspective.
The auto update functionality just tells you that an update is available, you can choose not to install it. You can also independently verify that the sha256 sum matches the one given on the website, and that the binary that sha256 sum corresponds to is produced via the reproducible build instructions. There are occasional bugs (I'd estimate a couple times a year, though it's less and less frequent) that causes the reproducible build to not match the provided build, and it's quickly noticed by someone and an issue opened in the issue tracker. If there were no explanation or no quick resolution, people would publicly raise a stink about it.
Sure, but that's an unrelated phenomenon to the security implications being discussed. The argument against auto-updates is "it's running code without my permission or ability to audit first"; putting a recency requirement for client-server communication doesn't impact that concern, and I don't see any reason why it would be considered a bad thing.
> The argument against auto-updates is "it's running code without my permission or ability to audit first"; putting a recency requirement for client-server communication doesn't impact that concern
It makes it impractical to actually audit the code you're running, because you're forced to re-audit on Signal's schedule. And it makes those audits mostly meaningless: what are you going to do if you decide a given code change is suspicious? You can't keep using the version of the code you were happy with, so you'd better have a plan in place for moving off Signal quickly - but in that case how much can you gain from using it at all?
To be clear I’m not suggesting this is absolutely happening. I’m merely pointing out it’s entirely possible from a technological perspective given it’s closed source software owned by Facebook. That’s not a recipe for privacy.