Yeah. It is really clear that Facebook is finally getting around to just implementing this feature of effectively having "hosted clients" for companies to be able to more easily--and yes: less securely--build chat bots (a mechanism that I appreciate is maybe less than ideal to encourage, but frankly just doesn't feel that bad and certainly isn't a surprise: Facebook has been taking about this for a year or two now); and all of the "changes" this week have been directly because of this, including the Privacy Policy update... the key article about which even explicitly said:
> The move, the spokeswoman said, is part of a previously disclosed move to allow businesses to store and manage WhatsApp chats using Facebook's infrastructure. Users won't have to use WhatsApp to interact with the businesses and have the option of blocking the businesses. She said there will be no change in how WhatsApp shares provides data with Facebook for non-business chats and account data.
And yet, somehow, everyone is just in complete hysterics over all of this, claiming Facebook is evil and undermining the feeling of security people have in WhatsApp, with lots of talk of switching not only to reasonable alternatives like Signal, but also to less secure messaging protocols like Telegram (or, frankly, Matrix). People at my supposedly-smart privacy company--Orchid, building something akin to "incentivized Tor for general VPN use"--are even panicking about this news, and it is really frustrating how no one even seems to want to analyze this carefully... "bUt FaCeBoOk Is EvIl!!" :/.
Yes, everyone is in complete hysterics exactly because Facebook is evil (by the definition "harmful or tending to harm" (OED) or "morally reprehensible" (Merriam-Webster)). Just remember the recent(-ish) Oculus controversy, where they forced everyone who bought their hardware to sign in with Facebook and in some cases (soft-)bricked users devices because their Facebook accounts did not have enough activity [1]. Especially because Palmer Luckey (founder of Oculus) when answering questions about the acquisition in 2014 said that Facebook would not do such a thing [0].
I personally am scared because the language being used here is not at all specific to the scenario mentioned here ("hosted clients"). I understand that anything more specific would probably be rejected by their legal team. I am afraid that some 5 years down the line they'll be able to do something worse without notifying users because the TOCs and privacy policies are written in this ambiguous language.
Regarding alternatives, I can't really speak on the security/privacy of any of them but from what I can gather, Matrix does have E2E-encryption functionality [2] so I'm not quite sure how it is less secure than Signal (provided you host your own server and/or have a reasonable degree of trust in the server-operator of your conversation-partner).
And when Facebook is doing something evil, I actively blast them for it; in particular, I have been extremely vocal with everyone I know about many aspects of the Oculus account issue, which I consider to be extremely evil when combined with their closed store model and DRM setup with developer account revocation (etc. I am somewhat famous for being a broken record on some topics, so I will try to avoid going into too much depth ;P).
Obviously, though, (but maybe not to you?!?) this is a completely unrelated issue to the WhatsApp "changes" this week: trying to use "Facebook is evil, so everything they do is evil" is not only ridiculously disingenuous--to the point of undermining the ability to make these kinds of arguments at all and still be taken seriously :(--but doesn't even satisfy basic questions like "ok, and do you also consistently use this frame with Apple and Google?" (both of whom are also evil to the point of being morally reprehensible).
As for Matrix: they do not have a solution for metadata yet, and even have gone so far as to claim that maybe they will never figure it out (due to being a federated system). Your metadata just ends up getting semi-permanently logged on various machines, and there is nothing you can do about it at this time. AFAIK, Signal has implemented solutions to this (even, I believe, fixing the subtle thing I used to complain about where their server technically had a temporary in-memory metadata log for rate limiting).
Facebook logs all metadata that is available from WhatsApp as well. I'd rather have my metadata on matrix servers than on FB servers - at least it's not connected to my phone number, which is tied to my real identity. Also, matrix doesn't upload my entire contact list to Facebook. If it's secure enough for the german military and the entire french government, it's certainly secure enough for me.
> Your metadata just ends up getting semi-permanently logged on various machines, and there is nothing you can do about it at this time.
Sealed sender means that an eavesdropper who can introspect into RAM inside Signal's AWS infrastructure is no better off than a network eavesdropper who passively sniffs ingress/egress.
That doesn't mean they can't build a reasonably accurate metadata database covering most people--people who communicate from a limited number of mobile ips to a limited number of mobile ips.
Signal is way better than matrix, but let's not pretend it has totally solved the metadata problem.
Extremely evil was when an entire population was wiped off the earth in the industrial genocide of the Third Reich. Facebook or WhatsApp changing its TOS is irritating but it is not "extremely evil" I just realised that this is the same absolute language that incited the violence we saw on Wednesday. If something is "extremely evil" then there are very few constraints short of the Geneva convention and probably not that you should be bound by in your response. The point is language matters and so enough with calling everything we disagree with "evil".
It was carefully explained to me that Facebook only wishes they could be as evil as Google is, now, or as Microsoft used to be able to be. Nowadays, even Microsoft and Russia wish they could afford to be as evil as Google; and even the spooks have had to outsource theirs.
(I use "evil" in the technical sense: not necessarily intending to exterminate humanity, but wanting to be able to -- or anything short of that -- if they did.)
Unfortunately, your interpretation of facebook's motives require trusting that they'll only do what their PR says they'll do, and not what they're able to do. Or, even if that is their current reasoning, one then has to trust that they won't then take advantage of said ability in the future.
For many of us, facebook's past actions are more than enough to prove that they do not deserve the benefit of the doubt in this case.
In addition, if this is indeed the backstory, then Facebook’s product management team failed miserably by not owning the story and instead deferring to anonymous internet commenters to explain their changes.
Remember when Facebook Security added SMS 2-factor verification and promised to never use the phone number for anything else, but then they were overridden and it was fed into the social graph, leading to their CISO resigning ?
Thanks for posting this. I was considering making the jump to a new messenger but decided to wait and see what others had to say about the changes to the privacy policy and what it actually means from a privacy perspective. The use case for businesses to be able to use it for hosted clients (probably hosted and with messages stored by facebook) makes sense, and doesn't seem as bad as its been made out to be – still get the same level of privacy we've always had between individuals and groups WhatsApp chats.
> but also to less secure messaging protocols like Telegram (or, frankly, Matrix)
Appreciate that Telegram doesn't have a good rep in the security community, but whats wrong with Matrix?
Also, this is off-topic, but I just wanted to say thank you for all the work you've done in the past with Cydia. I was a 1st gen iPhone user, and got a lot of use from services such as Cydia (in fact i'm convinced the App Store was inspired by services like Cydia).
Matrix is pretty open about how it hasn't been able to do anything about metadata leakage (which they have even at some times claimed is somewhat inherent to its federated nature; I think that is an overstatement, but is something that even they seem to believe).
> Matrix does not protect metadata currently; server admins can see who you talk to & when (but not what). If you need this today, look at Ricochet or Vuvuzela etc.
> Protecting metadata is incompatible with bridging.
> However, in future peer-to-peer home servers could run clientside, tunnelling traffic over Tor and using anonymous store-and-forward servers (a la Pond).
Signal, in contrast, put a lot of effort into metadata reduction--critical as they are a single giant hosted relay service--and in the process (I am very sure) even fixed the issue I used to complain about wherein their server was technically keeping around a temporary-ish in-memory metadata log for rate limiting.
If you are going to switch to something, switch to Signal (...though I sadly can't in good faith ever really recommend anyone do that, due to how Signal has crippled the ability to do chat backups; more info on this in the other thread going on today re Signal/WhatsApp).
Those slides are from 2017. P2P Matrix was released in June 2020. A lot of work is being done on Dendrite, the latest commit was posted two hours ago as of this writing. From the GitHub page for Dendrite: "As of November 2020 we're at around 58% CS API coverage and 83% Federation coverage, though check CI for the latest numbers."
So, yes, for now the metadata leakage is a real issue. However this is likely to change in the near future.
Thanks for the info. I was under the impression that you were claiming that Matrix is less secure than WhatsApp. If they both leak metadata then they're roughly equal from a privacy perspective no? I guess with WhatsApp you can't know the extent of metadata leakage, but at least with Matrix, you have the advantage of knowing precisely what data is leaked.
Not trying to push Matrix or anything, i've been using Signal for some time already anyway, but thought i'd see what alternatives there are. The lack of chat backups is a real drawback, though since the Android version has a backup option, i'm hoping it's something they'll eventually implement?
> The move, the spokeswoman said, is part of a previously disclosed move to allow businesses to store and manage WhatsApp chats using Facebook's infrastructure. Users won't have to use WhatsApp to interact with the businesses and have the option of blocking the businesses. She said there will be no change in how WhatsApp shares provides data with Facebook for non-business chats and account data.
And yet, somehow, everyone is just in complete hysterics over all of this, claiming Facebook is evil and undermining the feeling of security people have in WhatsApp, with lots of talk of switching not only to reasonable alternatives like Signal, but also to less secure messaging protocols like Telegram (or, frankly, Matrix). People at my supposedly-smart privacy company--Orchid, building something akin to "incentivized Tor for general VPN use"--are even panicking about this news, and it is really frustrating how no one even seems to want to analyze this carefully... "bUt FaCeBoOk Is EvIl!!" :/.