Hacker News new | past | comments | ask | show | jobs | submit login
Stop the Earn IT Bill Before It Breaks Encryption (eff.org)
1252 points by DyslexicAtheist on Oct 6, 2020 | hide | past | favorite | 342 comments



Back in late medieval europa most postal services had a back room, called cabinet noir where letters where carefully opened, read and resealed to check them for signs of treason against the crown or cross. These were often abused for what we today would call economic espionage. For these reasons many rich people employed private couriers who traveled to their business partners in person to hand over messages. Those who could not afford such luxury had to rely on codes. The prevalence of such codes however increased the chance of being misunderstood by those who searched for treason. When the emerging middle class started to demand political power in the renaissance it was answered with violence and repression. Those who wrote the constitutions of all modern western nations understood that democracy can only exist when the state has no business reading the correspondence of its citizens.

Today many argue the state has a need to access such correspondence to prevent crime, but such a need is like the need of an addict: nothing good can come from it and the people should not enable these institutions to satisfy an ever growing demand for insight into their private lives. One must remember that democracy is founded on the believe that thoughts and words are not crimes and everyone must be free to express them-self in public, but even more so in private correspondence. A society that mistrusts its own citizens to a point where all those that whisper to each other are called criminals, dealers, traitors or terrorists is rotten at its core.

And yet some still say: but if the state can read all private correspondence it would be so much easier to catch criminals. And yes, it is true that these totalitarian methods ar efficient in fighting street level crime. However for society as a whole, such methods enable a terror of the state that is a crime against humanity itself. They say "but the state will never abuse its power" and i say: it did countless times before. Do not stray away from liberty and freedom for promises of safety made by those that profit from oppression.


> Today many argue the state has a need to access such correspondence to prevent crime, but such a need is like the need of an addict: nothing good can come from it and the people should not enable these institutions to satisfy an ever growing demand for insight into their private lives. One must remember that democracy is founded on the believe that thoughts and words are not crimes and everyone must be free to express them-self in public, but even more so in private correspondence. A society that mistrusts its own citizens to a point where all those that whisper to each other are called criminals, dealers, traitors or terrorists is rotten at its core.

I feel disregarding the usefulness of surveillance is part of the problem. We should not be arguing that that nothing good comes out of surveillance. It provides your opponent an easy strawman for a hollow victory. Because frankly, surveillance is a useful tool for law enforcement.

We need to rather argue that the moral cost and side-effects of public surveillance far outweighs its usefulness.


> I feel disregarding the usefulness of surveillance is part of the problem. We should not be arguing that that nothing good comes out of surveillance. It provides your opponent an easy strawman for a hollow victory. Because frankly, surveillance is a useful tool for law enforcement.

In this regard I recommend you go look into the evidence on mass surveillance. There have been several reports done on the mass surveillance programs that have been operating since 2001 and in report after report, the mass surveillance has been found not only to be ineffective at producing any tips, it commonly just tied up law enforcement resources that could have been spent on their legitimate tips.

Here is a very well sourced article referencing several FBI internal reports, a white house appointed review group, those of non-profits, and local police departments:

* https://www.propublica.org/article/whats-the-evidence-mass-s...


Using "it doesn't work" as an argument is a losing battle. If you even manage to convince people of that, best case they'll still be in favour "just in case it does work".

The actual, real point is that they're underestimating the downsides or surveillance, and that even if it would work, it would still not be worth it. That's the only argument that can hold, and the actual reason we're against it.


Moreover, the argument becomes invalid work as soon as someone finds a method that does work. Which creates an incentive for anyone with a vested interest in this spying to create such a working system.


Agreed. I would dismiss those arguments under the well established heading of: "The ends justify the means"


> We should not be arguing that that nothing good comes out of surveillance.

The problem is that we, the people, can never know what, if any, good is coming out of surveillance. Attorney General Barr admitted that in one of his speeches arguing for back doors in encryption. The government cannot reveal what is being discovered through surveillance without disclosing sources and methods that it (understandably) wants to keep concealed from adversaries. But without that information we and our elected representatives cannot exercise proper oversight. And without proper oversight any such capability will be abused.


It's often not the the government "cannot" reveal those details (maybe not immediately and directly in some cases, sure, but certainly with the distance of time that tools such as FOIA requests require), but that they "won't" and have no interest to. It should be the public demand with each attempt to increase surveillance to increase oversight. Sousveillance (watching the watchers) is the best known defense we have at keeping surveillance in check. The hard part is speaking those demands to those in power, embedding those checks/balances/required transparency in the surveillance processes in such a way that they cannot be circumvented by those in power.


> with the distance of time that tools such as FOIA requests require

Often that is way too much time--25 to 50 years in many cases, since those are the time frames for declassification of classified information--for such revelations to be useful for oversight, especially with the state of encryption as it is since computers and the Internet.

Before computers and the Internet, it was possible to have a reasonable tradeoff between strength of encryption and the ability of law enforcement to conduct surveillance, because perfect encryption was impossible and imperfect encryption got more expensive the closer you wanted it to be to perfect. So people were already making a cost-benefit tradeoff (difficulty of breaking the encryption and obtaining private data vs. cost), and it was reasonable for the government to ask that the potential benefits of surveillance be included in the tradeoff, since that would just adjust the balance of the tradeoff, and the adjustment could be periodically reviewed based on data on past surveillance that was revealed by things like FOIA requests.

But now, with computers and the Internet, perfect encryption is cheaper than imperfect encryption. Perfect encryption is just a mathematical algorithm, and it's straightforward to put that algorithm in computer code and verify that the code correctly executes the algorithm. Imperfect encryption requires adding code to that perfect algorithm, which adds cost, and also adds a risk that wasn't even there before, of whatever back doors are in the code being exploited. So now we users, to enable surveillance by law enforcement, would not be just making a small adjustment that could be periodically reviewed in a tradeoff we have to make anyway. We would be adding a new tradeoff that we have no other incentive to make, and thus taking on a new oversight burden, which is, if not impossible, at least extremely difficult to properly fulfill, that we have no other incentive to take on. That is simply not a bargain that free citizens of a free society should accept.

> embedding those checks/balances/required transparency in the surveillance processes in such a way that they cannot be circumvented by those in power.

The processes can't be transparent because, as I said, that would reveal sources and methods that should be concealed from adversaries. An application for a FISA warrant can't wait for the years it would take to allow a FOIA request to be fulfilled in the interest of transparency.


> Often that is way too much time--25 to 50 years in many cases, since those are the time frames for declassification of classified information

That's only part of what I mean about the goal to demand expanding oversight, maybe those timeframes are too long, but the point is that those time frames sometimes serve a useful purpose to slow things down for safety of parties involved or other reasons. A goal should be to find a healthy "medium" where "Surveillance FOIA 2.0" still allows for transparency/oversight/review without hobbling the process, and FOIA was just one example of an existing transparency tool to model from, it's not the only tool/model it was the first example to mind, but you would hopefully expand to a larger suite of transparency/sousveillance ("watch the watchers") tools.

I'm also not claiming that we shouldn't fight surveillance attempts, simply that where surveillance seems inevitable/a foregone conclusion/rough to fight that we also need to devote resources to fighting for increased sousveillance/transparency, because power will always abuse surveillance.


> where surveillance seems inevitable/a foregone conclusion

To me, breaking perfect encryption by putting backdoors in computer algorithms is precisely the kind of place where we should not think that surveillance is inevitable/a foregone conclusion, but should draw a line in the sand and say that no, we're not going to accept this, law enforcement simply needs to up its game and figure out how to operate in this new environment where anyone who wants to can use perfect encryption.


Hmm, that's a good point. It is a problem. And a problem for both sides.

If X is the amount of utility coming out of surveillance, and you cannot know X, then you cannot argue that X = 0 or that X > Y (for any Y you want to pick, like downsides of surveillance) or that X < Y either.

Essentially it becomes impossible to rationally debate the issue on the basis of whether it is a net gain.

Which means you need to fall back on other forms of reasoning. A reasonable position is that freedoms shouldn't be sacrificed for something whose utility cannot be demonstrated. But that's an argument about what sorts of justifications are required for laws, not about how much utility the law would have.


Yes, your point applies to many modern discussions and an often used rhetorical device I see is to jump away from the "should we do this" discussions and into the "does this work discussion" (to paraphrase jurassic park a bit). Common examples that won't make me popular are climate change, current anti-covid measures, and not eating meat. People get hung up on technicalities about whether ice cores show evidence of some climate relationship, or whether staying home reduces disease transmission, and pretend that those facts automatically lead to a conclusion about how we should behave, while bypassing the discussion about the kind of world we want to live in. I hypothesize that people feel on firmer ground when they shift their ideological arguments into facts about causality, instead of head on discussing why they value a certain kind of society.


> We need to rather argue that the moral cost and side-effects of public surveillance far outweighs its usefulness.

The argument I make is that it is more cost effective to develop a society where one does not need to commit crimes to get by in the first place. Law enforcement is reactionary and can only punish when crimes are already committed. While we shouldn't get rid of law enforcement, because crime will always exist, let's look to societies that have low crime (and societies that have high crime) and see what we can learn from them (and improve upon). Such policy is far more advantageous for citizens.


It's very clear that the people in power in the UK and USA don't want the law to be enforced. They seemingly don't want low-crime society, they want only to be immune to prosecution themselves.

Your idea seems predicated on people in general being benevolent towards others. That's not going to work, there's a significant motivated cadre who want to do terrible things provided they 'win'. You don't enlist Cambridge Analytica when you think you're right, you do that sorry of thing when you don't care about being right/moral/legal but only about subjugating others.


> You don't enlist Cambridge Analytica when you think you're right, you do that sorry of thing when you don't care about being right/moral/legal but only about subjugating others.

Be careful about how you frame that. While this is true of some people who engage in activities like this, there is also the "ends justify the means" group. The latter does believe what they are doing is right and moral, and that being right and moral justifies behavior that is illegal. It's easy to be cynical and assume that the latter group is just the former group deluding themselves, but there are people who genuinely think that way. Addressing them requires a different approach than addressing those who just want power and control by any means.


I see this line of thinking frequently, especially in modern politics, and it always fascinates me. I can get people to agree that the system is messed up because it is a race to the bottom. I can get people to agree that someone needs to draw a line in the sand for it to stop. I can get most people to agree that sacrificing moral values in an effort to win results in a hollow victory (and encourages the race to the bottom). But the interesting part is that my opinion that one needs to hold their own tribe accountable for sacrificing morals is extremely controversial. Yet I see it as logically following from the above.

I think this is why we can see people gladly vote for those that they very much disagree with. I think this is why attacking someone's tribal leader makes them double down and strengthens their convictions rather than changing belief. I think the question is how to get people to realize that you have to fight fair to get others to fight you back with fairness.


If you have the time, I'd highly recommend writing this argument as a blog post; it deserves a higher and more exclusive visibility than it gets as embedded in these forums.


I think if you find that you poll a random set of people, most of them are (for the most part) benevolent towards one another. So I think it is disingenuous to talk about people in power and then apply that to people in general when these two have different behaviors. The point of democracies are to increase the robustness of governments to help discourage abuse of those in power while providing mechanisms to remove those that do abuse that power. Obviously this can be improved, but that's a different conversation all together.


I feel like OP does that 2 sentences later.

> And yes, it is true that these totalitarian methods ar [sic] efficient in fighting street level crime. However for society as a whole, such methods enable a terror of the state that is a crime against humanity itself


I agree with your conclusion.

However I think it’s a very serious fallacy to split surveillance into a ‘useful’ component and ‘side-effects’.

They aren’t side-effects. They are the effects.

Reduced crime may be a consequence of a surveillance society.

In such a society you may discover that discussing crime statistics in a negative light would reflect badly on the party bosses and must be done with caution.


By the same token, one could say that extra-judicial torture could have "usefulness" as you put it. We could have a similar discussion of how absolutists on the torture question aren't doing a proper cost/benefit analysis and are "providing their opponents an easy strawman for a hollow victory."

I know this is Hacker News, but not every argument requires infinite nuance, we don't need to sit down and examine the pros & cons of torture or any other clear and obvious abuse of government power. We don't need to dignify the position of "read all citizens private correspondence" with a cost/benefit analysis. This practice provides legitimacy to clearly unconscionable actions. It is permissible, even strategically valuable, to have certain positions that we are absolutist on, policies that aren't tolerated under any conditions.


The nuance here is pretty blunt: no matter how useful torture is, it is never enough to legalize it.

Same with banning secure private communication.


Is that it? I think it might be "no matter how many times we legalize torture, it will never become truly useful".

Studies show torture is simply not effective. Similar to how surveilling all communication is not effective.

https://journalistsresource.org/studies/government/security-...


In that case why corruption is so rife? With so much tools available they should be able to catch some politicians that indirectly cause death and hardship to millions and yet the law enforcement is focused on catching another student who dared to use wrong plant to relax after tough day.


How many terrorists did the NSA's illegal wiretapping program catch? Zero.


To paraphrase someone I know, "once they run out of criminals, they'll just start catching people they don't like".

Once they have this ability, it will be much harder to make them give it up.


>To paraphrase someone I know, "once they run out of criminals, they'll just start catching people they don't like".

>Once they have this ability, it will be much harder to make them give it up.

There's an argument to be made that this is already happening and, in fact, has been happening for decades.

I'm of course referring to the "War on Drugs."

There's quite a bit of analysis in the literature to show that restrictions on mind-altering substances was explicitly introduced to disadvantage particular populations.

What's more, despite popular perception[0], the "crime" rate is at its lowest levels in more than 50 years[1], yet we continue to fund[2] law "enforcement" at levels even higher than when we were at the peak of the "crime" rate during that time.

So. Since crime rates have plummeted, yet we're spending more than ever, it's likely your paraphrase is already the case right now.

More's the pity.

[0] https://www.brennancenter.org/our-work/analysis-opinion/amer...

[1] https://www.pewresearch.org/fact-tank/2019/10/17/facts-about...

[2] https://www.bloomberg.com/news/articles/2020-06-04/america-s...


Having watched the newspaper coverage of the debacle involving Breonna Taylor with neighborhood interest, it's very clear how much the War on Drugs has caused so much pain for so little gain. There are so many facts involved that make it harder and harder to not cynically and conspiratorially believe there was some sort of personal vendetta involved and the media circus a cover up (the latest word is the city's Drug Enforcement SWAT team throwing the cops involved under the bus for not complying with Drug Enforcement accountability procedures in a warrant search involving a person's residence and throwing the entire department under the bus for trying to execute too many warrants at the same time spreading itself too thin and not following procedures). Occam's Razor still suggests outrageous ineptitude was involved, and the feeling of a conspiracy is just cops doing what they always do and protecting their own as long as they can before only breaking ranks just when public scrutiny gets too tough.

(But the detective that put together the residential warrant "bundled" it with a bunch of non-residential warrants, nearly burying/hiding it, when it took it to the Judge to be signed, and then again the SWAT team now says that the residential warrant execution plan was buried in the same swamp of non-residential warrant executions, and it's really hard to keep from wondering if that was malice or incompetence as all these details come out. Was it personal? Or was it dumb luck? I can't even tell which is worse at this point, because either seems to show a lack of responsibility, and both are worsened by likely what will continue to be a lack of consequences or atonement.)


>Once they have this ability, it will be much harder to make them give it up.

There's an argument to be made that this is already happening and, in fact, has been happening for decades.

Notable examples are RICO Laws, FISA courts and the PATRIOT act. So. Yes.


> "once they run out of criminals, they'll just start catching people they don't like".

Or, just as likely, "once they can catch the people they don't like, they won't need to bother catching the people that you think are criminals".


serious q: encryption challenges the authority of government and the power of its leaders. Why would they willingly give up this power, when they can manufacture consent[1] via a perpetual state of war[2] ?

[1] https://www.google.com/search?q=manufacturing+consent [2] https://www.google.com/search?q=perpetual+war


While morally reprehensible, endless "safe" wars are pretty profitable for industry owners if they are run in countries that do not target their valuable industries. Since the tax payer is footing the bill for it all, they're the ones who have to concent to it. Or perhaps not.

Using the invasion of Iraq as an example, there has been many years where public opinion was negative, or at least lukewarm, towards the invasion, though not violently so. Casually reviewing the polling history, this can be observed as early as in 2004. [1] But I think one can safely say that the war hasn't been at the forefront of most people's minds during the last few decades, except for the very beginning. But then, there has been very little mention of the financial cost of the war in the media, if any. And why would there be, as the media also earns a lot of money on these "safe" wars.

The video “Troops Versus Building --- an Iraq War tale” by soldier grunt Blacktail should give you a pretty hands-on idea of the financial cost of the war, however. [2]

[1]: https://en.wikipedia.org/wiki/Public_opinion_in_the_United_S...

[2]: Troops Versus Building --- an Iraq War tale, 24 Nov 2009, Blacktail, https://youtu.be/2N-1E2F9pmc


This is not how sourcing works.


The ability to manufacture consent is finite, even in North Korea.


Spot on. And further more, any appeals to safety from politicians should be shouted down as the misdirection that they are.

The root of 99% of crime in the US is poverty. Not private communication.

Trying to solve poverty by spying on everyone’s data is like trying to cure cancer with Tylenol. Even if you temporarily prevent a symptom from occurring, you're still dying of cancer.

So much of political thought in the US is focused on the futile efforts of treating symptoms, and not curing underlying causes.

And the worst part is, if you look at statistics in the rest of the developed world, poverty in fact has a cure! Like in most things, the US is the head-in-the-sand stubborn outlier here.


Everyone should be able to read private politician's correspondence and what they are up to. We should also know their bank accounts and their location at all times. Why should government know those things about us but we can't know that about them? That's a modern slavery.


Everybody should be able to read the private correspondence of politicians? That's absurd. Politicians are (believe it or not) people too, and have families, and relationships, and private lives. We should absolutely be able to read all their work related material, but private things should stay private.


Why should private things stay private, only for politicians?


Society cannot function without privacy [1].

[1] https://ieeexplore.ieee.org/document/1203230


I agree. If we allow the state to start reading all of our correspondence, we are going to start losing our freedoms at a far faster rate than we already have.


Reflecting on this I think we're watching the wrong people.

Politicians in general have shown they don't have the moral probity to be trusted to direct a democracy.

We need a sort of reverse-Stasi. Everything a senior politician does should be reviewed and only closed if it is provably personal and without public interest.

Maybe our politicians need to wear bodycams.


"start"


Someone’s been reading their Neal Stephenson.. :))


heck yeah! Baroque cycle!


> And yes, it is true that these totalitarian methods ar efficient in fighting street level crime.

You give too much credit to police states.

What happens in reality is that criminals with connections and a minimum of self restrain are folded into the "dark side" of the State, while their rivals are cracked down hard. In this way, a number of low-impact, high-revenue illegal activities are tolerated (in exchange of bribes), crime syndicates are expected to self police and not break whatever taboos were imposed from above; and then this "dark side" of the government do put a lid on top of the deviant side of society, diverting their energies into activities that do not challenge the status quo.

Does it make for a safer place to live for the common citizen? Maybe. While it may be less likely that you will be injured in an armed robbery, you will also be more likely to get your money swindled by this scheme or another... and you will have less chance of redress when this happens.


[flagged]


The only people who support this shit are people who haven't thought through all the second-order consequences. They're very few in number - most people generally dislike surveillance, but don't fight against it nearly as hard as they theoretically could. Economic incentives are working their magic here. For example, much of the NSA's spying programs are legal specifically because AT&T made everyone surrender metadata for advertising purposes as a condition of telephone service back in the 50s. AT&T had monopoly power, so the only choices available were to sign over your metadata or live like Richard Stallman. Furthermore, any indication that your phone call records were being bought and sold was hidden in the fine print of long, non-negotiable contractual agreements.

The problem isn't "undesirables" (MISS ME with that shit), it's lies by omission and economic power brought to bear against people's rational expectations of privacy.


> Let's ramp up those genetic studies and get rid of the undesirables already

Holy shit, this is straight up racism. How the hell you got from protecting freethinkers to eugenics, I have no idea.


he/she isn't advocating this, in fact the opposite. They are using reductio ad absurdum to show where the "law and order" approach taken to the extreme can end.


Poe's law strikes again.


Based on my experience in SF, this appears to just be your typical comment on Nextdoor, but without as many references to the homeless.


One can only imagine how long it would take a modern-day Jonathan Swift to reach -4 [flagged] [dead] around here.


Rather than calling for backdoors, or secret rooms, this law explicitly prevents civil lawsuits or criminal prosecution for companies that refuse to install backdoors or use end to end encryption that they cannot crack.

I'm not really sure what the EFF is unhappy with about this act, since their complaints don't seem to be reflected in the text.

From the act:

CYBERSECURITY PROTECTIONS DO NOT GIVE RISE TO LIABILITY.—Notwithstanding paragraph (6), a provider of an interactive computer service shall not be deemed to be in violation of section 2252 or 2252A of title 18, United States Code, for the purposes of subparagraph (A) of such paragraph (6), and shall not otherwise be subject to any charge in a criminal prosecution under State law under subparagraph (B) of such paragraph (6), or any claim in a civil action under State law under subparagraph (C) of such paragraph (6), because the provider—

“(A) utilizes full end-to-end encrypted messaging services, device encryption, or other encryption services;

“(B) does not possess the information necessary to decrypt a communication; or

“(C) fails to take an action that would otherwise undermine the ability of the provider to offer full end-to-end encrypted messaging services, device encryption, or other encryption services.”.


https://www.eff.org/deeplinks/2020/07/new-earn-it-bill-still...

> Sen. Leahy’s amendment prohibits holding companies liable because they use “end-to-end encryption, device encryption, or other encryption services.” But the bill still encourages state lawmakers to look for loopholes to undermine end-to-end encryption, such as demanding that messages be scanned on a local device, before they get encrypted and sent along to their recipient.


How does the bill encourage that? Unless I'm really missing something, the language in the new child pornography section is the same as the current language in section 230e covering sex trafficking. That existing sex trafficking clause hasn't done any of the things that the EFF says this new one will do.

The original Earn It Act was bad. But that bad stuff has been massively ripped out. Plus real protections for privacy added in. It's not the same as it was - look up the text and compare what's been struck through with what is left.

I think in the current form it's a definite win for privacy and common sense.


All of you smart arses out here and there, we know it, we know. We know, VPNs and other stuff in between are not so secure and not private. Stop saying it, don't you have some other piece of knowledge to be proud of? for god's sake. Tor and Signal are better than Public Cloud, Chrome & SMS if you're looking for privacy. Don't you have common sense?

- HTTPS is more secure and private than HTTP

- Signal is more secure and private than SMS/Skype/Messenger

- Tor/Browser is more secure and private than Cloud/Chrome

- FileVault, BitLocker and LUKS are more secure and private than RAW unencrypted disk data

- 2FA and Hashing is better than raw Passwords

- List goes on and on.

All you have to do is: get out of your desperate situation where "everything's controlled", because it's not true, at least not everything. We don't have the best security and privacy but we have some of it AND we need to fight to keep and expand it.

Now go on and tell me how smart you are and how you see things different and generalize stuff again in replies. Or you can just shut up and stand up to contribute and improve it.


There are two parts.

1) Yes, you can build better encryption and privacy preserving technology. That is a technology and adoption problem.

2) There is so much you can do once the law says you can’t encrypt certain kinds of things and if you do, the state will be after you. That is a social problem. Our elected leaders aren’t serving us.

Not mutually exclusive. We have to do both. Build better tech, educate others and esp our politicians who can change the fabric of society.


3) People are willing to sacrifice freedoms to protect kids, likely a result of not understanding the complexities of such a nightmarish problem and how much freedom will have to be sacrificed and how little protection will be gained at great cost. Why? Because our society does not allow us to perform cost benefit analysis on a child's well being despite reality giving us no such relief as removing the need for such.


> Because our society does not allow us to perform cost benefit analysis on a child's well being

I hate this sooo much! Having children really makes some people entirely unable to think. I've heard people entirely willing to outlaw any form of encryption to "protect" their children from pedophiles despite the likelihood of getting their identity stolen, bankrupt and unable to feed their children being significantly higher than a pedophile targeting their child even now, and would increase by several orders of magnitude if we suddenly weren't allowed to encrypt things like e-banking and email.


4) real freedoms for imaginary kids.


This tirade and "go improve it" recommendation may make sense in a vacuum, without governments undermining encryption and criminalizing parts of the activities to provide secure, private comms "because terrorism/pornography/covid/whatever".

In the real world, many states freely admit that they will fight against secure, private messaging between citizens (say, because law enforcement needs a backdoor to solve crimes). And while governments can, and do, make laws to that effect, improvement will be legislated away beyond a certain point. This also produces a chilling effect on engineering: why work on a technology that will likely be outlawed if successful?

In most cases when the government is making laws to criminalize X trying to overpower it with better engineering just does not work. My 2c.


Why not work on said technology? It can even end up being very lucrative to take up the cause. Do you forget that prior to the 90s (and also during part of the 90s) much of the lead-in to the completely common, widely commercially and privately used digital encryption we have today was completely discouraged or even made illegal by governments except for their own use.

It was a generation of pioneers who weren't quite so timid about working on new technology and fight absurd, archaic laws that built the foundations of modern consumer/commercial encryption. The U.S government especially tried very hard to chill these efforts too, and failed (so far at least).


I agree vehemently with your call to go work on this stuff.

What I don’t understand is why you seem to be suggesting that as an alternative to opposing legislation which would prohibit such work.


Both can be done, activist opposition, and a practical move to build the tools that make circumventing such privacy-hating regulatory nonsense all the more easy to pull off. More simply put: while active public opposition to bad laws is good, few things nullify bad laws better than a fait accompli that's out of the bag.


The first half I disagree with: if a law is passed, circumvention tools won’t help. This is what it seems like you have been arguing for up till now, and frankly not getting traction.

However, the second part (about the fait accompli) is a very important point.

You’d have to achieve widespread usage among the general population for this to be effective to be effective.


I don't see how you could disagree on the first point though. Not only is is often done today to varying degrees of success depending on country and technology, it was exactly how much of consumer crypto got its start back in the earlier days of the internet. During the 90's technologies like RSA, PGP and others basically got built and released while constantly treading all over extremely shaky legal ground. It was their widespread usefulness and steadily growing adoption by users that allowed them to in effect circumvent archaic laws until they were simply recognized as legally usable.


I am well of the history.

Treading shaky legal ground is not the same as circumvention.

This time around, if encryption is banned, they will do more than just hound Phil Zimmerman for years on end.

They’ll come after the end users, and ‘circumvention’ won’t help.


I don't understand what people have against VPNs. I don't want my ISP knowing when I use Tor, since my ISP is in the same country as me and that makes it easier for them to have access to me if they wanted to. Using a VPN means my ISP only sees me connecting to a VPN (which is arguably more innocuous). Then you can use Tor from there. And if you only use HTTPS sites, then only you and the end site can read the traffic. And obviously don't login to accounts created outside of Tor or that use details that can lead back to your real identity.


There are some points against using VPN with Tor here: https://write.privacytools.io/my-thoughts-on-security/slicin...


>> We don't have the best security and privacy

We today probably have the best privacy tools in all of recorded history. Modern encryption means that anyone on the planet can send a message to anyone else on the planet without fear of government decryption in transit (asymmetric public keys, Tor, PGP, pick your tools). Using freely available tools I can encrypt a file on a USB drive in a way that even an NSA data center running for a billion years wouldn't decrypt. Those sorts of things were not possible a hundred years ago. They weren't really possible only a generation ago.


We also today probably have the best surveillance tools in all of recorded history. Modern phones mean that anyone on the planet can have their location tracked in real time, and if exploited (perhaps by a malicious update to the OS or apps to add a backdoor), the most intimate information about the user can be harvested in bulk, or they can be watched and listened to, without being served a warrant, even in their own home. These sorts of things were not possible a hundred years ago. They weren't really possible only a generation ago.


I mean, fair enough to a lot of the above, but you’re decrying people saying VPN’s are insecure then showing how wrong they are by talking about six otherwise unrelated security technologies.

The VPN’s people will actually use are a poor trade-off that will leave most people with a false sense of security at best, and probably with significantly less rights over what happens to their data regardless.

Don’t disagree with you on the rest of the above.


You know because some smart arse said it.

I don't have the technical skills to improve it. Someone I inform might.


Sit down dude. You think you are standing up for rights, but are making a case for weaker security by trusting, i.e. that MS and Apple won't backdoor you in an instant if they must. You are literally, actually, a shill, especially with this pompous presentation.

None of your list is better than nothing, if the authoritarians want your data. Except, maybe, Tor, and only if people contribute to running exit nodes.

If it isn't end-to-end, and only you know and control your keys, you are already doomed. In other words, you cannot trust any service with your keys. That includes https and signal.


> If it isn't end-to-end, and only you know and control your keys, you are already doomed. In other words, you cannot trust any service with your keys. That includes https and signal.

https://en.wikipedia.org/wiki/Security-in-depth


Things the EARN-IT bill (reminder: supposedly designed to protect children) doesn't do: - Increase funding for child protective services & foster care services - Increase funding for parents (EBT, social work, etc) - Increase funding for schools (comprehensive sex ed, etc)

Because our priority is kids! Honestly! Seriously I mean it!


"Protect the kids" has been a meme for some time now. It's a dead horse being beaten in the afterlife. Surprisingly, it still works.


Not to be pedantic, but those services are funded at the state level, so it would be odd if they allocated part of the Federal budget for it.

Not that that would be bad, it's just that there are norms for who-does-what in gov't, and overlap of responsibilities is usually bad.


The federal government can provide funding for state programs without taking over the administration of those programs. That happens all the time.


Are you mad that laws against murder (reminder: supposedly designed to protect people) don't include funding for domestic abuse shelters, food pantries, social workers, relocation programs, job retraining, college tuition assistance, and comprehensive sex ed?

This kind of rhetoric is so boring at this point. Trying to destroy a proposal via scope creep is intellectually dishonest and does little more than weaken your own sides position. If I was a senator on the fence about this bill and this was the best argument you could muster against it it would be an easy yea.


Laws "against murder" != laws "to prevent murder/catch murderers".

Nobody anywhere is challenging the laws again child abuse. What they are challenging is the laws attempting to decrease it or catch people doing it through means that don't actually work and harm other important things.

To use your example, it's like writing a law "against murder" that's written in a way that actually doesn't make murder illegal and also bans seat belts.

In fact, this "scope creep" you mention is exactly how laws work. First you make murder illegal (done), then you start looking at the common causes for murder and find ways to fix those.


Actually, yes. This is the entire problem with "tough on crime" policy: it doesn't attempt to handle the causes of criminality.


> The EARN IT Act cynically uses crimes against children as an excuse to hand control of online privacy and speech over to state legislatures

I like the idea of federal legislature ceding power to state legislatures.

Additionally, it looks like encryption is offered more protections in this bill, Considering federal laws preempt state, especially with regard to telecommunications, I do not see what the risks are with passing this(regarding encryption).

The bill ammends Section 230(e) of the Communications Act of 1934.

> CYBERSECURITY PROTECTIONS DO NOT GIVE RISE TO LIABILITY.—Notwithstanding paragraph (6), a provider of an interactive computer service shall not be deemed to be in violation of section 2252 or 2252A of title 18, United States Code, for the purposes of subparagraph (A) of such paragraph (6), and shall not otherwise be subject to any charge in a criminal prosecution under State law under subparagraph (B) of such paragraph (6), or any claim in a civil action under State law under subparagraph (C) of such paragraph (6), because the provider—

“(A) utilizes full end-to-end encrypted messaging services, device encryption, or other encryption services;

“(B) does not possess the information necessary to decrypt a communication; or

“(C) fails to take an action that would otherwise undermine the ability of the provider to offer full end-to-end encrypted messaging services, device encryption, or other encryption services.”.

https://www.congress.gov/bill/116th-congress/senate-bill/339...


I won't challenge the bill directly, but I'll point out that putting "crimes against children" front and center smells of misdirection, so it begs the question of what's really the bill's goal (though that could be just bad campaigning). Also, Edward Snowden spoke out against this bill. That's hearsay, but personally I've a fair amount of trust in his assessments.


The meat of the bill is that it establishes a commission (yes yet another, unpaid though!) that will establish and forward best practices to AG Barr concerning child sexual exploitation online. It is possible that one of the recommendations might be establishing a backdoor. In the future that recommendation would then be used to argue for legislation.

Snowden is great, but do your own research.

Edit: I usually trust Snowden as well. I could be missing something in this bill. EFF did not provide specifics. Hopefully someone here can.


I prefered the context given on this petition page: https://actionnetwork.org/petitions/dont-let-congress-kill-e...

> Just a few months ago, Senator Lindsey Graham (R–SC) delivered an ominous threat to Apple, Facebook, and any other tech company that might refuse to kill encryption programs that prevent malicious hackers, law enforcement officers, and others from accessing our private communications systems: "You're going to find a way to do this or we're going to do it for you."

> Graham has authored the Eliminating Abusive and Rampant Neglect of Interactive Technologies Act of 2019 — or EARN IT Act [...]


> You see, the EARN IT Act grants the Attorney General broad authority to force tech companies to do whatever he wants

I looked and could not find this. That article did not offer any specifics either.


https://cyberlaw.stanford.edu/blog/2020/01/earn-it-act-how-b...

Scroll to summary, specifically "Section 230 immunity for CSAM can be earned via 1 of 2 “safe harbors”."


Here is some more criticism of the EARN IT Act supporting that https://www.schneier.com/blog/archives/2020/03/the_earn-it_a...

Lindsay Graham has repeatedly sought to weaken encryption and mandate backdoors and key escrow. Also in June this year following the EARN IT Act he introduced the Lawful Access To Encrypted Data Act (LAED) which would mandate backdoors:

https://cyberlaw.stanford.edu/blog/2020/06/there%E2%80%99s-n...

LEAD is extreme and has little support. It is widely believed that the LAED was not intended to be passed but is meant to help pass the EARN IT Act by making the EARN IT Act seem like a more moderate and reasonable piece of legislation.

The EARN IT Act is really a ploy by Lindsay Graham and others to bypass Congress on this issue which they cannot otherwise get passed, and allow a small group of people who are not even security experts to develop regulation and mandates (which will probably be against encryption) under the guise of fighting child porn.


The basis of the arguments in that article are based on items that have been stricken from the Act. That article and the EFF post are out of date. Compliance with "best practices" is no longer part of the bill. As of right now there is no teeth to the bill.


>The meat of the bill is that it establishes a commission (yes yet another, unpaid though!)

"Unpaid" here just means paid in other means, exchaging political favors, getting hired by the benefitted companies, etc.

Of course the same goes for paid commissions...


Handing national interstate matters over to states has proven to always be a terrible idea as it leads to a selective representation as they rule over far more than can vote for them. Doing so for the internet is doubly terrible given that location of all parties isn't reasonably known ahead of time nor usually relevant.


> they rule over far more than can vote for them

I've never heard the commerce clause explained that way. Very cool.


And very legal.


I don't think I've been paying enough attention but how does this work? The FBI, Police, and some congress members afaik have been talking about the going dark problem for years and now suddenly they pass a bill that explicitly protects companies from liability if they implement end to end encryption etc? Huh?! And why is EFF so wrong about this if that is correct?

Maybe it has something to do with "Notwithstanding paragraph (6)"


Notwithstanding means in spite of paragraph 6. So 7(see above) preempts 6(see below).

“(6) NO EFFECT ON CHILD SEXUAL EXPLOITATION LAW.—Nothing in this section (other than subsection (c)(2)(A)) shall be construed to impair or limit—

“(A) any claim in a civil action brought against a provider of an interactive computer service under section 2255 of title 18, United States Code, if the conduct underlying the claim constitutes a violation of section 2252 or section 2252A of that title;

“(B) any charge in a criminal prosecution brought against a provider of an interactive computer service under State law regarding the advertisement, promotion, presentation, distribution, or solicitation of child sexual abuse material, as defined in section 2256(8) of title 18, United States Code; or

“(C) any claim in a civil action brought against a provider of an interactive computer service under State law regarding the advertisement, promotion, presentation, distribution, or solicitation of child sexual abuse material, as defined in section 2256(8) of title 18, United States Code.


So what's the reason for the EFF's warning?


Sadly, they did not point to any specific parts of the bill that would support their claims.


The "best practices by a committee they control" requirement which is carte blanche. They could easily set it to considering "key escrow" or "master key" backdoors.


Yeah I think that is the big question that will need to be answered. Would failing to implement such a key escrow scheme qualify for protection under the clause:

> "(C) fails to take an action that would otherwise undermine the ability of the provider to offer full end-to-end encrypted messaging services, device encryption, or other encryption services.”

My guess is no since even with a key escrow scheme the messages can still be encrypted end to end. Its just that there is another party which may be able to decrypt it later.


But the worst thing that happens if you fail to implement the best practices is that you lose section 230 protections. If you're and E2E messaging app or the author of device encryption software you don't need section 230 protections to begin with.


Not according to the current text of the bill. The EFF post is outdated I think. The commission no longer has teeth. This bill will probably die.


I believe the original EFF post is outdated. Much of the complaints about the bill have been deleted in the current text.


>I like the idea of federal legislature ceding power to state legislatures.

I don't, because I'm sure most state legislatures are even less informed on the importance of online encryption than Congress is. Doubly so if you live in a red state.


I strongly believe we have lost the war for privacy and security (against state level actors) already. Once the net adopted the platform model, we were screwed. Platforms are as easy to regulate, as for example the telephone companies were back in the days.

They were easily forced to comply with wiretapping demands of the government.

The moment a central platform controlled (most of) our communication and was able to provide security to the users by using encryption, legislators realized the danger and also the solution. They would just need to force these central hubs with laws as they did back in the day. Legislation was/is needed to break users' encryption and give government the access to all communication that was lost with modern forms of communication and encryption.

So on one hand, it is "just" a return to a previous state of affairs. On the other hand, given for example today's methods of automatically listening in and transcribing voice to text, this would be a way more massive intrusion and control mechanism.


And I strongly believe apathy will be the death of freedom.

This is a needlessly negative outlook. Click the link, support the EFF. Educate your friends and family.

Defeatism is not a compelling philosophy.


Exactly. I see this defeatism all the time regarding climate change too. It's OK to feel defeated. But why the hell would you spend energy to spread that defeatism?!


> I see this defeatism all the time regarding climate change too.

And I see people suggest that we aren't completely screwed because we got 10,000 people to walk down a street one time. Its an insane position and we need dramatic measures that the human race in its current form are unable to enact.

* Climate change is going to happen because people like cars and supermarkets and nobody is asking anyone to _actually_ sacrifice anything. Much of the climate change movement is "someone else should do something about it" and those that cut themselves by going "off-grid" just make themselves quieter and are replaced by twenty more people with growing carbon footprints.

* Online security is fucked because the vast majority of people can't handle more than a modicum of detail. Its on the technical class (i.e. the 5%-15%) to protect it by hook or crook. It will lose in a democratic battle to preserve it because people aren't interested enough to care and its too complex.

My suggestion is that defeatism posits a better question to answer: "What do we do once we've lost?".

The answer to that question yields something useful instead of the suspension of disbelief that the human race will suddenly turn over a new leaf.


It's needlessly negative to suggest we have to first "lose" to make positive change.


you're being facetious. We don't "have to first lose", its just the very likely outcome is that we will lose.

That's not to say its not worth trying but that we should assume failure, otherwise you end up like the UK's Brexit negotiators who were "all-in" on the "oh they'll buckle as the time runs out" strategy and have no plan if that doesn't pan out beyond "so what now?".

The basic strategy right now seems to run on the basic belief that human-kind will figure out its slowly destroying its habitat and change accordingly. I don't think we've even managed to all figure out the extent of the destruction let alone even start to change accordingly.

What habits have we changed since the 1990s within our population to cut global carbon emissions? I'm struggling to glue a couple of things together. If I'm not mistaken, global air travel has _soared_ within that time window and everyone is just sitting on their hands under the mistaken belief that renewables will bail us out while we all still drive in a car to go to the supermarket.


I’m saying framing the climate crisis as “lost” is a bad way to gain support for positive change.


Well sure, its not the best marketing campaign but I'm not marketing for them so its fine as an opinion, isn't it?


If you have been defeated, acknowledging it is crucial.

It's hard to make progress while denying the actual nature of the situation.

Not saying that's necessarily true in either of these situations, just that it's not a waste of energy to spread depressing truths.


"it's not a waste of energy to spread depressing truths."

It certainly is, if those "truths" are not absolutes. Find a way to redirect your -- and others' -- energies toward something positive and constructive. Spreading doom and gloom, full stop? Always a waste.


If you never move on to constructive action, that's a problem, for sure.

However, telling a sad or difficult truth and stopping there is still a big improvement on pretending there was no issue in the first place.


You have presented a false choice.

Saying negative things is just negative, full stop, as already stated. You can make this positive by suggesting a solution or asking someone qualified to suggest one and then helping them achieve it.


> Saying negative things is just negative, full stop.

I dunno, I'm not sure that's always an absolute. Example: your boss says the budget has been slashed and the department head will be downsizing the whole department by 50% in the coming months. This is very obviously a negative thing, but is your boss doing you a disservice by giving you a heads up? "Oh but he should give me guidance that I should work diligently but also prepare resumes and look into other options." Of course they should and it would be really nice if they also could find out for sure if your head is on the chopping block- but let me ask you this: even if your boss doesn't offer this very obvious advice or have any insight into the department head's persona non grata list, is knowing about this negative truth a net-negative for any employee this concerns? Of course not.

Or even more relevant to HN: a very outgoing individual who loves the face-to-face environment of the office has just been told that the CEO has decided everyone will be 100% remote starting next week until 2022. Is being told this negative truth a net-negative for the social butterfly employee who now at least has a brief chance to wrap their head around the idea and mentally prepare for the change instead of being blindsided?

Sometimes simply raising awareness that a negative thing is happening isn't a bad thing.


To make it less questionable.


> I strongly believe we have lost the war for privacy and security (against state level actors) already. Once the net adopted the platform model, we were screwed.

This is very pessimistic. People still can understand the dangers (and are understanding) and switch to decentralized platforms, which by the way are actively being developed: https://joinmastodon.org.


There is nothing to say that 'decentralized' will be better for us on the whole.

Facebook and Google actually do a relatively good job of protecting our data. They have to live up to specific claims by government, which some people don't like, but otherwise, it's pragmatically safe.

'A million servers' here and there, without process, oversight, 'a lot to lose', lack of transparency, and it might be a whole lot worse for most people.

'Many leaks' at small companies may not have the public and regulatory impact as a 'single big leak' at G or FB.

For 'the tech literate' who know how to manage themselves, it may open up avenues of greater security, but for the net-plebes, not so much.

Think about the freedom that comes with ample food + health and lifestyle choices. Some people are incredibly more fit and healthy than any other people in history, but most of us are somewhat more sedentary, we eat to much, don't exercise enough.


And will just be banned and blocked


Well, if a mastodon instance admin decides to ban you, that's their personal decision. You can go to another instance.

If another user blocks you, that's on them but you have no right to push your messages into another user's home system.

If all else fails, you can host your own. It's ultimate free speech as intended and as someone who hosts a mastodon instance, it's a system that works pretty good, there is no noticeable hatespeech in my federated timeline.

edit: I might have misunderstood the GP, but I'll let the comment exist. Banning mastodon in the US is a non-concern since it's a french open source development.


I think tokamak meant they will be banned by law, which sadly I can see happening - it effectively happened already with napster and bittorrent in some jurisdictions.


I can't. Napster was shut down for copyright infringement. Bittorrent is blocked in some places on similar grounds (although, notably, BitTorrent itself is still alive and well - the law enforcement action has been against distributors of copyrighted materials and sites like TPB).

But even if someone really wanted to get Mastodon banned and created a server full of illegal material to give politicians the fuel they'd need, the law enforcement answer there would be to go after that host. The overall network would not be liable because it wouldn't be hosting/distributing the material.


Once the illegal activity is defined as running the software itself, because it cannot be done legally, then all hosts become liable, regardless of what they host. Simply by launching it, you would have broken the law.


To fight this, please use Mastodon and/or create an instance in it. It must become as widespread as possible to become the norm, then it cannot be (easily) banned.


I doubt a US ban will have much effect on a french open source project.


You're thinking of PeerTube. Mastodon's lead developer is from Germany.


The US govt will just convince their Nine Eyes friends to adopt the same legislation and that's the end of that, as well.


The authoritarian stream in the EU are already working on something similar, but it will be a much harder sell and i still hope we can stop it, before it is too late.


I think you may have misunderstood the commenter. They seem to have been saying that Mastodon will simply by banned/blocked as a tool/platform by law.


I fin it very unlikely to see distributed social networks like Mastodon or Diaspora completely banned. In the US, this should be protected under the first amendment, in many other countries, it's protected under similar laws. And many countries don't care as much about violating their citizens' privacy as the US does.


I think that same logic applies to companies though so it's just shifting the problem over.

But I think GP might have meant banned at a government level?


And (more pessimistic future) you'll go to prison if you install such an app, or visit a website that tells you those apps exist.

Already happened in some countries.

B.t.w. someone wrote long ago here at HN that the 2nd amendment about weapons, nowadays should be about cryptography instead. Maybe:

> A well regulated Militia, being necessary to the security of a free State, the right of the people to [use end-to-end encryption], shall not be infringed.


Do you really see most people leaving the likes of Facebook, Instagram and Twitter? I don't think they care about privacy enough to stop doing what's easy and fun


People have been leaving Facebook in droves. Hardly any of the original users use the platform regularly. Facebook has only remained relevant by buying relevant platforms as they start to take off. Companies that declines an offer or regulation to prevent this buy up behavior is what it takes. Social networks have no monetary buy in to discourage switching the moment your friends aren't on the platform any longer


You might get this impression if your main source of news is HN, but Facebook's monthly traffic among North American users has been steadily climbing quarter after quarter, year after year. Globally, it's up 8% YoY.

https://www.statista.com/statistics/247614/number-of-monthly....


...and they've been flocking to Instagram (also Facebook is still huuuuge)

Companies that are willing to play dirty are the ones that get investment and also are the ones who retain users. They have an inherent advantage over any platform that tries to be moral -- or alternatively platforms that try to be moral have an inherent disadvantage. As long as privacy and respectful user experience is on the bottom of the list of priorities of most users they are the ones who will be able to build momentum, and I can't imagine what could happen to change that.


For talking about their personal life? Not really. But when talking about money or politics, I see more people going to telegram.

Even the fact that this bill is proposed proves that government lost some of the control that it had over private conversations.


I like Telegram, but I‘m not sure it is the prime candidate for privacy / secure communication. Hopefully Matrix gains traction. Element is getting better but is still full of bugs...


I couldn't see Matrix in the play store, the web page is unusable: I wanted to try it, not ,,learn more''.

I installed Element, but it required username/password pair. I understand that it's a bit more secure than using phone number+email for first authentication, but it makes discovery of friends too hard. It trades too much UX for security, just like PGP.


I've not tried installing Element, but I imagine the app could gain the feature of automatically generating a password for you. Asking a user to select a username when joining a chat network seems like a reasonable UX, although having to pick a server is an extra burden.

Perhaps the creators of the app could partner with some big providers to allow the app to try creating your account on one of these providers at random (and keep trying different providers if your intended name is already taken on one that it tries).

I also agree with you about discovery of friends being hard unless users provide their phone numbers to a central server, so perhaps there should be an option for that when creating your account. This central database could run by an independent, audited, third party service. I'm not sure who could be trusted in that role (perhaps Let's Encrypt?), or how much it would cost, but it's an interesting thought experiment.


> It trades too much UX for security, just like PGP.

Thats similar to what eMail does, it trades UX for [nothing] (would place decentralisation there if it were not for gmail etc.)


Email is used for historical reasons, it wouldn't be good enough as a new service. If it disappeared, everybody would be just using Facebook for authentication / connecting with services.


https://news.ycombinator.com/item?id=19322448

U.S. users are leaving Facebook by the millions, Edison Research says (marketplace.org)

1293 points by rmason on Mar 6, 2019 | 616 comments


Why do "most people" need to do this?

The questions isn't "What do 90% of people do?". The question is whether or not those of us that care about privacy have the ability to be private.


Are you willing to cut yourself off completely from that 90%? Given current policies that third parties have no duty to protect and even a duty to provide their data to law enforcement, you can care about privacy all you want but if every person in your circle isn't just as vigilant you are still screwed. Once one person in your circle leaks data or metadata about you to a privacy unfriendly third party, you've lost the game.


> Once the net adopted the platform model, we were screwed.

So stop being another brick in the wall: stop using the platforms just because it is convenient. For example:

* https://old.reddit.com/r/selfhosted/


There is some irony in this being a reddit link.


Well, if we're talking e-mail:

* https://workaround.org/ispmail

* https://mailinabox.email

If we're talking photos:

* https://www.zenphoto.org

General storage:

* https://owncloud.com


A better link: https://prism-break.org.


2020, still a nightmare to make 2 computers both connected to the internet talk to each other.

One day IPv6 will get rid of the platforms.


Actual competition for ISPs / access provision is required for this. All consumer ISPs still ban "servers" on their pipes; this is a clear sign of insufficient competition.


My consumer ISP allows me to host servers. I can add on a static IP for a few bucks a month and as long as I don’t break any laws they’re cool with it.

Previously I had “business class” internet but that is just the name of the plan. Anywhere there was cable I could get business class plans.


> Anywhere there was cable I could get business class plans.

The one time I tried to get a business-class connection from Time Warner they refused to offer any business-class service to a non-commercial address except for their "Home Office" plan, which was basically just their top residential plan with a better SLA.


Huh?


It's nearly always in the fine print of the contract, but enforcement is on a selective basis.


Then we should abandon those big platforms and move to millions of self-hosted personal mini-platforms. That way, you control your data, and you control who you share it with.


> That way, you control your data, and you control who you share it with.

Well, you already control who you share it with, as you're the one initiating connections to sites like Facebook. It just happens people give a lot away to browse sites these days (admittedly exactly what they're giving away remains quite opaque).

Unfortunately even if we did move to one-platform-one-person, the question of data control remains as murky as ever. Suppose you are hosting a party so you send your street address to your friends so they know where to show up. Then they play a fun quiz game that tells them their Harry Potter patronus based on the street address of their friends (that means you). Suddenly some anonymous quiz maker (let's call them Oxford Synthetica) has access to your street address and at least one of your friends' info through no direct fault of your own.


Government and coporations are just systems. If people want a certain thing those systems change. Simple as. However, change takes decades sometimes. Just form good privacy habits so you don't even feel any inconvenience. And most importantly, tell people. Get the idea across that privacy has value. Tell companies you've chosen competitors over them because of their privacy policy. Ask businesses how long they keep your data. Ask people if you can use alternative technology.

It really is a case that if you give up then it will be over. Just make it a part of your life. I'm not asking you to rally every week. Just don't forget about it and positive change will come.


"Once the net adopted the platform model, we were screwed. Platforms are as easy to regulate, as for example the telephone companies were back in the days."

Perhaps - and most fall into this category.

However, you do have the ability to form a (simple, cheap) SCorp/LLC in the US jurisdiction of your choice and provision your own mail/dns/vpn/etc. by that corporate entity.

So now a corporate entity is the provider, and you are the customer, and notices/subpoenas/takedowns/requests will be seen by you and you will take action on them.

At the very least, you can self-provide your own VPN this way if you don't want to run your own mail services.


Except state actors are currently also using the blanket privacy and security to influence elections. This is a couple of levels above talking crap about your government or buying some drugs.

And if that is to the point that it scares you that they'd lock you up for such behavior then you need better government, not better privacy.


> I strongly believe we have lost the war for privacy and security

You are mistaken counselor (c) The Descendants. These three easy things improve your privacy by 10x:

- Incognito / Private browsing by default. Clears cookies used by trackers and platforms. Staying logged in is bad. Login, do your thing and close the tab. No Chrome, it has its own Id.

- VPN. Hides your browsing history from ISP and your IP address from trackers and platforms. Smartphone too, and disable Location and Background app refresh for most apps.

- Adblocker (uBlock Origin / AdGuard for Safari, iOS too). Prevents trackers and malware from executing.


Avoidance of social media should be in this list.


No, it is too harsh. These rules will work for most people. In social media, never share your address book and don't give your phone number, which is very powerful identifier. Unique email is good too.


That is just plain wrong. Even a distribuited model would have the same problems because they would be "platforms" to regulate. The truth is that they cannot win although they will demonstrate a mierdas touch and turn everything they touch to shit because what they are insane and demand the literally impossible. The "leadership" should probably be put in an insane asylum as a danger to self and others with no understanding of objective reality.


Yes, we have already a working precedent with the "Ministerium für Staatssicherheit".


[flagged]


> China sidesteps this issue completely by using home-grown platforms like WeChat of course.

How is that different from what the US does? It seems to me an error to think that US companies are somehow the default, and any other country using its own services is sidestepping the issue.


It's not that different, only that China was acting in response to the status quo of US companies dominating this global market. For most users outside of the US those companies are the default; China excepted.

By 'default' I don't mean to imply that this is desirable or healthy though.


I'm not very up to date on the history of WeChat, was it launched as a reaction to US services or did it just grow organically?


It grew ‘organically’ in China where the US services were not allowed to operate.


"How is that different from what the US does?"

? They are quite different.

Facebook is a social network. On occasion, with a warrant approved by an independent Judiciary, an agency may request on a case by case basis, information relating to a specific concern for which there are specific indications warranting a search - much like the search of your car or home.

WeChat is a control, censorship and surveillance network.

- WeChat, like >90% of Chinese companies, has CCP party apparatus that work within the company who overview protocol to ensure loyalty to the CCP agenda. The US 'equivalent' would be the CIA having staff at Facebook to intercede in policy decisions and to make sure 'The Man in the White House' has his policy objectives met.

- WeChat censors everything. If you, right now, start saying something negative about Xi, it will likely get censored by one of the massive army of censors. The US equivalent would be 100% of Facebook posts going through a 'large office in Virginia' where President's political operatives oversee censorship.

- WeChat censors anything they want, for whatever reason. Winnie the Pooh comparisons to Xi? Banned. References to Tiannamen or Hong Kong protest? Banned. The US equivalent would be Facebook banning all memes mocking Trump, and of course, banning any and all activity related to BLM, social justice, protests, any kind of history that contradicts the ruling American party's official view of the world.

- WeChat is used used to identify networks of civil antagonists. Have you said something about 'Hong Kong'? Well, you're friends are going to be tagged and more likely monitored. The US equivalent would be government operatives sifting through the Facebook DB all day, using that information to monitor your friends messages, because you said 'Black Lives Matter'.

- WeChat is used for a host of other things including payment etc. meaning the breach of privacy is considerably more significant. Everything you buy, everywhere you travel etc. tracked and monitored at all times.

[1] https://www.amnesty.org/en/latest/campaigns/2016/10/which-me...


"How is having a popular domestic social network in China different form having a popular domestic social network in the US?"


> The US 'equivalent' would be the CIA having staff at Facebook to intercede in policy decisions and to make sure 'The Man in the White House' has his policy objectives met.

1. You think that US intelligence agencies don't have any staff inside Facebook?

2. I mean, Joel Kaplan is right there. Not even a secret.


Good question, but it helps me make my point:

The US (and surely Russia, China) all have clandestine operations within Facebook. This is not part of a 'deal' with Facebook, it's regular spy-craft.

But Facebook does not drive it's policy around the wishes clandestine operatives within the company!

The US and Russia surely have clandestine operatives within Tencent as well.

But the CCP has something entirely different: legit, out in the open entities there to oversee and ensure CCP party policy and Xi's orders are effectively executed. If the Executive team at Tencent didn't 'get with the program' they would be forced out. While they don't probably care about regular operating matters, they do in fact 'hold the real power'. For example, if Tencent decided to not censor certain subjects, action would be very quickly taken.

Not only does the CCP have minders in Tencent/WeChat - they have them in almost every Chinese company [1]. These 'inner minders' are literally the direct apparatus of state control within the ostensibly private economy.

When we talk about 'state control of the economy' - this is literally it.

The US corollary would be Government officials in every single US private company, overseeing that everyone adheres to GOP policy for example. So not just Facebook, but Cisco, Disney, GM, Mattel, CNN, Morgan Stanley, B of A, Goldman - etc. - many of which would also be directly owned by the US government, as of course in China most major banks are nationalized. If the CEO of Goldman doesn't do what Mr. White House and 'The Party' want him to do - he'd be 'out' and someone more 'pragmatic' about their capitalism (i.e. make money but bend the knee where they have to) would be 'in'.

From TheGuardian [2]: "(in Xinjiang) there are QR codes on people’s doors for when the party goes in to check on who is in. If someone leaves through the back door instead of the front door, that can be considered suspicious behaviour." - one of many chilling examples of control that is enabled by private companies, controlled 'from the inside' by CCP staff.

[1] https://www.theguardian.com/world/2019/jul/25/china-business...

[2] https://www.theguardian.com/technology/2019/oct/26/china-tec...


> China sidesteps this issue completely by using home-grown platforms like WeChat of course.

Chinese avoid the problem of being spied on by their government?


China avoids the problem of figuring out how to spy on US-controlled Internet services, by making their citizens use domestic products instead.


I read this as 'China took a direct route to spying'.


these are all just battles though and all that losing these battles does is pass more and more power to the technical class.

I know where I'm watched and how not to be watched, I know how to very easily hide the odd activity. The average citizen no longer gets that for free and it really sucks for them but remember that's what this has always been about, trying to improve the lot of the average, not us. We're fine because we know and can do.

So its elliptic curve time for messengers but everyone that "knows" will get some European or underground US piece of software on their phone via an unofficial app store.


That https was established as defacto-standard was a huge success and I don't think it will go away again.

Freedom only works with some responsibility, but it seems to me that state level actors just try to keep up with the ad and tech industry in collecting data.

The EFF tries to blame it on Barr, but I think the wish for control is bipartisan, especially since a generation is in charge that doesn't really understand the problems of this data collection. Not that their younger compatriots seem more promising in assessing actual problems.

There might be a point where using a Asian or African social network would be preferable, but that is something not on the mind of common social media users. Sure, they do it if the product has appeal like TikTok. On bad days I wish users of social media (the self presenting kind) would be banned from all other sites.


Wrote Feinstein and got a reply (?!) that EARN-IT is “misunderstood” and that she won’t stop supporting it womp womp. Why does this bill have any bi-partisan support?? Would expect there would at least be some large special interests funding an opposition... big companies rely on encryption to keep secrets.


That's not exactly surprising.

Feinstein is always on the wrong side of copyright/patent/encryption bills. Just look up her voting record. She co-sponsored PIPA. See also https://www.wired.com/2016/04/senates-draft-encryption-bill-.... She co-sponsored the Sonny Bono Copyright Term Extension Act, too.

She's hopeless on this.


Many people have a delusion that just because they find that one major party is bad then the other one must be good. It's part of a common psychological phenomenon that has us overlook flaws in our allies/friends/family because we need them and can't survive alone.


> big companies rely on encryption to keep secrets.

A lot of big companies have old and crusty leaders / owners, who do not understand the terms of this problem.

I wouldn't be surprised to see this bill pass and then get nerfed in a few years, once big corporations actually experience the operational cost and lobby to have it rolled back / made it optional for them.


The distinction between the parties is little more than an illusion.


There is a clear need for an opposite law which will make backdoors illegal. Who could fight for it?


It's really easy. You just have to say that the backdoors are a method to stop [very big bad] from happening and that without it, you're supporting [very big bad].

The thing we have to come as a society is willing to accept that sometimes [very big bad] will happen, but a lack of back doors is more important.


Backdoors are a method to allow predators to spy on children online. If you support backdoors, you're supporting predation on children.


Sadly Australia did the opposite - they can force us to backdoor applications and punish us for refusing. Australia is not the place to look for security applications.


As an Australian, my plan is to route most of my traffic through a self hosted VPN on a server that resides outside the five eyes (and possibly also the extended thirteen eyes).

Yes, there are insecurities and holes in the above, pending the methods of implementation, but it's one level of potentially many.

And whilst it's good to know that there are technical workarounds such as this, the real work, the real progress for society, is to make these technical workarounds unnecessary by, as the EFF says, contacting your representatives and letting them know what they're constituency thinks. Politics, sickeningly, is the only avenue for worthwhile change.


Ah, the old penal colony mindset where the citizens are secretly prisoners.


When asked about how the law would ban certain mathematical operations, IIRC the Australian PM said something along the lines of "The laws of mathematics should be subservient to the laws of Australia"


Yeah, good luck with that. Let us know how it works out for you, Australia.


Except it’s not a secret any more.


Very few elected politicians of either party.


PGP was specifically created to combat this sort of legal threat. It is proof that encryption is ultimately controlled by individuals. The identity management is completely decentralization and it works over pretty much any medium.

It seems that the world has somehow forgotten the lesson. People can only see the centralized systems and don't realize that the problem they are trying to solve is not solvable in general. Encryption is unstoppable.

* https://en.wikipedia.org/wiki/Clipper_chip


You don't seem to understand that this is not about us, but about the average joe who will never learn about PGP.


Average Joes routinely use PGP on the darknets. So we even have a good current example of how encryption used in crime can come in a form that this legislation is powerless against.


I think if average joe/jane had a definition, it would exclude people that use tor.


Encryption might be unstoppable but they can always just assign you an IPV6 address and encryption fingerprint that becomes your new online social security number that is required from the compiler on up.

Notarize your apps buddy and sign into your app account... ;)


The threat here is not against anonymity, it is against encryption.

You can make as many PGP identities as you want and associate them with any sort of identity you desire, so there might be a political point available with respect to anonymity as well in some situations.


Would this have any affect on software developed outside of the USA? The USA is becoming increasingly irrelevant on the world stage. Couldn’t a company in some other country create a browser with encryption that any of us could use? Could Apple move iOS development offshore and continue creating a mobile OS with encryption?


Apple has already stated (with regards to China) that they will respect the laws of the nation they are offering service in. So while they put up a constant assault against privacy invading moves in the US, they handed chicom the keys to iCloud with zero pushback.

If this passes you can be sure anything digital is no longer safe from alphabet agencies. Sadly the younger generation has no concern for privacy and the older generation likely has no clue their data is being siphoned. Advocates are too much of a minority to make a difference.


Already today, as a European iOS developer with your apple developer program contract counterparty being their European corp, you still have to deal with US export restrictions and self reporting paperwork if you make a single HTTPS call in your iOS app (because somehow, your app is being exported from "iTunes in the US" even when both you, your app, your target audience, and the counterpart of your development program are European entities)...


Not necessarily affecting software developed outside the US, but most likely if your non-US company has communications in your app and US customers, you'll be screwed. For consumers, vice versa with using apps from FAANG since your communications will now be affected by nonsense US laws. This is of course unless the big companies want to split their apps into US & Worldwide versions, which is probably not happening and would cause other forms of feature drift if they did so.


I wouldn’t bet against a USA-RoW split any more. ~90% of internet users are outside America now, and even accounting for income differences more than 2/3rds of the money is outside the USA.


The world is welcome to compete with American tech, and has been trying very hard at it for quite a long time.


Hmm. I'm not sure the Chinese government would agree with your first statement. They seem to be getting significant US opposition to their technology advancement.

Yes, I'm aware of some of the issues surrounding IP copying. I won't call it theft because the word is contentious and there are those in this forum who don't think you can steal ideas.

I'm also aware of the US' own history of gathering IP to bootstrap its industrial efforts. I guess that "but you did it first" isn't a good defence in law (not really sure of that, after all IANAL) but it makes the US case weaker in my eyes.


> They seem to be getting significant US opposition to their technology advancement.

Do you think that's related to China allowing zero domestic tech competition, but demanding that their companies be allowed to compete in the rest of the world?


America hasn't been trying that hard to make a power grab on shared tech like encryption, so it's possible that the rest of the world will have a fighting chance.


And America is welcome to put up a fair fight instead of banning competitors.


If a fair fight is one where everybody plays by the same rules, then America fights as fair as you can get.

The main issue with something like this though isn’t really about the rest of the world catching up to the US technologically. It’s a question of whether service providers would choose not operating in the US over following their rules. But given the size of the US market, you’d find a lot of service providers choosing to follow the rules.

Which is incidentally the same way EU nations enforce their laws on companies like Google. And how countries like France, Germany and UK would enforce the anti-encryption laws they’ve been trying for years to get on the books.


Most countries capable of doing that already have similar or even stricter laws. In a few decades there will be no country in the world that doesn't do something like this. It's over, the privacy war is already lost


Have a nice day, DOJ.


I remember someone from here posting about how the user contacted Feinstein and got the most boilerplate response possible. They honestly do not care about the citizens at this point.


If you want them to care, stop by their offices and have a really rational talk with their staffers and the representative. If you can explain things with stories (for example, where a backdoor was used to crack ___ resulting in very personal damage to ____), you will change minds. Get 20-30 tech leaders to go meet with Feinstein and discuss how this will affect them. If she won't budge at that point, then maybe y'all need to vote for someone else. Encryption is easy to explain: if the good guys can use the backdoor, the bad guys will use the backdoor, to. The cops don't get keys to your car, safe, or home for the exact same reason.


You're assuming good faith in elected representatives and their staff.

I wonder if anyone in the last decade has _ever_ walked into their rep's office, talked to them, and changed their vote on a key issue. I really doubt it.


I have. Multiple times, but not every time. Reps get close to zero direct interaction with constituents. It's usually filtered by a group with an agenda (petitions, campaigns). When you go as a citizen and talk (not scream or debate) about an issue that really matters, you will get their ear. You may not get the vote you want, but you will be take seriously, and the staff and rep will discuss what you are bringing to them. If you have 2-3 people approach your rep with the same problem it will have the same effect as a petition with thousands of signatures.


Perusing /r/politics, all I see is astroturfing, and it garners tens of thousands of upvotes.

The age of true democratic participation is over. Politicians have no need to care about the ordinary constituent to win elections.

They just need to win over the elite, and know how to deliver their scripts.


Actually, reality is counterintuitive. Being in congress is a terrible job where you do not even have time to read 1/2 of the bills you are going to vote on this week. So you count on others to help you. Mostly this is staffers, interns and lobbyists because those are the people who show up and help. If you as a citizen want to make change, then all you have to do is show up, in person and have a friendly conversation with the rep or even just with their staffers. Come with easy to understand stories, and be able to rationally talk about the other side. E.g. I realize that law enforcement has a strong story about encryption, but there is a reason we don't require home, car and safe buyers to give law enforcement a master key. Yes, some bad cops might steal, but what is worrisome is that the bad guys might get the the kes. That's exactly what they are asking for here: a master key.


For more information about how to interact and influence your government (it is, after all, made up of us, at least in the US -- and in other places too, but I'm American so I focus on that), check out Take Back Your Government[0] by Robert Heinlein.

[0] https://en.wikipedia.org/wiki/Take_Back_Your_Government


If you're so scared about your government that your activity might get you thrown away for life and it doesn't involve murdering someone, then you need better government, not better security because those kinds of governments will just say you murdered someone without any proof and lock you up or kill you anyway if they don't like you.


Better government and better security are not mutually exclusive. You can work towards both simultaneously, and claiming that one negates the other is a logical fallacy.


Government exists solely for the purpose of negating security and life of bad actors who want to take away your rights. That is it's function, via the social contract.

That can only be broken in one of two ways. Either people don't care enough or they grow so distrustful of it that they don't participate in it or keep it in check.

When things like that happen you get elected officials decided by less than 40% of the country, anti-maskers, and asymmetric warfare in the streets to combat "the man". It's untenable.


And actually, the latest xkcd says it better than me:

https://xkcd.com/2368/


I see heavy security with zero recourse by anyone to break it and anonymity is the top single problem with the Internet, not the lack of it.

We should all have to log on with our identification through a blanket "know your customer" law like they do to combat money laundering in banks, and our country of origin at least should be displayed. Other than that... You want encryption? Fine, but if your id links you to 4chan offshoots planning domestic terror attacks i should be able to report your address to the authorities and there should be social repercussions for bad behavior. And I need to know if you're Russian or from the US and that you're human before I engage in US politics with you.

Music got worse since Napster, Sean Parker was the guy who got Zuckerberg funded, Google is evil, and John Perry Barlow is dead.

Wake up.


I respectfully disagree with most of this. The last part was a bit all over the place though, not sure where exactly you were going.

Others who know history much better than me are far better equipped to debate against your position. If you really want a challenge, I would invite you to reply to the top comment in this thread by Jon_Lowtek to help open a discussion on the deeper merits of your position. For me personally, your arguments aren't very persuasive.


I'm saying that the big giant corporations and the giant governments we used to fight against died or became completely ineffective to do their jobs and the only thing that happened was giant corporations took their place that are far more unethical than what they replaced... Funded and led by the same people who were once the underdogs!

I am aware of history as much to know that the Internet was designed by a bunch of people who talked on Ham radio and wanted a secure place to sell drugs to each other without being caught by the fuzz. They hated Ma' Bell. That's why it's designed the way it is. Then later it became the RIAA and the rest. I know all of that and if anything that's hardened my position on this, not decreased it. We don't live in the 1960s or the 1990s anymore, and what's worse is everything the people said during that time about how society would be if we went "full retard' into "fight the man" hipsterism and the subsequent deregulation of everything actually happened. Nobody laughs at Al Gore or Tipper these days, or Lars Ulrich for that matter.

Do you like musicians making a penny on a song leaving no room for art just lowest common denominator crap, the rise of the marxist/fascist camps that are hurling what was once a promising country that was getting over it's growing pains straight into 1920s/30s Germany at lighting speed, the totally unfiltered and unmoderated filth that is 4chan, etc? Or do you just ignore it because "some day blah blah blah dictator might take control" yet we are the closest we have ever been to a dictatorship because of faceless social manipulation technique where we have no clue who or where anyone we talk to is, whether they're paid, etc?

Not everyone is an engineer out looking for how things work. It's the same error the founding fathers made with the enlightenment thinking everyone is going to become smarter if you just opened everything up. It's complete baloney, people will just use the power vacuum to seize it for themselves because they make the most noise. Nobody cared about Aaron Swartz except the hacker community. Snowden sold out to the Russians from the beginning. And you're sitting there still in the year 2000 when all this stuff was new and the only ones who spoke were nerds and we all agreed on basically everything.

The point is that everything that this theory set out to solve, became worse. More nationalistic, more poor musicians with even bigger giant platforms controlling their art, more corrupt, more propagandized, etc. Not less. Everyone knows it. The other point is that instead of tearing stuff down and engaging in subversive activities for the same ends, maybe it's time to focus on making the systems you have better. Not turn it into some weird mix of The Matrix, Philip K. Dick, and William Gibson. They're just as bad of an instruction manual as 1984 was to be honest.


I know of no government ever that has been so benevolent that its citizens didn't benefit from privacy and security.


This implies that said governments are bad. Or have the perception of it.

You know who benefits from VPNs, privacy, and security?

The proud boys. Russia. China. Steve Bannon. And their bots, that's who. They are a threat to civil society at present, and they're using that very privacy and security to organize to destroy the thing you're trying to "protect" by drinking Barlow's kool-aid.


I prefer the idea of internet governance via third party instead of the state. Flip it on it's head, where instead of government spying on everyone, we're in control of the surveillance democratically through consensus. But there needs to be some kind of governance of some kind.


Those people also benefit from cash, opaque walls, and freedom of speech. That bad people benefit from something is hardly a damning case against it.


What i don't get, yes you can force Facebook to implement Backdoors, but how would you do that with a opensource project like Matrix or even the full opensource Android?


Many options:

- block access ala GFW, ensuring that most people will have difficulty accessing it or using it

- block access to any data you cannot decrypt or from an endpoint you cannot backdoor

- go after creators and ensure some kind of backdoor is inherent to the project

- shut down projects by exerting pressure on developers

- run some kind of propaganda campaign on the evils of using unsanctioned software (supporting terrorism etc.)

- do nothing, knowing that most users will avoid using anything that isn't one of the major web platforms


>- block access ala GFW, ensuring that most people will have difficulty accessing it or using it

depending on sofistication of that solution you could imagine some forms of tunnelling to be efficient against that (IP-over-X). Then of course due to the complexity this workaround will be used by a tiny fraction of users.

>- block access to any data you cannot decrypt or from an endpoint you cannot backdoor

steganography would be a solution to this, you can decrypt the cat pictures I'm exchanging with friends but you may not be able to notice that those images have hidden content (which may also be encrypted)

>- do nothing, knowing that most users will avoid using anything that isn't one of the major web platforms

this seems to be a guaranteed-to-succeed solution. Probably much better than

>- run some kind of propaganda campaign on the evils of using unsanctioned software (supporting terrorism etc.)

since there is always a risk that this may backfire and encourage resistance in some groups


>go after creators and ensure some kind of backdoor is inherent to the project

>shut down projects by exerting pressure on developers

The developer probably just going to the press with that, no need to damage your own private centric project, sure the NSA can say they never did that so the developers can ignore it OR they openly say it was them, then the project just can change the country.

>run some kind of propaganda campaign on the evils of using unsanctioned software (supporting terrorism etc.)

>do nothing, knowing that most users will avoid using anything that isn't one of the major web platforms

Those are probably the only viable solutions, but on the other hand it's pure marketing for the product, like banning telegram in Russia.


What is the developer going to the press with?


Publication


what are they going to publish, and why?


>shut down projects by exerting pressure on developers

Because i don't like pressure from a Secret Services...do you?

OK with one exception...Xenia Onatopp


Ok, but how is the press involved in all this? You think the press protects people against pressure from secret services?


It's only important to break thinks like matrix if you want to catch dedicated criminals. If you're just trying to sway elections and blackmail people then Facebook (WhatsApp) , Gmail, etc more than covers what you need.

Plus if you really really want to know what's happening there, call the NSA. They'll send a few chaps to join the matrix dev community. Or they'll use other tools to access the machines in question. And you can do that to the 0.01% of people that actually switch after mainstream tools become insecure in a way you can't when there are billions of people using encryption by default.


>Plus if you really really want to know what's happening there, call the NSA

Not sure if they give me any information as a Swiss citizen :-)

>They'll send a few chaps to join the matrix dev community

That's probably more on the James Bond side of reality

>Or they'll use other tools to access the machines in question

Yes that's what they probably will do, but that is targeting and not a Backdoor/break encryption by law.


We know from Snowden that the NSA actively and covertly submitted code to open source projects including Android\Linux:

https://www.androidauthority.com/nsa-android-code-239118/

They're also involved in the Coreboot alternative BIOS project:

https://www.tomshardware.com/news/nsa-contributes-low-level-...

One of the key revelations from Snowden was that all that stuff that seemed far fetched or paranoid to us, wasn't just happening, it was routine and much more advanced stuff was also going on.

It's also much easier to target Matrix etc when you don't have to target all of Facebook because you have a backdoor there. Reducing the NSAs workload by 99% makes the other 1% much easier because they have 100 times the resources to spend on it.

You should also consider as a swiss citizen that US law is increasingly world law. If you sit in Switzerland and contribute to a project banned by US law, they may well have you extradited to spend the rest of your life in a US prison. They're doing that to Assange right now. They've done it to others as well. Locking up foreigners is quite in-style in the US at the moment... I haven't read the bill, so I don't know what the criminal sanctions are for failing to provide backdoors...

Best of luck


So to hide a backdoor in a opensource software it's the best solution to tell the world that you (the NSA) is actively working on a Project? Like SELinux? Man do you think they are stupid? They work on those projects to make their own infrastructure more secure, don't you think every one looks on their fingers/commits?

>You should also consider as a swiss citizen that US law is increasingly world law

That's Bullshit, was probably half true in the Cold War era but for sure not today.

> They're doing that to Assange right now.

That's England, you know those little dogs of yours.


Per the links above, they didn't tell anyone, Snowden outed them.

In the cold War it was true of the western world and the USs use was limited because the US wanted to look gentle and reasonable. Today it's true for most of the planet and the US wants to look tough.

If you don't like the Assange case (I agree the brits are lap dogs, but that case actually got kicked off in Sweden and was handled under EU law much more aggressively than the in UK), try the FIFA case. The US ordered Switzerland to arrest and extradite foreign nationals, many had never set foot in the US for trial, Switzerland obeyed...

https://www.washingtonpost.com/news/the-fix/wp/2015/05/27/ho...

I don't like any of this. But it's time to abandon our innocence. If you're working on any meaningful project in encryption, the NSA are at least aware of you. The US are happy to use covert methods to undermine you if they think its worth it (and the bar is low). If you're important enough (or a DA wants to expense some flights) they'll use overt methods. Proceed at your own risk and don't assume that America banning encryption doesn't make it illegal for you just because you're a Swiss citizen in Switzerland.


>The US ordered Switzerland to arrest and extradite foreign nationals,

No problem with that, from the article:

>With wire fraud, one needs a wire that originates in the US

Sometimes we even ask for it...that's normal international business:

https://www.admin.ch/gov/en/start/documentation/media-releas...

>Proceed at your own risk and don't assume that America banning encryption doesn't make it illegal for you just because you're a Swiss citizen in Switzerland

Yeah i stop here...

BTW from your first article about android:

>So, if it’s not looking to plant backdoors, what’s the NSA’s business with Android? Ironically, the agency has been working to make Android more secure.

AND

>It is just as preposterous to think that the best way to gain access to any operating system is to publicly announce that you are contributing to the OS, and make the tainted code accessible to anyone with an interest in it.

So it was NOT Snowden, but NSA itself.

Second Article about Coreboot:

>Myers published a paper about STM last year on how NSA’s STM implementation could work. All Coreboot code, including all the STM contributions from the NSA, are open source, so anyone could verify that there is no backdoor in there -- in theory.


You enforce it on the main providers. For Android, force Google to supply the backdoor (In Google Play Services), for Matrix force the hosted Element.io instance to provide it.

Even if the project is open source and development is distributed, there is often a major entity behind it driving it on which the requirements can be enforced.


>For Android, force Google to supply the backdoor (In Google Play Services),

Again i was talking about the opensource android, not the googlified closed version.

>for Matrix force the hosted Element.io instance to provide it

How? If they are not US Citizens?


The most interesting and important parts of the smartphone stack are closed source.

The parts that control doing calls, connecting to cells, etc.

They are called "Radio" firmware.


My question was about Open source software, not closed source corporation BS.


Maybe force isn't required because they have some secret P=NP proof.


Yes all the Gandalf's work there...right


systemd


Politics can be Luddites too? They see the world is moving forward, and as it's escaping the reach of their small grabby hands, they enact the "Earn IT" act as an attempt to return the times where their little grabby minds were able to watch and control others.


I have been wondering if there is a way to satisfy law enforcement without breaking encryption or adding backdoors. An idea: what if platforms allowed law enforcement (with a warrant) to conduct rainbow table attacks against encrypted content of a specific user? In other words, what if platforms allowed law enforcement to determine whether a specific known object (e.g. a known photo or video) was sent / stored by a user rather than decrypting all sent or stored objects?

This would allow law enforcement to track the spread of a known piece of content while avoiding breaking encryption. Perhaps it could be a compromise.


>I have been wondering if there is a way to satisfy law enforcement without breaking encryption or adding backdoors

I'd say "how about having law enforcement do, you know, police work to catch the bad guys?"

Police can already get warrants for just about anything -- as long as they can convince a judge they have probable cause[0] -- without too much of a hassle already.

Giving them keys to unlock everything is the wrong way to go about it.

Get enough evidence to convince a judge (not that hard) and you can get a warrant.

However, that doesn't mean anyone, even criminals, should be forced to make it easy for them.

Law enforcement obviously has way too much time on their hands, with the amount of lobbying they do to increase their ability to chip away at civil liberties and privacy.

Crazy thought: maybe they should use those resources to do real police work instead.

[0] https://www.law.cornell.edu/wex/probable_cause


There is no situation in which the spread of "content" should be a crime. Any encryption that allows LE to track content is broken.


Good points.

Remembering back to the PRISM disclosures, metadata alone is enough to build a surveillance apparatus. So I guess even without decryption of all objects, confirmation of the existence of known objects could still be enough to conduct mass surveillance or enable other kinds of abuse.


>to determine whether a specific known object (e.g. a known photo or video) was sent / stored by a user rather than decrypting all sent or stored objects?

It is standard practice to make it so that it is impossible to detect identical plaintext. What you are describing would count as a backdoor. So you might as well instead make it explicit and save any sort of brute forcing (rainbow tables?).


Maybe this is possible instead with homomorphic encryption.


Well I emailed 'em again. Don't think there's much hope when your reps are McCaul, Cornyn and fucking Ted Cruz. I hope this is Lindsey Graham's last piece of legislation though.


As someone represented by Feinstein - there is no much more hope as well.


The question is what is really secure?

Running Telegram, over multiple VPN's?

Accessing Gmail over multiple VPN's so Google doesnt get to know where you are 'really' logging on from?

Making your own VPN network over a combination of AWS, G-Cloud, Azure, DO and Aliyun to 'hide' your actual location?

Peoples thoughts?



Rotate through various VPNs as your initial portal, and throw tor in there as well.

Share your VPN with friends and family to provide some 'noise' (although that may be worth little overall - maybe VPN through friends and family home connections as well).


It really depends on what you want or what your personal situation is. I'll give my thoughts on a few possible adversaries.

- 1. An individual attempting to perform an MITM attack on you. The classic free wifi adversary you've probably heard about. There's little risk of this individual using the sites you visit against you so you only care that they can't manipulate your usage of said sites: Use HTTPS and you'll be fine.

- 2. Your ISP. You don't want them to see where any of your traffic is going because you don't trust them: Use a VPN. Shift the trust to either a VPN provider or a cloud-hosting provider by running your own VPN.

- 3. Your Government. Let's assume they can see all of the traffic within the country and you don't want them to associate your traffic with you: This is the step where it becomes challenging, you want to blend in, not just add more security steps. Ideally you want your traffic to leave the governments jurisdiction and if needed reenter looking like normal traffic from other countries. Tor is a good option here, there's a reasonable amount of traffic on the Tor network to hide in and your traffic is almost guaranteed to leave your country at some point. Alternatively choose a VPN provider that resides legally outside of your country and choose a server that resides physically outside of your country. Both options will move your traffic outside of the jurisdiction of your government, so this should be sufficient within the confines of the current example.

- - What about a self-hosted vpn in a region outside my country? If you ever connect to a server inside your country the full path of your traffic will be able to be seen by your government.

- - What about multiple self-hosted VPNs outside my country? This is an improvement on the previous issue, but it's unlikely to prevent your traffic from being correlated to you on timing alone.

- 4. God's Eye. Your adversary can see all internet traffic everywhere on Earth: Good luck. Maybe use Tor over a popular VPN service to increase the difficulty of correlating your traffic to you? Hope the Nym mixnet becomes popular?

Some additional considerations:

- What if I don't trust a VPN provider? You probably want to hide your traffic in their traffic so pick a VPN provider that requires no user info to sign up and let's you pay with a cryptocurrency or cash. I know of Mullvad that fits this requirement, there are probably others as well. Self host a VPN and to the VPN service through your VPN, now neither the VPN service or the cloud provider has a full view of your traffic (wireguard makes multihop VPNs easy). You could do the same by use 2 vpn providers.


I can only repeat myself. If you are US based privacy is a literal oxymoron for you. Not saying that's not the case anywhere else but it's especially bad for US and it's biggest allies (Saudi Arabia and Israel).

Again go ahead give me all the down it's.


>but it's especially bad for US and it's biggest allies (Saudi Arabia and Israel).

I'd say that the UK, Canada, Germany, France and the rest of NATO[0] are the US' biggest allies.

Followed by countries like Japan, South Korea, Australia and even Mexico that are much more important as allies to the US than either of those places.

Saudi Arabia and Israel are just incidental players for the US. Saudi Arabia for the oil (The Saudi state oil company is now called Saudi Aramco, but used to be called Arabian-American Oil Company[1]), the military bases, and the arms sales. Israel mostly for the arms sales and to placate many (but certainly not all) Jewish folk in the US.

[0] https://en.wikipedia.org/wiki/NATO

[1] https://en.wikipedia.org/wiki/Saudi_Aramco


I don't get the sudden press on this bill. It was introduced in March, and it was added to the Senate calendar in July. There was one hearing back in March, yes, but there isn't even a companion bill in the House at this point. Don't get me wrong, it's bad legislation, but is there some imminent action planned?

https://www.congress.gov/bill/116th-congress/senate-bill/339...


Keep trying until it passes. Use distractions from elsewhere to pass it like thieves in the night.


Let's be real, beyond this being bullshit, it's double-bullshit. Setting aside that it's a morally corrupt idea that wouldn't help, it wouldn't even work. You can't get rid of encription. Code is easier to spread than both drugs and guns and neither of them are as widely useful to criminals. How could anybody who's not completely incompetent think that this would possibly affect more than just the average consumer? Who votes for these clowns?


Here is the contact form for the Senator who is Chairman of the Committee on Commerce, Science, and Transportation. I'm pretty sure that's the committee who would be involved with this bill.

https://www.wicker.senate.gov/public/index.cfm/contact


It already passed through the Senate Judiciary Committee unanimously


i didn't know that. what does that mean for the fate of this bill? doesn't that just mean that the senate must consider it? does the commerce committee have anything to do with it? since they oversee commerce and basically the internet (at least from what little i know...)


>i didn't know that. what does that mean for the fate of this bill?

IIUC, for this particular bill[0], the next step in the Senate is to debate it on the floor and then vote on it.

The companion House bill[1] has been introduced, but has not gone through committee or been voted upon.

The Senate would need to pass the bill. Then, separately, the House would need to pass their bill.

Then, the House and Senate would reconcile their two bills, and the resulting compromise bill would once again need be voted upon and passed by each chamber.

Assuming that happens, the President would need to sign it before it becomes law.

Most Americans (I hope) could tell you that. The links below detail the various actions taken by each chamber, as well as the sponsors and text of the bills.

[0] https://www.congress.gov/bill/116th-congress/senate-bill/339...

[1] https://www.congress.gov/bill/116th-congress/house-bill/8454


I know all that, but I was wondering what role the Senate Judiciary Committee plays in that process.


>I know all that, but I was wondering what role the Senate Judiciary Committee plays in that process.

A good question.

IIUC, none at all.

Once the full House and Senate have passed their bills, a conference committee[0] will be convened to create a single bill to be voted upon by both the House and the Senate.

One would expect that members of the relevant Senate and House committees would be part of the the conference committee, but that's not necessary, and leadership generally chooses the members of a conference committee.

Once the conference report is complete, the bill then goes directly to the floor of each house for debate and a vote.

The specific committee (in this case the Senate Judiciary Committee) that approved each house's bill is not involved.

Of course the members of that committee get to vote on the bill from the conference committee just like every other member of that house.

[0] https://en.wikipedia.org/wiki/United_States_congressional_co...


I actually kind of hope it passes just because it will stimulate the development of decentralized encrypted communication services VERY fast. What good is this law if it only motivates people to make their communications even more secure?


I guess zoom won’t have end to end encryption after all... if this gets implemented, I wonder how it will affect non-US services. Perhaps PaaS (Privacy-as-a-Service) will be a EU export product?


Does this apply to individuals or people hosting their own services or services for their friends?

Also, can someone come to me with a subpoena and a gag order for hosting email for my friends?


The earn it bill is already unenforceable. open source crypto already exists and is available everywhere. so yea. good luck stopping people from using it.


This means anyone implementing and using effective crypto will be flagged for closer monitoring. A jury is made up of twelve people who aren't smart enough to get out of jury duty. It won't be hard for a prosecutor to convince them that going out of your way to use encryption is proof of possessing child pornography. Or terrorist stuff. Or drug stuff. Or whatever else people are terrified of at the time.


>A jury is made up of twelve people who aren't smart enough to get out of jury duty.

You're assuming that every citizen is either ignorant or disdainful of their responsibilities as citizens.

That's sad. I've never tried to get out of jury duty, because I see that as part of my responsibility to my community.

That said, in the half-dozen or so times I've been called, I have yet to be selected for a jury.

But I'll do the same next time, because it's the right thing to do.

I'd add that if I were involved in a trial, I certainly wouldn't want someone who would prefer to shirk their responsibilities as a citizen on my jury, that's for sure.


Then start sending random bytes of everyone. Random bytes are undistinguishable from encrypted data. If everyone is in possession of what appears to be encrypted data, then it's no longer reasonable cause for suspicion.


Unless receiving/storing unexplainable random bytes becomes illegal (like "forgetting" your password).


Right that's why you send random bytes to other people. You're forcing everyone else to receive or store random bytes. Thus rendering any actual enforcement of a law forbidding possession of encrypted data (or random bytes) nigh-impossible.

If someone gets charged for possession of random bytes, send the prosecutor and judge a bunch of random bytes and see if they're still intent on moving forward with charges.


But how do you force other people to accept your illegal random bytes? If people actually risked legal penalties for accepting them, I imagine that spam filters would get much stricter about deleting attachments from unknown senders, for example.


As a non-American, how would they be able to enforce something like this? Is it related to the HTTP specification?

What stops me from encrypting things?


When you look at the things the left and right agree on you can see who really runs the show.


We ought to support the bill. However, NO digital communication is secure. Start put all important information in NON digital form.


Easiest way to build traction around this effort is to write some stories suggesting Trump is trying to take away encryption (even thought spying has bi-partisan support).

The Trump angle will pick up mass media attention and the 'orange man bad' crowd will activate to make sure this doesn't get far.

There may even be debates on encryption and people may actually talk about encryption, privacy and policy.


Wouldn't this work both ways?

"The Deep State wants to read your Facebook messages!", etc.


I was going to call my Representative, and express my disapproval to my Senators for being on the sponsors list, when I actually read the act. I really don't see what the EFF is talking about.

It seems like a very straightforward bill that makes things much better by explicitly removing any liability from companies for either having end to end encryption or for not creating backdoors.

Here's my breakdown of the text of the bill:

---

Create a commission of 16 people, half appointed by each party. Can only make recommendations that 14 of the 16 approve of.

4 shall have current experience in investigating online child sexual exploitation crimes, of whom, 2 shall have such experience in a law enforcement capacity; and 2 shall have such experience in a prosecutorial capacity;

4 shall be survivors of online child sexual exploitation, or have current experience in providing services for victims of online child sexual exploitation in a non-governmental capacity;

2 shall have current experience in matters related to consumer protection, civil liberties, civil rights, or privacy; and 2 shall have current experience in computer science or software engineering related to matters of cryptography, data security, or artificial intelligence in a non-governmental capacity; and

4 shall be individuals who each currently work for an interactive computer service that is unrelated to each other interactive computer service represented under this subparagraph, representing diverse types of businesses and areas of professional expertise, of whom 2 shall have current experience in addressing online child sexual exploitation and promoting child safety at an interactive computer service with not less than 30,000,000 monthly users in the United States; and 2 shall have current experience in addressing online child sexual exploitation and promoting child safety at an interactive computer service with less than 10,000,000 monthly users in the United States.

---

the Commission shall develop and submit to the Attorney General recommended best practices that providers of interactive computer services may choose to engage in to prevent, reduce, and respond to the online sexual exploitation of children, including the enticement, grooming, sex trafficking, and sexual abuse of children and the proliferation of online child sexual abuse material.

(A) preventing, identifying, disrupting, and reporting online child sexual exploitation;

(B) coordinating with non-profit organizations and other providers of interactive computer services to preserve, remove from view, and report online child sexual exploitation;

(C) retaining child sexual exploitation content and related user identification and location data;

(D) receiving and triaging reports of online child sexual exploitation by users of interactive computer services, including self-reporting;

(E) implementing a standard rating and categorization system to identify the type and severity of child sexual abuse material;

(F) training and supporting content moderators who review child sexual exploitation content for the purposes of preventing and disrupting online child sexual exploitation;

(G) preparing and issuing transparency reports, including disclosures in terms of service, relating to identifying, categorizing, and reporting online child sexual exploitation and efforts to prevent and disrupt online child sexual exploitation;

(H) coordinating with voluntary initiatives offered among and to providers of interactive computer services relating to identifying, categorizing, and reporting online child sexual exploitation;

(I) employing age rating and age gating systems to reduce online child sexual exploitation;

(J) offering parental control products that enable customers to limit the types of websites, social media platforms, and internet content that are accessible to children; and

(K) contractual and operational practices to ensure third parties, contractors, and affiliates comply with the best practices.

----

Amends Sec 230e so that child sexual exploitation law gets its own section, on the list of things that section 230 does not apply to, joining, among other things, all federal criminal law, all IP law, sex trafficking, and privacy laws. The text to be inserted looks similar to the current sex trafficking text?

Explicitly say that the entire list of exceptions to 230 does not create liability, either federal or state IF you are end to end encrypting and cannot decrypt.

Not creating backdoors does not create liability.

----

Go through all federal laws and change "Child Pornography" to "Child Sexual Abuse Material".

---

So overall I don't see the issue. Seems like a good law.


Thank you for the summary.

I guess the gotcha is in "preventing, identifying, disrupting, and reporting X", which seems impossible to do when communication is encrypted end to end.


Unpopular view, but unrecoverable encryption deployed at web scale is a cancer on society. While unrecoverable encryption provides additional privacy to law abiding citizens, it normalizes low level criminal activity at scale.

There has to be a better trade-off here that minimizes the risks of 3rd party access, and gives law enforcement and the intelligence agencies the tools they need to do the best possible job.

If I had to choose, I’d rather my consumer endpoints be hardened but have vetted and protected exceptional access mechanisms on the encryption.

In practice, this bill is likely to lead to cut corners by big tech, who won’t be legally mandated to actually build increasingly responsible encryption recovery mechanisms for LEO. This will enable big tech to say, “I told you so”, because they were simply doing the minimum amount that was required of them legally.


Would a bill really eliminate the concept altogether? It might make their business more expensive in certain parts but there are multi-billion dollar criminal empires that invest in things like their own submarine tech. Do you think e2e encryption isn't something they'd develop in-house & resell to each other?

At best it might help catch some street level crime but any serious organized crime (which arguably is a bigger problem because it's organized) will have the tools available anyway. Consider it this way. There's already a black market for security exploits with exploits frequently costing far more than they might actually otherwise be worth (there's a limited time utility before the exploit is patched). How much do you think e2e encryption would cost & do you think there's not going to be buyers & sellers for this? Especially since, unlike exploits, this is an infinitely distributable solution. I can sell to as many buyers as I want without risking my revenue stream.

On the technical side we've observed what happens with this stuff. We'd be one Snowden-style leak away from all websites instantly becoming vulnerable. Do you not think that might be valuable to adversaries of the USA?


>criminal empires

True, you can’t stop math but you can try to police it. You can regulate consumer access. Doing so means one less “gone dark” area, which makes LE job easier.

>security exploits

To your point about low level criminals: Now that the cat is out of the bag, yes, surveillance worked way better when people didn’t know about it. Yes, more sophisticated criminals will try to employ their own encryption. If I were in LE or the IC, I’d still rather not deal with the oceans of data produced by essentially unbreakable encryption via big tech.

Will address point about uncapped value to an encryption exploit below.

>Snowden-style leak

Yes, which is why it is imperative to continually improve and audit such systems, including maybe removing such single points of failure as you noted, both from an insider threat perspective as well as from exploit discovery processes.

It would be helpful to consider how to build recoverable encryption in a way that minimizes the risks of the existence of the exceptional access mechanism, from all angles: technical, social, etc


> Yes, which is why it is imperative to continually improve and audit such systems, including maybe removing such single points of failure as you noted, both from an insider threat perspective as well as from exploit discovery processes.

Can you join me on a journey to build this hypothetical world to figure out how this addresses the Snowden leak?

Let's imagine a world where every single server had a registered backdoor key. This key also isn't the key itself. No, we're smart. It's instead used to sign one-time use, timestamped keys that give you access. We assume all these servers are also somehow always running the latest version of the software to implement the backdoor to address any exploits that may have been discovered.

We control access to this carefully so that you can only request a code & this is validated by all kinds of bureaucratic controls that are never violated for expediency & no mistakes ever needed. Also the system handing out codes itself doesn't even have the keys. It has a temporary key that can't generate valid signatures past its expiration. To regenerate, we go into a fortified secure vault that is air-gapped. This air-gapped system is used to generate a new key, burning it onto a CD-ROM. So your admin has to, on a monthly basis, go into the vault to generate some secret that can be used to continue backdoor access.

Now imagine your admin going into the vault on a monthly basis with a CD-ROM drive that gets burned is Snowden. You've now stolen the root keys for every machine out there.

Let's also remember a few things that are elided for this hypothetical world we've built. 1. I may have gotten some details wrong here, but this is really close to how OS updates are handled by Google & Apple. This is treated as one of the most secure ways to do software deployment at scale (we're not talking about one-off carefully controlled & vetted backdoors which are a wholly different problems). 2. Software deployment is hard. There's no world in which you will instantly deploy a security fix to your backdoor code. Some machines don't have good uptimes & others can be mostly invisible to the internet. Mobile operating systems are different as Google & Apple dictate the HW design. Google has struggled here more pulling vendors along to do the good security things. Are you proposing we standardize on Apple hardware for everything? 3. If you have the ability to deploy code to any random machine, that deployment mechanism is a target in and of itself. Since every US machine has to implement it in this hypothetical world, this is an attractive exploit. It's easier to secure but now the value of compromising it has increased exponentially. We haven't heard of any exploits of this but given the value already (& exponentially more if we're talking about every single system in the US), we're looking at threat actors with ridiculously deep bank accounts & access to technical expertise. 4. Timestamps are hard. You're talking about every single machine in the world. There's plenty running the wrong time. So someone changing the clock breaks your ability to backdoor (unless you ignore timestamps, but then your keys you're generating are reusable on that website at least). 5. Key rotation & management is insanely hard. You're talking about every machine in the US. Even every server. Mistakes will happen at this scale so your backdoor either won't work (best case) or you'll have unintended compromises (or likely both). 6. Complexity & security are diametrically opposed. The more complexity you add the less secure you are. Modern machines are already ridiculously complex. 7. Everyone outside of the US (including US companies that have servers abroad) will not implement the backdoors. But may implement the backdoors the other nation states will force them to adopt. Sure, it's great if you're the US forcing your way to gain advantage over other countries. How do you keep these systems segmented so that a backdoor from another country doesn't give you access to the US? Moreover, let's say the US implements an impenetrable system. Do you think other countries will care to do the same? Does the US share our tech with them at the risk of making it even easier to find flaws? Also how do we manage distribution of such software when there's a flaw?

No amount of advise to "invent better math" solves the fact that this isn't a technical problem. No amount of "build things better" solves the fact that software engineering is hard & we have 0 examples, even in "big tech" which invests billions here annually, of building truly secure systems that are actively trying to prevent any backdoor/exploit. Above all else, you're proposing a single point of failure for the entire US economy. You can use this to conduct industrial espionage at an even larger & easier scale than happens today or to take down critical infrastructure in a time of conflict.

Is there something I missed in my analysis? What part can we "do better" on that doesn't result in exposing a significant vulnerability?


Yes, removing such spof’s should be a design requirement.

Yes, it’s a difficult problem with social implications, and not simply technical challenges, as you noted.

Yes, Snowden shouldn’t have so easily been able to steal so much data. Apparently the IC has installed numerous checks and balances to help prevent another such insider threat.


I think that the key point that should be addressed by those against this bill and these laws in general is: How to conduct legitimate law enforcement business and police investigations at a time when most communications have moved online?

The reason the Attorney General and the DOJ want some sort of access to communications is not to undermine free speech and to take over the world, it is because they know that without legal provisions technology can make it impossible to access communication during legitimate police investigations.

How do we solve this problem?


>How do we solve this problem?

We don't, we accept that encryption is part of the modern world and learn to live with it. Because there's nothing else you can do about it.

See encryption is just math, and you can't really outlaw or limit how math is used.

If we have bad actors who want to encrypt their communication, they absolutely can with or without this bill.

Even if Whatsapp/Telegram/Whatever has to provide the US government with a backdoor to decrypt all messages, anyone can make their own communication platform and simply not give the government a back door. Implementing secure encryption isn't difficult and it's very easy to research how to accomplish it.

Grab a few devs and they can create a simple encrypted messaging app in a few days.

You don't even need to distribute it through official channels. Android allows you to sideload apps from anywhere and you can jailbreak iPhones to install apps from anywhere. So our bad actors can create secure encrypted communication platforms and distribute them without anyone ever knowing about it.

How will this bill prevent that? How will it prevent a few random developers from whipping up their own apps? How will it make it impossible for anyone, anywhere at any time to implement encryption into any app or platform?


Just like law enforcement has always done, it uses the tools at its disposal. There are many mechanisms that require elevation and precedence already like a subpoena. Further more pgp is already in the wild, this bill will just force liability over towards the platforms and make them open up.

A fundamental requirement of a free society is encrypted communication, always has been. Im amazed given how the police state has grown since 9/11 that there are 'poor police' arguments et all. Government always grows in scope. The secret fica courts that were intended for terrorism only ended up being 50% domestic drug cases.


> anyone can make their own communication platform

and then the users of that platform would simply stand out in ISP logs making it actually easier to spot them. If this platform was a dedicated tool developed by/for a bad actor, then everyone working with/for that actor would be easily found.

Given that, it seems that steganography (combined with encryption) could be a solution with a "battle" between steganographic methods and algorithms to detect them


>and then the users of that platform would simply stand out in ISP logs making it actually easier to spot them.

Yeah no.

Encrypted data would still be flowing all over the place, if our bad actors use VPN's to hide their traffic then it would become impossible for ISP's to see what they're doing or using.

In addition, even if you can pinpoint who's using encrypted communications, unless you can prove they're actually engaged in some criminal practice, it won't do you much good. With EARN-IT the responsibility is on the encryption providers, so those two random devs who made the app. You can't tell what the users were talking about since communication is encrypted, you can't really prosecute any of the users for anything besides maybe using those apps if it becomes completely illegal or you can prove that the app is only used by criminals and no one else.

Now you can potentially go after the devs, assuming of course you can figure out who made the app, and assuming these people are in a place where US laws apply. The global nature of the Internet makes things very difficult. If a Swedish team develops and encrypted communication app and distributes it on their website, are they still required to comply with US laws? If they prevent US citizens from downloading the app with geoblocking but people get around it with VPN's, are they still required to comply with US laws?


>if our bad actors use VPN's to hide their traffic then it would become impossible for ISP's to see what they're doing or using

you just transfered a problem from ISP level to VPN operator level. While you could argue that using multiple VPNs from different countries could make this somewhat harder, the problem still exists. Especially if you consider metrics other than IP, for example specific packet sizes or timing patterns (for example, instead of users connecting to given IP, the adversary would look for users sending 640 byte packets every 300 seconds).

While the arguments that encryption of messages makes it impossible to know the contents of messages (and thus using the contents as evidence), however the ability to uncover the members/employees/cooperators of bad actor would make it easier to investigate them and/or use other means of targeted surveilance to obtain evidence. Also this would make it easier to infiltrate bad actor, since one of the uncovered users could be then coerced into cooperation.

(All above assumes that the app/platform is used only by members of "bad actor" and noone outside that organization is using the app. It is completely different if there are other users, perhaps even bad-actor users being a minority.)

With the developers outside jurisdiction, the problem is that while they of course might or might not be required to comply with the law, but they can still be coerced/manipulated/otherwise encouraged into providing a "patch" (backdoor) into the application.

I believe that much better solution would be to simply use any popular platform as a transport layer, with independent end-to-end encryption. Possibly with some steganography as well. The simplest example would be users exchanging memes/cat pictures - this will not stand out in any ISP/VPN traffic analysis. It will also not stand out (that much) in content analysis by any entity that can decrypt/access plain-content. The images being exchanged could then contain embedded (and end-to-end encrypted) content. While this is still far from perfect - you could imagine detection of repetitive images being sent, content/timing patterns or actual analysis of attachments for steganography but all those still require significantly more resources to work on massive scale.

Alternative would be to use custom platform but having as many "external" (in a sense of not working with/for bad actor) users as possible


I mean a bad actor can easily use stolen/free wireless with a randomized mac on a machine that’s used for nothing else and not access any “usual” services while doing it.

This is more about ordinary people maintaining privacy in their normal daily activities, in ways that aren’t too inconvenient to use 24/7.

If a bad actor has the knowhow to build a custom platform they sure have the ability to access the internet in a way where they can’t be found by IP.

Governments still like to push anti-privacy laws because they help catch non-technical criminals who don’t put in a serious effort to hide. This is why they hate “built in” privacy protections in consumer software and demand ways around it, because they help protect even technically illiterate criminals.

What I'm trying to say is, the important question is how much do we want to erase privacy for 99% of people who use normal consumer software in order to help police catch the ~1% or whatever the percent of criminals is that also use normal consumer software, and just happen to also be criminals. The 0.01% of people that are criminals and have the resources and knowhow to actively try to avoid detection by building their own systems are not going to be caught in trivial ways (like tracking their IP to their apartment, vpn or no vpn, or tracking them through correlation from using their personal social media account from the same connection they perform illegal activity from) anyway so they don't matter.


But if the app is in the gray area (e.g. in addition to bad actors, it's also used by a niche set of privacy enthusiasts) it enables plauisble deniability.


Money laundering is just math too.


Yeah, so? Most acts of "money laundering" consist of nothing more or less than the basic principles of privacy applied to financial transactions. The fact that the government has somehow managed to normalize intrusive mass surveillance in the domain of finances does not justify extending the surveillance to other areas.


So this statement: “See encryption is just math, and you can't really outlaw or limit how math is used.”

Is false.


A sufficiently authoritarian regime can outlaw pretty much anything they want, if they're willing to be heavy-handed about the enforcement and make a mockery of the justice system in the process—both of which can be observed in the anti-money-laundering regulations. Enforcement won't be 100% effective, of course, and the collateral damage would be enormous. It won't have nearly as much effect on the actual "bad guys" as it will on ordinary civilians. However, nothing prevents them from passing bad laws banning encryption. Which is exactly why such disastrous policies need to be strongly opposed.


Exactly, that’s why statements like “you can’t outlaw math” are counterproductive and wrong.

You can outlaw math, and the result will be really bad.


> How to conduct legitimate law enforcement business and police investigations at a time when most communications have moved online?

The old-fashioned way: physical surveillance for the cases where they strongly suspect a crime. It's even easier these days with the new surveillance capability.


> they know that without legal provisions technology can make it impossible to access communication during legitimate police investigations

> How do we solve this problem?

It's called a search warrant.


Well, no. That's the issue!

If you only have peer-to-peer encryption with communications still passing through servers unencrypted then, sure, you can get a warrant to force disclosure.

With end to end encryption, however you can show up with a warrant all you want it makes no difference because they physically cannot hand you clear-text communications.

Same for encryption at rest. The strength of the encryption algorithms we have today makes it near impossible to recover the clear-text data.


A very real problem with "backdoors" and wiretapping in the digital age is that, in the past, wiretapping was for future communications and between two specific individuals.

That is not how E2E encryption breaking will be used.

For E2E encryption breaking, they won't ask for "future conversations between A and B for the next 2 weeks" they will say "give me all historical communication from this person between all of their followers for the last 5 years".

For the first time in the history of the world, were are in a time when, not just your future communications, but your past ones, the ones that you think are private between two people in a modern-world way (that is, over the internet) are not private except when E2E. Worse, by all accounts from a normal person - they are private. I mean, we certainly act like the messages we send family iMessage or whatever are private, and we assume the same level of privacy we'd be afforded for a phone call, except that's not present; and, with laws like this, it will be even less present.

And not only is the former bad, but the latter is much, much worse.


SMS texts and call logs are already accessible, for example. Letters kept have always been accessible.

The fact that they may be accessed by law enforcement agencies in specific circumstances does not mean that they are not private.

If you are seeking absolute privacy for all time, i.e. absolute secrecy, the only option now and since writing was invented is to leave no trace: either communicate face to face discreetly or hand deliver letters and destroy them after reading.

What has changed recently is that we are effectively communicating in writing for virtually everything (even a voice call is affectively "in writing" as it's data stored on a medium) and that the cost of storing those writings has dropped to zero. In addition, and that's the issue, encryption has reached a stage where what I suggested (to hand deliver letters and to destroy them after reading) has become 'easy' and cheap for all communications if you want to, which creates big problems for law enforcement and security agencies.

I think this is a legitimate issue. The level of the debate, for example in this thread, is low and not helpful because, at least in some tech circles, people refuse to acknowledge real world issues and there is an extreme and utopian view than anything less than absolute secrecy is absolutely unacceptable.


> Letters kept have always been accessible.

The physical letters, yes—but if those letters were written in a code or private language it is well established in precedent that one cannot be compelled to translate them into plain language for the prosecution. Forget all the false skeuomorphic analogies about "locks" and "keys"; encryption is not a safe you put your message into, it's a set of private codes.

A warrant authorizes law enforcement to seize the physical evidence (letters, hard drives, memory chips, etc.). Making sense of the content afterward is entirely their problem, and that remains true even when making sense of the content is likely to be beyond their capabilities.

The particular mechanism being exploited in this law (revoking section 230 protections for companies that do not implement whatever "best practices" are promulgated by this new committee) is particularly bad because section 230 should never have been required in the first place. It should go without saying that a service provider is in no way liable for illegal content uploaded by users without the service provider's knowledge. Putting aside the fact that the idea of "illegal content" is itself nonsense in any country that purports to recognize freedom of speech, the occasional removal of unwanted content that is specifically brought to a moderator's attention does not imply that the service provider actively controls everything that is published on the platform. Logically, if a fully unmoderated site is not to be held liable for content uploaded by users then a partially moderated one likewise should not be held liable for content which has simply not yet been brought to the moderator's attention. The court erred in lumping partially moderated sites in with ones requiring full prior review and approval before posting. Section 230 was passed to address this miscarriage of justice, and as such revoking or weakening its protections, or holding them hostage as this EARN IT act would, is itself unjust.


> With end to end encryption, however you can show up with a warrant all you want it makes no difference because they physically cannot hand you clear-text communications.

Well, no. That's the issue!

They shouldn't be executing search warrant behind your back to some their party that happens to be stirring your data. They should execute the search warrant on the person who owns the data they're investigating. If they refuse to give up the data they're breaking the law.

> Same for encryption at rest. The strength of the encryption algorithms we have today makes it near impossible to recover the clear-text data.

Same for encryption at rest. We already have laws for that. Get a search warrant. If they refuse to give the data you have a warrant for they broke the law. For example, if you have something in a safe the cops won't just come into your house and crack your safe. They'll get a warrant first then in theory you'll open it for them or they'll have to crack it if you break the law and don't open it.


> If they refuse to give the data you have a warrant for they broke the law.

Warrants aren't issued for data, they're issued for property. A warrant authorizes law enforcement to search for and/or seize evidence (i.e. physical property) without regard for the owner's property rights. The owner doesn't have to give them anything or aid the search in any way beyond simply not interfering. Standing back and leaving them to break into the safe on their own does not violate any law. You might open it anyway just to show goodwill and avoid damage to the safe, but there is no obligation to do so, and unnecessarily demonstrating that you have the ability to open the safe may, in certain situations, amount to testifying against yourself.


A search warrant for the endpoints.


And I will refuse to cooperate because I know what my "endpoint" contains, because of course my device is also encrypted and/or will be destroyed if I think I'm about to get caught.

Your script kiddie might break at the suggestion that refusing to cooperate means 2 to 5 years in jail but organised crime, or people who are risking a much harsher penalty, won't.

These are the real, down to Earth issues, not utopian discussions about individual liberties.


Because criminals will follow the encryption laws? If they're not going to comply with a search warrant why would they stop using encryption? The logic makes no sense. The only people this law will affect are law abiding citizens.

I'm not willing to give up my privacy just because other people are committing crime and the cops can't figure how to do their job. I'm in the US and the 4th amendment says I don't have to.


Giving the government the option to show up at your door with some random garbage data and asking you to make an impossible proof of innocence or go to jail has some very obvious problems.


How was it solved before modern communication tech made it possible to automatically record, transcribe and search peoples private communication?


The key problem is not automation, it's access.

It's simple to read a letter. It's not difficult to eavesdrop on a phone copper wire. But it's impossible to intercept communications encrypted end-to-end, or encrypted at rest, considering the strength of the algorithms freely available today.


The key problem is that massive intrusion into the privacy of correspondence, posts and telecommunications is a telltale sign of totalitarianism. A democracy by nature has not only no need to access every ones correspondence, it is strictly forbidden.

I find it horrifying how many Americans lack the basic understanding of what they are facing here. The STASI reserved for itself the right to read every letter and before them it was the GESTAPO who argued that, for the protection of national security ("Schutz von Volk und Staat"), all communication must be accessible by the secret police.

The american people have no idea what devil they are summoning.


> The key problem is not automation, it's access.

I agree. There is a problem if they have open access to everyone's data.

> But it's impossible to intercept communications encrypted end-to-end, or encrypted at rest, considering the strength of the algorithms freely available today.

What about to people talking to each other in private. That could be considered as "problematic" as end to end encryption. Should their phones be recording them so the government can snoop later? And for encryption at rest, what if someone has a safe that self destructs is someone unauthorized tries to open it? Should that be illegal?


It's difficult to read every single letter. It's difficult to eavesdrop on every single copper wire. It's easy to store every bit of internet communication. Almost nobody here has a problem with warranted access, but this isn't about warranted access, this is about blanket recording of every user's interaction for no good reason.


We currently are in the middle of prosecuting a whistleblower that made clear that intelligence agencies maliciously abused this kind of access without any indication that this access providing a boon for security aside from perhaps letting some people interested in control sleep at night.


It isn't a problem - it is a solution. The answer to the question is "Go after the crimes in the real world you disingenuous lazy assholes!". If they said that they couldn't stop mobsters because they whispered to each other and talked in euphemisms they would be rightfully be thrown out on their ass as a worthless incompetents.


We don’t because it isn’t a problem. It really is that simple.

The cops can do their jobs in meatspace.


They used to be able to record phone calls with a warrant.

They used to be able to read letters with a warrant, IIUC.

With E2E encryption now widely available, no sane criminal will use any non-encrypted channel.

So, a lot of methods law enforcement used to find very helpful effectively no longer exist.

The steelman principle says that this is what you need to argue isn't a problem.

Disclaimer: I lean towards the EFF's position on this one, but I can see there are some debatable issues here.


Perhaps what we need to be doing is not finding new and creative ways for law enforcement to read people's correspondence, but finding new and creative ways to eliminate the underlying factors that cause people to turn to crime in the first place.


But if the government stopped the factors that lead to crime, then how could it justify its massive spending on a surveillance and selective enforcement regime that it can use against its political enemies (e.g. voters in the wrong demographics)?


I don't think the cops ever needed widespread surveillance capability.

They can park a van across the street from my house or office. They don't need access to the telco infrastructure to do their jobs.

The lack of foresight in previous generations is not an excuse to perpetuate their mistakes.


I'm not talking about widespread surveillance.

Decades ago, if they had evidence you were involved in crime, they could apply for a warrant to tap your phone, search your house, or read your mail.

That's focused surveillance, not mass, and it's under the oversight of a judge.

In a world of end-to-end encryption, they can't realistically find useful evidence by doing those things any more.

Parking their van outside doesn't help as much, either - you can conspire to commit crimes quite easily without ever leaving your house or having confederates come there, thanks to encrypted video chats.


I don't think they ever needed the ability to tap phones or read mail. Capabilities that were repeatedly abused, just like internet surveillance.

Where will I commit this crime? Somewhere in meatspace or it doesn't matter. Cops can work backward from there.


I'm pretty confident you can commit crimes that matter without physical presence.

Most people don't approve of making millions from insider trading, for example.

Or of cryptolocking beloved photo collections and holding them ransom for bitcoin.


You don't need to tap phones to catch and prosecute insider trading.

You can't stop cryptolocking by tapping a phone or by banning encryption.


> So, a lot of methods law enforcement used to find very helpful effectively no longer exist.

Exactly, this is a big practical problem and what these laws try to address (in a good or bad way...).

IMHO, this is a legitimate worry and opponents should indeed come up with practical alternative solutions. I think the EFF and others do not take a pragmatic approach but rather camp on ideological positions that will get them nowhere.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: