How will this work with hotspot capture portals? I have a dwindling list of sites I can use to check and see if my internet connection is down or just interrupted. And Apple/Microsoft don't always register the wifi is in captured so either I wait or I can open a browser to an unencrypted site and renew the session.
The only reason those captive portals don't work is because the operators decided to use the huge hack of a solution that is MitM'ing unencrypted HTTP traffic and/or messing with DNS instead of working to ensure a standardised solution (such as RFC 7710) was available before rolling out this sort of technology. The current approach really is a joke.
To be honest, I'm glad it's now back-firing with more sites using HTTPS (that they can't hijack connections to), maybe this will nudge the operators into doing things properly - whether that's using WPA Enterprise for username/password auth or using the DHCP option to tell new clients they need to visit an internal URL first to use the internet (no connection hijacking required).
I think it would be interesting to measure how well sites maintain their HTTPS too. I occasionally see improperly-configured certificate warnings (on corporate and even government web sites!). It may be that anyone who set up HTTPS once last month is going to have problems in 3 months or 6 months, and may not be set up properly for the long term.
In this “modern” world, it seems past time for governments to start supplying digital IDs. Why for instance can’t every single person have a government-backed certificate of identity once they have done all the usual things that it would take to obtain a driver’s license or passport? And if you can count on a passport being valid for 10 years, why can’t you count on a government CA (say) to continue to validate your certificate for 10 years?
> In this “modern” world, it seems past time for governments to start supplying digital IDs.
Good graces just stop. While this may seem like a step in the right direction and offer a great amount of convenience, one can only imagine how this would be abused by authoritarian powers.
Not just that, but the government has leaked millions of database records on everything from top-secret clearance holders to medical records to you name it. Their security posture is insulting.
Because then your ability to stay private and secure is tied to the government?
If we're already talking about breaking the current TLS system, a DNS based security (Browsers include DNS keys, DNS hosts site keys) makes much more sense.
Am I the only one not all-in about moving everything to HTTPS? It's great for sites that track personal information and sensitive data that you wouldn't want snooped. But for a lot of sites (like phrasegenerator.com, one of mine) I'm not sure it makes sense - if someone in a coffee shop finds out what random phrase generator categories you've visited it's of pretty much no consequence. It's just an extra configuration hassle on the server, and a CPU burden on both ends. Admittedly minor, but why add a layer of complexity when it's not needed?
I don't know. I use Let's Encrypt for the sites where it matters, and I'm all for it with search engines, email clients, anything with a password, etc. But as an engineer who likes to keep things simple, this seems like another layer of complexity borne out of one of those dogmatic 'best practice' rules that makes modern software slower, buggier and harder to maintain than it needs to be, despite our amazing modern hardware.
HTTPS keeps your ISP, your wifi hotspot, a dodgy router, or anything else from injecting ads and tracking information into your pages
HTTPS ensures that the server and information the user is requesting actually comes from you (and not a shady middleman who might give out bad information or just annoy the user with a slightly broken setup)
HTTPS ensures that content isn't being blocked by a bad government actor based on it's content
HTTPS keeps bad actors from injecting malware into your javascript, your images, even downloaded executables (there is "one click" software out there to inject malware into any .exe download it can find on the network in real time)
HTTPS helps protect against "dragnet surveillance". That doesn't just include bad governments, but also an ISP which might build a profile on you based on your browsing habits.
HTTPS treats information as "secure by default". You won't always know what is and isn't "private" to each person. "phrasegenerator.com" might just be a fun game to you, but to someone else it might generate a phrase that could get someone fired, or worse, based on the content of that phrase.
And using HTTPS everywhere means that it's harder to identify "secure" information from "insecure" information. Breaking one HTTPS connection one time for one person and one server isn't "easy", but it's not impossible. Breaking HTTPS for all connections to everyone every time for every server becomes practically impossible.
The overhead is next to nothing, the complexity mostly abstracted away, and with stuff like Let's Encrypt the "maintenance factor" is quickly getting reduced to "fire and forget".
There's no excuse to not use HTTPS any more in my opinion.
>HTTPS keeps your ISP, your wifi hotspot, a dodgy router, or anything else from injecting ads and tracking information into your pages
I suspect this is the main reason Google is so gung-ho about "https everywhere". Because of the popularity of GA, adsense, Double Click, and Google local caches at every ISP...they already have near global tracking. Closing the ISP MITM hole ensures that nobody else does.
Of course, "https everywhere" is good for other reasons, so not complaining...
i agree, though i'm worried about the security it actually offers and what protocols and ciphers are still considered secure.
Next PCI DSS will require most services to be using (and not honoring lower then) TLSv1.2; i'm not sure about what to do when (not matter of if, but when) that is considering no longer secure, TLSv1.3 isnt ready yet from what i understand.
Now that i write this i wonder, are there btw server/client solutions that utilize PGP? E.g. send requests via encryption with public pgp key server and response back to client with public key of client?
I understand your reasoning, but I respectfully disagree. The reason I think its important that ALL sites use HTTPS is that by encrypting everything "important" data becomes indistinguishable from "unimportant" data. This helps the security of the herd by making it more difficult for an attacker to identify what packets to collect and expend their compute resources on. In addition, anything we can do to normalize the use of encryption in such a way as basic crypto knowledge becomes mainstream is a win for the security of everyone.
The fight isn't about a hacker stealing your passwords. The fight is about every single entity on the Internet being hostile towards user privacy with a strong monetary and power incentive to use any and every inch you give them to take a mile. So let's not give them any inches. Encrypt everything using the strongest cipher suites available and the best methods of identity verification we have. It may not be enough, but it's better than giving up and letting the baddies win without a fight.
And to clarify before the inevitable reply points out that HTTPS isn't sufficient by itself to protect user privacy. Yes, I understand that. But it's an important basic building block for doing so.
So should we all browse through a Tor relay when we're looking at cat pictures? Maybe a new protocol can be invented that allows websites to force us to use one? I tend to dislike these dogmatic rules - you have to be aware of the particulars of your situation and design a solution that fits that. For a lot of sites HTTPS seems to me like overkill. If the environment changes where ISPs are routinely injecting ads etc. into your content streams maybe always-HTTPS will make sense, but the cases of that at this point in time seem relatively rare, not worth wrapping the entire internet in an unneeded software layer.
Actually, yes. Most tor users would be greatful if you used tor to browse cat pics, as it strengthens the entire network by having more people using it.
If you've ever been on hotel wifi that injected content into every unencrypted page load, you might think this was less about dogma. Such things are cheap and easy to do now, and people will do them more and more where they think there is an advantage. Unless they can't.
Then there is the whole MUTANT BROTH program, which maintains a database of billions of intercepted cookies, and can be correlated with other traffic to identify particular engineers behind a corporate firewall for targeting and exploitation, as was done with Belgacom [1]. To thwart such an attack requires encrypting any request that contains a cookie (or, conversely, never accepting a cookie from an unencrypted site).
The problem is that it's really hard to make judgments about what's "security sensitive" for someone else.
> But for a lot of sites (like phrasegenerator.com, one of mine) I'm not sure it makes sense
An attacker could replace your site with anything: ads, viruses, malware, payment prompt. HTTPS also allows you to add HTTP2 support which can lead to faster page loads.
Given that there's a lot of low effort ways to add HTTPS (e.g. Netlify, Cloudflare, Heroku), the question should really be why go with HTTP 1?
For me, the possibility of faster page loads with HTTP/2 is the main selling point, plus the usual TLS advantages of guaranteeing the integrity of page content, etc.
HTTPS would at least prevent shady WiFi providers from messing with your content before it gets to the user e.g. injecting ads into your content (or even replacing your own ads, if you use them).
I think this comment about the extra hassle is worth thinking about in the context of an open internet.
If you're a generic human being and want to do something like create a little blog or share some artwork online or whatever, any little extra barrier or complexity is just a big pain in the butt that discourages you from doing that. I think this trend towards https is a factor in subtly encouraging people to just sign up up for one of many walled gardens out there in Facebook/Tumblr/Twitter/whatever because it's a lot less work and that's a bad thing for a free and open web in the long run.
I agree with avoiding a VPS when you can. I think e.g. Digital Ocean + Let's Encrypt is just way too low level to be toying with when there's lots of hosting services that let you work at a higher level of abstraction where they deal with all the SSL details for you (e.g. GitHub Pages, Netlify, Heroku).
GitHub Pages doesn't handle SSL for custom domains. The usual workaround is to throw Cloudflare in front of it, but the Cloudflare->GitHub link is still out in the open, and unencrypted.