Hacker News new | past | comments | ask | show | jobs | submit login
Why we use HTTPS for every .gov website we make (gsa.gov)
233 points by konklone on Nov 14, 2014 | hide | past | favorite | 82 comments



And they do HTTPS right (at least on 18f.gsa.gov) - it's pretty rare to see an HTTPS site get an "A" grade from Qualys' SSL server test. No weak ciphers supported, no support for SSL v2/v3 (only newer TLS), most ciphers support PFS, etc: https://www.ssllabs.com/ssltest/analyze.html?d=18f.gsa.gov

Edit: not all their sites are doing HTTPS right :-/ 18F developed https://www.notalone.gov and this site has quite a few security issues: https://www.ssllabs.com/ssltest/analyze.html?d=notalone.gov&...


Thanks! But actually, it pains me that we're still showing a SHA-1 cert.

Because bureaucracy, we're still waiting on a replacement SHA-2 cert, which once installed will bring our score up to an A+ like this:

https://www.ssllabs.com/ssltest/analyze.html?d=staging.18f.u...

We're changing up processes internally that will speed things up. The subject of a future post!

EDIT: And yeah, we're aware of the notalone.gov issues. There's some legacy infrastructure there, but we'll try to get it fixed up. Our future sites will be stronger.


Do you have to maintain backward compatibility with Windows XP pre-SP3 for gov sites? That would require SHA-1 certs and SSLv3.


I don't know about 18F's perspective but most of the places I've heard about upgraded off of XP rather than deal with it not being supported (waivers aren't a given, require work and paid support hits already-tight budgets). Not even installing SP3 would be even harder to justify and that's one of the areas where the increased security panic is actually useful.


Every site is going to have a different sort of userbase, so it's tough to make sweeping proclamations about support. I don't think we'll have trouble using SHA-2 in our deployments.


Yeah, I'm sure there are exceptions lurking around – the world is too big not to be – but I'd also be surprised if those lingering remnants were in widespread, internet-facing use.


Yea, MS does offer custom support for older service packs too but it has been years since XP SP2 left support.


I know this is slightly off topic, but I've been wanting to ask this for a long time. Can you make https://www.annualcreditreport.com into a .gov HTTPS A+ grade secure site? It's so easy for people to get scammed by fake sites, it'd be great to say, "go to creditreport.gov - it's the real one"


They are not a government website, but I agreed with you. They should get an official URL. The .com makes it a easier target for scams.


you can just write a letter to the 3 companies (which are non-government too) and they will send you your report. That is essentially what these websites do on your behalf, just automate the process...


Annualcreditreport.com doesn't do it on your behalf. It guides you through the steps with each CRA so that you can get your free annual report that the law gives you. It's the only "official" site other than each agencies site. There are hundreds of others that often play on the users confusion/ ignorance. This is why I agree that AnnualCreditReport should be a .gov site to help it cut through the noise.


Write a letter? What am I a farmer?


No. Just a hipster.


Credit reports are scams. The entire premise that you have to pay a company to come up with a metric of how trustable you are in and of itself is a scam.


> These properties are useful for all of our applications, all of the time — not just when passwords or personal information are involved.

Bingo. HTTPS only is a much better way to operate web properties these days -- the only downside is a fairly marginal investment of money and time.

What's interesting is to see the headaches people experience with mixed content warnings/errors before they move to the Light Side of the Force (I've been at places where this is a substantial undertaking/roadblock).


Yes! Every single one of these issues is, on its own, enough reason to go HTTPS-only. I finally did it a couple of months ago, and it wasn't that hard to learn, and now that I know, it's just part of my setup process.

Another reason to add to the list: not having to repeatedly ask the user for permission to access advanced HTML5 APIs. Stuff like streaming media, GPS location, background notifications, and speech recognition. If you're making an HTML5 SPA, HTTPS is essential.


Those aren't the only downsides. The other downside is dramatically increased first byte latency due to the handshake.


> Those aren't the only downsides. The other downside is dramatically increased first byte latency due to the handshake.

The latency increased, but it doesn't have to be drastic. Ilya Grigorik has a great piece on how to get the handshake down:

https://www.igvita.com/2013/12/16/optimizing-nginx-tls-time-...

We turn on as much of that stuff as we can for our servers, and I'm pretty happy with it.


A lot of that can be mitigated with SPDY/HTML2 support, since the channel can be multiplexed for additional content requests.


I'm pretty sure you meant SPDY/HTTP2.


THANK YOU for publishing your configs! I only host a little unimportant blog site, but I want to do my part to be current. As someone keenly interested in this stuff but lacking the time to research it myself, a well-commented Nginx config is enormously helpful.


Very glad to help. :D For those reading this, the best URL is https://github.com/18F/tls-standards/blob/master/configurati..., which is heavily based on the nginx config I maintain personally at https://gist.github.com/konklone/6532544


I'm your counterpart at another agency. I'm glad to see other agencies are not doing FIPS on their websites (Which would be RHEL with mod_nss only). I'm a bit confused though, last I looked FedRAMP still required it. Have the mandates been changed?


18Fer here. Before I answer in greater detail, why do you think FIPS requires RHEL with mod_nss only? I don't see why an OpenSSL in FIPS mode wouldn't fit the bill too.


Regardless of your detailed answer, FIPS crypto requirements are a topic of some amusement in professional cryptographic and security circles, and anything you do to push back on them will be a help basically to humanity.


Nuke it from orbit. It's the only way to be sure. :)


I am in 100% agreement with you, FIPS is bonkers.


https://access.redhat.com/solutions/95213

Dated May 28, 2014

If you don't have an account:

"So at this moment we cannot say whether mod_ssl is going to be a valid crypto module in FIPS mode under RHEL-6 although this is the intent."

That may have changed, and contradict other sources on redhat.com. There are a lot more KB articles on FIPS since the last time I really dug into it over a year ago.

Edit, yes, it looks like it was mod_nss only until the release of RHEL 5.9 last Jan. RHEL-6 was ongoing, but it looks like they claim mod_ssl will work now in other places in the knowledge-base.

You can't even use FIPS in Ubuntu/Debian at all: https://bugs.launchpad.net/ubuntu/+source/openssl/+bug/95001

FIPS is just one area where it seems like there's a lot of contradictory information for federal IT. After doing the FedRAMP dance, and reading things to the letter, we stopped working towards it and partnered with one of the vendors that got it first. Their remote access was plain text VNC, 8 character password max. I would say I was surprised the paperwork matters more than real security, but I wasn't.


So your post makes no sense. OpenSSL provides the FIPS portion directly. You can just download and compile it according to the instructions and you are now FIPS compliant just awaiting a certification. You can do this yourself, you don't need RedHat or Debian to do it for you.

This is one of the problems with Government and hopefully something that will change. All that is done is piece together bits of what outside vendors have put together and the piecing together is normally done by contractors.


So you think recompiling OpenSSL from scratch, in doing so, deviating from the upstream vendor's supported binaries, and the dependency problems with updates it will cause, just to support a mostly smoke and mirrors standard is a good idea? I'd don't really think that's a best practice in commercial or government IT.


Exactly what the American people have come to expect from the government. Unless its been gift wrapped by a contractor they lack any ability to do anything technical.

You make a RPM and you deploy it like you would any other package. Yes it is a best practice, in fact the people at Red Hat do the _exact_ same thing, the difference is they have the technical capability to make those kinds of changes, as do most people in the commercial IT sector. The government is the one place where they call it IT when its really just glorified procurement.

However thats not even the problem as you stated its supported just fine. It has been for almost 9 years. The bigger issue is there was a perception is wasn't and instead of working to see what reality was people just did nothing.


> Exactly what the American people have come to expect from the government. Unless its been gift wrapped by a contractor they lack any ability to do anything technical.

Exactly the expectations that 18F would like to change.


"just awaiting a certification."

You say that as if the certification part itself is remotely quick, predictable, or easy.


Theres a couple of issues that rarely are mentioned:

- for very high traffic websites HTTPS costs a lot of money. .gov sites are not very high traffic.

- When you have such high traffic you also have a bunch of old winxp and similar clients. These dont work fast on HTTPS and dont work without sslv3 and what not.

So while HTTPS with safe settings works in most cases it doesnt work in all cases.


Few sources that claim that TSL is almost free today.

https://www.imperialviolet.org/2010/06/25/overclocking-ssl.h...

https://istlsfastyet.com/

http://blog.codinghorror.com/should-all-web-traffic-be-encry...

Although you may argue that if you are a CDN, encryption can be a significant portion of your costs.

Also, Jeff Atwood writes following:

> Of course, there's no reason to encrypt traffic for anonymous, not-logged-in users, and Twitter doesn't. You get a plain old HTTP connection until you log in, at which point they automatically switch to HTTPS encryption. Makes sense.

Today we know that it is no longer a valid point. TOR users can be deanonimized by injecting traffic into plain HTTP connections. Upgrade to HTTPS seems to be fixing that. So we really should use HTTPS by default and HTTP only in very well articulated cases.


No, SSL/TLS is not expensive. I've been running a moderalty high traffic website in EC2 for a few weeks. It is peaking at ~250k requests per minute per c3.8xlarge instance using only 40% CPU. 100% of the traffic is over SSL/TLS and the response size is less than 100 bytes, so most of the overhead is in handshaking (which is much more expensive than just sending data). With a larger response size, the CPU utilization would be even lower.


Get a CDN and you'll see the price. If you don't need a CDN or have your own multiple international pop's then you're not really all that high traffic indeed.


> for very high traffic websites HTTPS costs a lot of money. .gov sites are not very high traffic.

.gov sites can be very high traffic. Think weather, social security, health care, immigration, visas...

> - When you have such high traffic you also have a bunch of old winxp and similar clients. These dont work fast on HTTPS and dont work without sslv3 and what not.

SSLv3 only kills IE6, which is doable. And it's okay if very old clients work slowly with HTTPS -- those clients have far more problems than just slow HTTPS.


very high traffic is google.com and similar sites. gov sites are not the tiniest sites but they're dwarfed by anyone getting million hits per hour 24/7 (yes google is even way more than that obviously)

Killing IE6 and some others (java clients, etc.) is not always doable and thats the point. its doable for many but not all.


Try to go to https://www.irs.gov/

Probably not the best way, but earlier this year I spent about an hour on the phone with CC of IRS telling them that irs.gov should be behind HTTPS with proper certs. They mentioned that all the pages with sensitive information are behind HTTPS. Simple documents and forms are not that important I guess. As you can see it's still an issue.


gsa.gov seems to be only HTTP as well.


You claim the second most important reason for using HTTPS is my privacy, yet you share all pages I visit with Google analytics.

As long as 90 percent of all websites keep using analytics, authorities only have to go to one place with their warrant to get more or less your complete surf history. And by adding SSL to your site you make things like Privoxy useless.

I think its strange no one has pointed this out in the SSL hysteria going on right now.


> You claim the second most important reason for using HTTPS is my privacy, yet you share all pages I visit with Google analytics.

I think that's a totally fair point. Our use of Google Analytics isn't changing any time soon, but we do plan to add a third-party disclosure page that makes it clear what third parties have some window into our visitors' browsing:

https://github.com/18F/18f.gsa.gov/issues/293

This includes otherwise invisible things, like our host, Amazon, and (until we implement OCSP stapling, which is happening soon) our CA during revocation checking (in some browsers).

FWIW, we do turn on the Google Analytics anonymization flag, which instructs Google to chop off the last IP triplet before they write the data into their database. Of course, that depends on trusting Google to keep their promise, but it's something.


You have a valid point. But I guess using a Javascript blocker like NoScript will solve this problem?


I really don't think .gov cares all that much.

Over a year ago, I came across a vulnerable .gov site that was first reported in a news.com article from 2005! Sadly, it's still vulnerable today. The author of the original piece states they contacted the Department of Labor back in '05, who responded that they were 'working to address the issue.' Before I wrote the blogpost, I tried getting some attention too, but never got a response. It's now been 9 years, and the site is still vulnerable.

http://jarmoc.com/blog/2013/10/14/open-redirect-in-gov-for-e...

.gov sites are littered with trivial vulns that never seem to get addressed when reported. HTTPS is great and all, but it's far from the only thing that needs to be done in .gov.


(18Fer here)

That's one of the things we're very much focused on fixing. While we can't go in and change every federal government website out there, we can work to ensure that the security of the platforms we are working on is tight as possible. 18F is working with a number of agencies (see https://18f.gsa.gov/dashboard/ for the full list), and our hope is that we can be a force multiplier in security best practices throughout government.

Furthermore, we take responsible disclosure very seriously, and welcome any feedback through our email (18f@gsa.gov). We should probably take this up a notch and have a dedicated security inbox that goes directly to our core security team.


I didn't mean to imply that 18F is responsible for everything in the .gov namespace, and I have no reason to doubt that you guys take security seriously.

The problem, from my perspective, is that NO ONE is responsible for everything in the .gov namespace. Trying to sort out an appropriate security contact, especially for relatively minor vulns like this (though it's likely also reflected cross site scripting on benefits.gov) is a nightmare.

Another example involves some work I did while at a previous employer: https://web.archive.org/web/20131114050720/http://www.secure...

I definitely think you're right that a dedicated security inbox would be helpful. Even better would be a dedicated .gov-wide security inbox. I don't know how difficult it would be to establish something like that, but it would give security minded folks who care about improving things an outlet for sharing these sorts of details, and hopefully help get things fixed.


> The problem, from my perspective, is that NO ONE is responsible for everything in the .gov namespace.

Actually, the Department of Homeland Security is responsible for the security of .gov (at least in theory). I don't think they have unilateral directive authority to enforce that, however, which is a big problem. But maybe reporting the issue to DHS (whatever their cyber security division is) can rattle something loose.


US-CERT handles information security incidents and reports within DHS: https://www.us-cert.gov/report


One of the big things you have to remember is that the government is not only large and inconsistent (imagine a F500 with a ton of acquisitions and you're on the right track). I'd put pretty good odds that there's someone who cares at most of those .gov sites and they've been trying to get their management to authorize a fix for many years.

In many cases, they've been prevented from hiring technical staff for a decade or longer so it's not uncommon to find projects which are entirely on life-support, subject to a contract which would require expensive (in both time and money) amendments, etc. The person responsible likely knows about the problem but have zero ability to either fix it directly or get someone else to do.

The other thing you tend to find are poorly considered policies with serious unintended consequences – i.e. for alleged performance/security reasons all traffic is required to go through an agency-wide load-balancer but they didn't buy the SSL module / it's already at capacity and nobody is willing to authorize an exception. (Bonus points if the old single-point-of-failure is scheduled to be replaced by a shiny new SPOF and no changes are allowed until that overdue project ships)


A bit off topic here, but I just wanted to say that it's really cool to see some government employees on HN, and replying and adding value too.

I sometimes fall into the perspective that HN is mostly a community of startup and tech sector people, so it's good to be reminded that there's a wider community here.


As a former gov employee I know for fact that alot of gov sites have to be vuln and poorly maintained for many reasons.

i.e https://www.ssllabs.com/ssltest/analyze.html?d=elis.uscis.dh...

I worked with alot of out of touch people who instead of learning something new would rather just do what they know and keep the status quo. What made it harder was alot of these people were smart and new how to play the game to keep the work favorable to what they already know for various reasons usually contractual. People were very dug in and protective of their piece of the pie. It made collaboration difficult because everyone was seemingly an expert when in actuality they were clueless and googling like everyone else.

I wish you good luck 18f you are gonna need it.


Pretty amazed that 18F exists, and impressed -- optimistic that projects like this are going to bring government IT forward. There are some things government does relatively well; IT has never really been one of them. Maybe that can change.


It's pretty sad, government actually used to be the innovator of IT. The Navy is still doing personnel management (albeit behind a ton of Web-based wrappers) on systems written by government programmers back in the 70s, and the government used to drive a lot of other advancement of computing and IT.

But the government is inescapably bad at IT today and seemingly has been since the World Wide Web was first launched.

I hope that can change too, but I don't see that happening as long as the divergence between civilian sector pay and government benefits remains so crazy. This has sucked much (though not all) of the best technical talent (and worse, tech-savvy managers) out of the public sector. Without a critical mass, all of your really skilled geeks in government find it almost impossible to advocate for the right answer, and the policy makers and supervisors are usually not able to tell the difference between the right answer and other, non-feasible, courses of action proposed by the unskilled geeks in government.

To top that off, when the government can't build things using civil servants alone they have to fall back to contractors, but our contracting processes are so insane that it doesn't surprise me at all that our government contracting officers find it impossible to hire and oversee the best companies to deliver and maintain a working project. There have been successes but IMHO when that has happened, it's because the contracting officer has blindly stumbled into a decent contractor.

I'm excited for 18F because a lot of the problem comes down to educating our public sector middle management into what's possible and what's not possible. By demonstrating by example and by explanation of what's already possible in current rules, 18F can boost the stock of the buried geeks hiding out in many of our government agencies.


> By demonstrating by example and by explanation of what's already possible in current rules, 18F can boost the stock of the buried geeks hiding out in many of our government agencies.

You nailed it.


For anyone "hearing" about 18F for the first time in this thread, they are pretty legit. It's not just another typical gov agency with a little lean startup lingo and a cute site. That was my first guess when I first read about them, but I was wrong. A quick cruise through their github shows that. However, like someone mentioned, they will need more good luck than a leprechaun with a rabbit foot keychain if they are going to bring true innovation to the federal government. There are so many points of failure within the process of working with other agencies. I'm hoping for the best!


Do you have to encourage your colleagues to use it? I've enabled HTTPS on all our sites yet when the links are distributed they are the typical HTTP.

I partially blame apps like word and Gmail. When you type in www into a Word document, for example, it will convert it to a link without the https. I've enabled some of our servers to automatically redirect to https but when you are working with a few dozen web servers it gets cumbersome.


> Do you have to encourage your colleagues to use it? I've enabled HTTPS on all our sites yet when the links are distributed they are the typical HTTP.

Yeah, sometimes. But that's also why we enable HSTS on our stuff, so that that's not relevant. We also just pushed the first .gov domains into the Chrome HSTS preload list a week and a half ago:

https://chromium.googlesource.com/chromium/src/+/af1870543d1...


Also, take a look at the US Digital Services Playbook. Pretty impressive, right. The US Govt is stepping it's game up! I wish the administration would do a better job of touting these big steps on the right direction. Can't have it all I guess.

http://playbook.cio.gov/


GSA websites don't just require HTTPS - they require each individual who accesses the websites to get a digital certificate from ACES/IdenTrust.

You have to submit a paper form, two forms of photo ID, references...

To top it all off they snail-mail you the access code to then go online and download your digital certificate. Very archaic.


I believe that if you setup your infrastructure correctly on your end you can still create a personal cert for your own use from kerberos valid for a week for all gsa services. The paper identrust was more of a back-up method for those that hadn't.


Are you referring to developers within the GSA, or to internal users? The public certantly doesn't need to go get a client certificate to access these sites.


Let's say you want to sell products to the Veteran's Affairs department, through the GSA. To update your product catalog on their site, you need a digital certificate.


Oh, ok. Yeah, I can just imagine the bureaucratic nightmare that would involve.



Is it safe to use SPDY implementation at that point? I've seen critical flaws in the past...


The largest security issue with SPDY was CRIME, which was specifically caused by a gzip-compress-then-encrypt problem with the HTTP headers (which can contain attacker controlled info).

The switch to HPACK for header compression resolved this. SPDY/HTTP2 has no currently known security flaws.


Exactly - and that's why we have SPDY header compression disabled on our nginx config:

https://github.com/18F/tls-standards/blob/master/configurati...

When SPDY 4 or HTTP/2 make their way into an nginx module, we'll upgrade, and turn header compression back on.


Er, so I take this back? Ilya Grigorik rang in and told me I wasn't distinguishing between request and response headers:

https://github.com/18F/tls-standards/issues/24


How do you balance this with the network defense guys who want to break and inspect everything?


You had to waste a lot of IP addresses in the past because of Windows XP and IE8. :-(


Sadly, we still do. Requiring SNI remains a tough sell. That's one of the biggest limiters in our HTTPS setup on saving costs and being better stewards of IPv4 space.


Why is that? Do you still have IE8 users?


Yes. Not just inside government, either.


"HTTPS has never been faster" the same can be said of marathon runners, but millions of stubborn people are still using their car for 40ish kilometer distances.

Or maybe the sentence is a disaster...


A mid-range motorcycle is cheaper than a car and could make that trip in 9 minutes or less, if security isn't a concern.


Considering that latey there's been a pretty strong shift to using HTTPs everywhere, I don't see how using it on .gov site needs an explanation :)


> Considering that latey there's been a pretty strong shift to using HTTPs everywhere, I don't see how using it on .gov site needs an explanation :)

If only the .gov space were shifting at the same rate! That sense of inevitability and momentum you sense in the private sector is not so widely present inside the US government. Posts like this have two audiences: outside folks like you, and my colleagues around the rest of the .gov world.

For reference, 18F got the first .gov domains added to the Chrome HSTS preload list less than 2 weeks ago:

https://chromium.googlesource.com/chromium/src/+/af1870543d1...


Well you're moving faster than all of the major banks in America. All of the top 5 or so banks that I checked don't bother with HSTS.


You'd think that would've happened years ago, after all the "protecting our infrastructure" and "cyberthreats" rhetoric from certain parts of the government.


Sadly, it's not rhetoric.

Also sadly, yes you would think some progress would have started years ago.


> Citizens expect government websites to be secure, trustworthy, and reliable.

Yes, just like those involved with government. Ha!




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: