Hacker News new | past | comments | ask | show | jobs | submit login
Promiscuous cookies and their impending death via the SameSite policy (troyhunt.com)
223 points by tomwas54 on Jan 4, 2020 | hide | past | favorite | 78 comments



Just a nit on this point -

"Quick note on Microsoft's implementation: their first shot at it was buggy and caused the "None" policy to omit the SameSite cookie attribute altogether"

This isn't a bug, just unfortunate naming and a lack of prescient engineers. When it was written, None was not a valid enum value (and frankly still isn't) and the default was Lax. To remove Lax, you set it to None in dotnet, which did the standard behavior at the time - emit nothing.

If every framework implemented every random extension proposed to every standard, we'd be in a very messy world with half-baked and sometimes contradictory standards implemented. Special casing because it's Chrome doing the breaking change to the standard/Internet is not a precedent I want to see.


>and frankly still isn't

It looks to me like it was added on April 9, 2019:

https://github.com/httpwg/http-extensions/commit/fa624b1358b...


Note that this is the draft, and 6265bis-03 expired in October. While it's presumed that this proposal will be accepted and become the new standard, a single commit does not make the standard.


Chrome seems to have made one change recently -- maybe related to this? -- that made it so YouTube videos embedded in other sites no longer have you logged into YouTube.

This means that if you have YouTube Premium, where you pay $10/mo so you don't have to see ads on YouTube videos, you still see them when the video is on someone else's site. Oddly, it still works as it used to in Brave.

Since this is sort of cross domain cookie related (YouTube uses an iFrame typically when embedded, whether embedded by pasting html or by using the JavaScript API), does anyone know why this happens? If you ask YouTube Premium support, their canned answer is "we don't guarantee no ads except on the YouTube site." But they can't explain why it works with no ads in non-Chrome Browsers.


Is that cross-domain cookie related? I thought by virtue of being in the sandbox that is the iframe it has access to the cookies associated with the domain(s).

It sounds like an excuse to serve you ads, honestly.


It would break other features (like the "Watch Later" button) too.

According to Occam's Razor, this is very likely to be a bug rather than a conspiracy against YouTube Premium customers.


I agree it is probably a bug, partly since other browsers aren't affected.


Worked for me just now (tested a YouTube video on Reddit in Chrome and watch later had my account name on hover + no ads).

You sure you don't just have 3rd party cookies blocked? Could be a staged rollout of something but 3rd party cookies is usually what causes it for me since I whitelist sites for those.


Are you using stable Chrome or Beta/Canary? Have you modified any of your Chrome flags?


Regular stable Chrome. And the issue is the same on my mac and my chromebook.


I work for a company whose endpoints end up being embedded in iframes in external systems, and this change is currently causing no small deal of heartburn for several folks here.

I really don't like that the Powers That Be have decided to make this change, for many reasons:

1. Changing the default behavior of a thing on the 'net that's been around for so long. Like the article says, I can't wait to see the various and sundry things that are all broken as a result.

2. To set SameSite=None and get back to the old behavior (more on this in a second), you need to do... user agent sniffing, because some browsers (as the sibling comment by swang says) will choke on None and fall back to Strict. Great, so sometimes set SameSite=None, sometimes don't set it or else things will break. eye roll

3. As the article says, SameSite=None is only allowed on Secure cookies, which means you actually can't get back to the old behavior. Now, my company has been telling people for years to stop using HTTP (and customers have to contact support to even get HTTP support enabled). However, there are a few enterprise-y holdouts. In several cases, we've had to go be the bearers of bad news. Albeit, there is an undercurrent of glee in finally forcing them to stop being bad stewards of data (assuming they don't just enterprise-policy it away), but still, from a business perspective it's very frustrating.

So, in sum, it'll break stuff that's not updated (my guess: a lot of stuff), setting SameSite=None requires a user-agent-sniffing hack, and even setting SameSite=None is not a complete solution if you're using HTTP for some reason.

And for what, exactly? This would've been nice in 1995, but it's a bit late now. Though, I guess maybe twenty years from now we can rip out (or stop writing) some anti-CSRF code or something.


You don't fix security problems with the Same Origin Policy because you're trying to free up some anti-CSRF code to make developer lives easier. You do it because mistakes with that anti-CSRF code result in vulnerabilities, which harm users, who are an externality both to developers and standards authors. And those mistakes happen all the time.

The SameSite change we're talking about decisively mitigates most CSRF vulnerabilities. Once widely deployed, it probably kills the bug class, turning it into another bug bounty eye-roller like ClickJacking, rather than what it is now: a bug that is routinely exploitable on significant sites. It is more than worth it; it's one of the smartest things the browser vendors have done in awhile.


You're not wrong, the universe with SameSite available is safer than the ones without it. The migration is just frustratingly painful for those of us stuck between slow framework updates, unsupported browsers, and customers who are slow to upgrade our software. And for those of us who have csrf protections, it's effort spent just to maintain the status quo ante.


I like making the internet less dangerous to handle, so to speak, but there are always trade-offs, and I'm not sure that changing the default brings enough benefit to warrant all the pain.

To me, it seems like it would've been better to use this energy to push the community into opting into SameSite={Lax,Strict} by default (make it a 'best practices' thing). Get it added to automated security tooling, make the browser console print messages for cookies missing SameSite, etc.

Albeit, it is much harder to reach all the web devs in the world, and so some sites may not opt into SameSite and be bitten by CSRF, but that is in line with e.g. X-Frame-Options / Content-Security-Policy. It's not ideal, but it preserves backwards compatibility, which is a thing I value very, very highly.


> turning it into another bug bounty eye-roller like ClickJacking

When did clickjacking get mitigated by default by browsers? As far as I know it’s still up to websites to prevent framing explicitly.


I'm not saying that CJ has been mitigated by default the way CSRF is poised to be, but rather that it's very rarely exploitable, which is soon to be the case for CSRF as well.


20 years is too pessimistic. It was nearly accurate exactly once, which is where IE6 remained dominant for nearly a decade and a half. As far as anyone can tell, there will probably not be another case where a single outdated version of a browser contains a significant part of the marketshare. Most folks are now using browsers that update automatically, be it Edge, Safari, Chrome, Firefox or even numerous niche browsers.

There of course remain some demographics where outdated browsers live on. Like I'm sure the last version of Chrome that works on XP has some marketshare in China.

The worst case I can really think of is older Android, but let's at least keep in mind that Android has only been in people's hands for a decade, and practically speaking most developers are not very concerned with supporting Android Browser from really old Android releases. IIRC, Cloudflare's free tier already doesn't work with Android Browser from 2.2 because at that point it didn't support SNI. Not to mention, Android Browser marketshare is quite tiny at this point, as I believe Chrome or Samsung Browser is usually the default browser on modern Android handsets.

I would guess as long as this move goes off without a hitch and isn't reverted it will likely be the safe reality in under 5 years. My guess is not particularly special, but I mean, the internet is already unusable on really old browsers. For better or worse, the world we live in doesn't have much tolerance for older software versions. Sometimes this is for pretty good reasons (such as improving cryptography standards.)

I will admit I have a strong aversion to breaking browser support for anyone pretty much, but the security of 95% of users is worth forcing 5% of users to update their damn software. It's unfortunate that the situation can't be a bit more backwards compatible, but I think that's just how things go with certain hard problems. There's never not going to be some compromise, and it doesn't seem likely any time in the future will be particularly better than right now.


I agree on the dislike for the change, although it's not a big hassle for us at work.

The bigger headache for me is updating side-projects: I get little enough time to work on them and when I do have time I want to be doing interesting stuff, not jumping through bureaucratic hoops.

It's just no longer the case that you can put anything on the web, apart from basic static content, and have any hope it'll still "just work" in 5 years time because a bunch of do-gooders and busybodies at Google, Mozilla, Apple and the like are constantly fettling behaviour that's been standard for years with sometimes debatable justification.

This is one issue. Others I've had to deal with in the past 2 - 3 years that spring immediately to mind: blocking access to audio context, blocking access to device orientation[1], (lack of) reliable full screen support on iOS.

It gets tedious.

[1] This one caught me completely off guard: I'd had no idea any change was being made here until some time after it had happened.


Backwards compatibility at all costs is a main driver of insecurity.

This change is a good example where the security benefits clearly outweigh the downsides.


We're talking about changing a default behaviour that will affect literally every website that uses cookies. That's a trade-off where I'm not sure the benefits do clearly outweigh the downsides.


No, that's not true. The new default behavior doesn't change the way cookies on ordinary websites work at all; that's why the change was viable. It affects third-party cookies.


Does it affect all iframed websites (e.g., game sites iframed by an aggregator like Kongregate)?


If they use an actual iframe, most likely not. But if they're making Cross-origin requests from JS, probably.


The article discusses iframe behaviour


Adding annoyance to this change is that Safari has a bug where if you send a value with an invalid parameter in SameSite, ("lax" being considered an invalid value), Safari will default to Strict (rather than None) and thus there is another check developers have to handle, which is to user-agent sniff for a Safari/Webkit browser, then explicitly send "None" or not send the SameSite value of "Lax"

I'm pretty sure the WebKit team is aware of it but I don't recall a timetable for a release that addresses the issue, so up to Nov 2019 (when I last look this up) Safari still has this issue.


Hey, there are some important mistakes in this warning:

1. The value that is invalid for older browsers (including older versions of Chrome!) is None, not Lax. It is always (as far as anyone knows safe to explicitly set SameSite=Lax in all browsers, assuming your site is ready for that.

2. The latest Safari (v13) has changed their behaviour to match the latest spec.

See this article for details on detecting/dealing with it: https://www.chromium.org/updates/same-site/incompatible-clie...

TL;DR: old (but not too old) Chrome responds by rejecting the cookie entirely (which Google says was an valid interpretation of the spec, at the time of those versions) and old (but not too old) Safari responds by interpreting the None value as Strict (I think there is some debate on whether the spec allowed this back then, but at this point it doesn't matter/I don't care).


The latest safari may have changed the behavior. But the bug is tied to the OS version, and Apple has said they won't backport the new behavior to iOS 12 or Mac OS 10.14. So people who can't or won't (for example users of old iphones) upgrade the OS will not get the fix. So user-agent sniffing will probably be necessary for years.

> See this article for details on detecting/dealing with it: https://www.chromium.org/updates/same-site/incompatible-clie....

Yes, they recommend using a few dozen lines of user-agent sniffing code. Despite the fact that user-agent sniffing is generally considered bad practice.


> So user-agent sniffing will probably be necessary for years.

Only if you set None (either to opt-out or to do a None/Strict pair). Setting Lax doesn't require sniffing.

But yeah, that seems like a safe bet.


Here is the bug that I'm referring to: I think the link you posted to is the same bug but doesn't mention the additional issue. https://bugs.webkit.org/show_bug.cgi?id=198181

According to that bug report it says the issue was fixed in Safari 13 and also iOS 13 like you said, but caniuse says there is still an issue.. I'm on 12 right now and can't update to check atm. https://caniuse.com/#feat=same-site-cookie-attribute


Yep, that's the issue. I think I see the confusion now (I stand by my original comment).

SameSite=Lax was never an invalid value, so it was never mishandled by browsers (very old browsers gracefully degrade to treating it like None, which is as good as possible). In the original spec there was Lax, Strict, and unspecified (i.e. the Set-Cookie header didn't have a SameSite attribute, the default behaviour) but, critically, no None.

Browsers developed around that time that treated unexpected values as equivalent to unspecified/what we now call None (e.g. Firefox) turned out to have picked a more forwards-compatible approach. Browsers like Safari and Chrome took stricter action for unexpected values (the idea here is a vague "secure by default" feeling) but it's awkward now that the default is changing from what is (now) called None, to Lax.

In that issue, consider the title "Cookies with SameSite=None or SameSite=invalid treated as Strict" redundant: None was an invalid value according to Safari at that time, which wasn't wrong.

SameSite=Lax is 100% safe to set (assuming your site is ready for that). You only need to browser sniff if you're considering setting SameSite=None.


I think OP was saying that Safari treats "lax" as invalid, while "Lax" is valid. Note the casing.


Interesting theory. I don't think it's true, though. The original spec mandated case-insensitive comparisons and it looks like WebKit has always been doing that to me.


100% of the warnings i see are ad cookies from google. We get it google, you no longer need cookies to track us


That might be because those cookies have been stored for a while. I get quite different results when I check one of my side projects in a normal browser window versus incognito. The latter results in far fewer warnings, and the majority are no longer Google properties (ironically a couple are, but it's much better).


This won't stop cookie-based tracking though, they can just set "SameSite=None" with their cookies


Wouldn't that make it trivial to block them all?


You mean block all SameSite=None cookies? They have legitimate uses too.

Consider that SameSite=Strict even breaks cross-origin links (<a> tags): if a 3rd party site links to you and a user clicks that link, the GET will be sent without cookies.

To get value out of Strict for typical sites the new pattern is to have two cookies: one is SameSite=None and allows you to do GET/HEAD/etc. requests ("read-only operations", assuming you are following those parts of the spec) and one that is SameSite=Strict and allows you to do POST/etc. ("write operations").

If https://evil.com adds a link to your site (an <a> tag) you can allow deep linking by only checking for the None cookie. The strict cookie won't be sent for <a> tags. But POSTs/form-submissions, and any page/resource you don't want to allow deep-linking for, you would check for both the cookies.

I've seen this pattern referred to as "reader and writer cookie pairs".

---

This really is specifically aimed at killing CSRF attacks. It's not about tracking either way (it's orthogonal to that).


Why None instead of Lax? The uses cases you mentioned for the None cookie seem like they would still work with a Lax cookie.


Ah, good point. So it depends on your site. Some sites need to do things like serve embeddable content or be an OAuth identity provider, etc., and SameSite=None is required in those cases. Sorry for not being more clear about that.


In Firefox at least, it's been possible to block third-party cookies for... as long as I can remember?

And a sensible security move - it protects you against XSRF protection with very low usability cost.


I like the idea behind the change, but it's the first time I've heard about it (and I assume I'm not alone), and it's being deployed in less than a month?! Seriously Google ?


It was first announced last May: https://blog.chromium.org/2019/05/improving-privacy-and-secu..., which is nine months before deployment. There have been warnings in the developer tools for a while about the case where behaviour is changing.


I dont mean to be dismissive, but it has been long announced, discussed, and noisily in the console.

My honest question would be how you've missed it (because I'm assuming your missing means others would also reasonably miss this) but I have no idea how you could know the answer to that.


> and noisily in the console.

This is true but not necessarily helpful because what the console is often noisily complaining about is cookies from Google properties and the like.

For many cases where I'm not making use of those cookies myself that's simply irrelevant... noise: what I need to understand are changes required for my own cookies, on which the console has remained silent. Also, not something I'm going to be paying attention to if I'm debugging something unrelated.

(I'm not saying the console is a bad place to show these warnings - far from it - but there are plenty of reasons people might not spot them.)

I found out about the changes a while ago through HN but even that was months after the announcement was made. I don't closely follow announcements from Google simply because the vast majority of them aren't relevant to me. That being the case it's quite easy to miss things, or find out about them further down the line via another source.


I understand the complaint about noise, but the message is fairly explicit as to what is changing and what you need to do and where to go for more info:

"A cookie associated with a resource at http://google.com/ was set with `SameSite=None` but without `Secure`. A future release of Chrome will only deliver cookies marked `SameSite=None` if they are also marked `Secure`. You can review cookies in developer tools under Application>Storage>Cookies and see more details at https://www.chromestatus.com/feature/5633521622188032."

A quick google (ha!) check shows articles from plenty of development and security blogs (i.e. not from google direct) going back to May (though a LOT seem to be from the last few months, not sure if that's because chatter picked up or because Google is giving me more recent results, and I'm too lazy to experiment - I definitely heard about it from multiple sources before the original impact date in Oct)

Focusing on the point - Obviously this is a change Google should give "enough" notice for (both time-wise and breadth-wise). What would you recommend them doing differently than they have? At the end of the day, I'm not really sure what they can do that they didn't do - indeed, since they delayed the original release, there's a real risk of people ceasing to pay attention if you delay too much.

I'm asking out of curiosity, not accusation.


No stress, and I do get it, but it's bound to happen that people don't find out.

My issue with the console warning is that, as explicit as it is, unless you know at least some of the background it's not immediately obvious why it's relevant to me as a developer of mysite.com.


I'd assume plenty of people will completely miss any news of it.

A lot of sites have no developer actively working on them. Even if the developer exists, a lot of them will happily ignore warnings. I've certainly been guilty of that.


As others have pointed out, it's been around a while, although the only reason I knew is that I'd spotted a previous headline on HN some time after the initial announcement.

Regardless, the relevant information isn't so much the announcements, as understanding SameSite and the changes you might need to make to keep your site(s) running. You can find that at:

https://web.dev/samesite-cookies-explained/


Pretty sure its existed for at least a year or two


They mention changing your password with a POST request, but at least what I have seen require the old password to be included in the request too.

Nevertheless, this is also problem of web apps in general. In many cases there are better protocols and better programs anyways.

In the case of the cookies, there can be user settings; if the user defines a cookie manually they can define if it is sent with cross site requests or not, and if the server sends the cookie to you then by default it won't be sent with cross requests. Cookies would always be sent for <a> links outside of frames, though, unless the user configures otherwise (such as to disallow it if there are query strings, for example).

Another thing I thought is a "Web-Option" request header. This is similar to cookies but cannot be set by the HTTP response nor by document scripts; the only way to set it is for the user to set it by themself. The response can include a "Web-Option-Schema" header, which is a link to a file specifying what options are valid; the user can use this or can specify their own options which might or might not conform to the schema. (This is not meant for authentication. For doing authentication, use basic/digest auth instead.)


Another great article from Troy. I personally believe that Google is intentionally flagging their ad network to ensure a large majority of the 'web' and subsequent developers are made aware of the impending changes. However, the console does not make it clear that we're talking about a change that is being implemented in less than a month.


I wasn’t really following this whole SameSite thing but between Safari and Chrome and various versions it looks like they made the problem worse.

The idea is great. Basically browser vendors finally realized that most websites don’t need cookies for cross-site requests so it switched from opt out via CSRF busting techniques to opt-in.

Except isn’t following cross-site links basically a GET request initiated by a different referer? So now will the strict mode not have me logged in when someone follows a link to some site that set it? Is that why the default is LAX? And under Lax, what about html form posts to top-level documents? That should go without cookies, right?


I agree that making SameSite=Lax the default is the right thing to do. However, I think chrome is moving too quickly. Making SameSite=Lax the default while a substantial amount of users use browsers that don't support SameSite=None seems like a mistake.


So these cookies may die but it’s a perpetual arms race. Browser fingerprinting will (or has) replace(d) cookies for tracking purposes.


Browser fingerprinting is a hack, and exploits clear loopholes in browser privacy models.

I wouldn't rely on it because it's committing to an ongoing arms race against the browsers. One that I expect them to win.


> I wouldn't rely on it because it's committing to an ongoing arms race against the browsers. One that I expect them to win.

Don't be so sure about this. The world's most popular browser is developed by the world's largest advertising company. I'm not saying Google is intentionally sabotaging Chrome, but I doubt they're putting significant resources into anti-ad technologies.


Well, in the end it's their competitors that are hurt most when they close loopholes without warning. All chrome needs to do is hamstring ad-blockers (which they just did) and add a fingerprint that only google can use (like tying your google account to the browser for no reason...)


> Browser fingerprinting is a hack, and exploits clear loopholes in browser privacy models.

> I wouldn't rely on it because it's committing to an ongoing arms race against the browsers.

It doesn't seem to me that browsers are trying to win at all. For example, one of the greatest discriminators - font list - has been known about since people were talking about browser fingerprinting.

The fix would be pretty easy too: in incognito mode (or when toggled by the user), only support 2 fonts: 1 serif and 1 san-serif that ship with the browser on all platforms.

I don't think any of the browsers want to do that.

There are a number of other longstanding fingerprinting issues that are similarly easy to fix.


Last I checked, Safari in fact restricts the fonts web pages can see/use to ones that ship by default with MacOS. So you can't fingerprint a Safari user via fonts any further than "Safari user".

So yes, browsers, at least some of them, are in fact trying to win here.


You'd need a standardized font rendering engine to defeat fingerprinting via canvas.


"Same canvas image looks the same on every browser" seems like a desirable state of affairs to me?


I think the problem is that canvas can be GPU-accelerated, and GPUs don't have an exact standard for how each pixel will look.


> You'd need a standardized font rendering engine to defeat fingerprinting via canvas.

That's fair.

But that really only gives the attacker the OS (and perhaps the GPU vendor?). Not ideal for sure, but not that many bits of info, especially if you are in the majority (windows / intel)


> One that I expect them to win.

Sure, the basic things like "which fonts do you have installed" are easy to make consistent, but there are thousands of other ways to fingerprint a browser, many of which would have serious performance impacts if fixed. For example, Macbook Air's can only run at full CPU speed for about a second before slowing down. Just make a 2 second javascript busy loop and watch for the slowdown. Are you going to slow all users down all the time just so these macbook users can't be identified?


Doesn't Google have to leave a Chrome loophole for themselves that is less conspicuous than a specific exception for DoubleClick/Google?


The article (and Google Chrome's change) is mostly about auth cookies, not tracking ones.


What is the technical difference between an “auth” cookie and a “tracking” cookie?


None, its a usecase difference not a technical one. but samesite is designed to tackle csrf (a problem with using cookies for auth). It wont prevent user tracking.


Sure it can. Samesite cookies will prevent e.g. Google Analytics from identifying me between domains, since any samesite cookies they set for the domain from which they’re serving their script/pixel won’t be sent. (Presumably tracking prevention will eventually start to block cookies with samesite disabled).


Browsers have offered a "block third party cookies" setting for decades.

I'm honestly surprised none of the major browsers block third party cookies by default, it's much simpler XSRF protection than this as it doesn't rely on site developers updating and setting the new flag right.

Of course, two sites that seriously want to collaborate on user tracking (or login) can always forward the user’s whole browser window there and back, with URL parameters to synchronise first party cookies.


> as it doesn't rely on site developers updating and setting the new flag right.

Chrome is enabling this flag by default. Websites can opt out, but if they do nothing they are opted in.

Blocking third party cookies doesnt really stop csrf attacks. At most it makes the attack a bit more noticeable as it prevents some of the quieter methods of pulling off the attack. Since as far as i understand, if you submit a cross-domain POST form, that's still a first party cookie


> (Presumably tracking prevention will eventually start to block cookies with samesite disabled).

This means the privacy advantage doesn't really come now, it instead comes at some hypothetical point in the future.


It makes it harder though.


Does it?

Websites can just opt out if they dont like samesite cookies. Even if they couldnt, its trivial for the website operator to work around if they want (and website operators are almost always in on user tracking)


For authentication, there is also the HTTP basic and digest authentication. However, I do not know that any web browser provides functions for the user to manage this authentication. (It would also make it easier for the user to configure cross-site authentication, too.)


Not sure how this is relavent, but IE had document.execCommand('ClearAuthenticationCache'); for http or TLS auth. Dont think other browsers have anything


Yes, I heard of that, although that is done by the document script and what I was asking is a command for the user to enter instead.


Http basic/digest auth doesnt last beyond the session, so just close the browser window?

Probably putting a different username in the url would work too (in non ie)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: