Hacker News new | past | comments | ask | show | jobs | submit login
Let's build a Chrome extension that steals as much data as possible (mattfrisbie.substack.com)
849 points by exadeci on Feb 21, 2023 | hide | past | favorite | 270 comments



> Chrome scrolls the permission warning message container, so more than half of the warning messages don’t even show up. I’d bet most users wouldn’t think twice about installing an extension that appears to ask for just 5 permissions.

An egregious and nearly unbelievable oversight on Google's part. :-\

As a developer, it's unimaginable to me to not test the extreme high and low numbers of inputs cases to ensure things look and operate as expected. Especially for a security sensitive UI element.

The chain of humans who've been responsible for developing and testing Chrome Extension functionality and security has been asleep at the wheel this whole time, for something like 15 years.

There are so many risk-reduction controls in place; tons of red tape and umpteen security and privacy reviews required to ship even minor features or updates, yet here we are.

How many hands have been in the pot and not noticed/raised/resolved what amounts to a pretty obvious security vulnerability? And if this kind of issue can fly undetected for so long, what can organizations with drastically less resources than $GOOG do to ensure adequate velocity while not leaving the proverbial barn doors open?

The author deserves the highest tier of bug bounty reward for bringing this to light. What's that? It wasn't submitted through the proper channels to be eligible? Right.

<insert relevant Dildbort cartoon>


> The chain of humans who've been responsible for developing and testing Chrome Extension functionality and security has been asleep at the wheel this whole time, for something like 15 years.

As the first in this chain of humans, I can tell you that (a) we obviously considered this in the first version of extensions and did not allow permissions "below" the fold, (b) Chrome's extension model dramatically improved on the previous state of the art which was Firefox's "every extension can do everything, extensions can't be uninstalled completely, and there's no review" [1], and (c) the install dialog is just one part in a bigger system which includes the review process.

I encourage the author to try and get this onto the store and get meaningful usage, then we can complain about how well the entire system works end to end. Examining just the install dialog alone is missing the point. I'm not even certain that an extension that requests more than 5 permissions would be approved in the first place.

I also encourage readers to remember that generally speaking, you all _want_ extensions. When Chrome didn't have them, they were the top feature request in the bug tracker. Real security is hard. If you don't solve user needs, users solve them themselves with solutions that are even worse (ie native code). Managing the browser extension system is a thankless painful job of delicately balancing incentives. Extensions need to work well enough that developers don't reach for more powerful and dangerous tools, but have enough controls that the majority of malware can be controlled. It sucks. Trust me you really don't want this job. Please spare a bit of empathy for the "chain of humans" that have had it.

[1] https://static.googleusercontent.com/media/research.google.c...


> (a) we obviously considered this in the first version of extensions and did not allow permissions "below" the fold

Irrelevant.

> (b) Chrome's extension model dramatically improved on the previous state of the art which was Firefox's "every extension can do everything, extensions can't be uninstalled completely, and there's no review"

Irrelevant.

> (c) the install dialog is just one part in a bigger system which includes the review process.

"Our MFA system is broken and will accept wrong input, but that doesn't matter because you without the correct password you won't get in."

Would that fly?

> I also encourage readers to remember that generally speaking, you all _want_ extensions. When Chrome didn't have them, they were the top feature request in the bug tracker.

What are you trying to say? That because users want it then overlooking security issues is excusable?

> If you don't solve user needs, users solve them themselves with solutions that are even worse (ie native code).

That falls under the realm of 'their fault.' This problem does not.

> Please spare a bit of empathy for the "chain of humans" that have had it.

OK, I feel bad for the poor Google worker with the thankless job and a six figure salary. Now fix it.


This is a needlessly combative reply. Your interlocutor is not sitting at the interrogation table.


I admit it is a bit combative, but his response took absolutely zero responsibility and expressed zero concern for what is in my mind a huge and inexcusable security oversight. Countering an acting apologist for a de facto monopolist corporation is also needed, or else we end up with complacency and rot.


Frankly your tone made yourself sound like an entitled armchair expert, whereas the paragraphs you have labeled as "irrelevant" sound extremely relevant. Why don't you use a few words to explain why you think that paragraph is irrelevant?

Even though he didn't take responsibility I walk away convinced that he's doing a good enough job.


> Frankly your tone made yourself sound like an entitled armchair expert

Well, the OP's tone was flippant, and I thought it was dismissive. Who cares how thankless the job is -- it is your job. Stop blaming it on the users for wanting things and fix the problem and stop making excuses and acting put out.

> whereas the paragraphs you have labeled as "irrelevant" sound extremely relevant. Why don't you use a few words to explain why you think that paragraph is irrelevant?

It is irrelevant because the issue exists now. Who cares what they accounted for 8 years ago, and who cares what their competitor does?

P1: My lock opens with any key. It should only open with my key.

P2: But the competitor's lock opens without a key!

P1: I don't care? Fix it.

> Even though he didn't take responsibility I walk away convinced that he's doing a good enough job.

You should raise your standards.


To clarify:

1. I agree the bug should be fixed.

2. I don't work on Chrome anymore and have not for 8 years. I should have made that more clear.

3. I was responding to the claim: 'the entire chain of people working on this has been asleep at the wheel'. OP was assuming this bug has existed forever. I know for a fact it hasn't, because I remember caring about getting this right and implementing custom UI for it. I also listed some other things we improved over status quo at the time, which refute this claim.

4. Based on my knowledge of how this all used to work, it's not as big an issue as the original article and several people in this thread are making it seem, because the review process is really the primary safety check in the system. It needs to be because, as many have noted, most users don't read the dialog. Extensions are required to have a single purpose and extraneous permissions aren't allowed. And if an extension with a large number of permissions was approved, extensions (and updates) are reviewed both with automated and manual processes, and it would be difficult to get a malicious extension through the process or to get large numbers of users on it.

I still agree the bug should be fixed. We cared a lot about getting this UI right when we originally implemented it, and it's unfortunate that it regressed.


Because there’s nothing to be concerned about. Author irresponsibly incited a bunch of FUD. End of story.


>> (a) we obviously considered this in the first version of extensions and did not allow permissions "below" the fold

> Irrelevant

No, it is directly relevant. He is responding to someone who said that this problem with the permissions UI goes all the way back to the everyone who was involved from the very beginning.

He is one of those people who were involved at the beginning and is pointing out that they did not have that problem back then.


Perhaps I'm missing something, or things have changed in the last couple of years, but when I last took a look there were a few main issues with the extension security model that would have been dead-simple to implement.

My understanding at the time (I realize I could be mistaken about any of these):

* Users have no idea what code they're installing. Extensions aren't required to be open source, where the community can audit them for malicious behaviour. Even if an extension claimed to be open source, there's no verification system to ensure the code actually being executed is the code displayed on their github.

* Automatic updates. Maybe this isn't the case now, but at one point I remember extensions were updating themselves automatically. Users of popular extensions are frequently contacted to add an "analytics" dependency from some shady company, in return for a nice payout. Users don't have the ability to opt out of these kinds of updates when some of their extension devs inevitably cave into the pressure. In my mind, these kinds of updates should only be shipped with user's full consent and understanding that they are being asked to install updates which have no practical utility to them

* Code obfuscation. I don't see why every extension shouldn't be shipped fully unobfuscated, at least as an option. Perhaps minified bundles could also be shipped as a way of supporting users on low-spec devices which really need those extensions, but then again, if disk space is so short for those users maybe they shouldn't be installing extensions in the first place

* Better observability of interactions between user-requested websites and extension background pages. If I look in my network panel I want to see communication with installed extensions happening


> Users have no idea what code they're installing. Extensions aren't required to be open source, where the community can audit them for malicious behaviour. Even if an extension claimed to be open source, there's no verification system to ensure the code actually being executed is the code displayed on their github.

I love this idea fwiw. Browser vendors should totally do this. Requiring OSS seems like a fair tradeoff for the power that extensions wield.

> Automatic updates.

In my day, extensions could auto-update but not change permissions. Also all updates had to be reviewed. The bribery issue is serious and sad, but killing updates seems like the wrong tradeoff. Ability to fix and deploy bugs is important for extensions just like any other software.

> Code obfuscation.

Obfuscated code is already disallowed: https://blog.chromium.org/2018/10/trustworthy-chrome-extensi...

> Better observability of interactions

It is already possible to know which extensions are active on a page: https://i.imgur.com/73lmozH.png. This is a better level of observability than what you describe because once an extension has access to a page, there is no way to prevent it from exfiltrating data from that page. The communication with background page is not relevant. There are myriad ways to exfiltrate just given access to normal DOM, unfortunately.


Chrome need a "view source" equivalent for extensions.

If should be just as easy to inspect extensions as it is for web pages, including all the network requests they have made.


"view source" doesn't really even work for regular pages now, since the DOM is often very different from what is in the HTML page the user receives


Ideally before downloading the extension. However, with automatic updates, this is nearly useless. Malicious code can be injected at any time in the future, long after the user audits the code.


They pretty much do... In the web inspector, you can filter for network requests by an extension.


> Requiring OSS seems like a fair tradeoff for the power that extensions wield

To be precise, I wasn't advocating requiring OSS, just source-availability. Extensions should have at a minimum a visible repository displaying the source code, where users and auditors can publicly comment or leave issues (ideally, which the maintainers can't remove). Chrome should verify that the extension code matches the code in the repo (without an additional build/compile step - the repo should reflect the exact code being shipped in the extension). This way prospective users can inspect the code of an extension without installing the extension first and mucking about in dev tools.

> Ability to fix and deploy bugs is important for extensions just like any other software.

Ability to fix bugs, sure. Ability to deploy bugs (without user consent or knowledge) is an antipattern ;P Third-party browser code should never be shipped to the user without their consent and full transparency into what is being shipped. Users who don't intend to inspect the updates can opt-in to automatic updates, sure.

> Obfuscated code is already disallowed

This is great! Though they distinguish between minification and obfuscation, which is bizarre. Intentionally obfuscated code is hard to detect. I was suggesting they also disallow any form of minification (or at least require extensions to distribute with minified and unminified options, and have minified bundles verified)

> It is already possible to know which extensions are active on a page: https://i.imgur.com/73lmozH.png. This is a better level of observability than what you describe because once an extension has access to a page, there is no way to prevent it from exfiltrating data from that page. The communication with background page is not relevant. There are myriad ways to exfiltrate just given access to normal DOM, unfortunately.

This is true, but even background scripts which don't have access to the DOM can communicate with the content script through sendMessage.

A user looking at the network panel on the page might not see a network request sending their password to a strange server if the message is being sent to the background script and the request is being made from there. All communication in and out of the sandboxed extension environments should be logged and inspectable from any page where the extension contexts are being communicated with.


> Chrome should verify that the extension code matches the code in the repo (without an additional build/compile step - the repo should reflect the exact code being shipped in the extension).

To ensure that the code matches, Chrome servers could download the source code and build the extension themselves. This is what F-Droid does. For each version of the extension, they could also archive the source code they used to build it. Even if the repository gets rewritten or taken down later, the archive remains.


Funny story, the original version of the web store had a button you could press to see the source of an extension.

> Ability to deploy bugs is an antipattern

Disagree then. One doesn't come without the other.

> even background scripts which don't have access to the DOM can communicate with the content script through sendMessage

If by "content script" you mean the web page script, then this is true, but in that case the web page is collaborating with the extension. Whatever the web page is sending to the extension it could just as easily send anywhere on the internet.

>


> If by "content script" you mean the web page script, then this is true, but in that case the web page is collaborating with the extension. Whatever the web page is sending to the extension it could just as easily send anywhere on the internet.

Content script would be part of the extension that runs in the context of the webpage, and has access to the DOM

Yes, I agree this could be sent anywhere, but when sent to the background script with the sendMessage API, you don't get visibility into that with devtools out of the box


OP could have worded the criticism as an organizational attack and not aimed at the individuals that make up the organization...

But anyhow, you work on the system and can't point out if this is an issue or not in the end-to-end sense? Like, why say OP is "missing the point" if you are not sure if they are missing the point?

TBF you do say "I'm not even certain that an extension that requests more than 5 permissions would be approved in the first place." Which if true, would make this pretty low priority.

(On the Empathy note, I just want to add: stating "Trust me you really don't want this job." In this context to random members of a forum is a bit of an empathy-lacking thing to do)


I have not worked on the system for something like 8 years, so I cannot say what the current situation is.


I’m really curious what were the arguments against screaming “…and 58 other permission - see full list here” seems like an easy fix.

As for the review process, I don’t think it’s good enough. I don’t use Chrome as a daily driver but I’m often seeing out-of-store extensions. Is there external process that would capture and mark/flag such 3rd party extensions?


Chrome makes it pretty difficult to sideload extensions, so I'm not sure what you're seeing.


Well I’m usually browsing for development tools & such and some of them are advertised as an out of store downloads (which would be preferable as I prefer Chromium). You did prompt be to check how hard it is though, so allow me to get back to you in a few ;)

Edit:

I didn’t go through manual process but based on this (rather recent) Adblock guide [1] it’s not very hard.

Of course one could argue that “developer mode” is deterrent enough but - as for contrast - in order to get state mandated esignature to work I had to install some sketchy 3rd party system extensions on MacOS. On Windows it almost felt like malware (as some of the security settings had to be disabled in order for Java installers to run).

My point is - sure Chrome might have secure golden path but people are lured into skipping a lot for freebies/weird requirements and thus permission dialogue should be more informative.

[1]: https://helpcenter.getadblock.com/hc/en-us/articles/97385388...


I guess it depends on what you mean by "difficult".

Chrome is unfortunately limited here by the security of the OS. No popular desktop OSes have application isolation: all apps have the same permissions. Any app can write to any other apps' storage.

This means that if Chrome makes sideloading too difficult, developers will just tell users to run their native code which will hack into Chrome, making even understanding what extensions users have or uninstalling them impossible. Sideloading on desktop OSes has to be hard enough to discourage most users but easy enough that developers like adblock don't start looking for an even bigger hammer.

This is what I meant by delicate balance of incentives.

More info here: https://news.ycombinator.com/item?id=4954915


I absolutely understand and I agree with you. I suppose you have much more data and probably research. But on the other hand I believe Google also has literally infinite amount of developers.

Thus, it should be easy to classify and make that specific dialog more visible - especially when side-loading.

I think it definitely is hard job to reach consensus on all the fronts.


macOS clearly has file system isolation though. Sandboxes apps can’t write/read anything outside without an explicit permission.


For some programs sandboxing can work perfectly fine, but for others it can result in anything between noticable UX degradation and a major PITA, because there are enough workflows out there that can't really be sensibly made to work with an "every file access must go through an official 'File open'/'File save' dialogue (or something comparable, like drag-and-drop, or launching a certain program with a certain file)" model.


Sure but it's still possible to install non-sandboxed software. What Chrome would need is protection from _other_ software, not limits on itself.

It's true that as MacOS continues to discourage non-sandboxed software, Chrome can make sideloading more cumbersome to match.


Your response explains SO much about why this is a dumpster fire. Rather than a professional "these are some valid concerns, we may have made some serious mistakes and this deserves a credible response", here's how your defensive commentary was received by at least one person:

"Public criticism (however valid) of something I worked on hurts my feelings. Just so you know, we already thought through ALL these possible "problems" (and more) and we chose the "one true way". So your criticisms are just, waves hands, whatever.

Plus, these "issues" could never happen anyway (just try it lol!), so your criticisms are basically imaginary. Also and anyway this is YOUR fault (not mine, no!) because YOU wanted extensions. Also, also: the other guys were worse. Your fault, shame on you, other guys worse, I'm a good person.

And finally, this is hard work, no-one ever said thank you to me."


"I also encourage readers to remember that generally speaking, you all _want_ extensions."

Exception here. Never asked for them. Never wanted them. Do not and cannot use them.^1 Seems moot anyway as I doubt anyone ever asked for Chrome itself.

When an extension collects data under the radar it's "malware" but when Google does this, it isn't malware. Funny.

There's really no greater "danger" to www users than Google because it's nearly impossible to stop Google from doing what it wants. The company is embedded in every aspect of using the web. Malware authors might be a threat but they hold no such omnipotent control. Google is the largest threat.

1. Occasional Chrome user via Guest mode that does not permit Extensions.


If you’re not using Chrome, then obviously you’re not the target demographic for Chrome extensions.


How do you feel that this contributes to the discussion?


> An egregious and nearly unbelievable oversight on Google's part. :-\

I agree it's egregious, but it's quite easy to believe.

It's surely just using a standard modal and passing a string. The thing is, this is on a Mac that has scroll bars that are invisible until you scroll. It's easy to imagine testing was done other OS's where the scroll bars are obvious and the bottom line might be only partially hidden which makes it even clearer. And/or that testers never caught it on a Mac because they themselves never realized there were more.

I would hope somebody sees this now and prioritzes a Chromium bug for it. Because on a Mac at least, this is pretty serious.

(And I'm well aware this is a good example of a negative side effect of Apple's choice to make scroll bars visible by default only while scrolling.)


> I would hope somebody sees this now and prioritzes a Chromium bug for it.

Rather, one would hope that Apple sees it realizes that their short-sighted, bone-headed, pea-brained idea to eliminate scroll bars should be rolled back. Of course, I'm not holding my breath. Yet another example of their crusade to prioritize form over function, exemplifying why I find their products to be infuriating to deal with.


> chrome extensions have horrible flaws and browsers are fundamentally broken

> apple sucks and is the reason for all evil because scroll bars

we dont have scroll bars on mobile as in days of yore. maybe browsers need to finish playing catchup to the threat and interaction models. having a vm on your machine with access to everything you do without sandbox is pretty bad


Hiding scrollbars has been obviously bad UX since day one and this has only become more obvious over time. It doesn't matter whether you're doing it in a web browser or not, whataboutism isn't an appropriate response


the fact that the browser extension can have access to shared memory is the root problem though x)


> Rather

The air travel industry uses the Swiss cheese security model.

EVERYONE does what they can to improve security. The equivalent would be both Google and Apple making improvements.

Both of the problems might be leveraged in a future attack. Fix along the whole chain of events, not just break it - defense in depth.


No surprise that the person overreacting about an Apple design decision ends their comment with “yeah and just don’t like them anyway!”

Critiques of Apple from people that have this sort of wide-ranging vitriolic hate for Apple are a dime a dozen, and don’t make for any sort of interesting or enjoyable conversation.


Having less visual clutter -is- function to me. I really don’t miss permanent scroll bars and hope they don’t bring them back.

Most Macs ship with a trackpad, which means I can’t remember when I last deliberately gripped a scroll bar. They are just a waste of space most of the time, even as an affordance/reminder that scrolling is possible.


I hardly use the scrollbar either (even when using a mouse), but the scrollbar is an important visual clue what portion of a scrollable page is currently being shown, no matter what the input method is. Apple could just have turned the interactive scrollbar into a much slimmer non-interactive hint and all would be fine, but no, they had to go "form over function" again :/


I see your point, but I still think this is a matter of preference and priorities.

I stand by the original argument that for most people, a minor twitch of their fingers on the trackpad reveals this information if they want it. I very rarely do want this though, and on average I prefer that it's not shown by default, or until I move.

This discussion was prompted by a UI fail in presenting relevant security information. Relying on permanent scroll bars would still be a UX fail, even if it were the default on Macs.


Wouldn't it be cool if apple had some kind of menu called like "display settings" or something, and in that menu there was like a checkbox labeled "use invisible scrollbars", and when the checkbox was checked, scrollbars would be invisible, then when it was unchecked, and I know this'll sound crazy, the scollbars would be visible, and people could just make it look the way they like?


Exactly. So much is broken because of misunderstood UX design.

Today I was again reminded that many years ago some UX designer thought it would be a great idea to remove back/forward buttons from the context menu in Firefox if I accidentally select some text on a page I visit.

No one was asked and when someone filed a bug it was ignored because ux designers had already decided.

Result:

- a few times every month back/ forward buttons are missing

- the look and feel of the context menu changes for no good reason


If I understand you correctly, then I asked for this. Text is often selected by accident, so the top item in the menu would go Back instead of Open in new tab, as that is in the same place in the context menu. The opposite of what I want, and infuriating to use.

https://superuser.com/questions/1074338/disable-back-in-chro...


i'm reminded of this every day. I hate it so much.


It was reported a decade ago.

But we are only "users", even if each if us converted 10 or more IE6 users and bothered IT departments to allow Firefox, web sites to write for web standards and not IE etc etc.

When they ask for money, we are "valued community members". When we have a question we are just annoying "users" it seems.

And the worst part: if you donate to Mozilla it doesn't go to Firefox. It goes to some other project.

Because Firefox is a profit center for Mozilla and they are milking it dry year after year and our donations comes on top of that.

Still I use Firefox. It is still better for my purposes (large hierarchies of related pages that lives from hours to weeks).

And it is not like using Google Chrome would improve the situation.

But lately (maybe the last twelve months?) I have started to use LibreWolf too. I use it as my research browser while using Firefox for all logged in work. It feels good, like using Firefox back in the days.

And if I can work against both Mozilla and Google simultaneously, maybe I should cut Firefox completely :-]


Can’t even tell if this is satire, but yes, there happens to be such a setting, System Preferences -> Appearance -> Show scroll bars, with options “automatically based on mouse or trackpad”, “when scrolling”, and “always”. It’s been there since forever. Of course that’s not going to assuage the wrath of non-Apple users like kibwen. Don’t use it if you don’t like it, ffs. As for me, I’m happy to do without bars everywhere.


> nearly unbelievable

I don't get why people say something like this, especially on HN where lots of people are SDE themselves.

Every single feature is hand-crafted by a/some real person(s), and is usually only reviewed by a handful people. It only makes sense sometimes it has serious oversight.


It's true. However, the orgs in question have billions of money. They can afford more than a handful of people to address these things.

Eight person startup? Sure. But not these folk who "only hire the best"


But the more people you hire, the more bugs there are to address!


What you are proposing is adding bureaucracy and red tape. The government has billions of money and they can afford many people doing various checks and compliance work; do you want to work for the government?


> The thing is, this is on a Mac that has scroll bars that are invisible until you scroll.

I try to live with the defaults as it makes switching machines so much less painful, however turning scroll bars in is an absolute requirement for sane usage. At least you don’t have to do it in Terminal anymore.

Edit: I can’t find any evidence that the preference could only be toggled via Terminal in some macOS versions. I’m thought that around version 10.6 this was the case, but maybe I’m wrong?


Fortunately, you can override Apple's choice:

Settings > Appearance > Scroll bar behavior > Show scroll bars > Always

If I were in charge of fixing this bug for Chromium, I might start by prioritizing which permissions are the most nefarious, list them first, perhaps in red. Then ensure the entire dialog expands vertically to fit as much content as possible.


I wondered about this. I would sincerely hope that the permissions list is already priority-sorted somehow; otherwise, it’d be trivial to just list five innocuous permissions first and then put the dangerous ones later. Even with scrollbars, most users would just see the first few and likely not bother to scroll down at all. Note that in the author’s post, one of the dangerous permissions (read and modify data for all websites) is already listed in the box.


Have users view and approve each permission individually. First permission is listed, buttons for "deny" and "approve", user clicks "approve", next permission is listed, etc. Clicking stuff is a lot of work for users and some of those permissions sound pretty intimidating, so I suspect either extensions would start getting by using fewer permissions or users would start getting by using fewer extensions.


Either of those outcomes would be a win for security and privacy.

Users _should_ understand what an extension having access to this stuff implies, and if that means they only use "official" extensions or stop using them that's fine.

Also "clicking stuff is a lot of work for users" made me laugh a little. How many damn cookie consent buttons do users have to click every day? An extension should be orders of magnitude more "work" to allow than that.


> on a Mac that has scroll bars that are invisible until you scroll

this has got to be one of the worst UX decisions of the past 10 years, come at me. You can just hear the meeting discussion:

A: make the scrollbars invisible until you scroll, that nets us 4% more width!

B: but it reduces discoverability by 50%, and its also an attack vector...

C: ship it!


I dislike the invisible scroll bars, but I don’t think they’re about width, I think they’re about visual clutter. Apple likes simplicity (yes yes you can find examples where they failed) and this seems to be in keeping with simple design.

Still a bad idea though.


I love this feature, scroll bars are mostly useless, the position indicator should be decoupled and doesn't need to have extra width (can be a mark on the window border), and the security issues are solved by a summary at the top of the list, not a generic scrollbar


> The thing is, this is on a Mac that has scroll bars that are invisible until you scroll.

Undoubtedly the stupidest UI decision made in the last 20 years.


>this is on a Mac that has scroll bars that are invisible until you scroll.

What modern UI even has visible scroll bars by default?

And assuming it's even visible (either by default or user-configured after the fact if that's even possible), what modern UI even has scroll bars wider than 1px?


> What modern UI even has visible scroll bars by default?

Pretty much all of them except Mac.

Can speak personally for KDE, Sway, GNOME, and Windows 10.

And when they are invisible, they usually show up on mouse motion/window interaction, so it's still not as egregious.


Xfce4.18 (GTK) has them. Has for 15+ years.

Shout out to Xfce for being so boring/consistent for more than a decade.


I'm so grateful for Xfce, it's been my desktop for maybe 15 years. Such a godsend in a world where every other UI gets more bloated and less usable over time.


> Pretty much all of them except Mac.

iOS is way more prevalent and has hidden scroll bars unless scrolling. Thanks Apple.


I'm using Firefox 110, Gnome, Wayland, Pop!_OS 22.04. I only have a scrollbar when I scroll or mouse-over the scroll bar.



> if this kind of issue can fly undetected for so long, what can organizations with drastically less resources than $GOOG do to ensure adequate velocity while not leaving the proverbial barn doors open?

If $GOOG can't do it with practically infinite resources then I'm of the opinion that nobody can. Computing is broken.


It's not a question of having enough resources. $GOOG can't do it because anyone who isn't building a promo packet by launching new services is on the way out.


I don't think Google's incentives are properly aligned with users for them to prioritize creation of a user-first extension package repository.

APIs which would make it easy for extensions to exfiltrate user information might be the same APIs they would use to do the same. APIs that allow the user to enhance their web experience (think uBlock origin) are being kneecapped by manifest v3


I agree - nobody can.

You can't make a computing environment allowing anyone to write and distribute code that has the ability to do all the things users want it to do, without having a good chunk of bad actors trying to trick/persuade users into granting malicious code enough permissions to hurt the user.

You have to either severely limit what code can do (the iOS model), or allow users to shoot themselves in the foot (the windows download a random .exe and run it model). Chrome falls in between.


Are we certain that they are motivated to do so? Google doesn't strike me as a company committed to user privacy.


There is no company on earth with a larger privacy organization than Google.

Edit: @dogecoinbase: What you propose is a shallow and dismissive analysis of the oversimplicative variety. Please put in a little more effort before derailing an otherwise thoughtful conversation.


@metadat -- can you provide the source for that?

I'm genuinely curious how much Google spends on their privacy org, and esp. how that compares with the other big tech companies.


Privacy is bundled under Trust & Security. I don't have precise numbers or estimates, but basically every Google product area has a team of Technical Privacy Engineers, TPMs, TPgMs, and VPs. They are the arbiters who can block a production release if privacy and security issues are discovered and not remedied.

No other company I can think of has invested in such a rigorous Privacy review and support structure in an attempt to reduce risk. Other BigCorps do take it seriously, but get by with much less investment. Despite this, FB et. al. are keen to poach Google Privacy employees, because there are very few Privacy Engineer in existence and Big-G pioneered modern day corporate privacy efforts. Google is a huge target for hacking and public criticism or even loss of human life due to product decisions, especially because the products are so ubiquitous and widespread (browser, mobile OS, search engine, Gmail, ad network, etc). See subjects such as Differential Privacy.

Source: I have a few friends who've worked in the G-Privacy org.


Differential privacy didn’t come out of Google.

> They are the arbiters who can to block a production releases if privacy and security issues are discovered and not remedied.

Every company I’ve worked at outside of tiny startups has had this level of gating by the security team.

> No other company I can think of has invested in such a rigorous Privacy review and support structure in an attempt to reduce risk.

Which companies have you worked at in the last 8 or so years? It sounds like you’ve just watched the industry mature a bit in PII from the perspective of the inside of Google.


> No other company I can think of has invested in such a rigorous Privacy review and support structure in an attempt to reduce risk.

In recent court cases Google employees admitted they have no idea where user data is stored (specifically location data), which systems have access to it, and how to fully turn tracking off.

80-90% of Google's revenue comes from online ads. There's a huge conflict of interest between Google's business model and whatever "arbiters" pretend they want to block.

And of course the number of privacy things that Google pioneered is minuscule to non-existent. Google has been dragged into caring about privacy against its will, kicking and screaming, by government actions like GDPR and CCPA.

Facebook poaches Google's privacy people because Facebook is the only one of mega corps who are worse than Google, and wants to continue its practices as much as Google.


> In recent court cases Google employees admitted they have no idea where user data is stored (specifically location data), which systems have access to it, and how to fully turn tracking off.

Really? Do you happen to have a source for that?


https://www.theverge.com/2020/8/26/21403202/google-engineers...

Edit: a better article https://www.businessinsider.com/unredacted-google-lawsuit-do...

Short quote:

--- start quote ---

Jack Menzel, a former vice president overseeing Google Maps, admitted during a deposition that the only way Google wouldn't be able to figure out a user's home and work locations is if that person intentionally threw Google off the trail by setting their home and work addresses as some other random locations.

Jen Chai, a Google senior product manager in charge of location services, didn't know how the company's complex web of privacy settings interacted with each other, according to the documents.

--- end quote ---


Thanks. But neither of those sources matches your initial description.

The first is not anyone "admitting" anything in a "court case". Nor does it discuss "where user data is stored" or "what systems have access to it". It is quotes from an email discussion on some article, about the behavior of a UI toggle, with no indication that these are people working on that system who would be expected to know where data is stored but don't.

In the second link you've at least got a deposition, but how is either of those paraphrases relevant to your claim about "not knowing where user data is stored" or "what systems have access to it"?


Unfortunately I cannot find the exact article describing this now.

There were a few over the years. If I can find one, I will update you with the link.

Meanwhile Google has settled for $85 million in this case https://eu.usatoday.com/story/money/2022/10/05/google-arizon... A whooping 0.6% of their profir for 2022


The lesson, then, appears to be that size of privacy organization does not have a correlated effect on effectiveness.


Yes, but what is the focus of google's privacy organisation? Is it protective user privacy or is it designing and implementing ways to circumvent user privacy controls and regulations? From the evidence I'd say the latter.


Extensions in general are a massive security issue and it’s impossible to fix without crippling them beyond uselessness. Or maybe having approved extensions only.


Sounds like the Apple app store.


Apps are a different thing since even if you sideload apps, one app can’t insert itself in to another app to scrape your data. While extensions are almost like kernel extensions that can do just about anything.


The naivety of innocence is rather blissful, isn't it?


Care to provide an example of an ios app which can read the data from other apps when they shouldn't?


1. Android and iOS sandbox applications. But if I grant permission, a mobile app can read files from my photos, or documents, SD card on Android, etc. folders. I can even ship a mobile Safari extension on iOS.

2. Desktop platforms do not universally sandbox applications (though they are trying). You can install a desktop app that steals all the data in your home directory, including your entire browsing history, with no permission dialog whatsoever.

3. That aside, browsers sandbox extensions just like mobile applications. One extension cannot access another extension's data.

4. Furthermore, by default, a browser extension can only access content from its own origin. It is in fact sandboxed from the rest of the sites you visit.

5. If the user grants permission, a browser extension may access other sites.

So in short, browser extensions are in fact sandboxed.

And your idea of mobile apps accessing data is entirely dependent on the qualifier "when they shouldn't", which, arguably if given permission, they should so it's a moot point.


The shared data on ios and android isnt all that important. Sure, you might not want a random app to read your photos, but it's not getting access to your bank session token. And these days you can grant apps to only specific photos.

The vast majority of extensions require the ability to read and modify the dom on any website to do anything. This is so much worse than the average app permissions.


An extension can read another origin’s secure cookies? That’s news to me.


I don't think it goes that far but a while ago the Twitter app was somehow polling for registered request handlers and discovering what other apps were installed. Not their data, but their presence, to profile their own users.


But the idea of the app isolation is just that — apps should not be able to touch each other.

There are of course escape hatches, because sometimes you want apps to interact.


Extensions cannot touch each other. They also can't touch other websites without an escape hatch, because sometimes you want your password manager to modify the DOM and fill in your password for you.


extensions have to touch other websites to be useful


Not all of them. Momentum is one of my favorites and it just gives me a new start page.


This introduces an especially silly attack vector: if you expect that asking for a specific permission might alarm users, and if you can push it below the fold, just ask for more innocuous or plausible permissions than you need!

Besides the oversight of hiding some permission requests, this highlights that the order they’re presented matters too. Even if it weren’t scrollable with ~invisible indication of that, people stop reading at some point. If N lines (I’m gonna guess ~5 for most people) seem totally innocuous, the rest are probably effectively invisible.


In general, for the most part, it looks like the permissions shown higher in the list are the more dangerous ones. In particular, "read and change all your data on websites" is by far the most dangerous permission, and it always appears first or second. ("Access the page debugger backend", which is above it, sounds technical and opaque to most users, but the permission that it gates also triggers the "read and change all your data" warning.) I vaguely suspect that this is by design.


There are several below the fold that I’d consider more dangerous than “show notifications”, but I suppose that one is enough of a nuisance to most people that it gets sorted up. Either way, having this kind of design is an opportunity for it to be gamed. Probably a better design would be more annoying over time (prompting for permissions as needed), but I’m the oddball who’d prefer that annoyance over this kind of approve-and-forget scheme.


Every ad blocker asks for "read and change all your data on websites" permission, same as page scrapers/readability tools - so depending on the disguise I think it won't be hard to get this permission.


> just ask for more innocuous or plausible permissions than you need!

Looking at the list I think they're sorting by ~alarmingness, so it seems hard to push an alarming one below the fold with innocuous ones?


> The chain of humans who've been responsible for developing and testing Chrome Extension functionality and security has been asleep at the wheel this whole time, for something like 15 years.

There should be liability for negligence.

If you were told about a security hole and you have not fixed it, and you have not informed your users, in months, you should pay statutory damages.

And if you lied about your app (claiming encryption where there is none) you should be liable too, even if the app is free


Well, you can get a full refund.

I see where you are coming from, but increasing liability for free software does not feel like a good idea to me at all. There's basically no way you could extract money protected by Googles lawyers army, but any small open-source project or even medium sized company will be extremely vary of releasing anything.

I'm not saying you should never go there - GDPR does and it's a net improvement -, but it's extremely easy to massively overshoot.


It may be free software but it's also a commercial enterprise.

Furthermore, Chromium is free software, Chrome isn't. Perhaps Google should be liable for distributing Chrome in such a sorry state, but not for what lands in Chromium.


I don't really disagree, but to play devil's advocate a little, if I was giving away free knives and someone cut themselves, would I be to blame in any way?


If they were known to be unsafe, such as heavily rusted and the handle was loose, then yes, you could be liable.


If software companies sold knives, this is how it would work:

You market them as iButter knives. They are actually carving knives. Half of the users are 14 years old.

You give the knives away for free, but the knives steal 1% of any food they cut.

Sometimes butter knife needs an software update in the middle of cooking.

If you sharpen the knives you lose warranty. Company says you should buy a new one regularly.

They come with a bug, when iButter knife is used on Cheese, they spontaneously transform into chainsaws.

The company says it's not a real problem because using iButter knife on anything else is against EULA


The company would be knowingly made of lead but neither owners or buyers would be informed of that by the company who is themselves heavily invested in lead.


Of course, by design the knives would also need to take a bit of the user's blood.


Are you marketing it as a safer alternative? Do you have a near monopoly on knives?


i didn't recall chrome being marketed as a safer alternative. I suppose the original "sandbox" tab is considered safer, but not in the way that this chrome extension is dangerous in.

And why does it being a monopoly matter in this context? Not to mention that it isn't a monopoly.


I said "near monopoly". Companies with an outsized share of the market may be subject to extra scrutiny.


> Well, you can get a full refund

Suppose developers of smart locks make an error, and all smartslock unlock on Fridays. When users ask the Company, they lie and claim their locks are flawless and users are to blame.

Millions of houses are robbed, people loose their life possesions, and home robbers kill some grandma.

Should the Grandma's family get just a $100 refund (price of the lock)?

> There's basically no way you could extract money protected by Googles lawyers army

That's defeatist. If thats true, then this whole discussion is pointless.

If there are 'nobles' that don't answer to justice, we live in feudalism. Freedom and capitalism are dead.


Please dont make up alarmist analogies to try and support your point. You introduce unnecessary points of confusion with 1) whether your scenario even fits and 2) whether the outcomes even make sense.

Stick to the actual situation wherever possible.


Alarmist? In Britain we have sent ~800 innocent people to prison because a programming error said they stole money. The software development company testified in court that their software was great, despite many inconsitencies being pointed out.

https://en.m.wikipedia.org/wiki/British_Post_Office_scandal

I think it should be obvious that a poorly designed product can do much more damage than it costs, both through stolen data and by causing legal action

As software intrudes into physical world, the potential for damage will keep growing.


You should read any one of Mark Russinovich's amusing novels on viruses destroying the world. You might be convinced we should stop using any software.


Then use this real example, not hypothetical dead grandmothers.


[flagged]


Ah yes, how could I forget:'companies shall not answer for their actions or crimes', Amad Smith, Wealth of Nations, Page 6.

Capitalism means system of Justice and democracy and markets.

'guy with most money wins' is not capitalism, it's serf mentality.


Even if they were to somehow show everything up-front, it would be user hostile design. No one wants to read and consider every possible permission prior to granting it to the thing they want to use

Even if there were 20 items in the list, people would hit "OK," it's like accepting terms and conditions. Most people won't read them, they're trained to hit "Agree"

The whole model needs an overhaul. Extensions should be required to ask whenever they need the specific resource, rather than asking up-front. And each gated resource should get its own prompt with its own informative design. Similar to permissions on mobile


That would make the password manager plugin I use extremely annoying to use. I want that plugin to read every page I visit so it can detect username and password fields. The same goes for many other extensions, like ad blockers. They should block all ads they can find, without constantly needing to annoy users with permission prompts.


> An egregious and nearly unbelievable oversight on Google's part.

This describes Google's entire approach to browser extensions. It's so cosmically and hilariously bad, and because extensions grant post-decryption access, renders basically every other security and privacy effort they've ever done with web standards completely pointless.

All that garbage about encrypting everything in transit is entirely irrelevant when the endpoint is a masterpiece of bad security design.


But isn't the scrolling problematic here only because macOS by default hides the scrollbars when there is no traditional mouse connected?


I've experienced this problem once on Windows when I've installed a free VPN and noticed that the list of permissions lacks some obvious points.


> The author deserves the highest tier of bug bounty reward for bringing this to light. What's that? It wasn't submitted through the proper channels to be eligible? Right.

Almost entire point of bug bounty programs is to encourage researchers to submit vulnerabilities using a proper channel and adhering to a proper procedure.


And what’s even worse, in that screenshot there’s no visible scrollbar to show there is even any additional data below the fold, or any indication that it’s a scrollable control in the first place. I can’t tell looking at it that there’s any possibility of further permissions anywhere.


But it's not the issue of the quantity of resources, it's about their quality, and it's a bit puzzling you find these pervasive issues hard to imagine


there is always a balance or tradeoff between convienence and security. If you maximize security, you will have a piece of software nobody wants to use unless there is a gun pointing at their head. Not that security is unimportant but please also keep user experience in mind when you design security.


What's that saying? Ah, found it:

"Security at the expense of usability comes at the expense of security."

If you maximize security, so that there is poor convenience/usability, people will find usability workarounds which compromise security.


> nearly unbelievable oversight on Google's part

You're giving Google way too much credit here. Do you not think they use some of these techniques here? Or that they're an endpoint for other people's usage of these techniques?

> security has been asleep at the wheel this whole time, for something like 15 years

All is as intended.


A completely foreseeable negative externality of the ubiquitous GDPR cookie permission popup, is training users to just click on any old shit.


GDPR does not mandate the popup (which in most cases is even illegal under GDPR).


Wait until you see what’s possible with executables!

I like this project, but I also worry that eventually we’re going to lose access to extensions entirely because people will take away the wrong message.

Safeguards are good, but at a certain point I want my devices to trust that I know what I’m doing.


> Wait until you see what’s possible with executables!

The most important thing is whay you tell the user -

Windows says "We don't know where Trojan.exe came from, it could be a virus, are you sure you want to run it?"

Chrome says: "You downloaded Trojan.exe from our store, we manage it and check it for viruses. It only asks for harmless permission, install it!"

One is warning you, the other is entrapment.


Windows has an app store now too.

> It only asks for harmless permission, install it!

Not true. It lists all the permissions being requested. Sure the scrollbar issue is real and should be an easy fix.

I don't understand why people are so confused about permissions. If the user grants your extension permission to read your browsing history so it can provide value to them, why is that a problem? It's not. The problem is if the user grants a malicious extension the same permission because the extension is fraudulent. The author said this extension would never pass chrome store review, so it seems that the user would never be in this position in the first place and your example doesn't really match reality.


>I don't understand why people are so confused about permissions.

I don't understand why people (read: devs) still assume permissions are read and understood.

The vast majority of people simply do not read nor understand permissions and just instantly hit the OK button. Even Linus from LinusTechTips doesn't read permissions, and he's even a tech guru unlike most people.


So hopefully the Chrome store review process is the nanny they need.

The only thing that has permission to read and modify website data on my browsers is my password manager. I trust it. Without that permission it could not operate and provide me immense value. Yes, I would be hosed if it got pwned. It's a calculated risk on my part.

One cannot live in a world where we have the benefits of browser extensions and also disallow all behavior that could ever possibly be used for bad. It's an inconsistent worldview.


> hopefully the Chrome store review process is the nanny they need.

This is very disrespectrull.

When you supply exhaustive documentation of your software, how it works, what it does, to users, then you can mock them.


What are you talking about? I'm not mocking anybody. We have app store reviews because it's a known fact that not all users read or care about permissions. My point is simply that those safeguards exist for those users and they seem to be working rather well.


A nanny is for kids. Saying someone needs a nanny can be pejorative.


People (usually) understand the words that pop up when an extension/app asks for permissions. But they don't necessarily understand the implications of allowing those permissions. And most have little grasp on how to judge an authors trustworthiness, other than the star rating on the store listing... which is driven by other users who mostly also don't have a clue.

It is not unusual to see stories of malicious applications or extensions with hundreds of thousands of happy users and good reviews.


> I don't understand why people are so confused about permissions.

Ask the not-tech-savvy.


This article is not an example of a Chrome extension being downloaded from the store. Nor is it a case of the extension appearing to only ask for harmless permissions ("read and change data on all websites" is above the fold).


I'm not sure how you somehow drew the conclusion that Windows' warning is better than Chrome's.

Windows says "Now I know I say this for literally every program you download, but I'll say it again. It might be harmful because all programs can do anything at all. Good luck!".

Chrome says: "This might be harmful and it can do these things: X, Y, Z."

Apart from the stupid scrollbar issue that's clearly better.

The only problem is they have done the permission model that we've known for like a decade is broken: ask for permissions up front, you can grant all or deny all.

Could they not have spoken to the Android team who spent like a decade fixing that mistake and moving to fine grained on demand permission prompts?


> It only asks for harmless permission, install it!

I see it asking for two very scary permissions, two somewhat scary permissions, and one annoying one?


I think the problem is that you don't actually know what the permission is for. It isn't in relation to anything. It's a blanket permission. But almost everything asks for some kind of blanket permission when you install it. Your only option is to say yes. For mobile apps, things have got better and a lot of things ask for their blanket permission later on in the piece. In many cases, it's still not in relation to anything. When they are in relation to something, you're usually not granting permission to do the thing you want to do, you're granting permission to do anything like what you want to do.

The user isn't giving informed consent, because even if, like you, they are informed about what the worst case is for granting this permission, they have no reasonable knowledge about whether the worst case is likely, even knowing that they've been asked to give permission. The questions are simply too generic for informed consent to be possible. (Mobile apps are getting better here as well, but there's still a long way to go.)

So the permissions aren't actually about managing your risks; they're about managing Google's risks.


How would you like this to work?


The problem is not a single person is capable of making informed decisions on extensions. One day they could be good and reputable, next day they have been sold to a malware company.

Maybe the middle ground is all extensions have a legal entity in your country so you can sue them for spyware.


What?

The same can happen with any piece of software in the world. Why single out extensions?


We sandbox apps to prevent them from reading each others data. It's impossible to sandbox web extensions and have them retain basically any of the functionality needed.

As well as almost all regular software is backed by some large company with legal presence to hold responsible. The same can not be said for most extensions.


> We sandbox apps to prevent them from reading each others data.

This is also true of web extensions. I suspect you've never developed one. You can't read another extension's data. It's also not true on desktop platforms. The user is still the security domain in desktop computing.

> As well as almost all regular software is backed by some large company with legal presence to hold responsible. The same can not [sic] be said for most extensions.

Is this true? All the browser extensions I use are published by a real legal entity that can be sued if they are negligent. What corner of the web are you on?


There are at least 100s of extensions that are published by legal corporates, adgaurd extension is pretty sure a legally registered company as an example.


Not really any: popular open-source software, which gets reviewed and rebuilt by many independent people (distro maintainers) has a much lower chance to be hijacked this way.


Oh yes it can. Project owners have sold out before--sometimes without telling anybody. The threat vector is the same. Something you trust gets sold to someone else and it abuses previously acquired trust. Open source doesn't actually fix this specific trust issue. And BTW your browser extension's source is available to peruse locally on your machine after you install it. Surely you did that, right?


Being a popular open-source project is not a guarantee, it merely lowers the chances and complicates the attack.

Regarding extensions code: indeed, I do read the code of extensions that require some elevated permissions, if these extensions are not otherwise vetted. This is why I avoid installing excessively complicated extensions, unless they ask for minor permissions. Having the list of tabs if no big deal; accessing data in your tabs, even for a particular site, triggers scrutiny.


> Being a popular open-source project is not a guarantee, it merely lowers the chances and complicates the attack.

It also makes it easier to deal with it after the fact. You can fork an open source project the minute it's detected that it's doing something it shouldn't. When closed source software goes bad you can't pick right up from the last known good version and move on, you have to find a product that entirely replaces what you had and hope that it does everything you need at least as well which isn't always likely since you were presumably using the other software because it was better than existing alternatives.


Sounds like we agree then that extensions are not any different from other apps in this respect and that you should always review the source code if you are installing software that needs to be given great power.


> next day they have been sold to a malware company.

Or pwned


That already happened with Firefox for Android. The old version allowed extensions, the current version only allows something like 10 specific ones.

There will always be a forked browser or an independent browser that supports/allows extensions as long as the web uses HTTP.


It takes two clicks to install an extension. There's no differentiating people who know or don't know what they're doing with two clicks.


Is this a roundabout way of saying, just don't use Google products and services? (non-rhetorical q)


Oh, I am quite certain most corporations and hackers did exercise this curiosity for all of 2022, at everyone's expense. Shout out to Google for the ongoing cover-up. Can't lobby this one away.


Now try actually distributing it.

My guess is this wouldn't even get close to getting through the review process for the Chrome Webstore. From our experience with Streak, this would def get picked up in review.

Seeing other comments in the thread pointing to this article as a reason why MV3 is bad I think misses the point. Personally I think MV3 is a step in the right direction (even though it negatively affects us!). But it's only one piece to make extensions more secure - the others being manual review, policy adjustments and automated scanning. Even though the APIs allow for all sorts of functionality doesn't mean you'll be able to get through the rest of checks.


My experience so far with publishing to the extension store has been that they examine both the code shipped as well as the scopes used. I've had apps rejected due to overly broad scopes and it was obvious based on the responses that the reviewer was pretty competent at JS.


He mentions this in the article, the heading is "Publishing to the Chrome Web Store"

"This extension would be laughed out of the review queue."


The real trick isn’t publishing a new app, it’s purchasing an existing app and pushing an update with malicious code. The latter review process is more lax


No it's not. From our experience at least.


Just buy an already published popular extension and submit an update


Updates still go through review.


This is a spicy essay for sure but what is the author's actual point? If the user grants you permission to do all these things, then you have permission to do all these things. If you can't be trusted and abuse that permission then you are not ethical. If you aren't ethical someone will find out and your extension will be removed in the worst case and simply not approved in the common case. The author even admits as much saying this thing would never pass Google's review process in a million years. Sounds like there's no real risk here and we're mostly just enjoying the show...

I do agree about the permission UI box. Surely that's a completely simple fix on Google's part to force the user to scroll through the permissions box before accepting.


Is that solving any real problem? Will any single person actually be protected by that annoyance? The permissions already appear roughly sorted by invasiveness. Is the sixth one really going to be the one that your install decision hinges on? I mean, once you have "Read and change all your data on websites" it's game over anyway if the extension is truly malicious.


I think perhaps we need to agree on the real problems that exist.

It's not like users are getting their data stolen left and right out there and man these extension trojans are winning the battle against good extensions and we just can't shake 'em. All your base... are belong to us.

It seems the status quo is that extension fraud, while entirely possible as this article demonstrates, is not actually a problem. If you don't trust a piece of software to access your webpage content then don't grant it the permission to do so. I think the onus is on alarmist pieces like this to bring the data supporting the existence of a problem to be alarmed about in the first place.

I mean, raise your hand if you've been pwned by a malicious Chrome extension recently...


It's very early for users to not understand and for apps to ask for a bunch of permissions. One great example is Grammarly, which is a keylogger that helps users with grammar. I don't think Grammarly has bad intentions, but still, millions of users are giving it access to everything they type.


How else would it correct their grammar? Users want it doing that. It’s not malware.


Okay, maybe "keylogger" was incorrect, but Grammarly has the potential to expose users to all types of security risks they might not realize are possible.

Here's a past example: https://bugs.chromium.org/p/project-zero/issues/detail?id=15...

I have nothing against Grammarly; it's a legitimate company and they seem to respond well to security vulnerabilities, but it still bothers me because a user doesn't think through the potential ramifications of using software that records every keystroke.


It's a tutorial to build a Chrome extension. Like, a project template. He links source code, found a privacy issue in Chrome, and the Grinch-plundering Whoville character made me laugh. That's already asking a lot for a Substack post.


> force the user to scroll through the permissions box before accepting.

Something else used to do that. Java, maybe? Whatever it was had regular enough updates that I _habitually_ drag the scroll bar directly or simply hit the end key to this day when I get to EULAs and other long modal popups.


Look, I hate MV3 as much as the next guy. I've even wasted part of my life porting a large extension to it, so I might hate it MORE than the next guy. But I don't draw any security conclusions from this article.

For every permission in your manifest you need to provide the chrome web store reviewer with a written justification for why your extension needs that permission. Even the ones that don't prompt the user. And they definitely read it, and your code.

Shipping malicious extensions is almost entirely a social engineering problem and not a technical one.


> If we’re expecting the page DOM to change often (for example, with SPAs), we certainly don’t want to miss out on any valuable data. Just set a MutationObserver to watch the entire page, and reapply listeners as needed.

The code below this text is highly inefficient and may lead the user detection solely from page interactivity slowdown alone. A more efficient implementation could read input using the 'input' event[1]. For example, here[2] is how you would use the input event to detect changes to any fields in a page.

1. https://developer.mozilla.org/en-US/docs/Web/API/HTMLElement...

2. https://gist.github.com/eligrey/615fcc9fa9edbfb5153478109b5b...


Fair point, multiple people have circled this snippet as problematic. I'll confess, I didn't spend much time testing this for performance.

My idea was that separate inputs should have separate debounced handlers, but it's likely you could do away with that and just listen for input events globally with no adverse effect on data collection.


"Let's build a Chrome extension that taps what google is already stealing."

But it isn't stealing if you clicked something somewhere sometime so "stealing" is wrong will be the PR response because people are being paid to not understand "stealing is wrong"


Author here! I'm tickled to see that this whimsical cautionary tale is so resonant.


Do you have any concrete recommendations beyond the obvious "fix the glaring permissions UI scroll box issue"? Are you advocating for browsers to remove these permissions altogether because you feel they're too dangerous? Are you lobbying for a shift towards use-site triggered permission requests like Safari does? It seems you understand the tale is whimsical but I fear some people view it much more seriously and want to start an extension witch hunt. It would be nice to see a concrete call to action so it would be more clear what your proposed solution is instead of just inciting a bunch of pitchforks with no clear goal.


I'd like a fork of chrome which removes all (or at least most) the "features" mentioned - a browser that renders well but just doesn't support these masses of unsecure features.

If you want to give 3rd parties access to all that stuff, you can run chrome. But I don't - I want the bare minimum that will run normal websites. I know that will break some pages, I'll accept that. (And that would give me a smaller & faster browser.)


If you don't install any addons, you'll be fine. You'll have to do without uBlock Origin and other ad blockers, though. Consider Brave if you still want those and want to stick to Chromium.

In Firefox, you can go to about:config and set the default permission to deny to a lot of stuff (notifications, clipboard, etc.). You can also disable webgl and other features like those.


I'd feel much more secure, from extensions and websites if those permissions just didn't exist. I can only turnoff what I know about, but not those new things quietly added with each update.


I don't understand this sentiment. Websites can't use these permissions so it's not applicable at all.


> uBlock Origin and other ad blockers

Router based adblockers work well, Flint by GL.net comes with nice UI and adhlock and VPN built in.

Some people complain about its chinese origin but at least I know only 1 government is spying on me - my provider supplies a router with a linux kernel older than this house. There could be an entire ensemble of Trojans partying there


Pi-Hole or the adblock package for OpenWRT are probably better examples. GL.Inet routers are already natively supported by OpenWRT (since their firmware are just custom forks) so one can flash them.


They come with OpenWRT already, just have a much nicer UI.

You can still access LUCI openWRT UI if you need advanced features.

If you need a new router, they are a decent choice, better than average performance, no need to flash anything.


Yes, theirs is a custom fork. Only commented about flashing to vanilla OpenWRT in relation to your comment about being spying concerns (and since some, including myself, aren't that comfortable with their remote cloud service built into it).


Not with encrypted DNS (DoH and DoT).


Your router can proxy encrypted DNS (if you have some decent firmware) or you can set up your own DNS server. There are also things like nextdns.io which can do all the work that a pihole does but works outside your home network.


I think the point is that some applications will use DoH/DoT/a custom protocol to bypass DNS-based blocklists. It's trivial to run your own DoH/DoT/custom server if you just hardcode the IP into your application.

You can still block those by doing IP-level blocks for known ad domains, but that starts to become a problem if one of those domains are run from a shared cloud host (i.e. Cloudflare etc.) because you will also block legitimate domains.

Most in-app ads and tracking will still use HTTPS so if you use SNI sniffing + certificate validation (to prevent domain fronting) you can still do network level blocks, but that's quite resource intensive, especially at modern internet speeds.


And not once encrypted SNI has proliferated.



The features mentioned in the article are available to extensions, and ungoogled-chromium supports these just as well.


Sure, but you don’t need to install them.


> I want the bare minimum that will run normal websites. I know that will break some pages, I'll accept that

If you are willing to accept quite a few pages breaking, Lynx is probably the smallest and fastest browser:

https://lynx.invisible-island.net/


What else would you remove? If it's just extensions, can you just not install them?


Maintainer of a Chrome Extension with 10,000+ installs here. Chrome doesnt willy nilly approve your extension. They even take down extensions that ask for permissions you do not legitimately use. The article doesnt say for how long op was able to put his extension on the chrome store without it being reviewed or taken down.


It does: « Publishing to the Chrome Web Store − I’m kidding, of course. This extension would be laughed out of the review queue. »


What's up with the tons of fresh accounts (all created 3 days ago) posting plagiarized snippets in the comments? Various snippets from news articles, Quora, etc.

Sample of accounts: ChillNilly, LadyXaga, NerdAlerts, SuperDud, QueenBean, Moonshining, LetFree, FoxyFox22, TurkeyTurtle, LovableLily, BeingBean, CandyRandy, AdorableLama, WiseWolfie, WoozyWarrior, PenguinPeace, SunnyHorsey, SunnyMaylor, WiseSnail, ZappyHippo, FriendlyFlame, PudgyPanda, FriendlyFlame


It's very bizarre, and possibly why it got kicked off the front page


I have a chrome extension with about a 1000 DAU right now [link below]. I'm getting messages to buy the whole thing out but the buyer always fails to answer why they want to buy it. they are also open to buying any extension whatsoever. I suspect it's to open up the permission model and started stealing user's data.

link: https://github.com/prakhar897/workaround-gpt


One of the issues here is that the browser is prompting the user for all the Permissions at install time. Both Android and IOS have moved away from that. Perhaps it is time browsers to move away from that as well.


Extensions are one thing, but I'd also welcome granular permissions to various JavaScript capabilities for every website. I don't like when some websites capture native browser hotkeys (CTRL+F), disable my right mouse button, change scrolling behavior or perform asynchronous HTTP requests. The only solution I found to protect against these practices is disabling JavaScript completely for given site, but more often than not it prevents the page from rendering altogether.


You need extensions to stop this behavior from websites. ;-)


At this point I want the able to control what HTML and CSS can do!


And this is why I am hesitant to install any and all Chrome extensions.

Well done!


At the office we maintain a policy which restricts any extension installs but ones explicitly vetted by IT. I would go so far as to suggest any company that doesn't do this is negligently irresponsible with computer security at this point.


> Without looking, can you name more than half of the extensions you have installed right now?

Sure.

uBlock Origin, Multi-containers, Temporary Containers and cookies.txt on Firefox, which I only use for specific purposes. History and all data is wiped frequently.

None on Chromium, which I always use in incognito mode. I use this daily, but don't need even uBlock on it, since I run a DNS ad blocker on my network.

And none on my main browser, Luakit, since it doesn't support extensions. :) Technically, I have some user scripts, which I've all reviewed or written myself.

Browser extensions are the number one security and privacy risk for all users, more than any OS exploits. The fact they've historically been handled so poorly, and these issues exist even today, should be terrifying.

Great article and extension! <3


> Just set a MutationObserver to watch the entire page, and reapply listeners as needed.

I did not know such thing is possible. I want to make an extension which undeletes some chat messages in typical chats (usually that happens because of moderation)


>Identify and eject storage devices

I mean, why?


It’s all because of chromebooks. Google has had to implement basically every capability as a js api so Chromebook’s can do real work.


You should have been around before chrome books when any extension could do whatever it wanted without any permissions at all. Your understanding of history is missing some key pieces. Over time, Google has generally locked these APIs down not opened them up.


Browser JS definitely never had the ability to unmount storage volumes before Chromebooks existed.


Oh but your Java applet surely could.


And nothing would prevent an extension from loading an applet


Extensions could, and this action could be triggered from browser JS.


Because browser makers and web app devs want to be able to do everything desktop software can, but inside of a browser.

In theory it’s kind of neat.



In theory that's basically like letting anyone on the internet run arbitrary code on your devices which is a terrible idea. In practice it's like letting anyone on the internet run arbitrary code with a few guardrails to catch the worst and most obvious abuses while it takes control away from the user and allows for highly invasive forms of tracking that is very hard to prevent.


In theory, there is no difference between theory and practice; but in practice, there is.


Chrome has a a USB interface; I'm guessing identification and ejectionnis required functionality fir dual-mode USB devices that may present as storage devices foe driver installation.


100% this is how people are getting their social media accounts hacked for scams, crypto stolen, etc.

Stronger passwords is useless when the session is stolen, when the actual data is read and sent off


Being able to lift cookies from every website you are logged into sounds crazy and amazing. Anyone have any clue what people are using it for? Maybe roll your own session sync?


The author notes that this sort of extension would be laughed out of the review queue....but there are plugin authors who get plenty of users by putting up a website and making the plugin available directly from their site.

For example, the author of FB Purity hasn't explained to anyone why his plugin is not available via Firefox's extension store, only via his page. Presumably, he didn't meet some requirements they had...but he won't say what they were...


And this is why I welcome “control over” what I can do with my device by my OS vendor. Even though it’s mostly trivial to bypass, it still serves as a good gut check in the rare case where I might skip over my own scruples. 99% of the time if I even get this far I back out because I’ve realized by that point I was being foolishly trusting.


Given all the extensions in the store that are at some point updated with a trojan to sell your internet connection to shady people, the "review queue" is some kind of mythical beast that doesn't in practice do or achieve anything.

It would be trivial for Google to find all the extensions using that kind of crap, but they don't care.


Don't all Firefox extensions have to be signed by Mozilla in order to be installable (in non-developer Firefox editions at least) these days? Even if they're publishing it on their own site, it should have gone through the review.


Yes. It's mostly automated review, usually taking a matter of minutes, though I guess Mozilla reserves the right to do manual checks if they find something suspicious.


Which is exactly why these permissions exists. If you don't take permissions that allows you to horrible things, you are rubber stamped and can go on your way. If you want to do more involved things, you're escalated.


My Firefox add-on has "<all_urls>" permission and gets rubber-stamped.


> Who maintains them? Is it the same entity that maintained it when you first installed? Are you sure?

Oh yeah, got bitten hard myself on that one a couple years back, it took Google days to respond to the extension buyer uploading a malware'd version. The worst problem is that extensions auto-update silently so you as an user don't even have the chance to spot anything in time.


I don't understand why Chrome even does up-front permissions.

iOS got this right from the start: ask on the first attempted access of the gated resource, allow the user to grant the permission once or on an ongoing basis, respect the choice. Don't allow permission prompt spam.

Even Android recently moved to this model from up-front permissions, so Google is aware of it.


Is anyone aware of a Chrome extension (or other spyware) that uses the macOS system clipboard to steal WhatsApp data?

I recently had an incident where WhatsApp Web was open in a tab in the background in a different browser window to the one I was actively using. I received and replied to a message on my phone. So imagine my surprise when I went to paste what I had previously copied from a web app in one Chrome tab to into a textfield in another, both in the active Window, to find that what was pasted was the second last message that I had sent in WhatsApp on my phone.

I have since deleted my Chrome profile at a system level, and the only extension currently installed is a well known password manager, but it bothers me to think what could have caused this aberrant behaviour, and whether there's something still installed on my system that's stealing data.


Probably off-topic. Has anyone done a security review of ublock origin chrome extension?


You really know how to write an article which is technical and at the same time fun to read.


High praise indeed! Thank you.


The probable intent of the author of this article is to let devs to know about his book: “Building Browser Extensions”. Ordered mine just now.


This is an excellent accidental rebuttal to the entire Manifest v3 project. The stated reason for the new major version and breaking changes is officially:

> Manifest V3 represents one of the most significant shifts in the extensions platform since it launched a decade ago. Manifest V3 extensions enjoy enhancements in security, privacy, and performance...

https://developer.chrome.com/docs/extensions/mv3/intro/

Web developers (see uBlock origin for one) have been complaining that Manifest v3 breaks chrome extensions for no discernible benefit and Manifests v3 exists entirely to protect Google's ad business. This review gives fresh evidence to support that assertion and showcases Google's deception. As seen in this blog post, extensions can request literally every permission and the user permission warning actively hides the permissions beyond the content fold.These changes haven't been made for security's sake.

I wish the entire Manifest v3 was scrapped, but that likely won't happen. I'll settle for people assuming Google is lying by default.


It's good that it doesn't pretend to be a rebuttal, because it'd be a bad one.

I'm pretty sure the point of making a declarative content blocking API for adblockers is not to block all possible ways of writing a malware extension. It is just to make the most popular category of extensions safe by design. Once that has been done, it's then much easier to improve the situation with the remaining niche use cases.

What would those improvements look like? It could be finding other common ways of dangerous permissions being used by legit extensions, and extracting these patterns out as explicit and safe capabilities. It could be changing the messaging to make it easier for users to understand how dangerous the requested permission is (which they can't reasonably do while those dangerous permissions are still used by adblockers!). Or it could be a stricter review process for any extensions needing such permissions.

This extension that the author themselves think would never pass review doesn't really rebut that in any way.


The ad blockers this affects are popular because they do more than declarative black lists. It may make them safe by design, but it also makes them something completely different and less capable than what they are today. There's some room to be suspicious about that.


That's totally fair. Nobody except the people who proposed / approved the project know what the main motive was. It could be trying to hinder adblocking, could be security, it could be performance, or it could be that somebody just wanted to copy Apple.

If we as outsiders try to reason about that decision, it makes sense to pick the strongest version of those motives, not just strawmen. There's a good security argument to be made, and a silly security argument. If you pick the latter one to argue against, of course it will look like a bad excuse, leaving the more venal explanations as the only possibilities.


How is preventing extensions from blocking requests making them safe by design. You can still use the API to record every network request and send it to a server.


Chrome extensions that contain malware aren't written and submitted to the chrome store hoping to sneak past review. Malware authors _purchase_ the intellectual property of fulling functioning, useful extensions, and update them to contain their extra malware payload. I'm not sure where you got the idea that a review would be involved here at all.


AFAIK updates go through a review process as well.

Reducing the attack surface has similar benefits for this case. There will be fewer extensions with dangerous permissions around for bad actors to buy, and the the reviews for the remaining legit use cases for those dangerous permissions can be stricter.


The counterpoint might be that being declarative it's easier to do static analysis to find malicious extensions. But I'm not sure how much I buy that argument, and it's no excuse to disable v2 extensions entirely.


It just removes the OnBeforeRequest() way of injecting javascript. There are other directly supported ways to inject javascript, some of them easier than OnBeforeRequest(). The counterpoint would have to be something like removing that was just the first step, and that they plan on closing all the doors. But, if you close all the doors, really all the most popular extensions are hobbled.

Edit: Removing just OnBeforeRequest() js injection does sort of uniquely harm mostly heuristic ad blocking and things like tampermonkey. It's not hard to feel like that was probably the real goal.


It's funny. They crippled extension usefulness in the name of "security", yet you can still make something like this that will steal every piece of your data and masquerade as your tabs while performing malicious behavior.

Very secure, indeed! But at least those pesky adblocks are defeated.


This is my main worry with Firefox as well.

How can I even be confident beyond reasonable doubt that the uBlock Origin extension I have installed won't suddenly start exfiltrating any passwords I enter on websites, for example.


>How can I even be confident beyond reasonable doubt that the uBlock Origin extension I have installed won't suddenly start exfiltrating any passwords I enter on websites, for example.

You can't, just like you can't be confident that any other piece of software on your computer won't start doing it. This problem isn't specific to extensions in any way and I don't understand why people act like it is. If the risks are too much for you, don't install them, just like you wouldn't install any other software you don't trust. Don't try to ruin it for other people who understand and are willing to take the risk.


Please try not to extrapolate my comments to conclusions like I'm (quote) trying to ruin it.

Keywords are "beyond reasonable doubt".

uBO is a Recommended Extension, and even has a badge that says it's only granted to extensions that meet their standards of security.

But do we know if Firefox manually reviews updates as well, for changes in their source code? Or do they only review them once (at the moment where they grant that badge)?

I can't find conclusive info on that front.

I would be happy to know that a few extensions get their updates manually reviewed.


Like all software: disable automatic updates, build from source, and vet patches when you pull them.

I can't be bothered doing that, but then I'm willing to take the maintainer at their word when they say their software doesn't just have a Free license, but also respects my freedoms. https://github.com/gorhill/uBlock/wiki/Can-you-trust-uBlock-...


You cannot. Not with any software you didn't write or very carefully review. It's always about trust.

In the case of your example, I think gorhill is a prime example of a trustworthy author. He has a very good track record, never betrayed the users, explains his thought process and behaved consistently in the users (my) interest in the past. uBlock Origin is the one extension I trust the most, even more than say the Multi Container extension from Mozilla.


My worry is not the author or the extension; it's someone being able to push a new release of the extension (e.g. signing keys compromised, some vulnerability somewhere, etc), and it being automatically approved by Firefox without manual review.


Reading this makes the Apple App Store walled garden not seem so bad after all.

If someone were to add an extention with this manifest, would it even be reviewed, or would it need to be flagged first?


Cool. What’s the easiest way to push this data to a remote server?


If not alerting the user was a primary goal, they done messed up using MutationObserver. It would absolutely halt the browser completely, especially running across multiple tabs.


Great stuff! This is what I come here for.


Yea Chrome would be a lot more secure if we didn't let anyone view any data at all.


Just the title of this post alone should make it more than obvious that the article is not about preventing anyone to view data, but rather to grant anyone access to it in an exploitative way.

In response to the article itself: you might even be able to get such an extension on the Firefox Extensions store and just maybe get a "Recommended" status too. Refer to my comment in another post from a few days ago for other such current violations: https://news.ycombinator.com/item?id=34832280


chrome.tabs.captureVisibleTab()

Anyone knows what is the actual legitimate use case for this API ? Seems very dangerous to allow extensions access to it.


[flagged]


Nice, a bot account trying to farm karma.


[flagged]


Cool, two bot accounts farming karma


[flagged]


Worth noting the "Netflix Party" in question is not the extension now called Teleparty (previously Netflix Party), which is an order of magnitude more popular than the compromised equivalent named here


[flagged]


Wild! There's 3 "bot" accounts posting stolen snippets




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: