Hacker News new | past | comments | ask | show | jobs | submit login

>Before a project starts using a new crate, members usually perform a thorough audit to measure it against their standards for security, correctness, testing, and more.

Do they? I mean really? Let's lay aside the fact that it's almost impossible to eyeball security. I just cannot imagine that Google works so differently to every company I've ever worked at that they actually carefully check the stuff they use. Every company I've worked at has had a process for using external code. Some have been stricter than others, none have meaningfully required engineers to make a judgement on the security of code. All of them boil down to speed-running a pointless process.

And that leaves apart the obvious question: I want to use a crate, I check it 'works' for what I need. Some middle manager type has mandated I have to now add it to crate audit (FYI, this is the point I dropped importing the library and just wrote it myself) so I add it to crate audit. Some other poor sap comes along and uses it because I audited it, but he's working on VR goggles and I was working on in-vitro fertilization of cats and he's using a whole set of functions that I didn't even realise were there. When his VR Goggles fertilize his beta testers eyes with cat sperm due to a buffer overflow, which of us get fired?




Some useful context here:

https://chromium.googlesource.com/chromiumos/third_party/rus...

Seems there are 3-4 folks who helped build this and spent a lot of time doing initial audits; they outsource crypto algorithm audits to specialists.


Before the layoffs I worked on a security checks team (“ISE Hardening”) at Google. Google requires for almost all projects that code is physically imported into the SCS; when this code touches anything at all, extremely stringent security checks run at build-time.

These checks often don’t attempt to detect actual exploit paths, but for usage of APIs that simply may lead to vulnerability. These checks can only be disabled per file or per symbol and per check by a member of the security team via an allowlist change that has to be in the same commit.

This is not perfect but is by far the most stringent third party policy I’ve seen or worked with. The cost of bringing 3p code into the fold is high.

The flipside of this is that Google tech ends up with an insular and conservative outlook. I’d describe the Googl stack as ‘retro-futuristic’. It is still extremely mature and effective.


Like many here I haven't seen the Google sausage being made, but I've had many Googler coworkers and friends over the years. I've learned that they may really be in another universe (e.g. put every single line of code over all space and time in the same SCCS, oh and write a new kind of build system while you're at it because otherwise that...doesn't work). So possibly they just don't use external dependencies, and the small number they do use really are "properly" audited?

But meanwhile in the regular universe, yes it happens the way you say.


Google uses a fair number of external dependencies. But Google imposes a fairly heavy cost to add a new dependency. You (and usually your team) has to commit to supporting updating the dependency in the future (only one version of a dependency is allowed at any given repo snapshot), and fixing bugs. Often it is easier just to write code yourself for trivial dependencies (nobody is using left-pad!).

Adding a dependency also generates a change list (because dependencies are vendored), and so the normal code review guidelines apply. Both the person adding the dependency and the reviewer should read through the code to make sure that the code is in a good state to be submitted, like any other code (excluding style violations). Small bugs can be fixed with follow up CLs. If the author/reviewer doesn’t understand e.g. the security implications of adding the dependency, they should not submit the CL.


I've talked to many Googlers over the years, and your summary is consistent with what I've heard before, so I don't think you're lying. But this is still the most insane dependency managenent scheme I've ever heard of. Is Google truly so far up their own ass that they make it harder to pull in a third party library than write the code in-house? Why is Google so allergic to using a package manager like every other software project in open source?

You depend on any modern JS library like Babel or Webpack and it pulls in a dependency tree consisting of hundreds of packages. I cannot fathom that the expected and approved workflow is for someone to check in their node_modules directory and be expected to security-audit every single line, and "own" that source code for the entirety of Google. Sounds absolutely insane.

Not to mention needing to hand-audit that every transitive dependency of Babel and Webpack works with every other module in the repository, because of the one-version policy that exists for some "good" reason.


> But this is still the most insane dependency managenent scheme I've ever heard of. Is Google truly so far up their own ass that they make it harder to pull in a third party library than write the code in-house? Why is Google so allergic to using a package manager like every other software project in open source?

In the context of working in a highly sensitive business environment, I think the typical defaults of most package managers are way more insane than the practices being described (vendoring, auditing etc.) I think google is just being upfront about the costs of dependencies, which are often hidden by package managers. At the end of the day it's just code written by other people and using that code blindly has huge risks.

I think this is pretty context specific though. Do I care if my hobby project goes down for a day because a dependency auto-updated and broke something? Not really.


> Is Google truly so far up their own ass that they make it harder to pull in a third party library than write the code in-house?

From the descriptions in this thread, pulling in a third-party library is still far easier than writing the code in-house for them.

At least, it sounds to me like for adding the kind of example you gave, their process for adding the dependency is on the order of person weeks or in the worst case months, while writing the code themselves would be on the order of person years or decades.


I think it is interesting how both possible stories get criticized.

Option 1. Google has minor but uninteresting restrictions on pulling into //third_party: "well these audits are obviously useless because nobody reviews the code that closely."

Option 2. Google has very strong restrictions on pulling into //third_party: "this is so far up its own ass and completely unproductive."


> All of them boil down to speed-running a pointless process.

There's a pretty large gap between auditing every line of code and doing nothing. Google does a good job managing external dependencies within their monorepo. There's dedicated tooling, infrastructure, and processes for this.


Starting over a decade ago, I instituted auditing packages used from a Cargo-like network package manager, in an important system that handled sensitive data.

I set up the environment to disable the normal package repo access. Every third-party package we wanted to use had to be imported into a mirror in our code repo and audited. (THe mirror also preserved multiple versions at once, like the package manager did.) New versions were also audited.

One effect of this was that I immediately incurred a cost when adding a new dependency on some random third party, which hinted at the risk. For example, if a small package had pulled in a dozen other dependencies, which I also would've had to maintain and audit, I would've gone "oh, heck, no!" and considered some other way.

At a later company, in which people had been writing code pulling on the order of a hundred packages from PyPI (and not tracking dep versions), yet it would have to run in production with very-very sensitive customer data... that was interesting. Fortunately, by then, software supply chain attacks were a thing, so at least I had something to point to, that my concern wasn't purely theoretical, but a real active threat.

Now that I have to use Python, JavaScript, and Rust, the cavalier attitudes towards pulling in whatever package some Stack Overflow answer used (and whatever indirect dependencies that package adds) are a source of concern and disappointment. Such are current incentives in many companies. But it's nice to know that some successful companies, like Google, take security and reliability very seriously.


Yes, some people review literally every line. Cargo-crev has a field for thoroughness. Many reviews are just "LGTM", but some reviewers really take time to check for bugs and have flagged dodgy code.


> When his VR Goggles fertilize his beta testers eyes with cat sperm due to a buffer overflow, which of us get fired?

The PM gets promoted for encouraging fast experimentation!


At least sometimes: https://cloud.google.com/assured-open-source-software

Only 1000 packages but certainly seems they do that for a subset.


> When his VR Goggles fertilize his beta testers eyes with cat sperm due to a buffer overflow

Ahh, classic undefined behavior.


Well just today I found unsoundness in a crate I was auditing. It turned out that the crate had since removed the entire module of functionality in question so I couldn't submit a bug, but it led me to take steps to remove use of the crate entirely.


Don't forget that you need to do this not only for the crate you depend on, but the whole dependency subtree that comes with it as well.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: