Hacker News new | past | comments | ask | show | jobs | submit login
Super-secret Google builds servers in the dark (arstechnica.com)
91 points by llambda on March 15, 2012 | hide | past | favorite | 49 comments



What bizzaro world have we woken up in where Facebook is leading the charge for server openness?

"At Google we believe that open systems win. They lead to more innovation, value, and freedom of choice for consumers, and a vibrant, profitable, and competitive ecosystem for businesses...

We need to lay out our definition of open in clear terms that we can all understand and support...

There are two components to our definition of open: open technology and open information. Open technology includes open source, meaning we release and actively support code that helps grow the Internet, and open standards, meaning we adhere to accepted standards and, if none exist, work to create standards that improve the entire Internet (and not just benefit Google). Open information means that when we have information about users we use it to provide something that is valuable to them, we are transparent about what information we have about them, and we give them ultimate control over their information."

http://googleblog.blogspot.com/2009/12/meaning-of-open.html#...


People (generally) open up things that aren't fundamental to their core. Google opens a lot of software, but very little about the actual core that runs everything, because those instant results are a key differentiator.

FB opens their hardware, because the software is really quite meaningly to their core. The thing that matters to them is the social graph, which they guard jealously.

If you can make your competitor's core a commodity, you've gone a long way towards reducing their effectiveness as your competitor.


> If you can make your competitor's core a commodity, you've gone a long way towards reducing their effectiveness as your competitor.

I've been considering this idea a lot and I think it explains the motivations of a surprising number of technical advancements. The masses get better free/open stuff as new competitors come along and enlist the community to help undermine the incumbents.


This is pure speculation, but I wonder if google has photoelectric sensors that use light levels to send alerts to their administrators.

I bet it's a clever way of handling physical security. If anybody is accessing the server, they'll want a flashlight. Light sensor detects the light and alerts an admin. If it isn't a scheduled access, then danger will robinson!


Interesting idea!

Or similarly, perhaps they want anyone lingering/lighting their areas to be extra-obvious on security footage.

Without an explanation like these, it's hard to understand the security-gradient provided by darkness, against people who have leisurely physical access. Spies whose access isn't being closely observed can bring their own lighting, cameras, and even infrared-equipment, too.


I bet it's a clever way of handling physical security. If anybody is accessing the server, they'll want a flashlight.

http://www.youtube.com/watch?v=xyjn8aybPBs </humor>


Lights also generate a fair amount of heat and use unnecessary energy -- both things you don't want in a data center. Was that part of the calculus?


They don't generate anywhere near the heat nor use anywhere near the amount of energy the servers do. I imagine this wasn't a big part of the equation. Google is incredibly secretive about datacenters and servers. Speaking as a former engineer who worked on web search, all information about datacenter hardware was very, very confidential, way more than plenty of other confidential info. I hadn't heard about the lights out policy before, but it doesn't surprise me.


well given the pics of the DC that google released I do not think they take physical security that seriously.

The fences where low and only a single fence they did not cut back the woods next to the fence and they did not take the simple solution of extending the near by lake to make a proper moat.


Your competition is only going to go so far. "I got turned around in the datacenter, oopsie" is something that happens. It's not likely that Amazon is going to send someone over a fence, regardless of how low it is. That's crossing the line.

vaguely related: http://old.post-gazette.com/pg/06188/704045-28.stm


Amazon may not be sending someone in. But there could be some "paparazzi" out there who would like to get a photo of this for his/her fame. Then, it will be visible to amazon, facebook et al.


But its ignoring standard practice at even low security places for example the fences at BT Labs (Uk's Bell Labs) where about 50% higher and curved inwards (to keep intruders inside.

Though one visiting US telco guy said of BT Labs "Fuck Me it's a Prison"


Google Data Center Security video from 2011: http://www.youtube.com/watch?v=1SCZzgfdTBo

It's probably not going to stop heavily armed invaders, but nobody's going to sneak onto the data center floor.


It seems like a popular view here that Google is just trying to save twenty bucks on it's electric bill and I hate to (be the fourth guy in this thread to) break it to them but this has zero to do with electric bills.

Related: Santa not real.

{not to single nostromo out, it's a legitimate question but it seemed out of place as top comment}


That was my first thought as well, but thinking further, LED lights can be used and their heat and energy generated should be so low that it shouldn't be part of the equation.


Comparing lumen per watt, the T5 lights are still more energy efficient than most LED lights.


I suspect there is another dimension to this. Not critical but I think it is there. And that is the desire to stand out, be different. People like rituals and complicated processes. You can be just your regular run of the mill engineer OR you could be a Google engineer wearing a miner's hat and imagining you are opening up HAL to examine or fix problems with.

In a way look at military. It is very ritualized. Ex. there is the changing of the guard rituals, specific culture and practices that are passed down. Any large organization will have those. It sort of help with group cohesion and assigns a level of importance to even mundane things like swapping out a hard drive.


As someone who has been in that facility I chortled.

Actually thought that after Facebook's Open Compute work they would loosen up a bit at Google, guess not.

As for the camera question, generally it can be pretty informative if you can take a picture of 'all' of someone's infrastructure to get a handle on what their costs are but I doubt that seeing a bunch of SuperMicro, HP, Dell, or BrandX faceplates in one installation out of hundreds reveals too much info.


Google: It's important X does not see our super secret servers, how best may we attain that, team?

Team: We suggest we co-locate with X and then turn the lights off - that way they can't see our super secret servers.

Google: Awesome idea!

Some time latter:

Co-Facility: Here come the gogglers with their head lamps, quick turn the lights off.

Co-Facility: They are gone, don't they look great with their LED lights, make a cool movie huh?

Co-Facility: Crikey, here come the X's, quick turn the lights on, oh man it's such a shame they don't have LED head lamps, they would look so cool if they did.


I'm very suprised that they keep some secret, custom gear in a semi-public shared data center. What exactly prevents no-googlers from wearing their own headlight or using a flash in camera?


"I'm very suprised that they keep some secret, custom gear in a semi-public shared data center. "

Google wants speed above all else which can be delivered best in some cases by shared centers like Equinox's. That sometimes finds itself at odds with Google's desire for secrecy for it's own proprietary advances.

"What exactly prevents no-googlers from wearing their own headlight or using a flash in camera?"

Presumably rules. Ok, so people can break the rules. The amount of information they'll get from looking at a server with a flashlight isn't very much though.

Turning the lights off is like putting piranas in your moat outside your castle. The castle is the enclosure, the moat is the contract and rules you have with the people running the site. The piranas aren't really what's protecting you but they close off some attack vectors.


I'm just saying that it's a very silly protection. It's not like piranhas, more like having a moat full of herrings. For me it's this kind of protection that says "hey! there is something interesting here" but is not really preventing anyone from peeking. I understand why they use shared centers and that they actually have some kind of custom equipment that should be used there for our pleasure. I think that putting it in a normal-looking rack would be a way better protection.


It's actually better then piranas, it's more like frogs that make a loud noise when someone trys to swim past the moat. If Google's area is not supposed to have lights and lights go on there that's suspicious. Google could have light sensors on/near the servers along with surveillance cams and hell let's throw in a bullhorn for good measure...

"Step away from the server Mr. Ballmer!"

"Drat, foiled again! Meddling teenagers!"

Bullhorn aside, if you have 100K servers that might actually be an efficient way to monitor security for them.


A lot of co-lo facilities frown upon people taking cameras/phones into their server rooms.


They do --but with the camera & video equipment available today, unless they search people before they enter facilities and confiscate, it's more of a gentleperson's agreement that people will not take surreptitious photos of others' equipment.


Cameras, certainly, but wouldn't flashlights be pretty standard?


I walk in to such facilities carrying a backpack full of assorted gear and two camera-equipped smartphones, nobody ever says a word. They know there's no practical way to enforce the rules.


i imagine that if the DC didn't have stringent enough security policies to keep non-googlers out, google wouldn't be using them.


OK, so... let's think about this. If Googlers can buy flashlights, so can anyone. What's more likely is that certain datacenters have limited heat removal ability, and room lights are just a waste of heat.

It's fun to imagine what happens at Google, but you have to temper your wild speculation with some thoughts about the real world.


I'm still confused what would motivate a company to be so secretive about its hardware designs or infrastructure size.

I understand that having more efficient infrastructure is a competitive advantage, but it seems as though just seeing their machines wouldn't provide enough insight to duplicate them. "Hey, Google's servers have fans on the back and the front" or "Hey, Google's servers are 2U, not 1U" don't seem like overservations that would provide a competitor any insight. Is designing the these type of machines not as straightforward and standardized as it seems?

Similarly, say a competitor saw their racks and noticed SSDs instead of physical disks. What information does that really provide? It's reasonable to assume that each company tries to minimize their cost and maximize performance, and that for similar situations they would make similar decisions. Is it for strategic value -- say, knowing where on the price-performance scale Google is choosing to be -- or is it to calculate their operating costs so as to better price their own products?

I'm not even sure why quantity of infrastructure is closely guarded. What advantage would a competitor gain knowing that Google has, say, 2M servers instead of 1M?


Well, pre-IPO, seeing the resources Google sank into its data centers would let competitors estimate search ad revenue, which is a figure they wanted to hide.


This is just a relic of the pre-IPO days when Google kept their infrastructure secret because they didn't want anyone being able to guess how big search was.

Beyond that, Google does a lot of things to run their data centers cheaply - things like keeping a higher ambient temperature (at a cost of higher hardware failure rate), the physical arrangement of servers, etc.


This is all very speculative. I'd say they just want to save power.


They are also asking the facilities to turn off their overhead lighting as well --which presumably is not billed.

Moreover, since most would be fluorescent, the amount of energy wasted would be minimal compared to equipment.


> They are also asking the facilities to turn off their overhead lighting as well --which presumably is not billed.

I assume google owns their own data centres, and even if not, its highly likely they would negotiate on points like this with the owner for price.

[EDIT: okay. so its been pointed out I didn't read the article properly (not google owned). The second point still stands however, I am sure that Google can negotiate with price on a total-energy package]

> Moreover, since most would be fluorescent, the amount of energy wasted would be minimal compared to equipment.

I don't know how you can make that statement. There is a probably a pretty low-frequency of physical access to each server, so it seems obvious to me that a couple of rechargable LED lamps roaming around a huge space is going to be hugely more energy-saving than drenching the whole place with fluro lights (which also need regular replacement, and add to the heat load) constantly.


> I assume google owns their own data centers

The article specifically referenced Google's cages in Equinix data centers.


>I assume google owns their own data centres, and even if not, its highly likely they would negotiate on points like this with the owner for price.

The Article is specifically talking about space Google rents in other data enters. It would appear they are more sensitive to losing competitive advantage than a few dollars in energy savings


If there is 1 fluorescent light strip (~20W) over every 2 rack cabinets (~10kW), that's only 0.2% of the power consumption dedicated to lighting. Saving this, is far from being "hugely more energy-saving".


Comparing it to as a percentage to what servers use is a spurious relationship. An energy saving is an energy saving. Servers need to be on and lights don't. 20W*hundreds of racks (plus extra for heat extraction) is still greater than a couple of 5 watt headlamps - why spend money on this if you don't need to?


By some estimates, Google has the equivalent of 25000 racks of hardware, maybe at most a quarter of them hosted in the shared facilities where they turn the lights off, say 6000 racks of hardware. Continuing my calculation, that would be 3000 ~20W fluorescent light strips. If running 24/7, they would consume 530 MWh/year. This is only $53k/year at the average utility rate of $.10/kWh.

If you continue to argue that Google, a company making $30B+ of revenue/year, turned off the lights to save a meager $53k/year, you are making a fool of yourself. They would save more by merely firing this secretary who smokes cigarettes in the bathroom during lunch breaks.

Google is all about efficiency, down to giving technicians scooters to quickly move through large data centers, and choosing the optimal placement of velcros strapping hard disks to the chassis for ease of service. I bet they would rather leave the lights on rather than fumble with headlamps or flashlights...

Clearly the article is right, Google did it for privacy reasons.


Your calculation is wrong. Even taking all of your assumptions, 530MWh per * $.10 per kWh is $53000 (not $53), no meagre sum.

Would be very nice if you would admit it is not so 'clearly' that I am wrong now.


$53k is $53000. k means kilo (thousand).


You did not take into account the energy to create those headlamps, and to replace worn batteries. I estimate each headlamp uses as much energy as a 15 watt lamp burning continuously for a year.

Depending on how much they use them (they probably last around 500 hours before being worn out) they may come out ahead or behind, but it's not a slam dunk.

> plus extra for heat extraction

To extract the heat takes 1/3 as much energy as was emitted, if that helps your calculations.


> You did not take into account the energy to...

I also didn't include the energy (or cost) to create thousands of fluoro tubes, nor the human wages paid to constantly replace the ones that go... which is going to be far more than a couple of headlamps you can buy from the $1 store and some rechargable batteries.

> Depending on how much they use them (they probably last around 500 hours before being worn out) they may come out ahead or behind, but it's not a slam dunk.

I agree that 'how much you use them' is important. None of us really know how often they are accessed, but to me even for a very heavily accessed facility it does feel like a 'slam dunk'. If you feel the opposite maybe we just disagree here.

I'm not sure what wears out after 500 hours by the way: Batteries? LED lighting? Neither are going to cost an arm and a leg.


Save power by working in a helmet to save 30min of a lightbulb when you host equipment worth millions eating up gazillion kilowatts?


At Google's scale, micro-optimizations add up to millions of dollars.


Perhaps Google are just screwing with everyone? Making your techs wear helmets is a bit rough, but they seem like a company who like to maintain a certain air of mystery.


Another crazy idea: the servers use some sort of proprietary interconnect which works better without interference from typical fluorescent lighting.


What does it say about the barriers to entry for colocation, that google still has to host in someone else's building?




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: