Hacker News new | past | comments | ask | show | jobs | submit login
WSJ made to believe Google abandons strict net neutrality (wsj.com)
39 points by ckinnan on Dec 15, 2008 | hide | past | favorite | 30 comments



The meat is buried at the end of the article:

Google's proposed arrangement with network providers, internally called OpenEdge, would place Google servers directly within the network of the service providers, according to documents reviewed by the Journal. The setup would accelerate Google's service for users.

Akamai has been doing this for years. Is it a violation of network neutrality? The routers don't treat Google's packets any differently (as far as we know).



No it is not, but the anti net neutrality lobby want to present it as "see, no one really believes in net neutrality"


I'll bite - this strikes me as worse than Akamai, because anyone can get on those Akamai servers if you're a customer of theirs. Only Google will have access to these servers - I'd almost prefer that the Akamai arrangement would be against the rules.

I know, I know - this suggestion seems crazy. But maintaining dumb pipes is critical to maintaining the separation between content and carriage. We've already seen the cable companies get into fights over bundles of channels - and putting servers inside the networks would essentially turn websites into the new cable channels.


this strikes me as worse than Akamai

In what way is Akamai bad?

And if Akamai isn't bad (as I don't see any way to say they are), then how would Google doing this for their own use rather than paying Akamai rather astounding fees be "worse"? I'm failing to see how this is anything other than intelligent and efficient use of bandwidth. Bringing the content closer to users is good for consumers. It doesn't cost the consumer anything, it doesn't cost the telcos anything (it actually saves them money), and it allows Google to provide better service.

The only argument I could see against it would be, when does it end? Does Microsoft have to build a CDN like this to keep trying to compete with Google? Yahoo, I believe, already has a very favorable agreement with Akamai, so they don't. But everybody else that has a very large web presence might want the same privileges. But, since it saves telcos bandwidth, and they're the only ones that would need to accommodate these new local servers, it's probably perfectly reasonable to let them decide who they want in their data centers.


You exactly see my argument correctly as "when does it end?" If this becomes standard practice, you'll have a selected number of entrenched sites able to be inside the network. That then gives the telco provider more power than they have as mere dumb pipes, replicating the carriage issue I raise with my cable precedent. There's no reason to believe that the telcos will take everyone willing to pay a certain amount; these will all be negotiated, bundles will be introduced, and you'll kill the promise of a free and unfettered internet where we all have equal footing.

I'd be OK with licensed CDNs, but not with content and/or web application providers. Again, this is me with my antennae at attention, looking at a worst possible case scenario, but I play the red pieces. (Yes, I coined that phrase, which is why you haven't heard of it. Yes, I'm mixing chess and war games metaphors. Yes, damn straight I think it's a cool phrase.)


I'd be OK with licensed CDNs

Who "licenses" the CDNs? That sounds scarier to me than adding more caching to an already heavily cached infrastructure.

You do realize this is all just a matter of degree, right? There are probably a half dozen caches involved in every request you make. Certainly DNS has multiple caches along the path to getting your data, your browser caches local copies of lots of data, and your ISP probably has a cache as well. A CDN is just a specialized form of cache that can be a little bit smarter than each of those caches.

these will all be negotiated, bundles will be introduced, and you'll kill the promise of a free and unfettered internet where we all have equal footing

I think this assumes a difference in performance expectations that isn't reflected in reality. The difference between data coming from Dallas, or San Jose, or Chicago, or coming directly from your ISP data center is measured in the tens of milliseconds. I don't think this could realistically be expected to result in "packages". Who would sign up for a "Google/MSN/Yahoo/eBay" Internet that merely promised "30 ms faster!"?


I agree with Lessig that there shouldn't be a problem as long as access to fast pipes is merely a matter of money. But I fear this is a slippery slope. It wouldn't take long until different types of content were priced differently (porn) or until network providers became subject to blackmail by fanatics asking for censorship.


Historically, the most likely censor is the net neutrality enforcers themselves, the FCC.


[deleted]


Couldn't anyone come at least close to the same advantage just by signing up with Akamai?

Besides, this isn't about making other sites slower so that your's seems faster, its about building an additional physical infrastructure so that your site actually is faster while not degrading anyone else's performance in any way.


That's like saying ESPN's marketing budget gives them an unfair advantage over my fantasy sports startup. Cash, and the ability to exploit it, is certainly an advantage, but if it weren't the market wouldn't function.

As long as I can still get reasonable speed connections (like what I do now) I can compete, even if they're a little faster. I just have to make a better product. Now if they're slowing my service down to unusability, that's an unfair advantage.


How much is a "little" faster? Is that guaranteed? It seems like letting Wal-Mart control the toll booth on the turnpike. It's a dangerous first step.


It seems like a closer analogy would be allowing walmart to build their own private roads to walmart stores.


You may be right...until we see more details about how G intends to set their access up we may not be able to get the analogy quite right.


If they're simply doing what Akamai is, then they're not degrading my service, which is fine by me. If a user is getting the same quality they get from everyone who is not through a major CDN when they use my site, that's fine. I can still win.


It's the slippery-slope, or the "next step," that investor will want G to take that concerns me. I stumbled upon this: http://savetheinternet.com/=faq --a pretty good summary-- since my first post above. It is certainly something to keep our eyes on. Something the recent economic turn-of-events has convinced me of is that without oversight, investors can pressure companies, even benevolent companies, to do evil. It's our job as citizens to put safeguards in place to keep them from the opportunity to do so.


I've not seen a good concise, intuitive definition of 'network neutrality'.

Under the abstract definition "you should not be able to make special contractual arrangements with telecom companies so your services are faster than your competitors", yes, this is a violation of 'neutrality'.

If the only special contractual arrangements prohibited are those implemented in router software, then no, installation of edge-service-accelerator machines is 'neutral'.

But it looks like a loophole to me. In both cases, company G pays telecom T a certain amount $X, and thus gains a competitive advantage in reaching people downstream of T.

Whether that competitive advantage is sold from T to G via router software configuration (S) or physical-space cooperation (P) should be irrelevant.

In fact, method 'S' might be more economically and environmentally efficient. There's no mixing of operations/facilities. Each party has incentives to maximally scale what they do best -- G big energy-efficient datacenters, T big divisible pipes -- and lesser physical space/power demands are placed at network bottlenecks.


One of the weaknesses in the arguments pro-net neutrality is that it prescribes what the network should be like in too much detail. Net Neutrality advocates don't allow for the possibility that significant and unanticipated improvement could come in the future from innovations in network technology.

When internet access is provided "like a utility" (one of their catch phrases), your network providers will be every bit as innovative as your water and sewer company.


Have we gotten any significant and unanticipated improvement in the Internet in the last 5 years?

The value of, say, a neutral 100Mbps symmetric FTTH utility is large. What is the value of the future breakthroughs of the unregulated Internet? How do we measure and compare these?


OTOH, if the last mile is congested then router prioritization would really benefit Google, but with edge caching their packets still have to fight it out with all the others.


What an amazing press hit for the telcos. They just completely pwned the WSJ. So badly, though, that there's a chance the paper will turn against them once they realize how they've been used.


The way I see this, Google wants to put servers in the edge ISPs, aka the telcos, in an effort to save everyone money including those telcos. Google becomes a paying customer, the ISP in question uses a lot less Internet bandwidth and a lot more internal bandwidth, for which Google will pay. This is probably a power play by the ISPs to try to get the highest price from Google that they can.

I doubt it's a latency play, if you were already colo'd at the right places you're talking about sub millisecond benefits here. This is about money, and lowering the load on the Internet's backbone, which translates to time and money to upgrade it, which leads us back to money.

So I agree, the telcos used the WSJ here. But not to block Google, to command a higher price.


Akamai and Highwinds have been doing the same thing for quite some time.

Edit: Oops wmf noted the same thing below.


I think that was the first time I saw you write the word "pwned."

Hopefully the WSJ fixes this article (they considers edge caching as a violation to network neutrality, clearly it isn't because it doesn't slow any other sites down).


"... Lawrence Lessig, an Internet law professor at Stanford University and an influential proponent of network neutrality, recently shifted gears by saying at a conference that content providers should be able to pay for faster service ..."

wsj also did a job on Lessig who replied with "The made-up dramas of the Wall Street Journal" ~ http://lessig.org/blog/2008/12/the_madeup_dramas_of_the_wall...


Apparently Lessig is less than happy about their reporting too - http://lessig.org/blog/2008/12/the_madeup_dramas_of_the_wall...


Really, it sounds more as if Google wants cohosting than a "fast lane" or other special treatment.


I was just finishing my post on the topic of GOOG and net neutrality when I refreshed HN and saw this as the top link. When taken in the context of how GOOG arbitrates access to businesses via Ad Words, it makes it even more nefarious.

http://www.manyniches.com/entrepreneurs/google-net-neutralit...


I for one knew the WSJ had become a PR dumpster when they published the favorable article on cutco knives. I haven't had respect for it since; I guess the buyout was fatal.

link: http://online.wsj.com/article/SB121789140861111649.html


Now the next argument is going to be what exactly is "Caching" and when exactly does "caching" intrude on "net neutrality"?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: