A big reason our wireless infrastructure sucks (and won't improve a whole lot) is that we devote a huge amount of the best spectrum space to broadcast radio and television. Signals sent on TV or radio-sized frequency travel farther and are less vulnerable to interference than cell phone signals or LTE just about anything else except for some channels reserved for military communication. We have the ability to send a whole lot of data over the air more robustly than we can right now, but it's being used to support obsolete technology.
The FCC auctioned off or gave away a lot of prime spectrum to radio and TV providers back when it wasn't nearly as precious as it is today, and they adopted old, inefficient formats that need an enormous range of frequencies to transmit very little data. Do you know anyone who has used broadcast television over the past ten years? I don't, and I don't think I know very many people who have either. Yet we still use a whole lot of really good VHF space for wasteful analog communication. Meanwhile, the channels used for some of the most important things, like Wi-Fi, need to use some of the crappiest spectrum around, because everything else has been allocated already.
Things are improving, albeit slowly. The switch from analog to digital broadcast TV is opening up a whole lot of space, much of which is being used for WIMAX. But radio uses up even more space than TV did, and there are no plans on the horizon to change it at all.
actually, i know several people who still use both broadcast TV and radio.
a major issue that many people seem to miss is that all communication necessarily self-selects for people who can receive it. today, in what many people consider to be the "internet-age", those of us who use the internet tend to forget about those who do not (or simply miss them, if we do remember). and the people who don't tend to be those who are either disabled (because it is difficult for them to make use of the tech), or those who can't afford it.
while i'll definitely agree that we need to take another look at how we're allocating the spectrum, let's not forget as we do so that there are people who depend heavily on tech that is only obsolete for us personally, not for the world as a whole.
I have no problem with broadcast TV, although ubiquitous fast internet service could make that unnecessary. What I have a problem with is the use of really good spectrum space for analog signals inefficiently. TV now is a lot better than it used to be, but radio still uses up a huge band of frequencies, and it could use a lot less if we switched to digital.
There's a whole community on Reddit call Cord Cutters that use Internet steams and Broadcast TV to replace cable Television:
http://www.reddit.com/r/cordcutters/
I dumped cable 2 years ago due to the high cost, and relish my broadcast television.
Tradeable broadcasts permits would be an improvement, allowing companies with valuable new technologies to purchase spectrum rights from companies whose products had become valued less in the marketplace. Imagine what the country would look like if you had to apply to a regulator every time you wanted to buy or sell a piece of land.
The problem is the labels themselves which have been rendered meaningless. At a minimum, the US carriers should just cite the theoretical maximum rate. But maybe Consumers Union or someone of that ilk should create an objective, real world rating system that would reflect upload, download and latency speeds.
This is largely the result of cynical MBAs running companies. Can you imagine Google trying to water down terms to this degree? -- muchless just inventing buzzwords for "new technogies" out of whole cloth and cynically marketing them?
The big issue is that the US is so big. They talk about the new 4g in nordic countries. That would be like Verizon rolling out screaming fast 4g only in California, and leaving the rest of the country on 3g. That just wouldn't work.
Building a 4g network across the entire US is a big deal.
> That would be like Verizon rolling out screaming fast 4g only in California, and leaving the rest of the country on 3g. That just wouldn't work.
Well if that were the case, shouldn't they have rolled out 4G at least in California & New York? Ironically, those are the places with the worst cell service.
Actually, California & New York have 2.2 times the population[0] while having 2.1 times the GDP[1], so I really don't see how that's all that relevant.
Or how about we look at their relative land areas[2] (not counting Greenland) - California & New York only have 41% as much area as the Nordic Countries.
The term "big" doesn't really mean anything. The US has a higher population density. Per person served, it would be cheaper to cover the US than the Nordic countries.
This makes Sweden, Norway and Finland roughly comparable in density to Colorado, Maine, and Oregon, which we tend to think of as relatively spacious states. By contrast, each is about the size of California, which has 5-6x the population density.
While that is true, European cities tend to be more compact, with more 'empty space' between them. When I moved to the US, the concept of 'urban sprawl' was completely new to me.
This should increase the meaningful population density, as there is little need to have high bandwidth in uninhabited areas.
How exactly is it a big deal? You managed to build a GSM network across the entire US. Then you managed to build an EDGE network across the entire US. Then you managed to build a 3G network across the entire US. Why can't you now all of the sudden build an LTE/"real 4G" network across the entire US? This suite of excuses is the same song of complaint that the american operators return to every year. Oddly, not a single operator in any European country has made a squeak about any of the mobile network generations. Ever.
I live in one of the largest cities in the USA. AT&T's 3G sucks here. If I go 15 miles west of here, it's suddenly way faster. This is entirely because AT&T doesn't invest in backhaul infrastructue. It's got nothing to do with the density the country, and everything to do with the lousy infrastructure investments of the wireless carriers.
Wireless in this country sucks cause we don't demand better. If it's socialist to want my government to go after these bloodsucking bastards and make them provide better service for the exorbitant fees they charge, I'm content to wear the red armband everywhere.
Actually, European carriers are often required to let other providers use their network infrastructure at wholesale rates. Hence, if they are expensive, they will get competition from within their own network.
Contrast this with the network monopolies in the US where every (both!) carrier can just make up prices and there is nothing any customer can do about it. This is most certainly why prices are so high in the US and why carriers see little incentive to upgrade their infrastructure.
Does anyone here know, given current frequency allocations, what the theoretical maximum connection speed is? That is, allowing for improvements in hardware, how fast is really fast on a cell phone?
It's been way too long since my signals class, and I'm not even going to try since I know I'm going to be wrong (a whole lot of good that education did me, huh.). But I do know that it's proportional to the bandwidth available.
Careful reading of these [1] equations may yield you an answer. Flashbacks to college are an unintended side effect.
I think there are a few extra layers to this problem that aren't frequently brought up.
1) people believe they need cell/data service and would never cancel their service regardless of how bad/slow the service is. How many San Franciscan's or New Yorkers canceled their iphone service? The answer: Not enough. Will we as a country continue to pay $30/mnth for slow speeds and spotty coverage. Hell yes we will.
2) I don't think most people would notice faster network speeds if they came. They would notice better coverage, but would most of us be able to tell 50mps compared to 5mps? I have a feeling the speed my phone loads websites right now is limited by the cpu, not the network. And with 5gb caps (if we're lucky), we can't use our networks for anything more intensive like video streaming. What would be the point of a superfast network if it takes 20 minutes of downloading to use up my monthly alotment?
4G is not what it is advertised for. my favorite is the marketing scheme behind 4G LTE(long term evolution), which is the same as saying "ohh we are thinking about 4G but it wont actually be here for a while"
The best comment is that the speed doesn't matter so much. since downloads are capped, so your running throughput can be as little as 2 bytes per second throughout the month. I don't think Sprint is this way at the moment, however.
I usually get about 2mbps on my Verizon 3G, which is faster than my home ADSL connection. I´ve thought of my wireless as very fast, though perhaps not from a global perspective.
2mbps is great, I remember having ADSL. However with cable internet I've maxed out at 14.5mbps on a 16mbps advertised connection, although I'm actually testing at peak usage time from a wireless connection, so in ideal connections I might actually get what they're advertising for me (until I hit their bitcap which will be spot on 16mbps).
I wonder if a strong public sector has something to do with the fact that European and Asian countries always (supposedly) have so much better networking infrastructure.
Actual competition between providers is probably the major reason. It's amazing how consumer choice can motivate corporations to invest in infrastructure so they can gain marketshare.
China has a huge public sector, a ton of money, and deplorable networking infrastructure. I think it has to do with the size of the country more than anything. Russia's probably not any better than the US either.
I have an iPhone 3G, a 3Gs and a Tmobile MyTouch 4G.
Both iPhones are a complete joke. To the point of rage WRT ATT service.
The tMobile, while not having 'H' coverage everywhere in SF, is LIGHTNING fast by comparison.
There are several other factors, aside from network, that play into this. The OS and HW on the iPhones are just slow and old by comparison. Where - even on my home wifi network pages are slow to render in the browser.
(Even simply launching the SMS app on the 3G iPhone takes as much as 10-ish seconds to display text)
The MYT4G also has no progress indicator on the sending of SMS - where the iPhone does -- and it does the classic "Load to 90% quickly, then pause forever, if sending at all"
Finally, I have had the iPhone since launch day, and have gone through 9 separate handsets in that time. On every single one of them - the 3G and signal bar indicator have been an utter lie. At times when it claims to be online - it has, over a large percentage of the time I have been a user, not been able to "activate cellular data network" for various reasons - or simply "call canceled" so many times I am fed up.
The "call canceled" sound is heard at least 3 times a day minimum.
In closing, if I could have apple and AT&T reps in the room - I would like to kick them in the balls.
The FCC auctioned off or gave away a lot of prime spectrum to radio and TV providers back when it wasn't nearly as precious as it is today, and they adopted old, inefficient formats that need an enormous range of frequencies to transmit very little data. Do you know anyone who has used broadcast television over the past ten years? I don't, and I don't think I know very many people who have either. Yet we still use a whole lot of really good VHF space for wasteful analog communication. Meanwhile, the channels used for some of the most important things, like Wi-Fi, need to use some of the crappiest spectrum around, because everything else has been allocated already.
Things are improving, albeit slowly. The switch from analog to digital broadcast TV is opening up a whole lot of space, much of which is being used for WIMAX. But radio uses up even more space than TV did, and there are no plans on the horizon to change it at all.