It's a rant, so I'm not going to critique this post too much, but I'd like to call this out:
[...] Intel is desperately trying to figure out what to
do to combat the phones and tablets that are eating them
alive from the ankles up. It is pretty obvious that the
company both doesn’t understand what the problem is and
is actively shutting out all voices that explain it
to them.
I don't think this is true. Intel certainly understands the market and where it's headed. However they are committed to x86/64. What Intel is doing in my view is taking a series of huge but calculated risks. They seem to be betting that:
- Laptops will stick around and have Intel Inside for quite a while. The market may be boring, but it will be there for years. Corporate America helps.
- Servers won't be switching to ARM any time soon (I'd argue this is the riskiest bet).
- The desktop and enthusiast/gamer PC market will be around for a while, and also won't be switching to ARM any time soon.
So all of these "shoe-ins" buy them time, and I believe they think that in time they can pull off the biggest risk of all:
- Intel is betting that the biggest differentiating factor is and will be performance per watt. They are willing to gamble that they will eventually eclipse ARM cores in this area. In their view, if they have an x86/64 core that trounces competing ARM architectures in ppw then phone, tablet, and set top manufacturers won't have a problem putting those chips in their devices.
Granted, I'm not saying I think Intel is 100% correct or that they'll succeed with their long term bets; I just don't think they are as clueless as this rant makes them out to be.
No doubt about it, though, UltraBooks DO suck.
EDIT: I'm going to revise my statement on UltraBooks. Not all of them suck. In particular, the Lenovo Yoga is fantastic.
What they are missing is that it's not performance per watt, but performance per watt per dollar.
Intel may be able to build lower power/faster chips but their foundries aren't cheap, neither are the thousands of engineers Intel puts on each SoC, and technology scaling is only getting more and more expensive.
It's a classic case of the innovators dilemma. Intel are optimized to sell very fast chips that cost hundreds of dollars but the performance of fast-ish chips that cost a few dollars has almost caught up to them and Intel simply can't compete without a complete restructuring. A further complicating factor is the fact that ARM is a weird many-headed chimera [1] that Intel simply can't kill they way the did with x86 competitors like AMD.
Not too sure what you are referring to by "Ultrabooks," (something ARM specific?) but I have an "Ultrabook" and I absolutely love it. Its my work machine, my gaming machine, and my web surfing machine. It may even become my Steam Machine.
Its light, powerful, and compact. I can throw it in my backpack and forget its even there. What's not to like about Ultrabooks?
(For anyone wondering, its a Dell XPS 13 running Ubuntu)
My Thinkpad X300 running Linux is about as perfect a PC as you can get[1], except for the missing SD card slot. Then Apple came out with the Air and Lenovo just lapped it up. I'm sure the X1 Carbon is nice, but it wieghs the same as my X300, has a chicklet keyboard and costs about $1500 more at current prices.
The X300 was the Ultrabook before the word Ultrabook even existed :)
Not saying Intel is ignorant of these facts, but the Microsoft/Intel hegemony is shrinking. Not dying, shrinking.
1. Apple and Google built an OS. Intel isn't willing to go after software. Are they smart to limit themselves? Microsoft is at least pretending to make hardware – alienating all the OEMs in the process – and Intel doesn't have what it takes to make software?
2. Cheap laptops have traditionally been Intel's enterprise play. Dell/HP/Lenovo are still selling 1366x768. Is Intel serious about their integrated graphics, while their customers are still being outfitted with 1366x768?
3. The server market is not a safe place to hide. Talk of how Intel is safe on servers ignores the consumer market and Intel is obviously not going to ignore the consumer.
So are we down to a single supplier for all our computers (Apple)? What will you do when Apple screws up? But more to the point, Intel had better watch out before Apple just ditches them entirely.
Just to add to the discussion, intel has a few chips in Tizen. I don't think it will ever succeed in any signifiant way, but it seems they're at least testing the waters.
Their customer services sucks. When I bought my last Thinkpad, I had to buy it 3 times, over the course of two weeks before they finally built it and wait a month to have it shipped once the order actually went through. It was a solid machine (except for the speakers and screen), but I'll never buy another device from them. Instead, I'm typing this up on a fruit company device.
I've not had any problems myself buying new machines for my company but personally I tend to buy 2-3 year old ThinkPads off a eBay. They are less than 20% of the retail cost and are in perfect condition always.
I'm typing this on a T400 Core 2 2.4 Duo, Radeon HD, 8GB RAM, 1440x900 screen, 9-cell unit with 3G card built in and windows 7 x64 pro license sticker purchased in absolutely perfect unused condition for £145 (!). The battery had done charge 11 cycles to give you an idea. Chucked a Samsung 840 Pro in it for £102 and it's a perfect machine.
I have 2 spare ones (T61's) lying around as well so I always have a spare handy.
This puts me in a better position than a new Lenovo or fruit purchaser.
I'm a firm believer of letting someone else pay for the immediate depreciation in value! :)
Uhm I searched on ebay and found it for 199€ (+49€ for 8GB and +29€ for UMTS so a bit more). The seller seems to have many of them, unfortunately it's in Germany (i'm in Europe too but I want a QWERTY keyboard)
I had the exact opposite experience. When I chatted with a service rep, not only did I get help with the configuration, but she was friendly, sent me a personalized email with the details, helped me with the purchase process, and even gave me a discount because I didn't want Windows (I prefer Linux).
And to be honest, most of the help was unsolicited, but was friendly and helpful so I accepted it. My experience buying a ThinkPad was overwhelmingly positive, and I also love the machine (I ponied up for the 1080p screen, which is fanstastic).
Yeah, what exactly are we comparing this against? Because if you know of something that has a nice screen, a long battery life, an excellent keyboard, and the ability to run decades of software aimed at helping me develop and get work done, then please let me know.
However, if you're saying that ultrabooks suck at consuming content compared to tablets, all I have to say is "well duh."
on tablet, with touch. development will be fun!
oh, also no rooting allowed, no multi window, multitasking that freezes background tasks, just to make things more fun.
another fun fact: the OS for tablets and phones is compiled on x86, every time, all the time. Compiling on ARM for ARM is just way slower.
No doubt about it, though, UltraBooks DO suck.
EDIT: I'm going to revise my statement on UltraBooks. Not all of them suck. In particular, the Lenovo Yoga is fantastic
I suspect neither you or the OP have been paying attention lately? Neither had I until I started looking. I got a Samsung series 9 earlier this year and I'm very happy with it. I think many people would love it, but they don't know it exists. Same goes for a few of the others.
Any bet that the PC (i.e. the laptop) is going away, always needs to ask on which platform the comments on the website it's posted to are being written.
I'm willing to bet some hear are typing out on a tablet, but I'm sitting here in the kitchen with my laptop.
Ugh. Change tack[1], which is a sailing reference[2]. As for the actual content, I feel this analysis lacks nuance. Mobile is booming, of course, but the PC is not dead, nor will it be dead five years from now. There are a hundred use cases for which a desktop or laptop is the only practical solution. Fantasize all you want about businesses abandoning real machines for iPads; reality begs to differ.
"Change tack" is like "would of" -- people haven't made the connection between the written and the spoken word. Then you add almost-homonyms and it gets worse. I am often amused to find people "flaunting the law".
> During this time Windows 8 came out and PC sales dropped 15% in the first full quarter after launch.
I don't think it's all Windows 8's fault. The average desktop PC is just too powerful.
I've been using Visual Studio 2010/2012/2013 with an i3 and an SSD for years now and I rarely run against any sort of performance bottleneck.
To compare what sort of performance requirements I have: in the project that I work on I have a solution with 28 projects that takes about 50 seconds to build from a clean build. Visual Studio takes care of incrementally building the projects during normal development, so usually I'm looking at ~5 seconds to build then launch the debugger.
I have absolutely no need to upgrade. No need = no sale.
I'm using Windows 8 as my operating system. It takes one step forward and one step backwards. I'm looking forward to Windows 8.1 but there's nothing so seriously wrong with Windows 8 that I need 8.1.
When I'm sitting in front of my PC and using Visual Studio, I'm not thinking "I wish this was actually a docked tablet". I have an iPad for mobility.
PC sales are probably undergoing a bit of a course correction as people who are satisfied with tablets buy tablets instead of PCs. But I suspect PCs will be around for a long time to come and, until that day, there's nothing for them to "[come] back" from.
Same here. 4 years ago I bought 2 desktops, one for work and one for home. I quit contracting and got employed, so both sit in my office at home now. I still use them, develop on them, do everything else on them and they are fine. I wont' be buying a new desktop for at least another year if then. And I CERTAINLY won't be 'upgrading' to win8.
Yeah, I've been telling myself I'd get a new laptop when power consumption for modern chips reaches that of the ultra-low voltage pentium 4 in my current laptop (Asus UL30A with about 10 hours of actual battery life).
But each time I look at the offerings, I conclude that my $700 PC from 2010 is perhaps not superior, but is certainly adequate enough that I'm not interested in forking out $1k+ to get something with a comparable battery lifetime.
Instead I start to look at the Chromebook with Ubuntu on it as a much less expensive option with the primary thing I want (battery life) combined with an almost-real computer that does basically everything my older laptop can do.
But even then, sunk cost and all. It's not enough to meet the performance of my three year old laptop, they have to exceed it. And it's amazingly still not there yet.
I don't know what exactly Intel has been optimizing, but it sure isn't anything that matters to me.
I would really reconsider your upgrading decision. It's easy to get past the start screen-ness or spend 5 minutes setting up a 3rd party taskbar and making sure opening files (e.g. jpg, pdf) default to the "non-Metro" applications. After that, you'll have an OS that is functionally identical to Windows 7 with better boot times (non-fresh upgrade install from 7 to 8 HALVED my boot time), an even more stable kernel (I've literally NEVER had a BSOD since upgrading from 7), and better performance.
The FUD surrounding Win8 is kind of crazy. You want to know the easiest, fastest way to launch an application? Hit Windows key, type application name. Same on Win7, same on Win8...hell, it's the same on Ubuntu.
Through the jump list of the application you opened that document in recently? Obviously this requires you to know both what you opened it in and have it pinned to your taskbar, but I really like that feature for reopening files I recently closed on a per app basis.
Ah jump lists. I used Win7 since it came out and never found those. Only discovered them since I've been on Win8 through some random mention and a video. They aren't intuitive or very we'll known. People at work now see me use them and every time are surprised and ask me how I did that.
> The FUD surrounding Win8 is kind of crazy. You want to know the easiest, fastest way to launch an application? Hit Windows key, type application name. Same on Win7, same on Win8...hell, it's the same on Ubuntu.
Its actually, IME, noticeably slower than mousing in most cases on Windows 7.
Could do all that (and fork over $100s), OR I could sit still and have most of that. That's my point; its not attractive (enough) for the money, AND its damn annoying I have to refit so much just to get back to work.
Add to that, the unmitigated gall of assuming I wanted any of that worthless stuff, and there's the emotional part.
Reminds me of a coffee cup I bought at Wal-Mart for work. It had a transparent plastic sticker on it that would not come off, without leaving a gummy stain. Didn't come off in the wash. Think about it: I buy something NEW and shiny, and the first thing I have to do is clean shit off of it? Maybe buy some other product to get the marketing crap to come off? Of course I respond emotionally, just as if I got a new car and found someone had dumped in the back seat.
Actually, slight correction: you can use spotlight preferences to set the order in which various matches are displayed. Every now and then this is useful.
Spotlight et al have a problem - I build multiple branches of a project. The executables are all the same name, different paths. Mac doesn't distinguish them for me - I have to browse to the one I want every time. Which is a click-fest.
Oh well... This rant has little sense and a lot of angriness. The phrase "The PC is over and PC sucks" appears several times with little explanation other than citing the grow of other markets. The true is there is no replacement for the PC and it doesn't seem to have a serious replacement any time soon.
People can't make movies, edit images properly, use a compiler, debug, use a nontrivial spreadsheet,etc in phones or tablets. Until that doesn't change the desktop PC won't die. They might not been as popular as before nor have the same upgrade cycle as before, they might had lost relevance as a growing market, but they are far from dead.
Well, that's a very popular position taken recently in the media. "the PC is dying... Tablets will replace everything we know... look, the sales are going up, it means it's a zero sum game and PC share will go down to ZERO!".
As you mentioned the future is fragmentation, certainly not a monolithic tablet-only future. There are still too many incentives to keep using PCs for many, many usages.
Also for me this is all very familiar. As a pc gamer for years, maybe a decade! I have been hearing about how pc gaming is dying or dead. All the while playing the best games which are usually not released on consoles with the best graphics and performance + controls and setup that I want.
Yeah, and playing on Full HD since a while, while all the consoles out there rely on crappy upscaling to fake Full HD. It's been a while I did not take my Xbox360 or PS3 out. And I'm far from being convinced by the PS4 and XboxOne even after playing with them in the last week.
I don't agree with the "people can't create" thing with tablets.
I do a podcast entirely on my iPad, I also record music on it - and in both cases significantly prefer it to doing the same on my computer.
I also sketch stuff (although a decent digitiser would help) and my daughter records and edits films on it as well (she says it's too fiddly editing video on the Mac). I write numerous blog posts and I've written one essay on it (using an external keyboard). I even used it for coding (well not really, just as an SSH client to a linux box, but the inbuilt 3G over MOSH made it extremely convenient for doing non-UI work).
I've not had to do any spreadsheet stuff on it, and I can see why that would be a weakness. Nor have I had to do any Photoshop-level image editing (I have done simple image editing).
But for most "creation" tasks I find the iPad to be competent, and in some cases (especially audio editing) to be superior to a "computer".
Of course, none of the stuff is "professional" level creation - but that still fits perfectly with Steve Jobs' cars vs trucks analogy - most people don't need that level of control.
Well, true all that. Except, my non-technical parents won't be buying any more PCs. They can do everything they want on their ipad. And I'm pretty sure they don't want media editors, compilers or complex spreadsheet editors either.
Raspberry Pi, Arduino and their board-level Maker-centric ilk. It's a viable flea-market ecosystem now that everybody already has at least a computer of some sort, so the learning machine doesn't have to be that do-all hub-of-activity machine, it just has to connect to it.
Having GPIO pins to play with also helps with the learning. The Boca Raton model of computing device was a bit stifling for homebrew hardware extension because of having to accommodate the de facto standards of the IBM PC/AT model (RS232 serial, Centronics parallel, CGA/EGA/VGA display, DIN5/PS2 kybd/mouse) to get a hackable connection going.
>The phrase "The PC is over and PC sucks" appears several times with little explanation other than citing the grow of other markets.
There are some non-market explanations in the article and I could personally think of many more.
The PC could be way better than it is in technical terms and mostly (but not only) on the software side. The main problem is that once Windows reached the de facto monopoly, it had little incentive to innovate and instead had reasons to stay backwards compatible.
My own thoughts on the negligence and indolence of the PC industry are full of rage. But this guy makes my rants seem a little tame. I love it!
I believe I have an unpopular opinion about desktop PCs. The conventional thinking is that desktop computing is boring because a modern PC does everything it is intended to do just fine. That may be true, but the problem is that the industry is not interested in establishing new usage patterns—new things the PC should do.
At the end of last year, I started a series of rants about how modern technology sucks [1] with particular emphasis on the frustrating stagnation of desktop computing and the bothersome way every new portable computing device wants to be a center of attention.
I was pleasantly surprised that the author of the linked article hits the target squarely when he lists off what PCs need. The first item: better displays. He may be speaking more about laptops (and they are deserving of the shame), but allow me to rant a bit about my preferred computing medium—desktops.
The stagnation of desktop displays is, and has been for a decade, the crucial failure of desktop computing. Display stagnation is the limitation that allows all other limitations to be tolerated. It is the barrier that leads the overwhelming majority of users (and even pundits!) who tolerate mediocrity to declare everything else—from processors, to memory and GPUs—as "good enough." I absolutely seethe when I hear any technology declared good enough (at least without a very compelling argument).
Desktop displays, and by extension, desktop computing is so far from good enough that it should be self-evident to anyone who observes users interacting with tablets or mobile phones(!) while seated at a desktop PC. Everything that is wrong with modern computing can be summarized in that single all too common scene:
1. Desktop displays are not pleasant to look at. They are too small. They are too dark. They are too low-fidelity. And they often have annoying bezels down the middle of your view because we routinely compensate for their mediocrity by using more of them, side-by-side.
2. The performance of desktop computers is neglected because "how hard is it to run a browser and Microsoft Office?" This leads to lethargy in updating desktop PCs, both by IT and by users ("I don't want the hassle"). In 2013, I suspect many corporate PCs in fact feel slower than a modern tablet or even mobile phone.
3. Desktop operating systems are actively attempting to move away from (or at least marginalize) their strong suits of personal applications and input devices tailored for precision and all-day usage.
4. Desktop computers--and more accurately personal home networks--have lost their role as the central computing hub for individuals by a misguided means of gaining application omnipresence: what I call "the plain cloud." This is because none in the desktop industry (Microsoft most notably) are working to make personal networks appreciably manageable by laypeople.
5. Mobile phones and tablets are often free of IT shackles and therefore enjoy more R&D (more money to be made).
Desktop displays stopped moving forward in capability in 2001, and in large part regressed (as the article points out) since then. Had they continued to move forward--had the living room's poisonous moniker of "HD" spared computer monitors its wrath--I believe we would have breathtaking desktop displays by now. In that alternate universe, my desktop is equipped with a 50+" display with at least 12,000 horizontal pixels.
Desktop computing needs to leverage immersion (without nausea; VR goggles need not apply, yet). Large form-factor super-high-definition displays would bring all manner of new technology needs with them:
1. Gesture controls.
2. Ultra high-bandwidth wired networking (win for wired network folks) to move super high definition files.
3. Ultra high-capacity storage.
4. Extremely fast processors and GPUs to deal with a much greater visual pipeline.
Such a computing environment is a trojan horse for today's tablets: it turns tablets into subservient devices as seen in science fiction films such as Avatar. The tablet is just a view on your application, allowing you to take your work away from the main work space briefly until you return. I say trojan horse, but that's not quite right because I actually want this subservient kind of tablet very much. I do not want a tablet that is a first-class computing device in its own right (even less do I want a phone to be a first-class computing device). I only want one first-class computing device in my life, running singular instances of applications for me and me only, and I want all my devices to be subservient to that singular application host.
For the time being, that should be the desktop PC. In the long haul, it could be any application host (a local compute server, a compute server I lease from someone else, or maybe even a portable device as envisioned by Ubuntu's phone). But for now, the desktop should re-assert its rightful role as a chief computing environment, making all other devices mobile views.
One thing desktop displays desperately need is a way for each pixel to become brighter than its surrounding pixels at will. A lot brighter. Like, 50x brighter.
Dynamic brightness range is a necessary step for writing a 3D renderer that makes you feel like you're looking out a window. 256 levels of brightness aren't nearly enough.
We don't need the ability to specify that a pixel should be brightness level 64823 vs 64824. It doesn't need to be that fine-grained. What we need is the ability to overdrive the brightness of specific pixels. That way sunlight filtering through tree leaves will actually give the impression of sunlight filtering through tree leaves.
Tangentially related, it reminds me that OLED has utterly failed to become a thing on the desktop, and it breaks my heart. It was a decade ago when I read that OLED was the next hot thing and it would bring unprecedented contrast and brightness to displays.
Today, I like OLED mobile phone displays.
But my Dell U3014s are disappointing crystal-over-backlight garbage. Not only that but expensive crystal-over-backlight garbage.
OLED will come. It's just now finding it's way into TVs, and still costs a fortune... When you think about it, a 5" OLED screen comes in a 700 dollar phone, most PCs cost that. Not many people would be willing to pay 2000 for a laptop with an OLED screen. Some people maybe, I know I would, but most wouldn't.
I think in 2-3 years you'll see Samsung Chromebooks and Ultrabooks with OLED screens...
It's not just the number of brightness levels. On an LCD display, if you brighten the backlight by a factor of 50, a black pixel will be about as white as a white pixel with the backlight at a brightness factor of 1. One of the most frustrating things about LCD displays is their inability to completely transmit or completely occlude light. This can be seen most obviously by trying to watch a movie in a dark room.
"We don't need the ability to specify that a pixel should be brightness level 64823 vs 64824."
Actually what is needed is anatomically defined range - the "true brightness", like it is for the 24-bit "photographic" (tetra-chromatic) range of colors. This would be a limit defined by the anatomical limits of human eye. There is a limit of absolute darkness (relative to the eye) and there is also a limit in the brightness of the light that the human eye can safely be exposed to. In this defined range there is a limit in granularity that the human eye can distinguish. I am not aware if such a definition exists, but I am aware that a display to respect the "true brightness" is impossible (for the mere fact that the light reflected from our faces sheds upon the dark portions of the display, and those dark portions can not perfectly absorb the external supplement of light).
It's funny how your opinion contrasts against mine.
For what I've seen, displays have become a commodity and that's a good thing. I can go buy any kind of display, choose a size and I'm probably fitted with more pixels than I ever need, the panels are neat, flicker-free and flat, and the best part of it is that they cost next to nothing. You buy a laptop and you choose the size based on how much hardware you want to carry around — not because you need this huge screen because all laptop screens are "good enough" as you mentioned: I haven't had a laptop with less than 1440x900 for... a little less than a decade. And the resolution has never been inadequate for browsing, coding, drawing, writing and watching movies which is what I mostly do.
This is completely the opposite of what we had in the 90's when 15" CRT was the baseline, you were always a bit short of resolution and you never had the money to buy that huge one-cubic-meter-in-size display that could do your 1280x960 at 60Hz or something, and for which you probably had to upgrade your graphics card and probably your PC too. That totally, totally sucked. One could live with the basic resolution and screen size but I remember the agony of something better always being almost at reach. These days screens are something you don't think about twice. Everything is, again, dreadedly good enough, and if you need something professional you can get that too it probably won't cost you the price of a small car.
In the last 10 years or so I haven't considered once whether I should soon upgrade to a "better" display or to a laptop with a "better" screen. That's bliss, IMHO.
Get a phone with a higher resolution screen, and maybe your opinion will change. My phone is 4.7" with a 1280x720 display. I now prefer to use my phone for a lot of reading over my 1920x1080 17" laptop display because the text is so much crisper that the laptop screen is annoying. Even more so the 23" screen I'm writing this on, which seems extremely fuzzy and pixelated.
My next phone will likely be a 5" phone with 1920x1080 given current prices/specs for full HD display phones - at that point I expect my laptop and desktops will annoy me even more. Current relatively low end tablets are now starting to get substantially higher resolutions than that.
I actually have such a 720p screen on my phone. It's hard to compare because I look at the phone at a much closer distance than the laptop screen. I usually read using my laptop because the phone can't fit as much text on the small screen anyway, regardless of the number of pixels in it.
I can only say that I don't see the pixels on my laptop screen either and text looks "good enough" (as in, not pixelated and I can see serifs which also look natural) and that the phone would be quite unusable if viewed from the same distance as I look at my laptop screen.
This is a really good point. When I've heard others cite their satisfaction level with desktop displays, I ask them to go through an experiment.
I'll put aside the size portion of the debate since I can't really fathom how anyone could argue against a desktop display that can fill their entire field of view, assuming such a display were available and economically priced. I think those who argue for small displays on the desktop enjoy being contrarians, and they bring up matters of taste and style (such as "I don't want something so large on my desk.").
But in terms of clarity, the argument posed by those satisfied by the status quo usually is composed of these points:
1. Users sit at a distance of 2 to 3 feet from a desktop display.
2. Therefore, high-DPI is not meaningful because the human eye cannot perceive additional clarity at that distance.
3. High-spec IPS LCD screens are good enough.
So the experiment is simple enough:
1. Find an iPhone 5+, Galaxy Nexus+, or Lumia 920+. Something with a high-DPI wide contrast-ratio display. Open up a web site or document or whatever.
2. Do the same on your PC.
3. Hold the phone up side by side at the same distance. Zoom the phone's text to match the text size seen on the desktop display.
4. Behold how the phone's display is considerably more readable. For most combinations of phone versus desktop display, the phone's display will be crisper, have better contrast-ratio, and better color accuracy.
My Lumia 920 (not even an OLED, just a nice high-DPI IPS display) utterly shames my Dell U3014 IPS desktop LCDs.
Reading text rendered with high-DPI, high-quality displays at 2 to 3 feet is an absolute delight. Not only that, you can lean to see greater detail. Yes, 2 to 3 feet is the typical distance, but sometimes I like to get closer to my desktop display to work with fine details.
I would pay a steep premium (but obviously I wouldn't break the bank) for a desktop display that matched the clarity of my phone's display. If I could just tug at the edges of my mobile phone and make it magically grow to fit my desktop, I would be so happy.
You're just wasting all your battery life with having so many pixels you never ever see with your eyes on such small screens. Do you feel happy being manipulated by phones manufacturers Marketing?
I have a 22'' screen, FUll HD on my desktop and I never complain about it. Maybe you just have a crappy LCD one.
The point is that it does not make so much of a difference in visual quality but it does impact your GPU performance and your power consumption highly, so the benefit/cost ratio is very, very low.
They haven't stagnated, they've regressed. 1900×1200 dells were 15". 24" 1900×1200 desktop displays are now almost all "full HD", which cuts 120px at the bottom and doesn't add a thing. IBM had >4k displays in 2001 (the T220 was 3840×2400, 4k is 3840×2160. And the T220 was a 22" monitor too, not a 40" TV), try to find even as good today.
I had one of IBM-s P70 bought second hand in 2001. It was amazing - it had 1600x1200 on 60Hz refresh and it was able to push gaming resolutions (1024x768) at 120Hz horizontal refresh . I still cannot find as smooth gaming experience.
Its really hard to find a decent 15inch laptop with high resolution screen - unless its targeted to gamers.
For some reason a few years ago the PC industry decided that 720p is good enough as if people only watch movies on their laptop. Trying to squeeze a full featured IDE in 720p is like looking at the screen through a key hole.
You raise an interesting issue. But is there enough money in large displays?
Let's look at use cases, and focus on the average person(assuming that the gamer market is too small to justify opening a top manufacturing line for large displays).
Usecase 1: movies. Even assuming there's value in 4K resolution(which is not certain) you still need to align and improve so many industries to make it work. Really hard.
Usecase 2: games. Most games plain folk play are casual games, and maybe playing angry birds at high resolution don't justify spending that much money . And even if there are ideas for such games, it's still chicken and egg problem.
Maybe the right strategy is to sell 10" retina displays, used at a short distance, to let people experience quality cheaply, build the relevant industries, and than offer large retina displays .
I believe you missed the target here. The argument against 4k gaming is more about the fact GPUs have to draw this increased amount of pixels, this is why the current generation consoles only draw at 720p.
Gaming also hasn't peaked in any way at the current level of resolution, as only the very top end of cards handle AA, AF, etc. in games, GTA 5 shows the need for every ounce of processing power by the fact that it looks horrible on both consoles.
As far as movies are concerned the jump to 4k has already begun, but I do believe you're right and there needs to be an alignment between the industries. The BluRay still competing with DVDs shows that, hopefully the new console generation will outside enough for a serious shift in physical media.
Man do I agree with you. I think the disconnect is that consumer tablet/mobile devices lack the ability to be views for our desktops. PCs are powerful enough to be the locus of consumer computing; we just need a good architecture to make them so.
Not to mention: when it comes to real productivity, you can't beat the desktop. It isn't the profitable sector for manufacturers, but it's still the productivity toolset.
We just need to make them exciting to the overall ecosystem.
This is one way WebRTC will be really useful (since, IMO, consumer IT is mostly Web or mobile apps). With signalling services in place, we'll be able to run hosts from desktops without registering domain names or establishing fixed IPs. The user-centric, desktop-hosted systems can grow out of that.
> Desktop displays stopped moving forward in capability in 2001
Are you kidding? In 2001, it was pricy to get a 15" 1280x1024 LCD monitor. Are we in the same state of affairs today? Would you be willing to wager that the quality, viewing angle, etc of that LCD monitor compares favourably to today's models?
I am speaking of the state-of-the-art of 2001 versus today. IBM's 2001-era T220 was pushing 3,840 horizontal pixels for something around $7,000. A massive investment to be sure. But had computer monitor R&D continued unhindered by the taint of "HD", T220-class monitors would have eventually became mainstream and the state-of-the-art would have marched forward still. I would have bought a "T220" in say 2002 or 2003 at $2,000. But the advance of technology seemed to halt and that never became an option for me.
Yes, prices have (mostly) come down. Yes, we have IPS versus TN. Thank goodness. But in terms of the top-tier specification of displays—resolution—we've stagnated and regressed. As others in this thread have pointed out, laptop resolutions in 2001 were higher than they were in ~2011. I don't own Apple products, but I give them credit for ending the tyranny of HD. Thank you, Apple!
Also, prices are not uniformly better, or at least they were not until very recently when some Korean manufacturers decided to shake up the monitor cartel. The consumer high-end in particular had been stuck with 2560x1600 30" monitors at ~$1,100 for about seven years.
The price has moved down, the tech hasn't moved up, at least not a whole lot. I bought a Sony 20" flat trinitron in 2002 from a dotbomb property sale. It was capable of 1600x1200 and up to 120Hz. Traditionally, the high end displays come down in price and the volume sales pays for R&D for the next high end displays. Today this is not the case, look at the displays going into 300$ tablets compared to desktop displays. tablet resolutions are rocketing while laptops are static at 768, 900, or 1440 (if you pay for it). The GPUs are more than capable of 4K now yet how many 4K displays are available?
Of course the problem here is the software. It's not that it doesn't exist, it's that tools to use and create such a system easily don't exist.
We can make applications which do what you describe but it is a hell of a lot of work. It would require a custom affinity system (which desktop is the tablet attached to, how do I authenticate it), a custom server-client system (how do I talk to the desktop regardless of how I am connected, including NAT traversing, ect), a custom user interaction system (how do I deal with two users using the same program at once?) ect. for every device. Of course the key problem is that tablets, phones, ect. use a myriad different technologies.
So the solution today, the high intensity one, is that the application developers spend hundreds of thousands of man-hours writing tens of different versions of each piece of code. This is how Netflix, Google, Facebook, and others do it. These applications have integration across every device, TVs, smart-phones, desktops, tablets, smart-boards, cars, ect. This is not cheap, this is not easy, and this is not very useful.
The better solution (one that I am working on personally, albeit slowly) is to build tools that allow us to write one large piece of multifaceted application code against a myrid of idealistic DSLs (one for rendering, one for communicating, one for user identity, ect.), which can then be auto-magically (note the magic) ported to every device. This what unity does, as a domain specific example.
The poorman's solution is of course HTML, but that doesn't work well with current desktops due to NAT.
I agree that software is a principal constraint. As a programmer myself, I find implementing what I would like to see as well beyond my capability, even if I invested significant time. As you point out, it would require a holistic ecosystem-wide approach. This is why I'd like to see a software titan like Microsoft attempt something along these lines. They have the resources to unify all the devices in a person's life into a single computing experience.
I very much appreciate what you're attempting to do and I think it will measurably improve matters. However, where I personally differ is on the key point I made above: I don't want applications to run on multiple devices. I want applications to be singular instances available everywhere. If I begin typing an e-mail at my desktop, I want to view my e-mail application on my tablet or phone and see the exact in-progress e-mail—to the letter—available and interactive in real-time. As I type on any device viewing my e-mail application, the letters appear on all over views instantly. Presence information (for the purposes of notification sounds and the like) could follow whichever device I interacted with most recently.
A premise of MVC that we have in large part forgotten is that views can be plural and diverse.
As you point out, countless developer hours are used porting applications to myriad devices. I'd rather conserve that effort. Have the computational guts be on a high-performance, high-connectivity, well-known (e.g., x86) application host. Then only the views need to be made plural, in a manner similar to (but obviously more comprehensive than) responsive web design. More comprehensive because some devices will be touch-enabled, some will be large, some will be small, some with a keyboard, others without, etc.
All that said, I do like what you're talking about and building. I look forward to seeing that project come to light!
Oh I completely agree with what you describe. (I use exactly the MVC pattern you describe when I work on contracts to pay the bills.)
Personally, however, I maintain that even at the level you describe the tools just don't exist to reduce the time significantly unless you give up native application feel. I.e. streaming the view, and just sending the input back. The web is a great start to such a system, but it still has a lot of hurdles to cross if it wants to be that platform (including performant (i.e. native) 3d rendering, NAT traversal for user ran applications (just switch to IP v6 already!), saving data client side, threading!, different input and display methods like you mentioned, ect.)
I also have other, fundamental, problems (more like pet peeves) with the way the web is designed to work. But my main motivator is the massive problems I have with programming languages. I should probably start a blog...
Sitting behind a triple monitor setup with large displays I don't think display tech is what is holding us up, neither are gesture control or higher bandwidth.
It's simply 'good enough' for just about anything that I'd want to do with a PC (and then some). The problems - if you can call them problems, I'd prefer to call them challenges - are to reduce the need for all these interfaces.
The best computer would work like siri does, only it would be really intelligent. That sort of quantum leap would transcend any mere improvement in hardware. All this eye candy and visual stuff does not allow me to work any more productive than what I could do 20 years ago with just a 15" green phosphor CRT. Displays are not the problem.
I think you have to realize that nobody wants to me "desktop computers" because the margins you have to make in order for that to work aren't available at the moment.
What I have seen is that it started out being computers (look at the Altair for god's sake with its switch panel!) and then it became a computer and some 'office applications' , then 'office applications' and you could develop on it, and now 'applications.' The best way to develop for a Tablet or Win/RT system is on a workstation with some development tools.
What is interesting to me is that the "PC" overtook the "Workstation" (Which was very much a dedicated development device) and killed it. Now as the "PC" market moves to more turnkey solutions, nothing has yet backfilled the void being left behind.
I see that many folks believe that the workstation of the future runs a virtual instance on the other side of a network connection, your "terminal" is a couple of specialized applications running on your application running device. I can easily see Google taking the Pixel and making it work sort of like 'spaces' used to work on the Mac, except when you zoom into your 'Workstation' space with your xterm windows, your debugger, and documentation browser its really hosted by some RDP or VNC like protocol to a service back in the cloud somewhere. It isn't a diskless workstation it's a terminal with a really rich serial protocol that runs at 10 megabaud.
You claim "But for now, the desktop should re-assert its rightful role as a chief computing environment, making all other devices mobile views."
And I suspect it will, if the price of doing that is better than the price of doing it "in the cloud" (or remoted to the network).
> Now as the "PC" market moves to more turnkey solutions, nothing has yet backfilled the void being left behind.
I don't understand. I can buy a very powerful PC at very low cost from any nearby mall. I can build one custom from components I buy at NewEgg or Amazon. I can get a Mac with its great fit and finish. I can choose Windows, Linux or OS X.
What void? Nothing went away. Workstations are better and cheaper than ever. Go buy one.
I would like a better screen, I'm not interested in the Metro interface, and the new MacPro has no internal slots. There's not much excitement happening in desktops/workstations, sure. But they didn't go away.
> I think you have to realize that nobody wants to me "desktop computers" because the margins you have to make in order for that to work aren't available at the moment.
Let's regress further. Why are margins so slim in the desktop computing market? Well, for starters, aside from Apple, there are no non-windows desktop vendors worth talking about. And Apple doesn't license their OS.
So the Windows tax has basically crippled the PC market innovation-wise. And Microsoft has done little to promote progression in PC standards in the past 10 years.
Microsoft and Apple have given up on the desktop market (Apple because they discovered they could print money by making smartphones, and Microsoft because they're a monopoly that gets their money innovation or not).
I can imagine there being a large market for large, slightly curved or similar displays running on ultra-fast processors that seamlessly create environments like this http://blog.100percentgeek.net/wp-content/uploads/2012/08/de... without looking aesthetically ugly.
A sort of super-iMac, if you will.
They would probably be hugely popular among the IT/sysadmin/monitoring crowd too.
I definitely agree that desktops can have a huge win in terms of immersion. In addition to the ideal desktop screen being very high-res and very large, I think they would also use Parallax Barrier technology to show 3D imagery without requiring you to wear shutter glasses.
Perhaps once again Apple could be the one shaking it up. They already lead the charge with retina displays on the laptops, and we can hope it will come to desktop as well.
if the new mac pro is any indication, they might try to do innovative things on the graphic side. As they are in position to force a boost of the graphics performance of all their hardware they might be the only one to be in position to do so.
On the control side, I found the magic trackpad to be really different and a lot more usable for basic tasks and gestures (I'd use a mouse for gaming and a pen for drawing, but everything else is easier on the trackpad)
Too bad no company seem to be really advancing on the personal networking side. Google would have the hardware and software knowledge to pull it out, but as long as advertisement is their core business it woudn't make any sense. For now a Synology like NAS with pluggable apps seems to be the less cumbersome solution to have devices talk together.
Thanks! I have come to accept my ideas as a little crazy—perhaps if I were charitable with myself, I'd say unorthodox—so I appreciate your kind words. My blog is: http://tiamat.tsotech.com/
I dont get this article. Seems to be nothing but hearsay and opinion with no references to facts.
Also point of author just seems nonsensical to me. I have a big-ass powerful pc and I love it! Use it for gaming, game development, 3d modelling, music making. I built it myself from components, set up OS and software exactly the way I want with win7 and Crunchbang (linux). No other device would be able to do exactly what I want like this.
Maybe most people dont want that but most people have never been technically minded, if theyre happier with tablets then thats fine.
Just want to validate this... I feel the same way. PCs are faster than they've ever been and cheaper than they've ever been. I just bought a custom smoking fast computer for my parents in an awesome tiny mini-ITX case with a fast SSD drive and an awesome i5 Ivy processor (I think 3570k?). Including monitor it was < $1000. I priced out even cheaper PCs with cheaper, larger cases for $300. I also have a big-ass powerful PC and I love it.
I'm on the complete opposite boat - I think the laptop experience generally sucks still. Battery life only recently got improved to a great point in the past few years, but performance is still generally lacking.
Meanwhile a great desktop lasts longer than ever, is cheaper than ever, and does everything extremely fast. I have a 5 year old desktop that outperforms a lot of laptops out there, including my new Macbook Air & my work laptop (not even a month old), and that desktop pales compared to my half year old desktop (which costs maybe $200 more than the 13" cheapest Macbook Air).
I think part of the shift in the market is due to the great state desktops have become as long lasting devices (& thus declining sales), and some of the improvements on more mobile devices - I'm highly skeptical of any call that the desktop is going away anytime soon though, because the mobile experience is still seriously lacking in the sweet spot of performance, battery life, weight, and price.
What PC innovation? PC's are the same trash can size devices they were two decades ago. They just get a little faster every couple of years.
Innovation will be driven by mobile (both phones and tablets). It requires a lot more innovation to build these smaller devices that operate all day on a battery.
That's only true because mobile started from scratch a few years ago. You just see innovation because this market (for "smartphones and tablets") barely existed 6 years ago. Obviously this will tone down very fast as performances stop increasing significantly. You see that the latest ARM chips are only marginally more powerful than the previous ones in terms of operations/Watts.
So, they will, also, get only "a little faster every couple of years." Do not kid yourself.
> PC's are the same trash can size devices they were two decades ago
Do you actually visit your local PC stores regularly?
Because while it was not unusual to see full sized towers on display, these days it's rare to even see a standard ATX case. The typical form-factor even for the desktops have slimmed down massively, and a large number of the devices are now "all in one" models with the PC and monitor in one.
That's actually why I agree with you about mobile - the performance _most_ people needs is low enough that the form factor has been steadily going towards mobile, and yet sales are still anaemic while mobile device sales have sky-rocketed. With the rise of Android "TV sticks", coupled with increasing support for mirroring/streaming display from your mobile devices, and/or HDMI/MHL ports and bluetooth keyboards, the desktop is going to be redundant for most people within a few years.
The irony is the desktop trend towards small devices, bluetooth keyboards and "all in one" models are accelerating the demise of the desktop more than it is propping it up, by making people used to wireless keyboards and not having a big box PC.
We are getting 4K displays this year. Granted, phones, then tablets, then laptops got high res displays first; but dammit, I really want my 31" 4K monitor. If the PC space starts to innovate again, or at least catch up to mobile, people might start loving PCs again and gasp...buy new ones.
And that's all they need to be. It's software that needs innovation, but that requires critical thinking people and people to have power over processes and business. So before we can even let people do that, we first have to tell them how to think and keep secrets and fight each other, work against each other, build walls and checks for themselves, and deliver their value to stock holders while not doing the same types of land-grabs via software and regulation that Microsoft & co. is empowered to do.
I think people are really underestimating where things like perception computing are going. Something Intel is also invested in.
But maybe people start looking at which jobs require using a computer to get essential work done vs. not needing one and therefore not using it. If people really think that entire generations of people are not going to need computers to do work are seriously mistaken, especially in BRIC/developing countries. I don't think the question really is are PC dying, the question should what the hell can I do with the ~$1000 machine other than look at cat pictures. We can thank Microsoft mostly for that. Seriously I think people really underestimate how turned off the entire industry is from Windows 8, especially when they need to upgrade the only reasonable choice is Apple.
Remember Apple is the only company that is actually increasing sales to laptops(MBA models). Clearly there is a market it's just not being served by current parties.
I just recently purchased an Ultrabook; it was on sale and it was great deal. It had what I wanted: super thin, long battery life, and most importantly a high-res display.
But just the other day I was annoyed with the fan running while I wasn't doing anything intensive. So I checked my power settings and was surprised that the Min processor speed on battery was set to 100% and the cooling policy set to active. So I changed it. Then a few minutes later I checked it again and it was back to 100%/active. I finally started doing some Googling and found out the f'ing track pad driver was the cause of those power settings getting reset! Several driver updates and a BIOS update and the machine is where it should have been to start with.
I can only imagine how many people out there have this exact machine, which is chock full of Intel's best frequency stepping and power management technology, and it's all completely disabled.
Wait, what? I don't feel the Apple experience is a lot better than the PC one, and I say that as someone who fully bought into the dream (I used a MBP as my main machine for 2 years, and an iMac as a desktop for that same two years). After those two years where up, I totally went back to PC, built my own desktop, bought a nice Lenovo laptop and called it a day.
However, I guess I really don't understand why everyone's getting so uptight about. What we're seeing is not everyone in the world suddenly making a change to mobile for no discernible reason. It's not as though suddenly the winds changed and now PC is mysteriously dying.
No, what happened is what always happens: someone made a better tool for the mass market to do what they want to do, which is to consume content and be entertained. Magic wireless Netflix screens that fit in their pockets are just better at that than PC's are and there's nothing wrong with that. It's evolution, and even as a PC enthusiast and power user, I'm more than glad to see people better able to do what they enjoy.
That's not to say I'll be giving up my hand picked keyboard, large screen, and heavy tower, but I'm not to worried. There's always going to be money in serving the needs of people who get work done on computers, even if it's not quite as much money as it used to be.
> I don't feel the Apple experience is a lot better than the PC one
That comes across like you do think it is better, if maybe not by a lot. But then you write the following:
> I totally went back to PC, built my own desktop, bought a nice Lenovo laptop and called it a day.
I don’t understand. You hated your Macs so much that you bought 2 new computers? If it was just OS X that you hated, you could’ve simply kept the Macs and ran a different OS. So what was it about the hardware that you disliked so much?
Well, after two years it was just time to upgrade and I decided not to renew with OS X. I've gone back to a Windows/Linux workflow and I really enjoy it.
I'm still hoping that moving the consumers to consumer-oriented devices opens up possibilities to make the unfriendly platform (the desktop PC) more poweruser and developer-friendly.
Total Apple fanboy rant. The latest Ultrabooks are superior to the Macbook Air IMO. They are faster with better battery life and cost less. I prefer Windows 8 and 8.1 over Mac OS X ML and over IOS 7. I will never buy another iPad or iPhone ( I have an I iPad 3 and iPhone 5 atm) as I prefer the flexibility of my Ultrabook and Android phones have leapfrogged iPhones in almost all aspects.
The article is a giant pile of stupid. Umm...I think my Air is a nice laptop. It's packed full of Intel stuff. Why the anti-Intel rant? He's dumb or a liar. Intel doesn't care whether Apple/Intel wins, or Microsoft/Intel wins, or ??/Intel wins.
Intel should care if Apple wins. Apple is in the best position to change architectures. Again. Ubuntu is probably in good shape. But Linux by nature can't really switch primary architecture in an afternoon quite the way Apple did.
I'm not saying linux can't be compiled for ARM. I'm saying it's easy for Apple to just declare all new hardware that runs OSX is now ARM. It's not possible (or remotely desirable) for Linus to one day say, that's it, ARM only.
What do users want and ask for vocally? Screens that aren’t garbage quality, resolutions that are not worse than mainstream laptops from 2007, SSD instead of error prone and driver dependent ‘hybrid’ garbage, an OS that isn’t grating to the user, decent Wi-Fi, good build quality, and a decent price.
Do they really ? Imo most consumer couldnt care less about any of that, its a tech savy minority that wants higher quality screens and SSDs. Thats exactly the reason why we are seeing zero innovation in the PC monitor space, because the market doesnt really care. It cares for price most importantly which leads to popularity of low res screens and slow HDDs in the first place.
This 'the PC is dead' nonsense will come full circle eventually. Phones and tablets are PCs, we just haven't yet got to point where we can satisfactorily dock them with a full desktop accessory set.
I personally see a scenario where everyone has a nice big LCD screen, full sized QWERTY, and probably still a mouse, in their study at home but carry their 'beige box' in their pocket. Just 5-10 years out imho. Unfortunately I think Windows is still positioned best to make this happen.
Unfortunately I think Windows is still positioned best to make this happen.
Have you looked at the Ubuntu phone OS? It's designed to be the same software, hardware, and UX from phone to desktop. I don't necessarily like it, but it shows promise of doing exactly what you say Windows is best positioned to do.
Yep, and I'd put them at #2. I'm pragmatic, I don't necessarily think the best solution will triumph. One thing Ubuntu phone is doing wrong is dismissing Android. Dual booting is absurd, it doesn't promote sharing app data or a seemless user experience.
> Unfortunately I think Windows is still positioned best to make this happen.
As a user who has been very impressed by the combination of http://elementaryos.org/ and the fact that Valve/Steam has recommitted to Linux, I think Windows is on the downturn and Linux is on the upswing.
Let's hope so. As long as Valve keeps the desktop side of Steam and the Big Picture mode separate from each other we would be fine.
I really hope they won't make Microsofts mistake and start trashing the desktop UI of Steam and instead giving those users the Big Picture Mode UI to use. Because, you know "Everything must look and work identical. No matter how different the use cases may be"
Microsoft could have avoided all of this by just offering a small option during install "Where do you want to install win8? Tablet or Desktop". And according to that selection install MetroUI or otherwise leave it uninstalled. But their notion that from phone over tablets to desktop, all just have to have the uni colored tiles everywhere certainly isn't doing the trick.
This is also the direction I think computing is taking. In the case of Apple, you might still have the Mac Pro, and perhaps MacBooks, but the iMac and Mac Mini will be replaced by iPad, which can connect to a monitor, keyboard, and mouse. On the iPad itself you will have the iOS experience, and in desktop mode you will have the current OS X experience. This is the inevitable outcome of the convergence between the two.
This article sucks? why? because it's not semi-accurate, it's totally inaccurate, everywhere.
And writing "sucks" every paragraph doesn't make it right.
10% speed improvement a year isn't nothing. The small battery improvement this year? Oh.. We went from from 6H battery life to 13H.. it's only double, it sucks! Heck, it's better than my smartphone with screen on.
The rest is on windows, which is an OS, not the OS. (Which isn't even a _bad_ OS, despite the hate for Microsoft)
What's actually happening is that the PC market is basically saturated with machines that pretty much do whatever anybody asks of them.
The market has pretty much plateaued. Pretty much everybody has a PC at home and work. Most households already have multiple computers. Heck, I know entirely non-technical powerwasher/gutter cleaner guys who have 2 or 3 computers. In fact, I don't know a single person older than 10 years old who doesn't have at least one Personal Computer of some kind.
Any commodity off-the-shelf PC will pretty much do whatever you ask of it (at least for most consumers). I used to replace my computer every year or two just so I could run modern software. I haven't felt compelled to do so for the last 6 years and even then I'm 50/50 on doing it. The rMBP my work issued to me is fantastic for virtualization, but unbelievable overkill for everything else I do (mostly email, word and web).
There's just not much of a reason to buy more machines outside of regular replacement rates due to failure and total obsolescence and new humans buying them as they get old enough.
It's not that PCs aren't coming back, it's that the constant growth in the market has plateaued.
Everybody was hoping China, India and Africa would explode 3/5s of the world's humanity moved into the middle-class and needed computers, but the growth has been far slower than was hoped and these first time computer buyers won't really be constantly upgrading like previous markets did -- the market characteristics are such that it won't be a simple repeat of the 80s, 90s and early 2000s.
Smartphones and Tablets are an entirely new segment and still growing (though showing some signs of flattening out as well). That's why they're exciting, because those markets are still building out and upgrading. But there are signs that those segments are flattening as well.
Tablets and phones are awesome, but they're definitely not a replacement for a general purpose PC. Even my mother and father, who're quite the luddites, regularly needs capabilities that don't work well on a tablet -- like doing taxes. Even if those things were magically fixed and working awesomely tomorrow, they'd still want a bigger screen than a tablet afford.
PCs aren't going anywhere, it's just that the market has to shift to sustaining the market not growing it (which is infinitely more expensive, meaning loads more money sloshing around in the secondary markets). This is fundamentally the problem that both Intel and Microsoft are dealing with. Apple escaped it largely because they created new segments to grow into.
Heck, the one new market segment that PC makers did manage to get into, netbooks, they managed to screw up so bad that the entire segment was dead within just a few years. (If you think of where netbooks needed to go as a segment, the Surface Pro would probably be a reasonable outcome, except that market is totally hosed now and Microsoft has to rebuild it).
Also it becomes even better by imagining it in Bane's voice.
Netbooks didn't get screwed up, it's just that what most people want from a cheap device is primarily content consumption which is done better by tablets.
The upgrade cycle with phones and tablets might be starting to slow as well.
Oh they did. Every new generation of netbooks was actually worse than the previous one (I don't even know how that was possible, but it happened).
Hard disk space converged on 320GB, RAM on 1/2GB, display size was never seemingly available greater than 1024x768 and battery plateaued at ~5 hours.
I adored my first netbook. My second was blah. By the third I was starting to think everybody in the PC market was certifiably insane. What on earth made them think they should release computers that get worse year on year???
A netbook that gets better every year for $250-350 will have a razor-thin profit margin. If it has to have Microsoft Windows Special Edition on it, 10-30% of the price tag goes to MS. Joe Average wouldn't buy a computer that doesn't run whatever brand name he recognizes, so Windows is a requirement.
A tablet isn't a "computer", so it doesn't need to run Windows in order to sell, it needs to run Facebook and Angry Birds. It can sell for $250-350 and have a decent margin, and then there are so many of them being made that the parts cost drops and the margin gets better.
A Google Nexus 7 now has everything except a keyboard in terms of netbook hardware, but it gives up no margin to MS and has a market big enough for economies of scale.
Meanwhile, whatever Intel is pushing as an Ultrabook can go for $1000-1400 because it starts as a high-end desktop replacement and then gets features shaved away. Except nobody with any sense buys them, and the usual bulk purchasers of laptops that don't make sense (Fortune 500 companies) aren't re-buying machines as often, and when they do, sometimes it's a fruit machine in offices which would never have bought those five years ago.
(Why? because of you, dear HN reader. The alpha suits know that the alpha techs demand the best hardware, and they can see the logos glowing on the back of your screen better than you can.)
>A netbook that gets better every year for $250-350 will have a razor-thin profit margin. If it has to have Microsoft Windows Special Edition on it, 10-30% of the price tag goes to MS.
And if you count all of the payments they get for preinstalling crap, I think it cancels out most if not all of the windows license.
Or they could just install linux. Some did (not enough though).
>Joe Average wouldn't buy a computer that doesn't run whatever brand name he recognizes, so Windows is a requirement.
The entire netbook industry was kicked off by the linux brand name. So that makes little sense.
>A tablet isn't a "computer", so it doesn't need to run Windows in order to sell
Android or iOS are as much brands as windows or linux are.
>A Google Nexus 7 now has everything except a keyboard in terms of netbook hardware, but it gives up no margin to MS and has a market big enough for economies of scale.
Netbooks also had a market big enough for economies of scale too.
Netbooks was seen as a threat to both hardware manufacturers, due to their wafer thin margins, and software manufacturers, since they initially came out preloaded with Linux. The Wintel cartel quite deliberately killed it off.
Man that is some serious cojones and brain damage when you have the audacity to sell a worse thing year on year, despite a better performing but immobile alternative existing.
For those who didn't make it to IDF, it felt dead. There was very little attendance in most sessions and the expo floor was also pretty much empty. They actually moved food into demo areas so it looked like there was buzz. I'm pretty sure the "outside of Intel" attendee count was remarkably low.
PCs are dead, but Intel will be fine. Bay Trail will be the beginning of the end for ARM as Intel brings its massive lead in fab technology to bear on the mobile market.
baytrail has the vast expense of cutting edge foundries and the entire expensive might of intel r&d behind it, and it just got beaten by a cpu produced with far less engineering and r&d expense on a last generation foundry.
Lets wait for a fuller set of independent benchmarks. There is a lot of speculation on RWT and my conclusion is that nobody really knows what's causing the difference. Also note that a lot of these benchmarks use JS, which puts the quality of the JS engine into the equation.
That is very true. Also, bay trail still won a lot of benchmarks on said page.
I cannot shake the feeling through, that while bay trail is competitive and will be very interesting for many purposes (looking forward to getting a small home server based around the new atoms) - it needs to truly trounce arm to initiate an industry change.
The fact that tablet devices and phones have entered the market, reducing the need to do everything on a PC, doesn't spell the end of PCs. It just means they aren't the only go-to computer anymore, which is a good thing for everyone.
A rock solid PC in the home connected to a nice big monitor and other useful peripheral devices, is a good thing to have. Be it a compact PC, laptop or desktop, Windows or something else.
"Post PC" is a stupid agenda-driven term. We live in a "post horse and cart" world, but the PC has no inherent limitations preventing it from evolving. If you bother to look, there's currently more enclosures, cases, and interesting "desktop" configuration variety for PCs than ever before, cheaper than ever before.
Well, I am currently shopping for some used ThinkPad T60's and T61's because I cannot stand the shite keyboards (and also, not infrequently, displays) that have taken over current designs.
This doesn't really speak to market trends, I guess, but making your products physically unpleasant to use probably isn't helping your cause.
I should delete this comment as a pointless rant... but, I'm shopping for 7 year old laptops, dammit. I want to type quickly and pain-free, and also have some vertical context without eyestrain.
I don't think PCs are dying. I think computing consumption is increasing so desktop productivity looks like it is declining.
I think when a dock for tablets or phone finally happens for consumers, they'll just get it. Desktop mode is not intended for using your fat fingers on a touch screen. "Metro" mode is for that. Windows 8 is all about for when you get off the bus in consumption tablet mode and dock into your desk and your dual monitor with keyboard and mouse lights up and you go into productivity mode.
I dont really buy his arguments. I think the Surface2 is a good example of where the PC and Windows 8 is headed. For most people such a tablet with the option to use it as a desktop pc trough a docking station is all the computing they need. The Surface2 seems to do this job very well and with Haswell finally has decent performance and battery life.
In 5-10 years, i am pretty sure that the real desktop PCs will be for professionals only, while most consumers are using some mobile tablet/laptop hybrids.
"Then comes the hardware, you know the part Intel does. It sucks too. Why? Because for the last 5 or so generations it doesn’t actually do anything noticeably better for the user. Sure the CPU performance goes up 10% or so every generation, battery life gets better at a slightly faster pace, and graphics improving extra-linearly but that is irrelevant if you aren’t benchmarking."
In a sane world, this would be a feature, not a bug. PCs are now mature enough that you can buy a decent machine and expect that it will not be hopelessly outdated in two years. This is a good thing.
The problem is that hardware and software manufacturers have a mutually beneficial relationship whereby new software just won't function without that extra 10% hardware capacity you get from a new computer. Even if it's a word processor or a not-terribly-impressive game. (Remember "DirectX 10 requires the power of Vista", which requires a much faster computer that XP?)
And the other problem is that doofuses like the article writer have been so thoroughly gulled by the planned-obsolence treadmill that they actually think that's how it's supposed to be, and throw tantrums if this year's hardware isn't at least 10 times shinier and more sparkly than last year's.
In a sane world, this would be a feature, not a bug.
Being able to get use out of older hardware is a feature. New hardware not providing any new possibilities is a bug, and that's where we've been for the last 5 years at least.
I think the core point of the article is valid. It's ridiculous that my $230 tablet has a better resolution than nearly every "high end" laptop. And apparently nobody has any ideas for using our tremendously powerful multicore CPUs and near-teraflop GPUs other than rendering increasingly bloated websites.
My point is, there's only so much benefit that one person can get from exponentially increasing processing power. Web browsers and word processors gain new bells and whistles, but the basic functionality is the same as it ever was. You can run a bajillion windows at once, watch Netflix at HD resolution, and not slow down. Games are near-photorealistic. What more do we want a computer to do, exactly?
You mention resolution--once the pixels are smaller than the naked eye can distinguish at typical viewing distance, everything else is polish and marketing. Yes, I know, retina displays are indefinably "crisper", or something. But you can't functionally cram any more useful information on to them, because the user won't be able to make it out.
We're seeing features like this because we've reached the point of diminishing returns on desktops. There aren't going to be any more massive game-changing jumps in raw processing power. In the 90s, we went from Wolfenstein 3D to Doom in a year and a half, and from there to Quake in another two and a half years--all of them gigantic, envelope-pushing leaps in gaming technology. In this century, we've gone from Half-Life 2 in 2004 to...what? Crysis 2? The latest Call of Duty? When was the last time a PC game got the same kind of uproar over graphics that Doom did? The degree of change that used to come once a year is now coming every ten. That's not because everything got boring, it's because we're getting asymptotically close to perfection, at least from a practical home-user standpoint. That's bad if you're in the business of convincing people they need new computers every year, but it's decidedly good if you're a user.
Have you ever used a retina display on a laptop? Every reviewer that I've seen comment on it has said that they cannot go back, and not because of indefinable crispness. Also, some people do run their screens with 2-4x as much info crammed onto it - this would be the main reason I would use one.
The 1920x1280 Nook HD+ is now available for $149. I checked NewEgg and only 7 laptops met that resolution, and 5 of them are from Apple - the cheapest is $1399.
The problem here is Microsoft, not Intel. They never figured out a way to make sure that things looked consistently good on a high resolution display. When the OEMs all decided to piggyback on the TV industry and declared 1920x1080 was "high end", that nailed the coffin.
It appears to be a stream-of-consciousness rant (as all good rants are), so it is not quite as structured as one would like, but the tl;dr seems to be:
"Things in the PC space suck, and absolutely no one seems to be interested in making them suck less. Instead, they keep putting different shades of lipstick on the same pig and calling it 'innovation'."
Isn't this the same for most industries, and for pretty much everything? Genuine innovation takes lots of effort building things that people don't know they want yet, even IF they are in fact monetizable. And a LOT of things are not monetizable, or face serious difficulties gaining marketshare in environments with deeply entrenched players and expectations.
The history of computing has much more to do with "innovation" in monetization schemes rather than computing itself. Great things get built only when someone discovers some hackish way to get people to purchase it. The monetization itself usually makes the product/technology worse than it would be without the monetization (e.g. DRM, copyright, ads, preinstalled spyware).
I still run on a Core 2 Duo from January 2009 as my main machine (a laptop) and it is still plenty fast enough for me. I just don't find the latest generation CPUs to offer me anything that I don't have except doing things "a little bit quicker" but that isn't all that noticeable to me. I suspect this machine will die (in a way that makes fixing it financially unviable) before I out grow it. Visual Studio 2012 starts in ~1 second (warm startup) and performs great. The only issue I have is that Hyper-V (as part of Win8) requires SLAT which my CPU does not support which is annoying but I just use VMware Workstation instead.
For me there has not been enough improvement in CPU features since the Core 2 range was introduced for me to need to upgrade. Sure a Haswell would be nice from a battery life POV but as I am plugged in 99.9% of the time that isn't an issue to me.
I don't feel like the title matches the content of this piece. I thought it was going to primarily be about how users are shifting away from PCs (which it did touch on).
But really it was just an angry rant about chips not getting better, even though Intel makes the best chip on the market.
I just bought a overpriced ultrabook with Windows 8 (Microsoft tax). The Windows 8 experience is indeed terrible for touch screen. Seems like someone made an amateur touch screen mod for Windows 7. The product isn't ready. I can fell the Steve Ballmer signature in this product.
I just wanted a PC because I tough it would be easier to install a traditional Linux on it, but UEFI. Oh the humanity, UEFI is the most disgraceful scam the industry ever did. How they could be so wicked?
I don't want to live in a world that the only good option is a monopoly of Apple machines and software but the PC industry is not even trying.
1. Its not sucking if its only meeting 95 out of 100 of your requirements. My desktop can do everything my iPod can do, with difference. There is a very select group of tasks, ie eBook or drawing surface, that the table excels at but these are not shared by desktops. They are two different products with one offering a miniaturized and inferior version of the former.
2. Users see laptops that run faster on desktop relevant applications: games, spreadsheets and programs like Adobe CS and MATLAB. The rest don't count as they arent' motivation to buy a desktop.
PCs are not coming back in the sense that they won't see growth like they used to, but at the same time they're not going away. To be fair, most people don't actually need a PC. My wife uses a Nexus 7 as her main computing device, and loves it (she prefers it to a PC).
As an aside, Chrome OS devices are gaining alot of traction... Probably because they're more than adequate for most people's needs as is, and developers can always switch on development mode for a full set of Linux-y features...
I don't understand why people don't like Ultrabooks. They like MacBook Airs, so what's wrong about creating slim and lightweight notebooks in general?
That some vendors produce crap is irrelevant. But there are nice offerings out there.
I've seen a Samsung in the shop recently that looked and felt great. They only screwed it up in offering max 4GB RAM and their lineup is so confusing that I don't even know which one I would have to look at online. But that is not a general issue of "Ultrabooks".
My problem with ultrabooks is what is intel doing with the brand. At first, ultrabooks were supposed to be laptops with good battery life, high-resolution IPS screen and a SSD. Also it should be thin, which I think is meaningless, but whatever.
However, it seems that this disappeared somewhere, and now laptop with 1366*786 15" TN screen and rotary storage (with a measly 16 GB "SSD cache" that does pretty much nothing to make it faster), but it is required to have Windows 8 and touchscreen.
The brand lost meaning, it went from "Apple quality from someone else than Apple" (so we can buy nice computer without supporting walled garden approach)
I was never under the impression that I could just buy any notebook that has an Ultrabook sticker and get the equivalent to an MBA. So I guess I don't really care about the brand - I care about getting a lightweight, powerful notebook.
I do, but I still don't like their approach on iOS, don't like the idea of them doing the same thing with OS X (which they might or might not do), and would rather give my money to some different company.
This is rather hard, though, we'll see how Haswell Zenbooks will fare. Or maybe XPS 13's
So many developers use Macs now that I would be surprised if Apple went this direction. I'm sure they want to go this direction--I'm just not sure they feel like they can without angering developers.
To be fair, all OS companies (Apple, Google, and Microsoft) want to go this direction. It would probably be enough to push me back onto Linux.
Yes the PC seems to be stuck I bought a desktop PC in 2005 and replaced it this year love the built in graphics and more powerful processor but anything extra is useless crap i don't want.
It's amazing how hard it is to find a screen with a decent pixel density and how expensive these average screens are.
I'm actually considering buying I-Pad screens off e-bay and creating a wall of them to get what i need. How deplorable is that?
The PC needs to become the main machine of the house and to do that it needs to step up its game.
Argh, PCs never died. I am reading this on a PC in an office full of PCs. I can't develop code for other people's PCs on a tablet - I couldn't develop code for a tablet on a tablet, it would be horrible. I need a PC.
I assume this discussion is already more about the transition from desktop to laptop, than about the PC to Mac etc.
Have you looked at Windows on 27" Thunderbolt display?
The first time I saw that display with Mac OS X rendering on it, I thought: "Wow, that looks awesome. The glare from the glass is bothering, so there's that, but placed right in the room, it's totally fine and well worth for that size and clarity.".
Then I went to the specs and saw that the PPI of this monitor was exactly 6 points above my own (1920x1080, 21.5", 102 ppi vs. 108 ppi)? Is it possible 6 points to be such a differentiator?
Of course, it turned out it wasn't - I immediately tried windows on the same Mac machine and there it was - everything looked almost as badly as on my own PC.
So, I asked the guy that owns the machine (he is a graphic designer), why there is such a difference and he said: "Microsoft simply doesn't care enough about detail. Check the default icon sizes in the OS - OS X's icons on regular-density display (non-retina) are 4x bigger (512x512 vs. 256x256 on windows) and on retina this grows to 16x more pixels (1024x1024)."
So factor #1, IMHO, is people ditch Desktop PCs because of Microsoft's lack of flexibility and innovation - they decided "If we have a monopoly over desktop computing, why should we care what our users say? They still buy 'it'.".
Factor #2 - Microsoft still has the numbers!!! They have locked the enterprise tight, and still don't care about "personal" user experience.
So we have this giant corporation that gets enough money and has no incentive to innovate and we hope it will bring us the great singularity of software and hardware we have all been waiting for...
And the last factor I will point out is relevant mostly to developers - Mac hardware runs it all. Then having a Windows and a Mac machine, doesn't really make sense, both as expenses and simplicity of work.
Apple made the, arguably, "evil-corporate" decision to be totally in control of the hardware running its OS. So you, developers, have the option to buy:
- iMac, that sits tight on your desk + Macbook (probably Air) for your work on the go;
- Mini + Thunderbolt + Macbook (probably Air);
- Pro + Thunderbolt + Macbook (definitely Air after what you paid for the previous 2);
- Macbook Pro.
Well, as I see it, most developers are headed for the Macbook Pro with eventually adding a Thunderbolt.
For developers in need to support the Mac ecosystem, this is not even #3 reason - it's #1. I had to make this transition and I am glad I did, because Mac OS X turned out to be no less then great, and although smaller in size, my 15.4" retina display got me into the 21st century. :)
this is a good example of how microsoft fail. they have actually fixed this problem - but their fear of changing anything because of the enormous community backlash means that this functionality is not the default. they are doing more with Windows 8 on the tablets, but the desktop version in the desktop mode is like Windows 7, which has its legacy from Windows Vista, and XP etc... which is why the icons are tiny and don't adjust to DPI without you changing your settings afaik.
My problem with Macs (i use one as my main dev platform at the moment) is that they unpolished features are extremely weak. As a technical person doing development work on a Mac its a constant headache vs. Windows - VS is ugly, but its much /much/ MUCH!! more functional than xcode (it has what... a 15 year headstart though?) even in areas where Apple traditionally excel - e.g. the tab interface in Xcode requires more, less intuitive, user actions than the equivalent feature in visual studio, and it doesn't allow side by side docking of the tabs (afaik).
The other thing about Apple and displays is that they cheat a little and use a very good color profile which is vaguely related to sRGB at best. Try calibrating an Apple monitor - its basically impossible in my experience, even if you use their 'sRGB' setting. On the other hand, my well calibrated HP monitor does look like garbage by comparison... this is also a problem for development, because most people not using a Mac or iPhone do not see what you see when you work on there...
Coming back? Where did they go? When did they arrive? I don't think there was ever a time where the PC enjoyed significant market saturation. If anything, the PC "bubble" is deflating back down to normal levels.
It may be popular and "obvious" to accept that Intel and Microsoft own the PC market and decide where it goes, but the reality is that the market makes the demands and Intel either meets them or they don't. I think this is clear when AMD pushed 64 bit first and Intel adopted it. This is also illustrated with the fact that PC sales have declined along with the stagnation of Moore's Law. That last point seems counter-intuitive but it shows that Intel can't force a market if it doesn't deliver.
Now both CPUs and GPUs are "as fast as they're going to be" for some time now. For some reason, next gen GPUs are joining the theoretical ranks of "Moore's Law is more threat to economics and security than fruit of civilization," giving us 10% yoy speed improvements but doubling up on security and management overhead, added coupling, APIs for compilers only, dedicating more silicon to hypervisors and management that should go to the programmer and his compiler.
IT has become a completely dysfunctional market at the macroscale. The demand doesn't know what they want or how to shop for it, and the supply is to scared to deliver anything new.
At the micro-level, those who know what they want are still taking it one step at a time with their own feet, to their own drummer, but the mess that is the macro-market is just destroying knowledge and value like a wildfire. However, those few programmers who know what the Internet should look like, instead of one that's built to be profitable for thing manufactures, aren't able to keep up with the complete mess big software is doing to the collective wisdom of the netizens and the internet infrastructure, both physical and social.
- Laptops will stick around and have Intel Inside for quite a while. The market may be boring, but it will be there for years. Corporate America helps.
- Servers won't be switching to ARM any time soon (I'd argue this is the riskiest bet).
- The desktop and enthusiast/gamer PC market will be around for a while, and also won't be switching to ARM any time soon.
So all of these "shoe-ins" buy them time, and I believe they think that in time they can pull off the biggest risk of all:
- Intel is betting that the biggest differentiating factor is and will be performance per watt. They are willing to gamble that they will eventually eclipse ARM cores in this area. In their view, if they have an x86/64 core that trounces competing ARM architectures in ppw then phone, tablet, and set top manufacturers won't have a problem putting those chips in their devices.
Granted, I'm not saying I think Intel is 100% correct or that they'll succeed with their long term bets; I just don't think they are as clueless as this rant makes them out to be.
No doubt about it, though, UltraBooks DO suck.
EDIT: I'm going to revise my statement on UltraBooks. Not all of them suck. In particular, the Lenovo Yoga is fantastic.