Hacker News new | past | comments | ask | show | jobs | submit login
AMD's master plan to topple Intel - Back to the top on a radical GPU (theregister.co.uk)
71 points by ableal on July 8, 2011 | hide | past | favorite | 42 comments



Strangely, AMD had almost always better technical talent/product in terms of architectural design, but always stumbled on intel's business practices (see computer history museum panel) and manufacturing weaknesses.

For the better part, they made a better x86 for their time: 386, 486, K5 (core), 3DNow (all the SIMD intel lacked, and the MMX patch in response was a joke), 64bit (which intel was late by years) etc. comes to mind. I hope it will be different this time.


Intel wasn't just late to the game for 64bit, EMT64 == AMD64. The cross-licensing that Intel was bemoaning for years finally worked in their favor.


Really? On what basis do you say AMD has always had better talent? 386 and 486 were Introduced by Intel, not AMD. Not to mention Core, USB and other history found on http://en.wikipedia.org/wiki/Intel


I've been waiting for them to make more official announcements to this effect, and more details - there's been more than enough info to imply this is their goal for quite a while now. And I feel I must say: WANT. If this succeeds, it'll be a game-changer in almost every way.


I'd love to be able to go back to the diverse ecosystem we had in the 80's, but I don't see how an architecture that deviates significantly from the x86 norm could gain traction in this Windows world.


I don't think this is meant to replace the x86 ISA, but rather SIMD units like MMX/SSE and traditional GPU shaders. The article was a bit muddled -- I'm waiting for the Anandtech version.

Successful adoption will be dependent on good tooling. Auto-vectorizing is really hard in compilers, mostly because current languages aren't really built for it. But if AMD were to introduce something purpose-built (eg, with a Matlab-like syntax), then that might compel devs who aren't assembly or GPU gurus to develop against this.


Here's the AnandTech writeup (and as you expected, it's way better than the Register one): http://www.anandtech.com/show/4455/amds-graphics-core-next-p...


Fellow programmers make fun of me for liking Matlab. I've tried to explain it to them many times (how it doesn't need iterators, how most things are one-liners, how it's 10-100 times less code all the way down), but I've come to the conclusion that they just feel threatened because it goes around most of the problems they've learned to solve with c. It's like how some auto mechanics don't think much of electric cars. I know that hundreds of cores is the future, but nobody knows how to use inherently parallel languages like Matlab, so all I can do right now is wait for the future to arrive.


> But if AMD were to introduce something purpose-built (eg, with a Matlab-like syntax), then that might compel devs who aren't assembly or GPU gurus to develop against this.

Does anyone know if it's possible to program GPUs today without using a new, special purpose programming language? That would be the killer app for GPUs.

I see your point about most languages not admitting auto-vectorization. But couldn't you take a compilable functional language like Gambit Scheme, Clojure, or Haskell and emit GPU code for constructs like map and fold on numeric vector types?


> Does anyone know if it's possible to program GPUs today without using a new, special purpose programming language? That would be the killer app for GPUs.

Haskell has an embedded domain specific language for it. (It's probably a monad.)



The R language (http://www.r-project.org) is inherently well suited for auto-vectorizing. There are a couple of packages for GPU-accelerated processing already--currently heavily skewed towards CUDA though.

http://cran.r-project.org/web/views/HighPerformanceComputing... (scroll down to "Parallel Computing: GPUs")


Maybe something like Google's new Renderscript API could be used to expose this in a friendly way?


I do: cloud computing and mobile.

5 years from now, it won't be a windows world. Consumers will be spending most of their time using iOS, Android, and Windows Mobile devices. On the server, a large % of apps will be written on a layer above the OS (GAE, Azure, Heroku, DjangoZoom, ...).


I'd love to see that happen, but we have been predicting the demise of Windows since the last century but Windows is still going strong. Inertia is a powerful thing.


I certainly wouldn't go so far as to bet that Windows will go away, but I think that previous predictions about Windows demise were predicated on it being replaced on people's PC's with something else.

That never happened (and I agree may not ever happen), but I think the growing widespread adoption of non-pc computing devices (phones, tablets, tv's, etc.) definitely has a fighting chance. Within ten years, I'd bet that the vast number of people will be accessing the internet primarilly from something besides a PC.


Everyone was thinking years ago that no one could displace IE as the dominant browser - that Microsoft could and would pull every dirty trick in the book to kneecap the competition and outcompete with mass developers.

Now look - Firefox is sitting comfy, Chrome is a rising star, and on mobile, webkit browsers owns the whole place.

Windows can and will fall. If Microsoft is smart, they'll be the ones that put it to rest (like Apple had the mock funeral for OS9) on their own terms.


The difference is that with a web browser, you can switch from IE to a superior browser and all your favorite web sites will still work (apps built for IE6 compatibility notwithstanding). This isn't true of switching from x86 Windows to some-other-OS on some-other-architecture -- your existing special-purpose business apps won't work. You have to switch everything out.

That's not to say that Windows can't get displaced, but that it's a lot different from displacing IE. You can't just switch from Windows and have nearly everything working the same from day 1.


Most applications aren't speed critical, so on some-other-OS on some-other-architecture you'd probably get away with virtualization / emulation.


> 5 years from now, it won't be a windows world. Consumers will be spending most of their time using iOS, Android, and Windows Mobile devices.

That sounds about right, except what is Windows Mobile doing there? It's a pretty tight race for third place, and I wouldn't bet on Microsoft there.


Who would you bet on? The only other players are RIM/Blackberry and WebOS and I am not too sure how well they will hold up.


Yeh, I put them on the list because I do think they'll slide into the third slot. Their deal with Nokia gives them good reach, especially in emerging markets. Also, they have a huge warchest of funds and the patience to loose piles of money in order to get market share in an important market (XBox, Bing, ...).


RIM has much more marketshare in developed markets, and it can still turn things around.

WebOS has a technology advantage over Microsoft's offering, but a bad starting point. Still, I would not rule HP out.

But to be honest, I am not sure third place will be a significant thing. The mobile market is looking to repeat history with how things turned out on the desktop. Do we care much about third place there? Not really.


I work in mobile software, and I don't know anyone who thinks "how things turned out on the desktop" is a good guide for the future. See also, http://www.asymco.com/2011/07/06/the-post-pc-era-will-be-a-m...


This can't possibly be true. It may be true for home users and hobbyists, but people running business applications need screen real estate and a keyboard on which you can touch type. That's the Achilles heel of mobile, and it's not going away.


The idea here though is that mobile devices will eventually become versatile enough to be used in these environments. My phone has an HDMI output and bluetooth capability, what, other than current usage patterns, is keeping Android/iOS from allowing you to plug in a monitor and hook up a keyboard to a phone? Hell, they are already doing it (albeit poorly): http://www.motorola.com/Consumers/US-EN/Consumer-Product-and...

The point is, I already carry around a phone in my pocket, and eventually, when I get into the office, I will hook it into my monitors/keyboard/mouse, and work away. The PC cannot keep up, and will eventually be replaced by this paradigm.


I'm skeptical. I don't see more than a tiny fraction of corporations allowing their employees connection to the corporate network with a personal device. And they're not going to give you something that easy to lose if it's a key to their data. Too many privacy and security headaches.

Then there's the question of computing power. Recent mobiles look powerful, but that's because you're comparing them to older mobiles and not to desktops/laptops. The MIPS you can squeeze into a mobile will always be constrained by heat, and you're just not going to get anything like the same snappy response you would from a dedicated machine.


I wouldn't be surprised if more users were on devices that ran linux than devices that ran windows nowadays.


If you trust StatCounter's stats, mobile is gaining on desktops, but just barely: http://gs.statcounter.com/#mobile_vs_desktop-ww-monthly-2010...

(Also according to StatCounter, on the desktop, Windows has a 92% share. Make of that what you will.)


"but I don't see how an architecture that deviates significantly from the x86 norm could gain traction in this Windows world."

One possible way is that Linux quickly adapts to support it and gains a significant, un-ignorable performance advantage on the server side, forcing MS to step up and quickly support it as well. Basically, similar to happened with the transition to 64bit, must moreso.


With AMD's support for GPUs under open-source? Their lack of support in the past (to the point of not even properly documenting their hardware) is not very encouraging.


AMD properly documents their GPU hardware now, and has for at least the last three years.

The real issue at hand is that the open source drivers are incredibly horrid. It's hard to write drivers, surprise!


they're catching up right now though, AMD is putting a significant amount of effort into the open source display stack.


AMD only released docs on the 2d side of things. 3D is still undocumented, AFAIK.


3D is now documented.


Consider this: probably the majority of "enterprise" developers today are working in Java. And they would rather run their code in a JVM on a PC than on a dedicated JavaStation/SunRay/whatever. That's the kind of entrenchment you would have to overcome.


otoh a lot of enterprise java code is probably tied to windows in some fashion. And linux runs best on x86.


My guess is as bad as yours, but it seems to me like they intend to make the change mostly invisible to the software layer.


How about Bulldozer sometime soon? I've been putting off my upgrades to see how it compares to Sandy Bridge, but that Q2 2011 launch didn't exactly happen.


Same here. bulldozer got pushed out again, some folks are saying the silicon was underperforming by more than anticipated so they have to keep it I the lab and rev stuff NOW that they had planned for the refresh in 2012.

Then we have ivybridge at the end of this year promising just under 4ghz with 8 core desktop chips from intel.

I expect AMD to stay behind until 2013. They are rushing right now, and not hitting their marks BUT their tech and teams are strong. As that wobble evens our and this new platform (first redesign in like 7 or 8 years) settles down, they can shrink die, point forward and speed up.

I expect 2013-2015 to be a goddamn bloodbath between the big chip makers at the top (intel, amd) and Samsung, Qualcomm, nvidia and Apple shooting up from the bottom with ever faster and lower power multicore RISC chips.

I imagine by 2015 the landscape will be a hodgepodge of every kind of tech out there and then the acquisitions and shakeout will take us to 2020 where things get more homogenous again.


The architecture reminds me of the PS3's Cell processor, but not as insane.


Many interesting designs over the years appear and then faded. If they don't get distribution, nothing will happen.

As consumers are often saying today. "I don't want a tablet, I want an iPad"




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: