Hacker News new | past | comments | ask | show | jobs | submit login
Apple ditches NVIDIA, goes with ATI for their desktop lineup (icrontic.com)
38 points by primesuspect on July 28, 2010 | hide | past | favorite | 42 comments



" Apple has been going in the direction of ATI’s OpenCL acceleration architecture, the competitor to NVIDIA’s CUDA."

OpenCL is not ATI's technology. When i spot such an inacurracy in an article, i stop reading it, since i have suddenly no more reason to believe that the rest of it isn't just as inacurate


That one caught me too, even with zero knowledge about CUDA/OpenCL, I actually googled OpenCL to find out the fact of the matter.


That seems completely illogical since you cannot and will not verify most of information, and since information does not come in any kinds of packets or chunks of truthfulness. Why does an incorrect statement invalidate the whole article? When you find that your dog has a flaw, you don't throw away the whole dog now, do you?


When it comes to news sources, it's best to 'throw out the whole dog' when they can't get the very basis of their usefulness correct - reporting facts.


This is kind of like deliberate decision to miss the point based on a mistake in an irrelevant point. Too personal to argue though.


No it's based on the number of informations you're able to verify from the article, and the number of those which are false. Also this very piece of information is not irrelevant, it is at the core of the subject at hand.

If you're able to verify a small number of "facts" presented in the article, and a big proportion of those are false, the probability the entire article is bull went from 'unknown' to 'very high', relative to your personal knowledge.

Eventually you could be wrong, but there is enough sources of information on the internet to allow me to choose not to read those i think are bogus.

Even worse, the mistake i pointed out is a very simple one. It's almost impossible to think, from the information you find on the web that openCL is ATI's equivalent to nvidia's proprietary CUDA technology. The mistake serves the article well though, by antagonizing the two technologies, and making the whole thing more sensational than it needs be. So this isn't a simple "irrelevant" mistake like you said. It's bad journalism, plain and pure.


Pieces of information from the same source are not independent with regard to reliability. To establish reliability of the source, one should take a sample of facts and verify. If you can find misinformation even without investigation, it indicates high probability that other information from the same source is incorrect.


A source either has a culture of trying to provide precise facts or it doesn't. This is fully orthogonal to the source making errors, being unintentionally ambiguous or misleading. Everyone does, accidentally. I am fully for dismissing a bad news source, but the complaint in the original comment provides no grounds. Can't condemn ruby as "slow language" because a program somebody wrote in it at one point runs slowly. Just a bit tired of extremely pedantic complaints. I understand when there's lack of writing skill, horrible ambiguitiy, or outright bullshit, but here I see a mistake irrelevant to the main point of the article. It's not a big deal really, whatever floats your boat, but these harsh complains of how "I'm never gonna read it again! It LIED! Conspiracy to boost hype!" are getting annoying.


"ditching" is a strong word. Apple has always been rotating between GPU vendor.


I'm not totally sure Apple has ditched NVIDIA for long - there are reports of Fermi based graphics cards working. Apparently the GTX460, GTX470 and GTX480 all work.

http://www.electronista.com/articles/10/07/03/imac.and.mac.p...

I have also heard other reports of users verifying they work, for both real Macs and hackintoshes.


" Those who pay attention to details probably have noticed that since late 2009, Apple has been going in the direction of ATI’s OpenCL acceleration architecture, the competitor to NVIDIA’s CUDA."

It isn't really "ATI's OpenCL", NVIDIA supports OpenCL also. According to NVIDIA:

"NVIDIA has chaired the industry working group that defines the OpenCL standard since its inception and shipped the world’s first conformant GPU implementation for both Windows and Linux in June 2009."


Well, I didn't have a good experience with NVIDIA even though it's been a great brand to me when I used a PC many years ago. The NVIDIA 8800 in my Mac Pro was a miserable failure, and neither NVIDIA or Apple would RMA it even though I had on record multiple complaints about spotty behavior before it completely failed.

I read lots of complaints on the Apple forums, and not just about this particular card.


I'm not sure Apple actually stops using NVIDIA at all, but that's great news. Great news for Hackintosh'es.


Am I the only one to notice the irony of an Intel CPU coupled to an AMD GPU?

edit: the downvotes show I am


I wonder if this signals a possible switch to AMD CPU's in the future? Maybe AMD offered them a great deal on graphics boards for enticing them to try their CPUs. Of course this is just pure speculation.


We'll get more OpenCL & OpenGL focus on ATI cards now :)


Does anything actually use OpenCL yet?


A sad day indeed, at least for me. I inevitably end up running some flavor of linux on almost all my computers sooner or later. ATI's support for linux, compared to nVidia, has been piss-poor to say the least (certainly in my experience). I have always recommended and bought nVidia cards for this one reason. In fact, we recently went with nVidia's QuadroFX 4800 cards for 5 new workstations (requiring stereo vision on linux/macOS). Want to guess the biggest factor why no one on the team even dared think ATI ? ... The (almost always) nightmarish experiences with ATI's low-end 'consumer cards' on their personal linux machines led to a lack of faith in any ATI product.

Please note, as far as linux support goes, I am talking about the "official ATI closed source drivers".


The open ATI drivers really are improving. From my own experience, my home desktop with an nVidia 8800GTS can run for about a week before the display starts getting crazy amounts of corruption, and KDE grinds to a halt. My work machine, with an AMD HD4650 card, runs smoothly and never has any issues. Support for KMS is also a plus. I don't really do much 3D stuff on either machine though, so I haven't been noticing nVidia's generally superior OpenGL support for a while. Both machines have dual 1900x1080 monitors, and the home machine is a lot beefier than the work one.


If you have only tried the closed source ATI drivers (fglrx), you might find it well worth your time to give the open source xf86-video-ati drivers a shot: http://cgit.freedesktop.org/xorg/driver/xf86-video-ati/

As long as you don't need 3D acceleration, they are the best choice. In distributions like Ubuntu, they are built into the default install, so your display will be working perfectly even on the live CD. Also, their multi-monitor support is fantastic.


As long as you don't need 3D acceleration, they are the best choice.

That is a pretty big thing to be missing.


This makes it pretty pointless to have discrete ATI graphics under Linux. I have an integrated ATI graphics card in my desktop and discrete ATI graphics card in my laptop. They both have the same use cases; they can handle basic desktop functionalities (which is a huge step from fglrx) but not any serious 3d.


Sorry to rain on your parade - but for recent cards I thought http://cgit.freedesktop.org/xorg/driver/xf86-video-radeonhd/ was the preferred driver ?


No, I have 2 recent cards, both running Radeon. From what I understand, the 2 projects have mostly merged, and in the future, Radeon is the driver to be used for all cards.


Unfortunately, this is exactly what I came in here to say. This truly is a sad day for anyone that wants to use anything but OSX on their Mac.

(Written on a Macbook Pro running Ubuntu 10.04)


As I said above the reality of current Ati drivers is much better than the fiasco that existed four years ago. On ubuntu it just works.


Unfortunately, this is not my experience. Just this summer I installed Ubuntu 10.04 desktop on a computer running a relatively new ATI card (It was top of the line a couple years ago, I believe). While in general use its fine, there are plenty of really annoying issues. Unfortunately for us, as we were attempting to use it as a media center, there were significant issues with full screen video, where it would freeze up for 5 - 10 seconds at a time every minute or two. This is a known issue for the entire series of cards, and is a driver bug, but hasn't been fixed in the 1+ year that the issue has been open. Then there are other smaller issues with the card such as a 5 second delay maximizing a window, if you're using compiz. Not a deal killer, but still annoying, and there were a few similar bugs like this. After this incredibly sour experience with ATI, I'm not sure I'll recommend them again for a long time. And I haven't even gotten into how long I had to fight with X just to get dual monitors to work.


At some arbitrary time in the past I did use the xorg server from this ppa https://launchpad.net/~ubuntu-x-swat/+archive/xserver-no-bac... when I was using compiz and things were slow on an ati card. I dont know if that ppa is updated for lucid but you could give it a try. If youre using it as a htpc you can disable compiz no ? I mean in most cases wouldnt you want to directly boot into mythtv / xbmc ? Also can you specify what card are you using ? If its x1xxx series or earlier then youre beyond the support window (the closed source driver is only for newer cards) and you might be better off using the radeonhd driver. I think I should do a write up on how to setup ati drivers on linux the right way. It takes < 5 mins to get a dual head display with auto detection going these days. HINT: aticonfig is your friend.


If the article is to be believed, the switch to ati only applies to iMacs, not MacBooks...


I think youre in for a surprise if you try the new drivers! On ubuntu the install is seamless, even otherwise the installer just works and ati has pretty simple interface for doing colour balancing of multiple displays.


Let me disagree with that. The open drivers don't support 3D, and the closed drivers make X crash at least once a day on my setup and similar ones. I have a laptop that is just one year old. All in all, I always got a better Linux experience with NVidia.


The open drivers do support 3D. You won't get the same performance as the closed drivers, but you will get accelerated 3D plenty good enough for desktop effects and simple 3d games.


The open drivers support 3D IF you're using R600 or below. Which is a pretty big if. Once the new Gallium3D driver is merged in Mesa 8.0, we'll talk.


Are you using some funky compiz plugins ? What distro are you using ? When did you last update your drivers ? What laptop are you using and what card does it have ?


I think that ATi is going in the right direction with their Linux drivers: the newest version has support for 2.6.34 and Xorg 1.8 out of the box, whereas with past versions hacks were necessary.


nVidia has closed source drivers too, no? I think your best bet is the Intel stuff. If I recall correctly, they produce actual open source drivers that end up integrated where they need to be, rather than as some binary blob you have to download.

Of course, I don't think the Intel ones are high end, but all I need is browsing and emacs, really.


There have been problems with the latest intel chips on Linux, and even on well supported chips I've had trouble with weird resolutions, multi-monitor, etc. I'm currently using an AMD 785G chipset based motherboard which comes with a Radeon 4200 - the open source drivers for this actually work, including enough 3D support for desktop compositing. It's more stable than the proprietary driver for my old GeForce 8800. (no crashes or corruption so far)

That said, Radeon 5xxx series support is in its infancy.


For best results on Linux, usually the latest anything is best avoided, while issues are worked out. It depends though, and things are improving a lot.


True, although intel have been very good about this in the past, and to some extent still are - there are often drivers in the kernel and in X.org for IGPs which haven't even been released yet.


I've also run into regressions on older hardware that no one cares to trouble-shoot because not enough people are on the setup (e.g. the soundcard on a PowerBook 667MHz for example).


As a counter-example, I have always had much better luck with fglrx than with the closed nVidia drivers.


As a CUDA developer I'm pissed off by this decision.. Last time I checked OpenCL wasn't ready (much slower then CUDA) so I don't see how one could use the new Macs for serious GPU development :(

Also Adobes premiere uses CUDA.. maybe Apple wants to look their own product (Final cut) better on Macs?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: