Hacker News new | past | comments | ask | show | jobs | submit login
The Retina War is upon us. (wells.ee)
77 points by wells-riley on July 16, 2012 | hide | past | favorite | 91 comments



In my opinion Photoshop is absolutely inappropriate tool in this new era. It was never really meant to design user interfaces. Designers should finally stop thinking in terms of pixels as they don't really make any sense if your application will be shown on screens with more than one kind of pixel density.

There should be no problems with using vector graphics on high resolution screens and it will look much better on lower resolution screens than downscaled image. Not only will vector graphics make support of various screen densities easier it may also (significantly) decrease size of assets. Currently on iOS devices developers bundle two images for same purpose (1x and 2x). It could have been one vector image instead.

Please stop using tools which were made to edit images and start using ones which are made to design interfaces.



They're not a panacea but they are an enormous upgrade from your 1x raster.

I liken this situation a lot to adapting a site to multiple devices with media queries. Does it help? Oh hell yes. Would a unique mobile site be better? Probably, if you had the time and resources to commit to it.

Of course custom raster will look better, but I don't think it's realistic for most teams. If you didn't handcraft the pixels of your favicon you're not a candidate. Vector solves much of the problem and will continue to, even when new densities come out.


Illustrator has plenty of numerical errors all around to warrant Photoshop design for perfectionists.


What is an example of software made to design interfaces?


Name some really good software for designing interfaces.


HTML doesn't support vector images well. Yeah, you have SVG in modern browsers, but which one is easier?

    <img src="logo.png">
or

    <embed src="http://upload.wikimedia.org/wikipedia/en/4/4a/Commons-logo.svg" type="image/svg+xml" />

    (fails in IE8- and IE9 quirks)


You can do <img src="logo.svg"> in IE9 and every other browser. In fact, you can do an inline <svg> image inside of your HTML in all those browsers as well. The <embed> is long gone.


CSS does support vector images. It's equally easy to do as their raster pals.


The solution is what it always was. Design in units like 'em's, don't override the user defaults, and let the page scale however the end user wants. This not only accomodates the retina/non-retina split, it lets users with bad eyesight increase the font size and still be able to read things.

Images? That's a bit harder, but it's not the end of the world. Vector images are useful here, but for higher detail raster images, designing in double size and scaling down for ordinary screens gets you a long way. For smaller icons, it might pay to have two versions, and I don't see a way around that for a while. Gnome has handled this for a long time by having specific sizes of icon that it tried to snap to, as well as vector fallbacks for when none of the sizes were appropriate.

The important thing is to give up on pixel perfection. Just let the user chose the size, and don't mess with their defaults.


I really like what you're saying.

I don't think designers have to give up on "pixel perfection" - well, they don't have to give everything up.

We already have an enormous gamut of displays for any website. Forcing iOS users to a broken "mobile website" is never going to satisfy the designers who care about being pixel perfect, so they were grappling with these issues already.

And I can't say it enough: we're finally getting past 1920x1080 screens again. CRTs had higher resolutions a decade ago.

Some of us _do_ have good enough eyesight, and some of us _do_ work with videos that big (and the associated images) - yes, there's the Apple 30" Cinema Display, but I seek real competition between manufacturers.


I agree, designing in pixels has always been the wrong way to go about it. It is only needed when there is a restriction in the number of pixels available to work with(such as displays that have large pixels.)

The beauty of high pixel density screens is that we can literally output our vectors/high resolution rasters to the output size and not have to worry about massaging individual pixels for the best clarity. The advent of high pixel density screens allows us to exclusively think in terms of the end size on screen, instead of being bogged down with pixels dimensions. This high density makes design easier, not harder. Also the concept of different pixel densities is not new, screens have always had different pixel densities.

The reliable thing about 'retina' screens is that we can think of the pixel problem as 'solved' and just prepare artwork to pass the retina test, instead of trying to match it perfectly for every higher pixel density out there. (A level of accuracy that won't be easily seen by the user.) The same thing is done in print everyday, 300 dpi, 600 dpi, 800 dpi, it doesn't matter, past a certain point the end user isn't going to casually notice the extra detail.

Designers that have been working in print would see the analogy to various print device resolutions which are each measured in lines per inch. Again the approach is to think in terms of the final dimensions, and not get bogged down with the individual device resolutions.

I understand that designing for 1x can be problematic, but it's the same workflow as designing for any foreign pixel ratio/pixel density (such as non square pixels used in certain types of film.) It's just another step in the work flow and testing often on a device is useful way for the designer gets the hang of it.


QUESTION: What makes vector graphics based UI's so hard?

I am not a UI designer, so I am ignorant. Please educate me.

Several other posters have mentioned the option of moving away from raster graphics, and indeed this was my own first thought. I remember using IRIX back in the glory-days of SGI. For those who have no idea what I'm blathering on about, IRIX was a professional workstation OS that had en entirely vector based UI. Yes, almost two decades ago an OS existed that was immune to the retina problem. Pause for a moment and let that sink in.

IRIX was actually one of the first GUI-based OS's, and SGI's use of vector-based UI elements probably arose from specific needs SGI faced. I can only speculate on what they were, but one thing they had plenty of was raw power. SGI workstations were very nice in their day! However, the average smartphone built these days probably has more power than all but the last SGI workstations, and perhaps even them too. Processing power is therefore likely not the issue. In fact, given that most desktop OS's already render your desktop in a 3D environment (OSX, Win7, and Linux (depending on what you're running) all do this), vector UI elements may actually be less intensive to draw than raster elements in these situations!

Wild speculation: Is it that UI building tools and API's make working with vector graphical elements painful, or are UI designers just ignorant after almost a decade without a major vector-based OS in common use?


There's really nothing hard about using vectors; they'll just be blurry because they're probably not pixel-fitted. Also, complex icons are much smaller and faster to render in raster format.


One other facet of vector graphics, they don't always look that great at all scales.

I seem to recall a graphics designer blog that compared a vector to an image that was tweaked at the same scale resolutions. The differences were staggering. I don't think vector graphics are a panacea anymore. I'll see if I can find the link, was rather eye opening.


Relevant links in this comment: http://news.ycombinator.com/item?id=4252640


One of the big things I noticed in the first link there was that a SVG icon scaled to different sizes didn't look as good as a hand-optimised set of raster icons, largely due to issues that could conceivable be addressed. e.g. When the SVG icon was scaled, the lines defining the shape also scaled up in thickness so that they appeared far too chunky in the largest version. What if line-width were kept constant as the SVG was scaled?

While that link makes me appreciate that SVG speficially might be hard to work with and that simply scaling everything proportionally might not make for good icons across all sizes, I remain unconvinced that vector icons are hopeless. Perhaps SVG's have the ability to do what I describe above, or perhaps we need a more powerful vector graphics format to work with.


Of course, for Retina displays, those small vector-based graphics will look great. For everyone else, they'll be blurry.

So maybe that will simply add to the "tough for non-Retina users" momentum...


It's great that these 2x res displays are becoming more accessible, but the state of web standards is dangerously behind in terms of support for them. Right now, the only way to serve images optimally for the various screens is to have the server attempt to assess the display resolution for the client using either JavaScript or CSS media queries, and then either send different HTML to different clients, with different content linked in img tags, or have exception handling for the CSS images.

This is completely broken, as a client that wishes to access various assets might be denied them by the server, which thinks it knows best what's good for the client. This system is also bound to break as clients with unexpected capabilities emerge. Instead, the server should be dumb, making all the assets available, and the client should be making the decision of which assets to request.

Apple is the obvious source of expected progress on this front, since they control the retina hardware, OS and browser rendering engine. Others are able to influence this as well. Until someone fixes this broken mess we have now, however, retina users will not be getting widespread support. We can't expect big website operators to put up with the current horrible hacks people are using to support retina displays.


Safari in iOS 6 and Mountain Lion allegedly have support for -webkit-image-set to specify 1x and 2x imagesin your css.

background: -webkit-image-set( url(...) 1x, url(...) 2x) ... ;


That's 1) just for CSS, 2) incompatible with border-image and list-style-image properties and 3) horribly inefficient, with ugly syntax and a bunch of duplicated effort for developers (since filenames for 1x and 2x image variants are typically identical except for a few extra characters used site-wide for all 2x versions, there should be a way to allow developers to specify only these unique characters one time and avoid that extra work when possible).


This is hilarious. Pro tip: not all your users are graphic designers. It may be the case that the author, and everyone he knows, has new "retina" products from Apple. That's wonderful. But it can't be taken to represent a broader userbase.

It seems to me that the web's mainstream properties are built for workers in large corporations to fiddle with when they should be working. Most of these people have 1280x1024, 96 DPI screens attached to Windows XP or Windows 7 PCs.

I expect that to be the case for some years to come.


Speak again in 3 years.

Today, I am a geek (not really a designer) and 2 out of 3 of my devices have retina displays already (iPhone and iPad). Within much less than 3 years, all my devices will. Within 3 years, there will be no Apple device being sold that doesn't have a Retina display. Will other vendors follow suit? If they don't, they'll be left in the dust, so I imagine they will have to, if they can.

It's usually a good idea to skate where the puck is going, rather than where it is now. In 3 years, the puck will be retina. My Macbook Air is not Retina, so I can't be arsed to design for Retina... yet. Once my Mac has a Retina display, I will design for Retina, and let the lower-res experience be inferior. By then, most of the people who care about such things will have Retina displays anyway.


Will other vendors follow suit? If they don't, they'll be left in the dust, so I imagine they will have to, if they can.

Said the designer. People were still buying SD TVs years after HD ones came out. I don't disagree that it is good to skate where the puck is going, but it's still too early to worry about it now. A lot of the web design we do right now won't even still be used in three years time.

Once my Mac has a Retina display, I will design for Retina, and let the lower-res experience be inferior.

Terrible, terrible idea. It was only a few days ago that someone posted how ignoring Windows cost them dearly. Don't ever design for the machine you're using, design for your customer. This applies to screen resolution, processing power, everything.


Just so you know, a whole bunch of laptops have been coming with "retina" displays for years now.

They'd call it a "Full HD" laptop so it doesn't sound as shiny, but it's still a 15 or 17 inch screen with a 1920x1080 resolution.


> Just so you know, a whole bunch of laptops have been coming with "retina" displays for years now. They'd call it a "Full HD" laptop so it doesn't sound as shiny, but it's still a 15 or 17 inch screen with a 1920x1080 resolution.

That is different, though. 1080p on a 15" display is 146PPI; a "Retina" display (2880x1800) is 226PPI. That's a significant difference.


To put it into better perspective, put the resolutions into google:

(1920 * 1080) / (2880 * 1800) = 0.4

Your 1080p 15" display contains just 40% of the pixels of a single Retina 15" display. They even showed this in the demo. They edited a 1080p video in Final Cut at full resolution and there was plenty of room left for UI elements.

Apple is definitely the first to put a display of this density into a consumer product, or maybe any product.


No, definitely not. IBM sold a 22" 3840x2400 (205ppi) LCD monitor over a decade ago.

http://en.wikipedia.org/wiki/IBM_T220/T221_LCD_monitors

Panasonic has also launched a 20" 3840x2160 IPS LCD display, too.


But people haven't been running those in 2x mode, so they don't require special design.


Indeed, my last 2 dell laptops were 1920x1200 with 15" screens.


Legacy Windows APIs don't support scalable interfaces. Do you think the Fortune 500 and Global 1000 will phase out every legacy 32 bit application and replace every 1280x1024 display between now and 2015 ?

I don't.

P.S. Does Windows even have a pixel doubler for legacy apps on "retina"-type displays?


1280x1024? Do you have any data for that? 1366x768 would strike me as the much more likely resolution. The "normal" users I know are split into two groups - those with newish cheap 15" laptops, which are all 1366x768, or those with ancient desktops at 1024x768, or worse, 800x600 because they stumbled on the resolution setting and used it as a way to make things bigger.


Don't say this is the user's fault though. People with weak eyesight NEED to get stuff bigger and because of tons legacy software even Windows 7 can't really help you with scaling. Scaling the resolution is still the only really viable option for these people.

In fact, Apple's way of scaling by internally doubling the resolution on retina displays is the most useful way of scaling I've seen so far. It gives you an -almost- crisp image for 3 different scaled settings and one perfectly crisp image for the retina resolution.


The article raises the specter of a "1x-tax", along the lines of the "IE7 tax". The difference, of course, is that it's easy and free for people to upgrade their browsers, whereas upgrading the screen is not. As inconvenient as it is for the designers, it would be incredibly bad PR to say, in essence, "Only rich computer users should be using our site".


That should really not be a problem. Websites and apps can detect and load appropriate images for 2x or 1x displays. Same for apps. You design for 2x, and downsample for 1x.


They can, but that's a huge additional bandwidth/size cost for both web and native apps.


Native apps, maybe, since you'd probably have to include both versions in your download. But surely its a bandwidth saving for web apps? You just let the browser download the right size image. Now your low res users get low res icons, while high res users get high res icons. Less bandwidth than just sending everyone the high res icons, or both sets.


This change will not be dramatic like this. "Retina Display" is a "very high resolution display". It is not magical. It is not a fundamental technological shift. It is only Apple fanboyism that makes it seem that way.

There have been "very high resolution displays" at the front edge of PC tech for decades, and the Internet has scaled slowly but surely to meet our current middle ground (probably approaching 1080p on desktops and something slightly lower on laptops this year). Most of the people here have been using "very high resolution displays" during most of the development of the web, designers in particular. PCs will, of course, reach the resolution of Apple's Retina Display; they'll have to in order to stay competitive. But, it'll happen gradually. Most people don't choose computers based on display resolution. I bought a new laptop a couple months ago, and went to Fry's (after buying online failed three times, and needing to get something quickly)...I bought the only 15" laptop they had with a 1080p display. Literally, it was the display model and they didn't have any others in stock. Consumers buy cheap or buy a brand name. But, educated consumers will keep pushing things forward, slowly but surely.

So, yes, in three years, we'll mostly all be looking at "Retina Displays", but we'll just call them "displays". And we will have evolved the web slowly in that direction, just like we've been doing for decades.

Proof:

http://web.archive.org/web/19970404064352/http://www.apple.c...

Note that on a modern display, these graphics are tiny, and the site itself only takes up about a 5th of the page (if that much). We've "gone Retina" maybe three or four times since 1997, we just didn't have Apple telling us that doubling display resolution was an epochal shift in computing technology and a legion of fans to carry forth the message.

Final point: The shift from CRT to LCD was much more dramatic than this shift, and we all made it through. Scaremongering is pointless.


Like someone else says downthread (http://news.ycombinator.com/item?id=4253475):

(1920 * 1080) / (2880 * 1800) = 0.4 Your 1080p 15" display contains just 40% of the pixels of a single Retina 15" display. They even showed this in the demo. They edited a 1080p video in Final Cut at full resolution and there was plenty of room left for UI elements.

---

Retina stuff isn't supposed to be remarkable just because the resolution is higher. It's remarkable because there are drastically more pixels in the same size screen.

The reason why folks are talking about this as if it's different is because cramming that many more pixels in the same amount of space requires some technology for it not to get all screwed up. It's not a modest jump from, say, 1024 to 1280, the progression you're describe. It's from 1920 to 2880. That's big. You might not like it because it came from Apple and Apple people like it but it does matter.


The first display I browsed the web on was 640x400 (an Amiga) at 14" (but it was a 5:4 display, so it was bigger in surface area than a 14" 16:9 or 1.85:1 widescreen display, closer to 15" or 16" today). I'm currently browsing the web on a 15" 1080P display.

(640 * 400) / (1920 * 1080) = 0.123

As I mentioned, we have "gone retina" a few times since the beginning of the web.

And, it's not that I don't like it. Of course I like it. I just explained that I went out of my way to obtain a 1080P laptop. I once went to great lengths (custom cable, huge and complicated custom X configuration file) to hook up a massive 75 lb workstation monitor to my PC because I wanted a really high resolution display. It's just that I think it is ridiculous to freak out about something that follows the natural progression of technology that we've been following for decades.


> Note that on a modern display, these graphics are tiny, and the site itself only takes up about a 5th of the page (if that much). We've "gone Retina" maybe three or four times since 1997, we just didn't have Apple telling us that doubling display resolution was an epochal shift in computing technology and a legion of fans to carry forth the message.

Display pixel density (some call it resolution, a term that is bit overloaded these days) has been hovering at 100 PPI for a long time. There has been almost zero development in that area. Even in the old CRT days, 19" screens were used at 1600x1200 (roughly 100PPI), and almost all desktop monitors since have been <100 PPI (or close to that). And now Apple is doubling the pixel density to >200 PPI. I'd call that a fundamental technological shift.

> Final point: The shift from CRT to LCD was much more dramatic than this shift, and we all made it through. Scaremongering is pointless.

Did you live an alternative history? There was almost no change in the outputted picture in CRT->LCD transition. And especially very little change that required attention of software developers.


My first display for browsing the web was 640x400 at 14". My current display is 1920x1080 at 15" widescreen. That's a more than 8-fold increase in pixel count, and dramatically larger than the difference between current displays and "Retina". Yes, it's a big step in one go...the strong brand of "HD" and "1080P", and the cost of increasing density, seems to have caused the market to pause at that setting for a while. But, this is the natural progression of computer displays toward higher resolution. It is not a miracle.

"Did you live an alternative history? There was almost no change in the outputted picture in CRT->LCD transition. And especially very little change that required attention of software developers."

My guess is you never worked on games, video, or graphics software, during the switch from CRT to LCD. The way these display types behave is quite different. Colors are different. The speed at which pixels change state is (or was) vastly different. If you cared about how your software looked, and it tickled any of these differences, you tested on both and you tried to find a happy middle ground.

The change from 5:4 to 16:9 was also somewhat serious for developers.

I'm not saying a doubling of display resolution isn't awesome (it is!). Just that it is not a "sky is falling!" situation, and it's overly dramatic to act like it is. We've seen all of this before; or at least, those of us who've been around for a little while have seen all this before. Those who are too young to remember it should probably look to the past for guidance rather than acting like this is an unprecedented historical moment requiring heroic efforts to overcome.


you're confusing resolution with dpi.


You're exhibiting a severe lack of historical context.

My first display for browsing the web was 14" at 640x400. I currently browse on a 15" 1080P display.


2x scaled from 1x - Bicubic

(Nearest Neighbor produced identical results)

Sorry, then whatever software you're using to scale it is broken. The image pictured is obviously nearest neighbor; rescale with bicubic and it will look smooth but blurry.


Just to clarify, Retina is an Apple brand for a screen with a significantly increased resolution and high pixel density.

Its definitely a brand name, so lets not confuse that. Maybe just refer to the actual pixels per inch or even easier, the screen size/type and resolution. MacBook Pro 2012: 2880×1800

http://en.wikipedia.org/wiki/Retina_display


This resonates with something I was thinking too. Basically everyone who has had a web experience hooked up to their 1080p TV has been screwed for a long time. Basically an '800px' layout has sucked for at least 5 years now on both 1920 x 1200 monitors and the aforementioned HD setup. There is no 'Retina' war, there is only web design stuck between the same rock and the same hard place. I feel for them but your tool box has to account for that.


I think the generic term would be something like "HiDPI display".


I think that Retina resolution displays will dominate the field in much less time than ten years: Since every mobile device maker has to match Apple I'd bet that by next year every new phone and tablet will feature a higher resolution screen. maybe that won't be 256, but maybe it will be at least 150. Then you can bet that in at least two years you'll see the same thing happen on laptops — and desktops will have to follow within five years.

The truth be told: It's sort of shocking to think that we've been using 72/96 for so long, I would have expected this shift to have happened ten years ago.


Yeah, 72/96 is shocking, but look:

We're only just now reaching the point where I can count on my friends having 64-bit CPUs.

Intel's first 64-bit mobile processor was released in mid-2006. That is only six years ago. The manufacturing transition itself was pretty quick - only a year or two until all their currently-manufactured CPUs were 64-bit [AFAIK], if we ignore Atom. Many of my friends have 6-7 year old laptops because they still work and that's all they need.

64-bit was chiefly engineering work: the CPUs didn't cost more to make because of Moore's law [I believe]. I hope that Retina screens are similarly not-more-expensive when made in large quantities. But what if they are a bit more expensive? We'll be cursed with budget 72/96 screens for maybe five more years on top of the length of time people keep their devices!

And it really will be a curse to us, personally, web designers. The Web uses images, and everyone uses the Web.

[1] https://en.wikipedia.org/wiki/X86_64#Intel_64_implementation...

[Disclaimer: I am writing a science game that uses lots of 64-bit integers, so it runs noticably faster when compiled for x86_64 than for x86.]


I agree... I think many of us have been saying that for years, too: http://www.codinghorror.com/blog/2007/06/where-are-the-high-...

Despite the fact that many people primarily associate "Retina" with the iPhone (because that is where it was introduced first), I think this is the area in which it has actually had the least impact. Other phones had high DPIs before the iPhone 4, and the difference between the best Android displays and the iPhone 4 display was not overly dramatic.

OTOH, pushing laptops and tablets out of low-DPI land is a truly remarkable achievement. Other platforms are much farther behind in this area, IMO, and at least on laptops have a much longer path to catching up.


Was there any factory capable of producing 15" panels with 220+ LEDs per inch ten years ago?



The T220s were very nice monitors, but they just tipped over the 200dpi mark, not the 220+ one.


So basically 15 ppi, then - your comment reads as someone trying to exaggerate the difference - the T220 was 205 ppi, and some of Apple's Retina devices are 221 ppi, a grand difference of merely 7% - well, seven per cent and eleven years.


I'm looking forward to the day when vector defined graphics are the norm and raster graphics are only used for things like pictures and games. There's really not much reason for UI and text to be defined by raster based graphics aside from the ease of creation, in my opinion. Using scalable graphics for layout removes the problems of resolution and DPI.


"but it won’t be long before Retina Cinema Displays replace their outdated siblings"

I was honestly surprised with each retina release by Apple starting with the IPhone4. After the initial “aww” of it had passed, it was easier to wrap your head around its production due to its relatively small screen. Now we have 15” macbook pro’s with retina. Apple released a professional production machine for its retina ecosystem to thrive to aid its consumer mobile devices experience. This is exciting for us power users and geeks for sure.

A retina thunderbolt display(twenty seven inch), seems like quite a technological hurdle,. How much will you pay? $2000 for the screen? They could release such a product, again for professionals, to service professionals creating the content for their other devices to consume. You will pay for it though.

Does the average kitchen need retina imacs and will the market speak?


> Does the average kitchen need retina imacs?

Does the average kitchen need a 21" screen, 2.5GHz quad-core CPU, 4GB of RAM and a 500GB hard drive? (That's the base iMac.)

Does the average garage need a 4x4 SUV with cruise control, climate control, 6-disc CD, sat nav, heated seats, flip down DVD screens, etc. etc. ?

Of course not.

The developed world stopped caring long ago about what average people "need", and has long since focused on what people "want". Or at least what advertising can convince us we "want", before we even know we do.


Sadly, I think retina iMacs are further away than we'd like to hope. Eizo has a 36 inch 4K monitor for $35,000¹. There's a 31 inch 4K Viewsonic Monitor for an undisclosed price². Even that's only 150dpi. If Apple wants to make a pixel doubled 27 inch iMac for under 10K, I'd be surprised it if they could do it by 2013.

But then again, I thought the retina MacBook Pro was impossible. So perhaps I'm not the best judge of these things.

1. http://www.geeky-gadgets.com/eizo-showcasing-its-35000-4k-lc... 2. http://www.engadget.com/photos/viewsonic-vp3280-led-31-5-inc...



But the Retina MBP is already past that resolution on a 15" screen (2880x1800). If "Retina" means 2x the standard resolution, we'd need 24" monitors at 3840x2400 and 27" monitors at 5120x2880. I wonder if even Thunderbolt could drive that.


> I wonder if even Thunderbolt could drive that.

It can't.

http://www.marco.org/2012/06/22/predicting-mac-desktops

"" - If a 27” Retina Display is a “2X” version of the current panel, that’s a 5120x2880 panel — running that at 60 Hz requires more bandwidth (over 21 Gbps for 24-bit color) than Thunderbolt offers today (up to two 10 Gbps channels).""


These screens are insanely expensive, and wouldn't make much sense. Nobody is looking at 27" monitors from such a small distance to distinguish between retina and simply good resolution.

Plus, I don't think there are affordable video cards to drive resolutions like 5120x2880.


Well, you make a good point. For comparison, my 17" MBP is at 133 ppi (1920x1200), while the base 15" is 110 ppi, and so the Retina 15" is 220 ppi. If you wanted the 17" screen to be around 220 ppi, you could go with 3072x1920, which would noticeably easier to make and to drive than 3840x2400. Given that full-size screens are usually viewed from farther away than laptop screens, I think it's fair to offer this as a "Retina" resolution for a 24" 8:5 screen.

Similarly, the 27" 16:9 screen could come in at maybe 3584x2016.


This is a ridiculous post by some random guy who is pro-Apple.

Do you know how long it took for 1024x768 to significantly vanish? And you're talking about retina's in 3 years. Lulz.

The world is huge my friend, and it includes a major portion where people don't (or even want to) own Macs and many of them are still on Legacy hardware. By many, I mean MANY. In three years, 1920x1024 might become fairly standard, but to even think of retina as a standard is definitely an overkill. I'm surprised this is even posted on HN.

Oh wait..its the same guy who wrote it..


I enjoy the high pixel density displays on small devices like iPhone 4. However the iPad 3's display doesn't really interest me. I just don't hold the device that close to my eyes.

Snappy, responsive UIs are the most important thing to me. I'll trade the animations, rounded-corners, gradients, drop shadows and bouncy effects for a more craigslist-like UI. Something that's simple, basic and functional.


I've been using the new iPad for a few months now and I'm always bothereed to back to an iPad 2 for reading. I can just see the pixels on the iPad2 now, I didn't before, but now they bother me. Blast Apple for raising my standards!


Same for me when going back to 2nd gen iPod Touch. What really bugs me about the older iOS devices is the UI lag. It's not a huge deal if you use the device here and there. But where you use it constantly the lag becomes annoying. Hence my general desire to trade UI polish for decreased response time.

Google studied the impact of very small delays on users' search behavior. iirc reducing the search results loading time by 10 ms or so had a noticeable effect. I wonder if something similar could be applied to smart phone UIs? Might be an opportunity for a smart phone OS to substantially differentiate itself from iOS and Android.


Does anyone else find the 1x and 2x conventions confusing? I never realized it till I got a bit confused reading the article.

Also: I'm a bit surprised at the reverence Dustin Curtis is given for the pixel fitting article.


Its not just confusing, its misleading. The marketing copy is being confused for technical terminology.

At 2880 pixels wide, the new MacBook Pro resolution is about 1.5 times as many pixels across the screen as a typical high resolution display at 1900 pixels.


> At 2880 pixels wide, the new MacBook Pro resolution is about 1.5 times as many pixels across the screen as a typical high resolution display at 1900 pixels.

A typical high resolution display of what physical size? Using the resolution is meaningless, it's the DPI that's important in this case.

Also, the 2x is not really marketing copy. It's the number of physical pixels that make up each geometric "point" in an Apple retina display. Ideally applications and operating systems would be truly resolution-independent and this ratio could be any decimal number. In such a world, you could chose how to use your screen's resolution - for more space or for smoother rendering of on-screen elements.

Apple's solution is a hack that uses an integer ratio to make backwards compatibility easier. With a 2x ratio you can blow up a point from a non-retina app to exactly 4 pixels, avoiding some hairy rendering issues.


Resolution is only thing that matters. It tells how much information you can cram into a screen. If pixels are too large move back. If they are too small move closer. If your screen is glued to the keyboard. Detach it. Like in asus transformer.


The point of these displays is to (eventually) get away from slavery to the pixel. Just because a printer can render, say, 4800 dpi, doesn't mean you want to render your body copy in a 1pt text (which it can do with better letter-shape fidelity than any of my old dot-matrix impact printers); you still print at 9 or 10 points and enjoy sharper and better-defined type. Moving to high-linear-resolution screens does the same thing—you don't cram more stuff into the screen, you cram a better version of the same stuff into the screen.


Decorative visual information is still information and how much you can fit in a screen depends on it's resolution not dpi. To enjoy smoother visuals you just have to move away a bit from big screen.


Isn't that ratio 4x then? Or am I not understanding retina displays correctly?


2x linear, 4x area. I could swear we had this discussion yesterday.


The default mode is 1440x900 doubled, hence all the talk about 2x.


Apple's standard 15" has always been 1440x900. Their hi-rez 15" has been 1680x1050.


What the cynic in me finds interesting in Retina displays is how people will tend to hit their data caps more quickly on their mobile plan merely because they are downloading more pixels.


"...hit their data caps more quickly on their mobile plan merely because they are downloading more pixels."

Surely the caps will increase in size as a consequence of improved mobile-data-pushing-technologies and to keep up with the need, just like the speeds of broadband-connections have since they were introduced?


It should also help at getting those caps increased. In a major city cell phone providers not having capacity in 2012 should be lambasted for not expecting usage to increase. This is only the harbinger of things to come, we need to provide pressure on the providers to drop the (when technically unnecessary) data caps. They don't fix the congestion problem and only exacerbate these problems.

Also, its about damn time we have high dpi displays, we've been stuck at 72/96dpi for far too long on regular computers. I welcome this change and all it brings.


Would be nice if only the supported artwork assets were downloaded to a particular device. This could help w/ the data caps issue. However there are corner cases like universal iOS apps.


Easy solution: Use an external monitor to verify things still look good at 1x.


I welcome the Retina Revolution (and look forward to its progression beyond the Apple world), despite the messy consequences. Three things plague my eyes and psyche:

  1) 1080p as a least common denominator
  2) composite analog video feeds
  3) low bitrate MPEG/JPEG/iDCT compression artefacts
Let the reader understand.


A nitpick, but this author is absolutely using bicubic wrong. The only case where bicubic filtering appears identical to nearest neighbor is if the entire image has constant color.

Author probably forgot to turn off color indexing or something trivial.


Ironically... when I visited on the link on my iPhone, the text appeared pixel doubled. If I go back, it appears correct. No idea why. :)


I really think the @2x hack is absolutely revolting.

sigh...


He's designing a website in Photoshop and complaining 800px is too small. Is that serious?


These svbtle blogs render unfocused on iPad 1 in portrait orientation - how annoying.


Isn't this something an asset pipeline should be handling?


This guy's problem is that he thinks pixel-fitting is an appropriate technique at any resolution: it's not.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: