Does the Retina display eliminate the need for anti-aliasing?

Jake picture Jake · Dec 26, 2010 · Viewed 7.5k times · Source

With the iPhone 4, the Retina display's resolution is so high that most people cannot distinguish the pixels from one another (supposedly). If this is the case, do apps that support the Retina display still need anti-aliasing to make fonts and images smooth, or is this no longer necessary?


Edit: I'm interested in more detailed information. Started a bounty.

Answer

Fattie picture Fattie · Jan 18, 2011

There's no question at all - you still need to do antialiasing mathematics, because of the complexity of curves, second order curves, intersecting curves, and different types of joins.

(Note too that, very simply, since this question appeared two years ago. Retina displays are now ubiquitous and - indeed - antialiasing is, in fact, done everywhere on every Retina display.)

Sure, straight lines (perhaps at 45 degrees) may conceivably test as well in A/B tests. But just look at a shallower line, or a changing differential.

And wait - there's a knock-down argument here............

Don't forget that you can display typography really, really small on a retina display!!!

One could say that you need antialiasing, whenever letter are less than (let's say) 50 pixels high. Thus if you had a crappy 10 dot per inch display ... but the letters were 80 feet high (8000 pixels high) you would NOT need antialiasing. We've just proved you "don't need" antialiasing on a 10 ppi display.

Conversely, let's say Steve's next display has 1000 pixels per inch. You would STILL need antialiasing for very small type -- and any very small detail -- that is 50 pixels or less!

Furthermore: don't forget that the detail in type ... which is a vector image ... is infinite!

You might be saying, oh the "body" of a baskerville "M" looks fine with no antialiasing, on a retina display. Well, what about the curves of the serifs? What about the chipping on the ends of the serifs? And so on down the line.

Another way to look at it: ok, on your typical Mac display, you don't need antialiasing on flat lines, or maybe 45degree lines. further, on a retina display you can get away with no atialiasing on maybe 22.5 degree lines, and even 12.25 degree lines.

But so what? If you add antialiasing, on a retina display, you can successfully draw ridiculously shallow lines, much shallower than on for example a pre-retina MacBook display.

Once again as in the previous example, say the next iPhone has one zillion pixels per inch. Still, adding antialiasing will let you have EVEN SHALLOWER good-looking lines -- by definition, yes, it will always make it look better because it will always improve detail.

Note that the "eye resolution" business from the magazine articles is total and complete nonsense.

Even on say 50 dpi displays, you're only seeing a fuzzy amalgam created by the mathematics of the pixel display strategy.

If you don't believe this is so, look at this writing right now on your Mac, and count the pixels in the letter "r". Of course, it's inconceivable you could do that!! You could maybe "resolve" pixels on a 10 dpi display. What matters is the mathematics of the fuzz created by the display strategy.

Antialiasing always creates "better fuzz," as it were. If you have more pixels to begin with, antialiasing just gives even better again fuzz. Again, simply put under consideration even smaller features, and of course you'd want to antialias them.

That seems to be the state of affairs!