With the iPhone 4, the Retina display's resolution is so high that most people cannot distinguish the pixels from one another (supposedly). If this is the case, do apps that support the Retina display still need anti-aliasing to make fonts and i开发者_高级运维mages smooth, or is this no longer necessary?
Edit: I'm interested in more detailed information. Started a bounty.
There's no question at all - you still need to do antialiasing mathematics, because of the complexity of curves, second order curves, intersecting curves, and different types of joins.
(Note too that, very simply, since this question appeared two years ago. Retina displays are now ubiquitous and - indeed - antialiasing is, in fact, done everywhere on every Retina display.)
Sure, straight lines (perhaps at 45 degrees) may conceivably test as well in A/B tests. But just look at a shallower line, or a changing differential.
And wait - there's a knock-down argument here............
Don't forget that you can display typography really, really small on a retina display!!!
One could say that you need antialiasing, whenever letter are less than (let's say) 50 pixels high. Thus if you had a crappy 10 dot per inch display ... but the letters were 80 feet high (8000 pixels high) you would NOT need antialiasing. We've just proved you "don't need" antialiasing on a 10 ppi display.
Conversely, let's say Steve's next display has 1000 pixels per inch. You would STILL need antialiasing for very small type -- and any very small detail -- that is 50 pixels or less!
Furthermore: don't forget that the detail in type ... which is a vector image ... is infinite!
You might be saying, oh the "body" of a baskerville "M" looks fine with no antialiasing, on a retina display. Well, what about the curves of the serifs? What about the chipping on the ends of the serifs? And so on down the line.
Another way to look at it: ok, on your typical Mac display, you don't need antialiasing on flat lines, or maybe 45degree lines. further, on a retina display you can get away with no atialiasing on maybe 22.5 degree lines, and even 12.25 degree lines.
But so what? If you add antialiasing, on a retina display, you can successfully draw ridiculously shallow lines, much shallower than on for example a pre-retina MacBook display.
Once again as in the previous example, say the next iPhone has one zillion pixels per inch. Still, adding antialiasing will let you have EVEN SHALLOWER good-looking lines -- by definition, yes, it will always make it look better because it will always improve detail.
Note that the "eye resolution" business from the magazine articles is total and complete nonsense.
Even on say 50 dpi displays, you're only seeing a fuzzy amalgam created by the mathematics of the pixel display strategy.
If you don't believe this is so, look at this writing right now on your Mac, and count the pixels in the letter "r". Of course, it's inconceivable you could do that!! You could maybe "resolve" pixels on a 10 dpi display. What matters is the mathematics of the fuzz created by the display strategy.
Antialiasing always creates "better fuzz," as it were. If you have more pixels to begin with, antialiasing just gives even better again fuzz. Again, simply put under consideration even smaller features, and of course you'd want to antialias them.
That seems to be the state of affairs!
The resolution at which the eye/brain will detect a discontinuity or stair edge is higher than the resolution at which it can resolve individual pixels. The Retina display appears to be high enough for the latter.
But throw in image animation, hand motion, vehicle vibration, imperfect eyesight, display reflections, et.al. and you may have to experiment to determine whether the former makes any difference in your particular application.
I did some quick tests on an iPhone 4 from a friend with an OpenGL application. Without multisampling, there were still stairs and other artifacts on the output, however, with multisampling they were gone.
Thats not really surprising as you can still build hard edges with a lot of pixels, so just putting more pixels into one device won't solve the problem (however, it clearly can help to reduce the need of multisampling)
Do a test app, with two images side by side, one antialiased and the other one not. Let users pick the one they think looks better on their retina display and draw your conclusions from the results. If a clear majority of participants pick the antialiased image, then you certainly have a significant difference, otherwise it would be safe to assume said difference not to matter to people who use the app.
Here's an article that suggests you need a resolution of 477 DPI to eliminate the ability to see pixels, higher than the 326 DPI of the IPhone 4 Retina display. Be sure also to follow the rebuttal link in the article. http://www.wired.com/gadgetlab/2010/06/iphone-4-retina/
I also remember reading an argument some time ago that anti-aliasing works better at higher resolutions up to a certain point; unfortunately I can't come up with a reference.
Edit: I still can't find the original reference I was thinking of, but John Gruber has compared the 326 DPI screen of the IPhone 4 to the 220 DPI of the Retina MacBook Pro and found the MacBook superior because of the text anti-aliasing. Look about halfway down in the article: http://daringfireball.net/2012/08/pixel_perfect
At some point the number of DPI is high enough that a "pixelated" line at such a high resolution will still look smooth. I'm not sure if Retina would be it. For application like gaming, if you had a screen with 300 DPI or more, you would not need anti-aliasing for geometry. (though things like textures and sprites would still need it since when you approach objects in a 3D world (or even look at them from different angles) the textures are stretched or shrunk)
Here's a great article on the subject: http://gamintafiles.wordpress.com/2012/03/12/when-anti-aliasing-is-no-longer-needed/
Yes, you still needs it. If you really want to take advantage of the higher PPI, you will use antialiasing. The point of it is to provide the "bleed" that is necessary to make the image appear as best as it can in it's analog form. The only reason the magical 300 PPI or DPI number makes a difference in print is that the dots bleed together some. When you're dealing with the hard edges of an LCD pixel, you have to use antialiasing or you're still dealing with the digital attempt to communicate in the analog.
Since we're dealing with light emitting pixels, instead of light reflecting pixels, the need is even higher, since the contrast of the hard boundaries in the screen are even more noticeable. Reflective light blends and bleeds to gather better than the same light intensity be very directly from the emitting source.
Antialiasing will be needed until we have high resolution organic, non-grid based displays, preferably reflective in nature.
Nice question!
When I think of anti-aliasing, I think of a technique that was invented to compensate for pixels that were too big. Image details are spread to surrounding pixels because they are cut off on the pixel edge prematurely. Since you cannot see individual pixels on the retina display (from a certain distance anyway) I think anti-aliasing becomes irrelevant by definition.
精彩评论