iPhone 4: the end of pixels?
9th Jun 2010 | 09:20
Is Apple's claim of no visible pixels technically justified?
Apple has dubbed the iPhone 4's new higher-resolution screen "Retina Display". So what exactly does that mean, and are Steve Jobs' claims of "smooth and continuous graphics" – in other words, no visible pixels – technically justified?
It can't be denied that there's a certain amount of hype in Apple's presentation of the feature.
The term "retina display", for a start, has been borrowed from a slightly different context: conventionally, it refers to screenless display technologies that project images directly onto the back of the eye.Bumping up the number of pixels in an LCD panel is not really in this class of innovation.
The benefits have also been slightly exaggerated, though perhaps only for the legitimate purpose of clear illustration.
The typographic comparison used in the WWDC keynote and on the Apple website shows a "before" image with pixels about nine times larger than those in the "after" image: that is, each original block becomes a 3x3 grid of nine blocks. In reality, the new screen packs in only four times as many dots.
ABOVE: Apple's simulated comparison of iPhone 3GS vs iPhone 4 screen resolutions (left) vs TechRadar's. (Anti-aliasing mileage may vary.)
Only? That's still a heck of a lot of dots. The iPhone 4's pixel density comes out at 326ppi (points per inch), meaning each square inch of the display comprises a grid of 326x326 pixels. Anyone familiar with designing for print will spot that this exceeds the magic number of 300dpi, the standard resolution used for printed images.
Except that 300dpi isn't actually a magic number.
Since printing uses dots of four inks (cyan, magenta, yellow and black – CMYK) to produce each blob of a given colour on the page, there's quite a complex relationship between the resolution of the press and the effective resolution of the printed image.
Colour fidelity and tonal range are traded off against detail by adjusting the line screen used in halftoning continuous tone images such as photos.
Typical line screens are 133 or 150lpi (lines per inch) for magazines, somewhat less for newspapers.
This means what you're seeing in print has an effective resolution of no more than 150ppi, although because it's halftoned rather than pixellated it doesn't suffer the tell-tale blockiness of low-resolution digital images.
The reason designers output pictures at 300dpi is simply to avoid any loss of quality due to offsets between the image pixels and the halftone grid. It doesn't mean they're getting the full 300dpi on the page.
This makes 326ppi sound even better. But is it really, as Apple claims, "pixel density so high, your eye is unable to distinguish individual pixels"?
We thought that seemed like a testable claim, so we asked an expert in display technology to help us check it.
Formerly at Sun and now presiding over wonderfully named startup Photon Wall, Michael F Deering has spent years working on the physics of digital displays and human vision, and has established a sound basis for calculating just how much detail is needed to satisfy the eye.
We've summarised the main factors in Deering's calculations below, and it's fascinating stuff.
For those who prefer to cut to the chase, however, the smallest element most human eyes can resolve at 25cm – our estimated typical viewing distance for a 3.5in screen – is about 1.5 arcminutes, while the iPhone 4's pixels are comfortably smaller at one arcminute. In short: believe the hype.
There's just one wrinkle that makes us reserve judgement, until we see it for ourselves, on whether the new screen really will be, as Apple puts it, "something that looks to your eye like you're holding a printed page in your hand".
Remember our line screen/ppi figures? They only apply to colour images, and what humans like to look at most of all isn't pictures – it's words.
In print, solid black (100% K) text isn't halftoned, but uses the full resolution of the platemaking system, up to 2400dpi. Even allowing for dot gain (ink spread), that enables a degree of crispness that even the best digital displays struggle to match.
What about the iPad?
Any iPad user will tell you, for example, that text as small as in the average magazine is almost illegible.
Larger point sizes have been adopted by all the major titles that have designed their iPad editions from scratch, while magazines repurposed directly from the printed pages often leave the reader zooming to cope with body text. Yet the iPad's display resolution works out at 132ppi – essentially the same as a magazine line screen.
So we may need more pixels than we think. At 326ppi, the iPhone 4 packs 2.5 times more dots than the iPad into every square inch – a substantial improvement, and quite possibly enough to finally bring print-quality text to the screen. We're holding our breath.
Finally, there's "just one more thing" that will contribute to the iPhone 4's perceived display quality. As one tech industry pundit put it:
"The iPhone 4 retina display is going to look friggin' AWESOME when the glass is covered in fingerprint grease, right?"
A fair point: with the same oleophobic coating as the iPhone 3GS and iPad, the iPhone 4 will be less susceptible to smearing than some touchscreens but by no means immune.
Still, you can't have everything, and on balance we'd rather see paw-prints than pixels. They're a lot easier to wipe off.
How many pixels is enough?
Visual perception expert Michael F Deering points out that the number of pixels per inch tells us nothing by itself.
Display quality is hugely affected by how far away the screen is from your eye. After all, an HD movie looks just as good on a cinema screen a storey and a half high as it does on your living room LCD TV, even though it has no more pixels.
Deering has an HDTV projector at home with a resolution that works out at less than 16ppi, but from two to three metres away it looks fine. Similarly, text on your 100ppi iMac screen looks as clear as on your 163ppi iPhone 3GS – because it's further away.
So instead of looking at the device in isolation and measuring how far apart the pixels are, Deering's formula is based on field of view (FOV).
"The 'resolution' of a human eye is measured by the smallest FOV that person can resolve." A real world example would be distinguishing the bars of a capital E in an eye test; in scientific studies, a visual sine wave pattern is used, with half a cycle representing the smallest FOV.
There are substantial differences between individuals, so we'll need to generalise. For someone with 20/10 vision ("probably Chuck Yeager", suggests Deering – this level of visual acuity is rare) the FOV is half a minute of arc, or 1/120 of a degree, while 'normal' 20/20 vision is only half as precise at one minute of arc.
Notice that, because we're now looking outward from the viewer rather than measuring pixels across a screen, we need to think in terms of angles, not lengths.
Other factors affecting sight
In practice, less than ideal lighting further compromises accuracy, and then there's the question of how wide your eye is at the time.
We can vary our pupils between about 2 and 6mm; narrower is better, but "a 2mm pupil is starting to bring in considerable distortions from quantum mechanical effects", notes Deering. There's much more on all this, if you care to explore, in an expanded version of a paper presented by Deering at the 2005 SIGGRAPH conference.
Also to be taken into account is the fact that we don't see equally well across the whole retina. "In the fovea – a small area at the centre – are just a few dozen cones [photoreceptor cells that can distinguish colour] of the smallest size.
One degree away from the centre, the cones have twice the area. Further away they have twice the width (four times the area), and large gaps between them for the rods [non-colour-sensitive photoreceptors].
After a few more degrees, individual cones don't report directly to the brain, but groups of cones, vastly lowering the resolution again. When all this is taken into account, the five million cones in one eye report only half a million 'spots'."
Clearly, establishing a typical FOV is no simple matter, but to cut a long story short it comes out at about 1.5 minutes of arc.
Crucially, the FOV changes with the distance from eye to object, and, as is obvious if you mentally picture it, the angle gets smaller the further away the object is – so as a starting point we need to establish not just the size and resolution of the display but how far it is from the viewer. We agreed with Deering that people tend to hold their iPhones about 25cm from their face.
The FOV is calculated as the inverse sine of the distance between the left and right sides of a single pixel, divided by the distance from the human observer's eye and the pixel.
In the case of an iPhone 4 at 25cm, it works out at just about exactly one minute of arc. As we've seen, that's comfortably finer than the vast majority of people can resolve, and thus the new screen does indeed offer "pixel density so high, your eye is unable to distinguish individual pixels". QED.
Liked this? Then check out iPhone 4: how it points to the future
Sign up for TechRadar's free Weird Week in Tech newsletter
Get the oddest tech stories of the week, plus the most popular news and reviews delivered straight to your inbox. Sign up at http://www.techradar.com/register