The Reality Distortion Field is back

by Kenny Hemphill on June 29, 2010

Kenny Hemphill

Steve Jobs returned to form at his WWDC keynote, with his Reality Distortion Field at full strength again for the launch of the iPhone 4.

Steve Jobs’ WWDC keynote speech last month was notable not just for the announcement of the iPhone 4, exciting though that was. And, although some of the stats he reeled off regarding download sales and the sales of the iPad were dizzying, it wasn’t those that brought a smile to my face, either.

What had me beaming like a keynote front row fanboy was the return of something that has gone missing in recent speeches: the Steve Jobs Reality Distortion Field, or SJRDF as we geeks like to refer to it.

Perhaps it has been absent because Apple has been too busy releasing kit that regularly outperforms even the most outlandish sales forecasts, or because once you’ve seen an iPad demo, there’s really no point in hyperbole. Or maybe its just because Jobs couldn’t be bothered. Whatever the reason, there has been a noticeable decline in RDF levels in recent keynote announcements.

That all changed on 7 June, and how. The needle on MacUser’s RDF detector was stuck in the red zone for almost two full hours as Jobs described just how amazing Apple thinks the new iPhone 4 really is.

The first candidate for our detector was the new iPhone’s display. It’s exactly the same size as the one in the previous three iPhones, but now has a resolution of 960 x 640 pixels, giving it a pixel density of 326ppi. It also uses an in-plane switching (IPS) panel, something that, until the past couple of years, was only seen on large, high-end displays made by the likes of Eizo and aimed at professionals for whom accurate colour rendition is critical. One of the features of IPS displays is that they can display true 24-bit colour, rather than fudging it as non-IPS LCD panels do.

It was, however, that pixel density that sent our SJRDF detector into overdrive. According to Jobs, 300ppi is the maximum pixel density that can be perceived by the human retina, and the iPhone 4′s 326ppi ‘comfortably exceeds that’. To prove his point he spent several uncomfortable minutes showing comparisons of text and images on an iPhone 3GS and iPhone 4 – using a special projector because, hey, normal projectors just can’t display that level of awesomeness. He willed the audience to see the differences, to appreciate the fuzziness reduction on the new screen, but there was a distinct lack of whoops, unlike the rest of the demo. Jobs even referred to apps that contain high-resolution artwork for the new display as ‘retina apps’ – surely worthy of an immediate entry into the top 10 SJRDFisms.

It took me no longer than two minutes on Google to establish that the actual limit of the human retina is far less easily described and depends not just on the distance of the screen from the eyes, as Jobs said, but on the viewing angle, too. Who’d have thought it? There’s no limit that can usefully be described in pixels per inch. This is because, according to Raymond Soneira, president of DisplayMate Technologies and Phd in theoretical physics, talking to wired.com, the eye has ‘an angular resolution of 50 cycles per degree’. To compare that angular resolution with pixels on a screen, a conversion has to be made, which gives a result of 426ppi.

It wasn’t just in his description of the display where Jobs ramped up the RDF. The new five-megapixel camera got the full treatment, too. Jobs explained that one of the technologies that helps the camera produce great pictures is the ‘backside illuminated sensor’. The what? It turns out the backside of the sensor isn’t illuminated by anything other than the light falling on the subject of the photo. The term simply means that the light coming from the lens falls on the opposite side of the sensor from the chips’ circuitry (or the ‘backside’), meaning that the circuitry doesn’t occupy space on the important bit, thus providing a greater light-sensitive area and increasing the sensitivity of the chip as a whole. This directly improves low-light performance.

Then there’s that LED flash. Sure, any flash is better than no flash, and LED is preferable for a number of reasons, not least its low-power consumption. But a flash mounted above the lens will do the same thing, LED or not: light the subject by shining a harsh, small light directly at it, blowing out highlights and producing deep shadows.

A far better target for the SJRDF would have been the pixel pitch of the sensor, which is the same as that used in Sony’s most recent compact cameras and is a much better indicator of image quality than either resolution or the type of flash. Perhaps Jobs had had enough of pixel density by that point, or perhaps the SJRDF is a little rusty.

That would hardly be surprising, given its lack of use in recent years. Now that it’s back up and running, though, I can’t wait to see how its employed for the announcement of new iPods in September.

For more breaking news and reviews, subscribe to MacUser magazine. We'll give you three issues for £1
  • Amnesia10

    The iPhone 4 buyers can’t all be caught up in the RDF.

Previous post:

Next post:

>