If you’re a regular reader of this column, you have noticed that I tend to be a defender of 3D.
That’s not because I particularly love 3D. I enjoy it when it’s good, find it annoying when it’s bad — just like acting and every other part of a performance. It just became clear to me long ago that 3D is the “killer app” for digital cinema. It follows that since Hollywood is committed to d-cinema, 3D is here to stay, barring some intervening crisis like a health scare. I just end up spending a lot of time pointing out that the arguments against it have been illogical, counterfactual or both.
On the other hand, I have been something of a skeptic about 4K, the super-high-def digital cinema format that puts four times as many pixels on a movie screen, simply because there didn’t seem to be a clear case for it. It looks better, but not good enough to be a premium format like Imax or 3D. James Cameron, whose credentials as a tech pioneer are impeccable, has ridiculed the format. 4K has seemed like a solution in search of a problem.
But as exhibs have begun installing 4K projectors, it’s become clear that 4K is coming for real, and earlier this week, Sony entertainment technology prexy Chris Cookson finally made a case for 4K that I can follow.
“4K is important because theaters have changed,” explained Cookson. The issue is how close viewers are to the screen. Before stadium seating, theaters were “long, skinny boxes,” and most viewers sat at least three screen-heights away from the screen. Someone with 20-20 vision sitting three screen-heights or more away from the screen can’t see any more resolution than about 2K.
That’s part of the logic behind the HDTV standards, says Cookson. “High-def television was a reasonable simulation of the image that you were able to perceive on the screen in a movie theater from 30 years ago. Whether there was greater resolution on the screen didn’t matter; that’s all you could see.”
“If you go into an old classic theater and sit in the back row, which is seven screen-heights away from the screen,” he says, “you shouldn’t be able to tell whether you’re watching 4K, 2K or standard def, because your eye becomes the limiting factor.”
But in a stadium-seating theater, the back wall is about three screen-heights from the screen. Most viewers sit much closer, within 1½ screen-heights from the screen. “That’s where you need 4K,” says Cookson. “The modern stadium-seating theater puts you too close to too large a screen for 2K to be adequate.”
This poses a conundrum for technologists. If most viewers simply can’t see more pixels than about 4K in today’s movie theaters, then putting more pixels on the screen isn’t going to improve the viewing experience significantly.
Yet movies are under pressure to continue to improve as the home theaters get bigger, sharper screens and truer sound. So what to do?
Higher frame rates are one answer, and in fact some TV networks are already broadcasting their HD feeds at 60 frames per second, far beyond the 24 fps of the movies. Yet Cookson notes audiences sometimes find true 60 frame-per-second images look “too live.”
“There’s what some people call ‘the proscenium effect,'” he says, “artifacts that separate what you’re seeing from reality in ways that help with willing suspension of disbelief.” There is a rising debate on this issue among filmmakers and technologists.
In fact, one of the more cogent arguments against 3D is that it interferes with this very effect, adding too much reality to a filmed image. That may be why 3D works so well with animation; animated images gain impact from the added realism without becoming “too real.”
So where’s the next improvement in the viewing experience coming from? More on that in future columns.
Want to comment or suggest a column topic? Email firstname.lastname@example.org