Monday, August 28, 2017

Why 4K in the home is dumb, according to cinematographer Steve Yedlin...

Stu Maschwitz, of ProLost fame, long ago wrote a very informative blog post about why having 4K resolution screens in your home is stupid. The full blog post is available here: https://prolost.com/blog/2013/1/22/4k-in-the-home.html?rq=4k%20in%20the%20home



Recently, I have been communicating with cinematographer Steve Yedlin (DP for Star Wars: The Last Jedi, Looper, Brick, etc...) on Twitter about all things camera related. Steve is known in the industry as being very technical-minded, and has conducted numerous technical camera/post tests. His latest tests deals with resolution, from IMAX down to Arri Alexa, which I think you could benefit from watching. That presentation can be found here: http://yedlin.net/ResDemo/

Anyway, I was thinking more about spatial resolution, and as I was researching the topic, I found an article that Mr. Yedlin authored on the very topic. I have pasted some of the text here, but the rest of his article can be found here: http://www.yedlin.net/BigK_2014.html

"A recent NY Times article (http://ow.ly/Ct58Y) troubled me when it stated with brazen confidence that “There is no doubt that with the right video playing, 4k simply looks better than an HD TV.”  Even in a top newspaper known for journalistic rigor (not just a tech review site), marketing deception has been given preference over rigorous science.
The truth is that given a fair comparison with controlled variables, 4k resolution is nearly indistinguishable (or perhaps completely indistinguishable) from HD TV resolution in normal viewing conditions, and the TV manufacturers are in large part selling snake oil to consumers.
The unchallenged fact of image science is that when you’re watching TV in a real-world situation (meaning, say, ten feet away on your couch, not 6 inches away with a magnifying glass) there’s a ceiling to the amount of resolution the human eye can perceive.  That ceiling was surpassed when we moved from SD to HD.  When you move beyond that ceiling, increased resolution does not translate into better perceived quality or more sharpness. (For tech details on this, start here: http://ow.ly/CBZme).
A fair and controlled comparison between HD and 4k shows almost no difference for normal TV viewing conditions, and in fact “4k” content often comes from 2k source footage, and almost always has artificial sharpening added to it in mastering — that’s fake sharpening that is (technically speaking) a degradation, but adds perceptual sharpness. The sharpening would look the same if applied to an HD image instead of a 4k image.  
Why would 4k content distributers REDUCE quality on the very format that they're advertising has  more quality!? It's because they know that their claims are incompatible with reality: that true 4k resolution doesn’t actually look appreciably different to a viewer than HD resolution. So marketing demands that something has to be done to alter the image and make a striking visual difference to sell the gimmick, even if the procedure undermines the very claim being proffered: increased resolution or "quality."
So, going to the electronics store and comparing HD and 4k TVs is not an even handed test, because 4k-mastered content and 4k-upscaling have gimmicks built in to create a fake wow-factor that is neither real resolving power nor is it the image that the author (director, cinematographer, etc) intended. I know from first hand experience that this is the case with 4k mastered content, and I would speculate (but don’t know for certain) that it’s also the case with the TV’s themselves: that they add sharpening and other fakery when upscaling HD images to 4k, or perhaps even when just displaying 4k images.
I know this is a nerdy thing to write a long post about, but it’s frustrating to see TV manufacturers duping consumers into "upgrading" their technology by putting a bigger number in front of the letter K and representing it as the “quality” of an image, when in fact the only practical difference from HD in many or all cases is that it's been degraded with artificial edge contrast (which, if applied to an HD image, would make it look just as “sharp” as their 4k content.)
High pixel-density screens are useful for computer monitors and smart phones, because those are screens that you view at a very close distance and concentrate on only a small detail of the screen: you read fine text, or you look at a photo embedded in a webpage, etc. But high pixel-density screens don't have practical application as big screen TVs that are livingroom centerpieces, especially while there are other (actually visible!) attributes of contemporary home theaters that could stand to be improved, including compression artifacts which are much much larger than single pixels (though that's a topic for a different time).
I hope this information can contribute to a more fair and balanced analyses in the media and that the public dialog will question artificially simplistic representations of technical data by manufacturers. "

No comments:

Post a Comment