If there’s one industry on Earth that thrives on finding the newest, hottest technologies and jumping into them head-first, it’s the video production industry. Photography has been built on laurels that rarely need large jumps, as the technology itself doesn’t always improve the result: you can still use film cameras from 20 years ago and come up with results that would hold up today. (Albeit with a more rigorous workflow) With video, this is different.
Think back to films from the 1980’s: special effects back then are fairly easy to pick out these days. Whether the scene features monsters, talking heads driving in a car or space travel, the technology that shot and edited these scenes was limited. If you follow the progress, you’ll see similar films, and then a movie comes along that ups the ante, typically because of a newly-utilized technology.
While no one can argue CG has drastically altered story telling in film, (Star Wars, anyone?) the means with which we film and capture movies has also changed our possibilities. No matter how good your CG department, Gollum wouldn’t have looked so real if The Lord of the Rings had been shot using the same cameras as Back to the Future. Film maker J.J. Abrams uses anamorphic lenses in his Star Trek reboots to mimic that stylized lens-flare look, but is still capturing the images on very high-quality cameras to ensure a beautiful HD picture. We’re using newer tech with the ability to mimic legacy film styles, but always retaining the crystal-clear integrity of today’s technology.
Enter 4K, or UHD. As a brief explanation, 4k is available in two main types so far: 3840×2160 (or Quad HD), and 4096×2160, or true 4K. For anyone not living under a rock since RED first hit the scene, this means full-frame sizes frames, for a picture 2x as tall and 2x as wide as a standard HD picture. (1920×1080) This is essentially to video what shooting RAW is to shooting .jpeg on a DSLR.
“So, there’s a bigger picture, big deal!”
This doesn’t just mean the picture itself is larger, it means that you’re capturing roughly 4x the information: 4x the color, 4x the detail, etc. As Adam Wilt explains in his article on DVInfo.net, which I’ve quoted here:
“Prior to NAB, I was under the impression that 4K would happen about twice as fast as the HD transition did. Why so fast? We already have the cameras, we already have the displays… everything between them, by comparison, is just software.
In Ye Olden Days, every part of the production, storage, postproduction, and transmission chain was built around analog hardware following well-defined standards: 3.58 MHz subcarrier, 13.5 MHz digital sampling; format-specific tape decks, NTSC II encoding and OTA transmission. Moving to HD required replacing all of that with something new.
Now? Sensors and displays are hardware, but the stuff in the middle is a string of ones and zeroes. There aren’t hardware vision mixers any more, just T-handles driving encoders that tell DSPs what proportion of channel A to composite with Channel B. A hard drive doesn’t care if it’s storing 720p, 1080i, 1080p or 2160p, or whether the images refresh at 23.98 Hz, 50Hz, or 59.94Hz. You can wrap anything in a broadcast transport stream; it’s just bits.
But having seen NAB, I’m now betting on an even faster transition. Almost every bit (pun intended) of that in-between stuff was available in a 4K version somewhere at the show… All of these are available now.” – Adam Wilt, DVInfo.net
It’s all just information to our capture devices, and if you can work with more information, why wouldn’t you?
“But I’ve already got everything I need for shooting in HD, why change to 4K now?”
I wouldn’t recommend selling all your trusty HD cameras and recording devices to upgrade immediately, but there are pieces of the puzzle out there now that mean you don’t have to. Blackmagic Design introduced their Production Switcher 4K, which I have already wrote about, that allows you to switch HD or 4K, and it’s available now. On top of adding some great features for a rack-mount switcher at $2,000 (as of this writing), they recognize that the transition to 4K might be slower for some people, especially since consumer-ready monitors and televisions at 4K are fairly cost-prohibitive at the moment.
But it’s coming: Nothing I can show you in an online article will truly capture the difference between a 4K monitor with proper signal and an HD monitor with HD signal. But the best comparison I can make from seeing it first hand, is that the difference between 4K and HD is the same as an LED HD TV and an SD television. There’s an intangible beauty and brightness to it that simply must be experienced first-hand.
Besides, as content creators, it’s not only our desire to keep up with newer technologies and produce the most beautiful video possible, it’s also our job; our mandate by those who hire us or pay our salaries. Unlike 3D technology, which has been sparsely used and even more rarely used well, 4K is a technology that most of our infrastructures are already able to utilize to some degree or another. And with some 4K monitors for the production end at already sub $2,000 (as of this writing) pricing, it’s really becoming a lot more available than you might think. Couple that with Blackmagic Design’s 4K Production Camera and 6G/SDI cables, and we’ve got a familiar set-up, too.
So maybe we won’t all have UHD Players and TVs in the next year, but it would certainly benefit you to feel out how you might eventually make the jump yourself. You can always compress something back to the previous technology standard, (HD, SD) but it’s a lot tougher going the other way around.