4K? Maybe So, Maybe Not.

These frames illustrate different ways of using UHD/4K originals in production: the “scaled” one retains the framing of the original for converting an entire show to alternate products (above); the “cropped ” one retains the resolution of the original static UHD shot for use in a lower-resolution product (top).

Unless you’ve been in a cave or under a rock for the past year, you’re probably thinking that you need to replace your camera with one that shoots and records UltraHD or 4K. Some of us might really need to, but, as usual, the decision isn’t cut and dried. The answer is the one you’ll always get from a technical advisor: “It depends.”

It depends on what you shoot, how you shoot it, how you edit it and, most importantly, where it will end up. It also depends on how you’re currently shooting, who decides if you can keep shooting that way and how much you make for your work. For example, you may have to upgrade just to stay competitive in your market. The question is, how much is it worth to you and to your client?

There are several benefits to shooting at approximately four times HD resolution, but it comes with some added costs. Whether it makes sense for you probably will be determined by the quality requirements of the primary product, the available edit system(s), the skills of the shooter and the editor, and the expected life span of the original footage.

Myth: It’s time for everyone to move to 4K

Changing your format may not be an all-or-nothing decision; it might make sense to upgrade your camera to improve the quality of the raw footage, but you might not want to upgrade every step in your workflow.

Shooting at four times HD resolution may improve the performance of a “full HD” recording because the oversampling of a “true” UHD camera system can improve the “depth of modulation” or perceived sharpness of an image down-converted to HD. An improvement isn’t necessarily certain, however, because delivered resolution is dependent on many factors: the skill of the operator, the quality of the lens elements, the performance of the camera sensor and processing electronics, the video compression encoder, the recording data rate, the video decompression decoder, the quality of any transmission links in the path from the camera to the display and, of course, the display.

The real performance increase that 4K has over HD also will be different among consumer, prosumer and professional equipment. Whether the difference is important depends on what you’re doing with the footage and to whom you’re selling.

An old adage in the video business states: “The only important resolution measurement is the marketing resolution.” That means you can’t sell an improvement if buyers don’t think they can see a difference. That observation also can be applied to data rates, the number of reproducible colors and any other measure of video system performance.

If your output product is so far down from UHD that all the improvements are lost, then what’s the point of using a fancy UHD camera? On the other hand, some improvements will translate through many levels of processing and still look better than cheap capture.

The important questions to ask relate to the trade-offs among the processing elements and to the best places to spend your data budget. Nearly every stage in the video chain between the lens and the viewer is digital now, and each stage requires decisions about how many pixels, or storage bytes, or transmission bits, or processing operations per second are really visible to the viewer at the end of the chain.

You can increase the number of megapixels and not increase the perceived sharpness, but you might reduce distracting artifacts. You also might make more improvements by using a better compression codec. Or, you might be better off capturing more levels of brightness in the original image and carrying that through to postproduction, so the editor can treat your video more like a film negative. And, of course, the size of the screen and the viewing distance will determine whether the viewer can see any of that resolution (or perceived sharpness) that you fought so hard to capture, record, edit and transmit.

The bottom line: Can the viewer see a difference and pay extra for it, assuming that there’s an incremental cost increase to get an improvement? If there’s not a marketable difference in the primary product, it still may make sense to upgrade if the value of the archival originals is increased for aftermarket use in subsequent products.

Obviously, a thorough review and analysis of the production process, deliverables and operational issues, plus testing in appropriate circumstances, may be necessary to determine the value of any potential upgrade in the camera, in the workflow chain or in the delivery process.

In the next few columns, we’ll try to deal with several of these questions in more detail and, hopefully, give you a bit more insight into whether you should take that fork in the road when you come to it.

C.R. Caillouet is a technical producer and video engineer who has worked in TV production, from preproduction through field acquisition to postproduction and presentation, as well as for NASA, Sony and Panasonic. He’s currently Technical Director of the Jackson Hole Wildlife Film Festival and Science Media Symposium.