4K vs. UHD: What’s the difference?

4K vs. UHD: What’s the difference?

samsung-un55h6350-55-inch-hdtv-et-1-640x354

Now that 4K is becoming a bit more mainstream, with HDTVs and computer monitors both approaching somewhat normal levels in pricing, let’s look at two terms that have become increasingly conflated with one another: 4K and UHD, or Ultra HD.

TV makers, broadcasters, and tech blogs are using them interchangeably, but they didn’t start as the same thing, and technically still aren’t. From a viewer standpoint, there isn’t a huge difference, and the short answer is that 4K is sticking, and UHD isn’t. But there’s a little more to the story.

4K vs. UHD

The simplest way of defining the difference between 4K and UHD is this: 4K is a professional production and cinema standard, while UHD is a consumer display and broadcast standard. To discover how they became so confused, let’s look at the history of the two terms.

The term “4K” originally derives from the Digital Cinema Initiatives (DCI), a consortium of motion picture studios that standardized a spec for the production and digital projection of 4K content. In this case, 4K is 4,096 by 2,160, and is exactly four times the previous standard for digital editing and projection (2K, or 2,048 by 1,080). 4K refers to the fact that the horizontal pixel count (4,096) is roughly four thousand. The 4K standard is not just a resolution, either: It also defines how 4K content is encoded. A DCI 4K stream is compressed using JPEG2000, can have a bitrate of up to 250Mbps, and employs 12-bit 4:4:4 color depth. (See: How digital technology is reinventing cinema.)

Ultra High Definition, or UHD for short, is the next step up from what’s called full HD, the official name for the display resolution of 1,920 by 1,080. UHD quadruples that resolution to 3,840 by 2,160. It’s not the same as the 4K resolution made above — and yet almost every TV or monitor you see advertised as 4K is actually UHD. Sure, there are some panels out there that are 4,096 by 2,160, which adds up to an aspect ratio of 1.9:1. But the vast majority are 3,840 by 2,160, for a 1.78:1 aspect ratio.

Why not 2160p?

Now, it’s not as if TV manufacturers aren’t aware of the differences between 4K and UHD. But presumably for marketing reasons, they seem to be sticking with 4K. So as to not conflict with the DCI’s actual 4K standard, some TV makers seem to be using the phrase “4K UHD,” though some are just using “4K.”

To make matters more confusing, UHD is actually split in two — there’s 3,840 by 2,160, and then there’s a big step up, to 7,680 by 4,320, which is also called UHD. It’s reasonable to refer to these two UHD variants as 4K UHD and 8K UHD — but, to be more precise, the 8K UHD spec should probably be renamed QUHD (Quad Ultra HD). (Read: 8K UHDTV: How do you send a 48Gbps TV signal over terrestrial airwaves?)

The real solution would have been to abandon the 4K moniker entirely and instead use the designation 2160p. Display and broadcast resolutions have always referred to resolution in terms of horizontal lines, with the letters “i” and “p” referring to interlacing, which skips every other line, and progressive scan, which doesn’t: 576i (PAL), 480i (NTSC), 576p (DVD), 720p, 1080i, 1080p, and so on.

Now that there are 4K TVs everywhere, it would take a concerted effort from at least one big TV manufacturer to right the ship and abandon use of 4K in favor of UHD and 2160p. In all honesty, though, it’s too late. That said, the more important problem isn’t really the name; it’s where in the heck we can all get some real 4K content to watch. So far, it’s appearing in dribs and drabs on services like Netflix, Amazon Instant Video, and some proprietary hardware and software products from Sony. That’s not yet enough for 4K to really take off.