Tech —

Whatever happened to 4K? The rise of “Ultra HD” TV

Companies seem averse to calling their latest TVs 4K.

A Samsung TV that's not only 4K, but also curved and OLED. We dare to dream.
A Samsung TV that's not only 4K, but also curved and OLED. We dare to dream.
Casey Johnston

Television resolutions have become a moving target in the last ten years—every time a consumer decides to jump in with both feet and buy the apparent latest model, better screens seem to appear on the shelves within weeks. TVs took decades to go from standard to high-definition resolution. Only a few years have gone by, and now “Ultra HD” is the new glass ceiling.

At CES 2012, a few companies showed “4K” displays, resolutions with four times the pixel count (2160p lines of resolution, usually 3840x2160) of a full HD display (1080p lines of resolution, usually 1920x1080).

This year, all companies seem to have transitioned to showing “Ultra HD” displays instead of “4K” ones. Where did 4K go? Why are we back to describing displays in terms of HD?

Ultra HD, like vanilla HD, is a term defined by the International Telecommunication Union (ITU), a body that has existed since 1865 and, as an agency for the United Nations, acts as the allocator for global radio spectrum. One of the ITU’s sectors sets standards in areas like networking, signaling protocols, and telecommunications (which includes television resolutions).

Casey Johnston

The terms “Ultra HD” and 4K have co-existed for some time. The first Ultra HD prototype was developed by NHK Science and Technical Research Laboratories in Japan (the same lab that developed HD) back in 2003, for which they had to create a special camera to make sufficiently detailed footage. But just as the term “HD” before it technically covers both 720p and 1080p-resolution screens, “Ultra HD” describes two resolutions: 4K, or 2160p, as well as 8K, or 4320p, which is visually detailed enough to compare to IMAX.

It’s hard to say definitively why nearly all of the major TV manufacturers (LG, Samsung, Sony) switched from calling their TVs from 4K to Ultra HD, but we have some guesses. As 4K makes the transition from “wildly expensive product of the distant future” to “regular consumer purchase,” 4K may be a bit hard to parse for the average customer.

For instance, if one is asked whether he’s in the market for an HDTV or a 4K TV, 4K sounds like overkill—he’ll just go with HD, thanks for asking. A customer who is asked to decide between an HDTV and an Ultra HDTV is made to feel like there’s a cohesive, continuous transition between the two resolutions. One is simply a natural evolution of the other.

We hate to see a level of informational detail lost to marketing, especially since we already know what it’s like to deal with the ambiguity of the term “HD” and whether it refers to 720p or 1080p. Some manufacturers refer to 1080p as “full HD,” but its usage is still not consistent or abundantly clear. And when 8K displays roll around, here’s hoping no one burdens consumers with the term “full Ultra HD.”

4K, or Ultra HD, or whatever you want to call it, still faces the same problems that plagued HD when it first permeated shelves: first, there’s little to no content for such displays (and that content, when it does become more prevalent, will be expensive to create and distribute due to the sheer amount of information packed into 4K video). Second, the displays remain wildly expensive: LG’s 84-inch 4K monster is priced at $19,999. Since we can’t yet afford such finery, we’ve stocked up all we can on mental and digital images of these displays at CES while we await the price war.

Channel Ars Technica