On photography

8K is on its way – but do we really need it?

Just as we’re getting used to the idea of watching (and shooting) 4K video, now manufacturers are trying to convince us that 4K resolution just isn’t enough. It turns out that 4K is only Ultra High Definition (UHD), and pretty soon we are going to need Full Ultra High Definition, or 8K. So is 8K really the next big thing, or are we being sold a dud? Time to investigate.

The high resolution “arms race”

Around 2010, we first started to see a new breed of LED screens appear on the market. Apple launched its “retina” displays, and marketed them as having a resolution to match the resolving power of the human eye. Other manufacturers introduced similar technology as they each tried to outpace one another. By 2012, 4K TV’s were starting to appear. This new, Ultra-High Definition (UHD) offered a staggering four times the already outstanding resolution of Full HD TV’s (or 1080p). Now, videographers everywhere rush to upgrade their camera equipment to be able to record 4K, to benefit from this new image quality, or just for fear of being left behind. This also means upgrading to ultra-fast broadband connections, purchasing bigger disk drives and faster memory cards, and a more powerful computer to handle it all.

Now 8K, or “Full” UHD, is just around the corner. Electronics giant Sharp have announced that a new “prosumer” camera will shoot 8K, and several TV’s are already on the market, albeit with ludicrous price tags. 8K offers four times the resolution of 4K for the ultimate in clarity and realism. Sound familiar? It should be, because that’s exactly what 4K was sold on.

Reality check: a critical look at 4K

So has 4K really brought the benefit that it promised? The truth is, if you’re watching an average-sized TV from the comfort of a sofa, then it’s likely that the visible difference between 4K and 1080p is minimal, even non-existent. That’s because 4K was invented for the cinema (or home cinema) experience. For an average TV, 1080p is already plenty enough resolution, and the enhanced resolution of 4K is mostly lost to the human eye. That’s not to say 4K doesn’t have its place: for a big screen experience, or for close work on a 32-inch computer monitor, the difference in picture quality between 1080p and 4K can be dramatic. That’s exactly what 4K was designed for. But for any other situation it is usually overkill.

So where does 8K fit in? If 4K only brought a benefit in certain situations, and no benefit at all in others: will anybody benefit from 8K? Is it just a cynical attempt by manufacturers to get us to update our TV sets and camera equipment, yet again? Or will it usher in a new age of stunning image quality? To answer that question properly, we have to delve a little deeper into the world of pixel density and visual acuity.

Why does resolution matter? Here’s the facts

The aim of these incredibly high resolution screens is to achieve the most realistic picture possible, by having a pixel density that is able to reach the limits of human perception. This limit is called visual acuity, when the human eye can no longer discern between two points. According to Michael F Deering of Sun Systems, a typical visual acuity is 0.47 arc minutes, or 0.0078 degrees. Our actual “visual resolution” is less than this, at around 1 arc minute, or 0.0167 degrees.

Armed with the above knowledge, plus a dash of trigonometry, we can arrive at a simple formula to relate the maximum visible screen resolution to both the viewing distance and the screen size. For a standard 16:9 widescreen, we arrive at:

Max screen resolution = 140 x screen diagonal (inches) / viewing distance (feet)

Screen resolution is quoted as the number of rows of pixels: eg. 1080 for Full HD. Here’s a list of the most common resolutions:

Number of pixels per rowNumber of rows of pixels
Standard Definition (SD)720480
Standard HD1280720
Full HD19201080
Quad HD25601440
Ultra HD (4K) 38402160
Full UHD (8K)76804320

So now let’s apply this to some real word examples…. Below is a table of common screen sizes with their recommended viewing distances. For TV, I’ve used the viewing distance recommended by home cinema experts THX to give an optimal viewing angle of 36 degrees. Computer viewing distances are consistent with recommendations by the College of Optometrists. The results may surprise you:

Screen sizeIdeal screen resolution (and standard required)
15 inch laptop viewed at 18 inches1400 (Quad HD)
24 inch desktop monitor viewed at 2 feet1680 (4K)
32 inch desktop monitor viewed at 2 feet2240 (4K)
32 inch TV viewed at 3.6 feet1244 (Quad HD)
43 inch TV viewed at 4.8 feet1254 (Quad HD)
65 inch TV viewed at 7.3 feet1247 (Quad HD)

All of these examples suggest that the ideal resolution falls in the range between Quad HD (for TV viewing and laptop use) and 4K (solely for desktop viewing on a large computer monitor). Interestingly, this shows that 4K is already overkill for TV. To gain the full benefit of 4K TV you would have to sit much closer than the minimum distance recommended by THX. I’m willing to guess that most people will actually be sitting further back than the THX recommendation, in which case, Full HD is already enough resolution. Check it yourself:

Screen size (inches)Minimum viewing distance for Full HD (feet)
Minimum viewing distance for 4K (feet)
324.12.1
395.12.5
435.62.8
557.13.6
658.44.2
8210.65.3

Where does that leave 8K?

If 4K is already more than enough resolution in nearly all viewing situations, where does 8K fit in? The short answer is that it doesn’t. 8K is limited to niche applications where a large screen is viewed from a very close distance, increasing the field of view far beyond what we normally expect. (Basically, anything with a viewing angle over 60 degrees.) These are immersive environments, such as wrap-around screens, planetariums, and virtual reality. Sure, you can always sit really close to a large flat screen to get a more “immersive” experience, but what you gain from immersion you will lose from being at a distorted viewpoint. Flat screens were simply not designed to be viewed like this, nor was the content that we’re watching.

The verdict

After close inspection, it’s safe to say that there really is very little need for 8K in the mainstream market. We have already reached the very real limits of human perception with 4K. It was designed that way. So if you are thinking about going out and buying an 8K TV next year to impress your friends, or an 8K camera to make “better quality” home videos, then there really are better ways to waste your money.

About the author

Paul Maguire is a professional photographer with a scientific background. He has a bachelors degree in Physics from Bath University and a masters degree in Exploration Geophysics from Imperial College, London.