October 29, 2025, 4:28 pm | Read time: 5 minutes
For decades, the trend in televisions has been toward ever-higher resolutions. Today, 4K is the de facto standard, and manufacturers are even trying to push 8K onto consumers. However, it has long been known that higher resolutions only offer advantages in very specific cases. A new study from Cambridge now confirms this.
Our eyes are indeed capable of perceiving very high resolutions. But not in the way TV manufacturers want to make users believe. Researchers from the University of Cambridge and Meta have found that the so-called “retinal resolution” exceeds the previously accepted standards. This refers to the maximum image sharpness the human eye can perceive. That’s the theory. In practice, it can only be utilized in certain scenarios–and watching TV is not one of them, according to the study.
The Retina Myth
In 2010, Apple introduced the Retina display with the iPhone 4–a marketing term meant to describe the pixel density at which the eye can no longer distinguish individual pixels. According to Steve Jobs, the pixel density should be 58 PPD at full 20/20 vision. This refers to the number of pixels per degree of visual field. This has become a quasi-standard in the industry.
However, the new study by Maliha Ashraf, Alexandre Chapiro, and Rafał K. Mantiuk comes to a different conclusion. Participants could perceive up to 94 PPD on average–and some even up to 120 PPD. This means the actual visual acuity of humans is about 50 percent above the previous standard.
According to the researchers, our eyes perceive colors with varying sharpness. For black-and-white contrasts, they recognize 89 PPD, and for red-green contrasts, even 94 PPD. It’s different for yellow-violet patterns, where perception drops to 53 PPD. This aligns with the visual science theory that the eye is most sensitive to the red-green channel.
8K TVs in Crisis–Only One Manufacturer Left in the Market
The Difference Between UHD and 4K in TVs
8K? Only Useful Up Close
For the study, the researchers mounted a monitor on a motorized stand that could change the distance to the observer with millimeter precision. At each distance, participants were shown two types of images in random order: vertical lines (one pixel wide) in black/white, red/green, or yellow/violet, or a simple gray block. Participants then had to indicate which image contained the lines.

This way, the researchers could find the “resolution limit”: the point at which the resolution is too high or the lines are too thin to be seen with the naked eye. In the study, even 4K displays typically maxed out human perception at a viewing distance of 3 display heights. The International Telecommunication Union (ITU) recommends a viewing distance of 0.8 to 3.2 display heights for 8K displays. However, the study concludes that the eye can only discern pixels when sitting closer than 1.3 display heights.
Read also: Who Benefits from a 4K Monitor?
In other words, if you own an 8K TV with a 65-inch diagonal, you would need to sit less than a meter away to notice the difference from 4K. At a normal viewing distance, the eye is already at the resolution limit. For consumers, the researchers have developed an online calculator that shows the optimal viewing distance for each resolution and display size.
The Cambridge study makes it clear that our eyes have their limits–but not where marketing departments draw them. 4K is more than sufficient for almost all everyday situations. 8K is only worthwhile at extremely short viewing distances or for specific applications such as professional image editing, simulations, or VR.
Important Insights for Mixed Reality and Streaming
For research and industry, the results are a real milestone. They provide new reference values for the development of displays, rendering technologies, and video codecs.
Especially for virtual and augmented reality headsets, the insights are crucial: Through so-called “foveated rendering” techniques, only the area the eye focuses on is rendered sharply. The new data allows these methods to be more precisely aligned with human vision–saving computing power and bandwidth.
The results are also relevant for video compression in streaming. The finding that our eyes can best distinguish red and green offers room for improvement. This contrast could be better represented in the future, while other patterns that we distinguish less well could be further compressed.
The Law of Diminishing Returns
“That manufacturers are running into a dead end by trying to define higher resolutions as the future for TV sets should have been clear to many. However, the joint study by Cambridge and Meta researchers now clearly shows that 4K is absolutely sufficient for consumers.
In my opinion, there are many other areas where displays can improve. I am particularly thinking of brightness. More and more manufacturers are turning up the nits. At this year’s CES, Hisense introduced the first TV with an incredible 10,000 nits–that’s progress, that creates a realistic TV experience. Because our world is colorful and bright. When the sun shines on me from the TV with a familiar warmth, I don’t care what resolution I see it in.”