- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
A new study published in Nature by University of Cambridge researchers just dropped a pixelated bomb on the entire Ultra-HD market, but as anyone with myopia can tell you, if you take your glasses off, even SD still looks pretty good :)
OP, please update the post to reflect the current article title. It may have changed since you posted.
I think the real problem is that anything less than 4k looks like shit on a 4k tv
“No duh” -Most humans, since ever
This finding is becoming less important by the year. It’s been quite a while since you could easily buy an HD TV - they’re all 4K, even the small ones.
The study doesn’t actually claim that. The actual title is “Study Boldly Claims 4K And 8K TVs Aren’t Much Better Than HD To Your Eyes, But Is It True?” As with all articles that ask a question the answer is either NO or its complicated.
It says that we can distinguish up to 94 pixels per degree or about 1080p on a 50" screen at 10 feet away.
This means that on a 27" monitor 18" away 1080p: 29 4K: 58 8K: 116
A 40" TV 8 feet away/50" TV 10 feet away
1080p: 93
A 70" TV 8 feet away
1080p: 54 4K: 109 8K: 218
A 90" TV 10 feet away
1080p: 53 4K: 106 8K: 212
Conclusion: 1080p is good for small TVs relatively far away. 4K makes sense for reasonably large or close TV Up to 8K makes sense for monitors.
The article updated it’s title. The original title is retained in the slug.
The article title is basically a lie intended to generate clicks by pretentious people far stupider than the people who did the actual research which is why the non morons who did the research called it “Resolution limit of the eye — how many pixels can we see?”
Here’s the gut-punch for the typical living room, however. If you’re sitting the average 2.5 meters away from a 44-inch set, a simple Quad HD (QHD) display already packs more detail than your eye can possibly distinguish.
That seems in line with common knowledge? Say you want to keep your viewing angle at ~40º for a home cinema, at 2.5m of distance, that means your TV needs to have an horizontal length of ~180cm, which corresponds to ~75" diagonal, give or take a few inches depending on the aspect ratio.
For a more conservative 30° viewing angle, at the same distance, you’d need a 55" TV. So, 4K is perceivable at that distance regardless, and 8K is a waste of everyone’s time and money.
Please note at 18-24" with a 27" screen 4K does not max out what the eye can see according to this very study. EG all the assholes who told you that 4K monitors are a waste are confirmed blind assholes.
They are a waste of time since the things with enough fidelity to matter run like shit on them without a large investment. Its just a money sink with little reward.
Are you talking about 8K or 4K? Not only can you game in 4K with a cheap card depending on the game the desktop and everything else just looks nicer.
Ether, 1440p is about the limit I draw before the extra fidelity is not worth the performance hit.
Your own budget is by definition your business but you can run some stuff in 4K on my desktop I bought in 2020 for $700. Not worth it “TO ME” requires no defense but it is pretty silly to say its a money sink with no reward when we are talking about PC gaming. You know where you game on a 24-32" screen 1 foot or 2 from your face. The study clearly says its not.
I have at one point of time made my living in hardware, I would not advise running in 4k or higher without good reason. You being able to run at 4k does not in anyway change the terrible value proposition of losing frames and latency for fidelity. I would not recommend anyone not wanting to go absolutely silly to run a 4 or 8k monitor. Run an multiscreen setup at lower resolution like a normal person. Don’t make your own preferences or sunk costs your position on tech in general.
Personal anecdote, moving from 1080p to 2k for my computer monitor is very noticeable for games
Going down from 24" 2048x1152 to 27" 1920x1080 was an extremely noticeably change. Good god I loved that monitor things looked so crisp on it.
Even 4K is noticeable for monitors (but probably not much beyond that), but this is referring to TVs that you’re watching from across the couch.
Isn’t 2k and 1080P basically the same thing?
2k is about double of 1080p and 4k is double of 2k
If 4k is 4k because the horizontal resolution is around 4000, so you’d think 1080p, with its 1920p-long lines would be 2k. It’s fucked that it isn’t.
1080p is 2k, the commenters above are just wrong.
It’s all just marketing speak at this point.
1920x1080 vs 2560x1440
Not crazy higher but a noticeable increase
Yeah. They went from counting pixels by rows to columns. A 16:9 widescreen 1080 display is 1920×1080, and most manufacturers are happy to call 1920 “2K”.
Ah yes, my 1920x1080 monitor with a resolution of 2560x1440
I think age makes a big difference, too. I’m over 50 and I’ve never been able to really tell between 720p and 1080i and 1080p, much less higher resolutions. And I’m nearsighted.
Do you wear glasses?
ITT: people defending their 4K/8K display purchases as if this study was a personal attack on their financial decision making.
They don’t need to this study does it for them. 94 pixels per degree is the top end of perceptible. On a 50" screen 10 feet away 1080p = 93. Closer than 10 feet or larger than 50 or some combination of both and its better to have a higher resolution.
For millennials home ownership has crashed but TVs are cheaper and cheaper. For the half of motherfuckers rocking their 70" tv that cost $600 in their shitty apartment where they sit 8 feet from the TV its pretty obvious 4K is better at 109 v 54
Also although the article points out that there are other features that matter as much as resolution these aren’t uncorrelated factors. 1080p TVs of any size in 2025 are normally bargain basement garbage that suck on all fronts.
Resolution doesn’t matter as much as pixel density.
Right? “Yeah, there is a scientific study about it, but what if I didn’t read it and go by feelings? Then I will be right and don’t have to reexamine shit about my life, isn’t that convenient”
My 50" 4K TV was $250. That TV is now $200, nobody is flexing the resolution of their 4k TV, that’s just a regular cheap-ass TV now. When I got home and started using my new TV, right next to my old 1080p TV just to compare, the difference in resolution was instantly apparent. It’s not people trying to defend their purchase, it’s people questioning the methodology of the study because the difference between 1080p and 4k is stark unless your TV is small or you’re far away from it. If you play video games, it’s especially obvious.
Old people with bad eyesight watching their 50" 12 feet away in their big ass living room vs young people with good eyesight 5 feet away from their 65-70" playing a game might have inherently differing opinions.
12’ 50" FHD = 112 PPD
5’ 70" FHD = 36 PPD
The study basically says that FHD is about as good as you can get 10 feet away on a 50" screen all other things being equal. That doesn’t seem that unreasonable
I know I am a display tech nerd, but can people really not tell the difference? Even going from a 1440p to a 4k monitor to me was a very noticeable improvement to clarity. And there’s a huge difference in the way that games look on my living room TV in 1080p compared to 4k.
Anecdotally at average viewing distances on my 55" TV I can’t really tell a difference. If I had an enormous TV maybe I would be able to tell. 1080 > 2160 is for sure not the leap 720 > 1080, or 480 > 720 was in the average environment that’s for sure.
If you’re sitting the average 2.5 meters away from a 44-inch set, a simple Quad HD (QHD) display already packs more detail than your eye can possibly distinguish. The scientists made it crystal clear: once your setup hits that threshold, any further increase in pixel count, like moving from 4K to an 8K model of the same size and distance, hits the law of diminishing returns because your eye simply can’t detect the added detail.
I commend them on their study of human eye “pixels-per-degree” perception resolution limit, but there are some caveats to the article title and their findings.
First of all, nobody recommends a 44-inch TV for 2.5 metres, I watch from the same distance and I think the minimum recommended 4k TV size for that distance was 55 inches.
Second, I’m not sure many QHD TVs are being offered, market mostly offers 4k or 1080p TVs, QHDs would be a small percentage.
And QHDs are already pretty noticable quality jump over 1080p, I’ve noticed on my gaming rig. So basically if you do the jump from 1080p to 4K, and watch 4k quality content, from the right distance - most people are absolutely gonna notice that quality difference.
For 8Ks I don’t know, you probably do get into diminishing returns there unless you have a wall-sized TV or watch it from very close.
But yeah, clickbaity titled article, mostly.
Really depends on the size of the screen, the viewing distance, and your age/eye condition. For more people 720 or 1080 is just fine. With 4k, you will get some better detail on the fabric on clothes and environments, but not a huge difference.
8k is gonna be a huge waste and will fail.
It does make a difference for reading text like subtitles or navigating game menus.
If my quick calculations are correct, the 70 inches screen at 1080p has a pixel size of about 0.7 mm give or take, where 4k would be about 0.1-0.2.
0.1mm is a smallest size of a thing a human could potentially see under very strict conditions. A pixel smaller than a millimeter will be invisible from a meter away. I really, really doubt its humanly possible to see the difference from the distances a person would be watching tv.The thing is, the newer 4k tvs are just built better, nicer colour contrast, more uniformed lighting, clearer glass, and that might be the effect you’re seeing
Basically you are in a study which calculated for you what people ought to be able to see and you insisted on redoing the calculation yourself incorrectly. The study says people factually can distinguish up to 94 pixels per degree. a 70" screen at a meter away is 24 PPD. You yourself could have easily eye balled 2 screens and come to the correct conclusion but are instead asserting nonsense.
Did you notice that FHD tvs larger than 40" literally don’t exist in stores? If people literally couldn’t see more than 24 PPD than at the more typical 10 feet viewing distance a 70" screen at 640x480 would be just as good as a 70" 1080p was at a meter away! For a 50" you could go down to 320x480! Still 24PPD
Uh… Hol up. So if we can maybe see down to 0.2 mm and the 1080p screen has 0.7 mm pixels… That’s pretty much what I’m saying. 1080p is noticeably grainy.
The text in 4k looks crisper. I concur I can’t count individual pixels, but reading game menus in 1080p feels rougher and makes me squint. Reading in 4k feels more like reading on print paper or a good e-eeader.
This and yes, the build quality of newer screens also contributes.
This is literally the only truly important part after a certain threshold. I have a 34”, 1440p monitor and the text is noticeably better than any 1080p screen. It’s entirely legible and 4K would not provide a new benefit except maybe a lighter wallet. It’s also 100Mhz which is again beyond the important threshold.
The only time I can see 4K being essentially necessary is for projectors because those screens end up being massive. My friend has a huge 7’ something screen in the basement so we noticed a difference but that’s such an outlier it should really be a footnote, not a reason to choose 4K for anything under 5’(arbitrary-ish number).











