Originally Posted by OverclockN'
Agreed. Simply inputting numbers into a pixel density calculator doesn't tell the whole story. It's not that simple.
In my own case, I have a 49" Sony 900F 4k and had a 24" 1080P IPS monitor. When I first got the TV (on the wall in front of my desk), I had the 1080P monitor on my desk for a while. The monitor has a pixel density of 91.79 and dot pitch .2767. The 49" 4k tv has a pixel density of 89.91 and dot pitch .2825.
Just looking at raw numbers, you'd say it was virtually the same. But the difference in person was just staggering. The monitor was so bad I couldn't stand to look at it any more. I said the hell with dual screen completely and just use the raw real estate of the 4k tv for everything now. The monitor is collecting dust.
The tv is about 3 1/2ft from me. The monitor was 18"-24", somewhere in there at best guess.
If I had a TV your size, I could easily sit back another foot or two and maintain the same results. We aren't sitting 8-10ft+ away from these things like people in a lot of living rooms, and struggling to see the difference. It's immediately and blatantly obvious, and you never go back.
Precisely, I never understood the pixel density argument when what you should be looking at is really resolution clarity. Yes, the former has an influence on the latter, but it is not the only
This couldn't be more "clear" (pun intended) in this demonstration:
Regardless of what screen size, what pixel density, how far away or close you look at the left HD side, whatever you do it will never approach the clarity of that 4k image.