An overly compressed 4k stream will look far worse than a good quality 1080p. We keep upping the resolution without getting newer codecs and not adjusting the bitrate.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
I went looking for a quick explainer on this and that side of youtube goes so indepth I am more confused.
The resolution (4k in this case) defines the number of pixels to be shown to the user. The bitrate defines how much data is provided in the file or stream. A codec is the method for converting data to pixels.
Suppose you've recorded something in 1080p (low resolution). You could convert it to 4k, but the codec has to make up the pixels that can't be computed from the data.
In summary, the TV in my living room might be more capable, but my streaming provider probably isn't sending enough data to really use it.
On codecs and bitrate? It's basically codec = file type (.avi, .mp4) and bitrate is how much data is sent per second for the video. Videos only track what changed between frames, so a video of a still image can be 4k with a really low bitrate, but if things are moving it'll get really blurry with a low bitrate even in 4k.
I think the real problem is that anything less than 4k looks like shit on a 4k tv
This finding is becoming less important by the year. It's been quite a while since you could easily buy an HD TV - they're all 4K, even the small ones.
And then all your old media looks like shit due to upscaling. Progress!
The study doesn't actually claim that. The actual title is "Study Boldly Claims 4K And 8K TVs Aren't Much Better Than HD To Your Eyes, But Is It True?" As with all articles that ask a question the answer is either NO or its complicated.
It says that we can distinguish up to 94 pixels per degree or about 1080p on a 50" screen at 10 feet away.
This means that on a 27" monitor 18" away 1080p: 29 4K: 58 8K: 116
A 40" TV 8 feet away/50" TV 10 feet away
1080p: 93
A 70" TV 8 feet away
1080p: 54 4K: 109 8K: 218
A 90" TV 10 feet away
1080p: 53 4K: 106 8K: 212
Conclusion: 1080p is good for small TVs relatively far away. 4K makes sense for reasonably large or close TV Up to 8K makes sense for monitors.
The article updated it's title. The original title is retained in the slug.
The article title is basically a lie intended to generate clicks by pretentious people far stupider than the people who did the actual research which is why the non morons who did the research called it "Resolution limit of the eye — how many pixels can we see?"
You appeared to be complaining that OP's title didn't match the article title, and I was only pointing out the article's title has changed since OP posted.
My apologies if I misread.
OP, please update the post to reflect the current article title. It may have changed since you posted.
"No duh" -Most humans, since ever
Here’s the gut-punch for the typical living room, however. If you’re sitting the average 2.5 meters away from a 44-inch set, a simple Quad HD (QHD) display already packs more detail than your eye can possibly distinguish.
That seems in line with common knowledge? Say you want to keep your viewing angle at ~40º for a home cinema, at 2.5m of distance, that means your TV needs to have an horizontal length of ~180cm, which corresponds to ~75" diagonal, give or take a few inches depending on the aspect ratio.
For a more conservative 30° viewing angle, at the same distance, you'd need a 55" TV. So, 4K is perceivable at that distance regardless, and 8K is a waste of everyone's time and money.
Please note at 18-24" with a 27" screen 4K does not max out what the eye can see according to this very study. EG all the assholes who told you that 4K monitors are a waste are confirmed blind assholes.
They are a waste of time since the things with enough fidelity to matter run like shit on them without a large investment. Its just a money sink with little reward.
Subjective obviously.
Oh there are more pixels, sure. But not worth the money and most (and a big most) applications want more frames and smoother movement with less input lag over more pixels. The push for 4k gaming has went no where and it has been more then 10 years. You want to watch some 4k video? sure! That is a use case, but just get a TV with the nicer lumen, slower rates and comparably tiny price tag. I can not stop people from buying stupid crap, but I am judging them.
Personal anecdote, moving from 1080p to 2k for my computer monitor is very noticeable for games
Going down from 24" 2048x1152 to 27" 1920x1080 was an extremely noticeably change. Good god I loved that monitor things looked so crisp on it.
Even 4K is noticeable for monitors (but probably not much beyond that), but this is referring to TVs that you're watching from across the couch.