This forum has been archived. All content is frozen. Please use KDE Discuss instead.

4K downscaled to 1080p - can it be better than HD?

Tags: None
(comma "," separated)
User avatar
vylaern
Registered Member
Posts
157
Karma
0
Hi,

There is an interesting video:
Why Does 4K Look Better on 1080p Monitors
https://www.youtube.com/watch?v=kIf9h2Gkm_U
with information why 4K downscaled to HD looks better than HD. In 3m37sec of this video, there is information as during transforming 4:2:0 4K into HD, we will have 4:2:2 HD (not 4:2:0).

I was trying with kdenlive to downscale 4K into HD and compare with pure HD, and the videos looks the same form me - like in this video:
4K Compressed to 1080p VS 1080p: Does it make a difference?
https://www.youtube.com/watch?v=u06PFznS6PA

Howver, there is many examples on YouTube that 4K downscaled to 1080p REALLY looks better, so maybe I should use different way during downscaling 4K into HD?

Did you know: 4K vs 1080p, chroma sub-sampling and why you should record in 4K even if your TV does not support it yet
http://www.phonearena.com/news/Did-you- ... et_id61878
User avatar
redu
Registered Member
Posts
5
Karma
0
I think it's s true that in theory 4:2:0 colorsampled 4K video can yield 4:2:2 HD if downconverted cleverly yet i also doubt that you can feel the difference with your bare eyes. Where the same theory would have applied to 4:2:0 HD to 4:2:2 SD downconversion process too but it has never been a point before or i haven't since heard of it. I believe there are two reasons for that.

1) 4:2:0 (just like 4:1:1 of DVCPRO SD) colorsampling is a way of bit rate reduction applied prior to the encoding stage. Actually the original RGB signal which is supposed to be color sampled in 4:4:4 takes up huge bandwitdh for recording and transmission so it is first converted into YUV (luminance and color difference signals where U being R-Y and V being B-Y) and then U and V are sampled at half the resolution of Y (luminance) hence 4:2:2 or even less like 4:2:0 or 4:1:1. But again this method of bit rate reduction is applied to the uncompressed video signal. It's a wise method since your can scrape huge amount of color date (up to 50% with 4:1:1 and 4:2:0) prior to the encoding and your eyes can still tolerate the loss of that much color information. This loss can only become apparent when you start making color grading, layering of multiple video tracks with like chrome key effects etc., not just watching the video on the display monitor. So in practice what we like to call uncompressed video if not encoded, is in fact might be up 50% compressed in the first stage if colorsampled 4:2:0 or 4:1:1. So a footage downconverted from 4K 4:2:0 to HD 4:2:2 won't make any noticeable difference to your eyes compared to originally HD 4:2:0 shooted same footage. I mean even with uncompressed (not encoded) signals.

2) The 4:2:0 signal then goes in to the encoding stage and gets heavily compressed further. That makes everything worse. Since i would always prefer avoiding unnecessary encoding stages. If the case that you mention involves 2 encoding stages. Such as 4K 4:2:0 shooting -> encoding -> decoding -> downconverting to HD 4:2:2 -> encoding -> decoding -> watching on display monitor is to my knowledge less preferable to HD 4:2:0 shooting ->encoding -> decoding -> watching on display monitor. So if you target HD video originally, i would recommend you to shoot 4:2:0 HD in the first place (like with an AVCHD camcorder) rather than shooting 4K 4:2:0 and then downconverting it to 4:2:2 HD which will involve 2 encoding stages.

3) But on the other hand yes.. If you only have a 4:2:0 color sampled 4K signal as a source and nothing else, it might be wiser to have it 4:2:2 downconverted if you have the bandwitch / space for that.
User avatar
vylaern
Registered Member
Posts
157
Karma
0
In this video from 4K downscaled form mobile Samsung Galaxy looks better than HD from dslr Cannon T4i (please watch in HD).
https://www.youtube.com/watch?v=55SWtb1Hm7w

And here is very good example:
4K VS HD: Side By Side Comparisons (Part 3) (Using Same Camera)
https://www.youtube.com/watch?v=7tuMw7ThT1M
(even in 720p on YouTube 4K in this video looks better than HD)

Galaxy NOTE 3 versus 5D Mark III - 4K in a cell phone!
https://www.youtube.com/watch?v=pLiehzLTxXk
User avatar
CorrosiveTruths
Registered Member
Posts
87
Karma
0
OS
What matters is results.

If you get a 1080 camera that produces the best results, use it, if it's a 4K one and you want to make 1080 video by shrinking it and it looks better, do it (and it well might - resolution is important, but so are many other factors).

If you play a videogame in 4K and then downscale to 1080 and it looks better, then go for it (this is literally brute-force anti-aliasing).

But there's nothing magical here, and the reason we use 4:2:0 chroma subsampling is because you can't tell the difference. That's the point. Don't convince yourself you can. Especially with youtube video, because they're all 4:2:0 anyway.
TheDiveO
Registered Member
Posts
595
Karma
3
OS
If you happen to be blessed with a 4K sensor that has more noise than a full HD sensor then, well, you may want to preserve that noise. Especially when the 4K sensor with some luck has the same size as the full HD sensor: smaller sensor pixels means less charge per frame, more gain, more pain in form of noise. A decent denoiser can somehow compensate for this. Then, your optical system may not even be matched to the 4K sensor. But nothing a sharpener can't correct in consumer 4K cams. ;) But this all strongly depends on the specific scenes and conditions.

I've worked a lot with those 2.7K material from GoPro Hero cameras. I liked it because it gave me headroom for crop in post to adjust the framing as I wanted it to be. But otherwise 2.7K wasn't better in any way, especially given the "optical system" of the Hero series.
TheDiveO
Registered Member
Posts
595
Karma
3
OS
Oh, did I mention that you need to have those Gold DRAMs? Gold DRAMs have much less cell noise than normal Office DRAMs. You can immediately tell from the tube picture; it's much more vivd!
User avatar
CorrosiveTruths
Registered Member
Posts
87
Karma
0
OS
Okay, so I caught up at home here, and tested it.

The only claim that really worked was, play 4k videos on your 1080 display, and it will look better and have less colour bleed, and it worked (shockingly well to the point I might start uploading pixel-doubled versions of my videos).

The other claims were too dependant on content. For example, with the still shot I used uploading a 4k video and viewing at 1080p quality of it looked worse than uploading the 1080 version and viewing at 1080 quality.
TheDiveO
Registered Member
Posts
595
Karma
3
OS
CorrosiveTruths, which playback device did you use that can swallow 4K material? I'm asking because I've met a lot of smart TVs that weren't exactly smart when it came to footage in formats other than up to 1080p50.
TheDiveO
Registered Member
Posts
595
Karma
3
OS
CorrosiveTruths wrote:The other claims were too dependant on content. For example, with the still shot I used uploading a 4k video and viewing at 1080p quality of it looked worse than uploading the 1080 version and viewing at 1080 quality.


The differences you are seeing may be because TV device manufacturers seems to focus more on video, while photo may be of less importance. I noticed that on my own smart TV, a Samsung 2013 model that does 100 frames/s interpolation with prediction. I have diving footage in 25fps and the frame interpolation does an incredibly good job of bringing this to 100fps. In addition, Samsung plays tricks to reduce the natural image blur at 25fps to correctly achieve the impression that the footage is at least 50fps.

But for simple photos ... well, even a cheap digital frame does better...
User avatar
CorrosiveTruths
Registered Member
Posts
87
Karma
0
OS
TheDiveO wrote:CorrosiveTruths, which playback device did you use that can swallow 4K material? I'm asking because I've met a lot of smart TVs that weren't exactly smart when it came to footage in formats other than up to 1080p50.

This was done with a computer.
TheDiveO wrote:
CorrosiveTruths wrote:The other claims were too dependant on content. For example, with the still shot I used uploading a 4k video and viewing at 1080p quality of it looked worse than uploading the 1080 version and viewing at 1080 quality.


The differences you are seeing may be because TV device manufacturers seems to focus more on video, while photo may be of less importance. I noticed that on my own smart TV, a Samsung 2013 model that does 100 frames/s interpolation with prediction. I have diving footage in 25fps and the frame interpolation does an incredibly good job of bringing this to 100fps. In addition, Samsung plays tricks to reduce the natural image blur at 25fps to correctly achieve the impression that the footage is at least 50fps.

But for simple photos ... well, even a cheap digital frame does better...

Nothing to do with display or frame interpolation in this case, I was screenshoting frames from the youtube video so I'd know if it was the video or not.

I'd have to re-do and document properly, but essentially I created some images where it would make subsampling obvious (I created a 1080 spiral with orange / blue hard-edged lines, then resized without interpolation to 3840x2160) and then uploaded that as lossless video, with a couple of other experiments.

You can see the results here if you like - > https://www.youtube.com/channel/UCW9bBK ... j2OzaYQ6gA

Like I say, the only real result of note was the 4k version of video viewed on a 1080 display.

Now I just have to figure out how to double image size without interpolation with ffmpeg or mlt. If you can.
TheDiveO
Registered Member
Posts
595
Karma
3
OS
Well, you still have some image processing in the different video players -- that's why I'm asking. For instance, there are quite some differences between, VLC and, say, Microsoft's video player, or Samsung's in Android, or ...

There's some information about Ffmpeg's scaler: https://ffmpeg.org/ffmpeg-scaler.html ... I'm not clear if this would be bitexact?
User avatar
CorrosiveTruths
Registered Member
Posts
87
Karma
0
OS
Thanks, but first place I looked (mlt documentation was the second). I'll probably just start a new thread - see what people come back with.
jonesmaria
Registered Member
Posts
1
Karma
0
1080 looks better in direct comparison when not zoomed in. Sharpness and colors looks better. When zoomed in 4k looks muchbetter. But without a 4k monitor 1080 is ok for me at the moment. Its like with 4k tvs. Cheap 4k looks much worther than a good 1080. And really for more than 2 meter distance you cant see the difference resolution but you can seebette colors and contrast. BK Experience
sirobinson
Registered Member
Posts
1
Karma
0
when you have 4K footage and downscale it to 1080p (Full HD), the image is going to look better than it would at native 1080p. You'll find the picture is a lot sharper, the colors more vivid, and (depending on the properties of the image) you'll also see less noise. ^-^

https://www.videoproc.com/edit-4k-video ... lution.htm
https://en.wikipedia.org/wiki


Bookmarks



Who is online

Registered users: bancha, Bing [Bot], daret, Google [Bot], sandyvee, Sogou [Bot]