Registered Member
|
Hello all,
When import an mp4 clip and drop it into the timeline and render it out with a lossless format (only ones I have tested) the last 5-7 frames seem to be held/duplicated. I am importing a an HD slideshow clip (animation) by filename pattern of 300 frames with the frame number in the center of the image (another issue is with the slideshow clip, it seems to loop at the end and put the first frame at the end, so I always have to trim it off). When I render this file out as FFV1, mpeg-2, mpeg-4, or h264, I overlay the frame number just to double confirm (it starts at 0 whereas my frames start at 1). All the files play fine in VLC, although the mpg and h264 version have trouble displaying the last several frames. The problem of duplicate frames happens when I re-import the clips back into kdenlive. The mkv is the only one that works flawlessly, but it is massive and not good to work with. The mpg completely drops the last 2 frames, the mp4 duplicates the last several frames ending at 291, and the h264 mp4 starts duplicating at 292. This happens both on the timeline and in the clip monitor. I'm not sure where the problem is, but I'm not sure how to resolve it and it is becoming increasingly frustrating as I work with animation sequences and each frame is very important, so lost frames are unacceptable. Any help would be much appreciated. I can upload the files and project if need be. thanks |
Registered Member
|
OK, just to add some more information after testing: The slideshow clip is made up of 8-bit tifs.
I also just noticed that the FFV1 mkv is brightening up the image. I compared it to the original by extracting a frame and comparing and there is definitely a difference, maybe a gamma correction happening during the render? I'm not sure, but it is definitely not desirable for lossless output. EDIT: The brightening is not just the FFV1 mkv, it is happening for all rendered output formats. EDIT2: The brightening is not in the file, it is when it is being displayed through kdenlive. I tried ffmpeg/avconv in a terminal and imported into kdenlive and had the same issues. |
Registered Member
|
Hi, How were the 8bit tifs created? Are they RGB?
You're brightness changes are probably just a YCbCr < > RGB levels mapping problem, ie: restricted range levels treated as full range or some similar mix up either in kdenlive or external preview, image extraction. |
Registered Member
|
hi yellow,
thanks for your reply. yes, they are RGB tifs, that could explain things. I read about a possible mapping/color space conversion issue in my investigation, but assumed that if kdenlive could display them in the correct color space when i first drop them in the timeline, then something else was amiss... i really just wanted to render out proxy clips instead of using the slideshow clip of raw images since it is very slow to play through them. not sure it is worth it at this point... thanks |
Registered Member
|
OK, thanks for the tip, yellow, I just noticd that in the advanced settings of the clip properties there is an option to force the colorspace, I changed that to ITU-R 601, which I have no idea what that is, and the brightness problem goes away.
I'm now converting the image sequences into dnxhd files via avconv and reimporting them into kdenlive, seems to work well. Something else I noticed was that if you use Real Time (drop frames) with image slideshow sequences it does not work too well. If it drops a frame it seems to freeze the image in the monitor, but the timeline progresses. Hitting pause will update the monitor to the current frame. thanks |
Registered Member
|
The ITU 601 vs 709 with regard to conversion to and from RGB and YCbCr video is about the luma coefficients (color matrix), part of the calculation in handling the RGB color values including the brightness. So it's important that the same coeff's (color matrix) is used doing the color space conversion from YCbCr to RGB and RGB to YCbCr.
Getting the color matrix wrong is not as strong a difference as say assuming wrong restricted vs full luma levels in the conversion but when putting source and output side by side a difference can be seen, generally contrast and discrepancy in orange <> pink, blue <> green. http://www.kdenlive.org/mantis/view.php?id=2808 There's a discrepancy between Affine and Composite transitions, SDL preview and rendered output currently in kdenlive, so it can get a bit mixed up when trying to establish where the 'wrong' conversion happens and what to trust. http://www.kdenlive.org/mantis/view.php?id=2934 Also if when encoding RGB images to video, the color matrix is not declared by the encoder, an ITU 709 color matrix will almost certainly be assumed for HD and ITU 601 color matrix for SD, in any conversion to RGB due to pixel count. Forcing the color space in the clip properties is basically you telling ffmpeg -> MLT -> kdenlive that it got the color matrix wrong, this may not be technically correct if other parts of the chain are screwed up, you may see consistency simply because of a reversing the screw up but is it actually as the camera captured it necessarily. Just to add another layer of confusion, a tool like VLC with hardware accelerated decoding on or MPlayer or Xine will not necessarily respect the metadata regarding full or restricted levels, or color matrix giving 'wrong' output again. You can create proxies for images but it doesn't work so well because proxies for images don't appear to go below project resolution, so if you are say adding DSLR 4k photos into a 1080p project the resolution will not drop below 1920x1080. This may still be too large a resolution for smooth playback. You can check your image proxy sizes in the proxy folder within your project folder on your hard drive if you wish. |
Registered Member
|
hi yellow,
sorry for the delay, had a project to finish. that is a lot to digest, but thanks for your detailed response. I am not familiar with a lot of the terminology and concepts with digital video and the conversions you describe, but I think I understand a lot of what you write. I used mostly 16mm back when I was into film and never had to deal with a lot of this conversion business in digital space. I seem to be experiencing what you posted in 2808. I guess using the ITU601 is a hack and is not necessarily the best solution, but it works? I am not using vlc to verify the color error. I have the original RGB image files rendered out of a compositing program and use a good viewer called mplay by SideFX Software to examine all my images side by side with what kdenlive is displaying and also extracting a png from the project monitor. Using avconv to go to DNxHD was a great solution, I really like this codec for editing and it preserves most of the detail, but the file sizes are rather large. Is there a way to ensure that kdenlive will import and apply the "correct" color matrix for my RGB files so that when they are encoded they will look like the original files when the movie file is viewed in vlc, etc? Is there something I can do to the RGB files or the converted DNxHD file (like attach metadata) to let kdenlive know how to process it correctly? thanks for your time. |
Registered Member
|
hi zephyr707, the basis for choice of color matrix luma coefficients between ITU601 & ITU709 is generally governed by resolution.
So any YCbCr video upto but not including 720p (1280x720) would be classed as SD (Standard Definition) and 720p and above, disregarding Cinema resolutions and DCP's (Digital Cinema Packages) would be classed as HD (High Definition). When RGB source is encoded at SD resolutions, generally a ITU601 color matrix would generally be specified, when encoding RGB source at HD resolutions then a ITU709 color matrix would generally be used. There are exceptions to this premise not only with camera manufacturers like Canon, Nikon and Panasonic GH3's which even though they are HD use ITU601 color matrix and even more outdated coefficients and of coarse, bad handling by some software and human intervention encoding SD sources with ITU709 and visa versa but unless you put the original source against the final encoder output, extracted frames, assuming that the extracted frames have actually had the 'correct' color space conversion done then no one would probably notice, or they'd assume the camera produced **** skin tone, too orange say, unaware that the camera actually produced perfectly good skin tone, just the processing chain they used screwed up the convertion. Say assuming ITU709 for a ITU601 encoded source. Pinks to Orange, Blues to green and set about applying a 3way color adjustment to 'correct' something they screwed up without knowing. And with regard to composite transition in kdenlive greys to green tinted greys. But as long as whichever color matrix you choose to use in the color space conversion (or whatever your encoding app chooses on your behalf) when encoding the original RGB image source to YCbCr, like DNxHD (ITU709), you just make sure that's how the video gets converted back to RGB for frame grabs and display. Overiding the handling of the color matrix in kdenlive is not so much a 'hack' but a useful option to overide what ffmpeg is telling MLT/kdenlive that the source is assumed to be and therefore how to display it. Infact it's a useful option missing from numerous NLE's. So you really just need to establish what is happening in the processing chain, what color matrix the encoder uses for your source, specify it if you can in the encoder parameters, decide if that choice is 'common' for that resolution, you can check what is being flagged in the video stream by using a tool like mediainfo and then check if your trusted player, frame extraction process is actually doing th correct thing too. The other factor which may have a bearing on brightness and contrast is luma range used and assumed in the chain. Going from RGB to YCbCr should be 0 - 255 RGB to 16 - 235 (luma) & 16 - 240 (chroma) in YCbCr and the conversion from YCbCr to RGB the recipricol. Again that is a chain that may need establishing is actually happening. What resolution are your RGB tiff source files. When you encode them with avconv in a terminal what does it tell you about that process about pixel formats, swscale etc and using a tool like mediainfo what has been flagged in the stream. If you extract an image from the DNxHD in your viewer app is it identical to your RGB source? |
Registered users: Bing [Bot], Google [Bot], q.ignora, watchstar