Registered Member
|
How do I enable OpenGL for Kdenlive? When I go to Settings/Configure Kdenlive and look at the video drive drop down, no OpenGL. It is on my system and I can use it with with mplayer (-vo gl2). I have Ubuntu Lucid, 10.04, did not build kdenlive, installed from the repository. Says version 0.7.7.1-0ubuntu1
|
Registered Member
|
So you want video playback via OpenGL?
Video playback is done using SDL, and I'm not sure whether SDL supports OpenGL as a backend. Could you please open a feature request in our bug tracker so that it is easier to keep track. |
Registered Member
|
Why do you feel the need for OpenGL? We primarily use XVideo via SDL, and I believe some X servers like NVIDIA's emulate XVideo via OpenGL.
|
Registered Member
|
The main reason is performance. I have some 1080 60i video. Was AVC, converted it to DNxHD 1080 30p, still have performance problems. The main culprit is most likely color conversion, that is expensive. OpenGL will do that with a shader. XV can also do it with the overlay, though when I'm using XV, I still have one CPU pegged and not multi-threaded so skipping some. I'm surprised that the DNxHD is taking that much power to decode. My CPU is not that fast but still thought it would do it, Intel Core 2 6600 @ 2.4 Ghz. I can almost keep up with the AVC on mplayer which is very expensive to decode.
Anyhow you are probably correct, the XV is most likely good enough as it should do scaling and color conversion in hardware. How do I get the multi-threaded decode working? I recompiled ffmpeg: FFmpeg version SVN-r24744, Copyright (c) 2000-2010 the FFmpeg developers built on Aug 8 2010 20:56:17 with gcc 4.4.3 configuration: --enable-gpl --enable-version3 --enable-nonfree --enabl ds --enable-libfaac --enable-libmp3lame --enable-libopencore-amrnb --ena able-libtheora --enable-libx264 --enable-libxvid --enable-x11grab libavutil 50.23. 0 / 50.23. 0 libavcore 0. 3. 0 / 0. 3. 0 libavcodec 52.84. 3 / 52.84. 3 libavformat 52.78. 1 / 52.78. 1 libavdevice 52. 2. 1 / 52. 2. 1 libavfilter 1.31. 0 / 1.31. 0 libswscale 0.11. 0 / 0.11. 0 libpostproc 51. 2. 0 / 51. 2. 0 Have not been able to get the MLT to compile, but didn't think I needed that to thread ffmpeg. Do I need to build everything from source to get multi-threading or just ffmpeg? |
Registered Member
|
Where does vdpau and libva fit in here, can kdenlive-svn work with libva and vdpau enabled ffmpeg from git or svn?
I have the same problems with DSLR h264AVC, tried all manner of options for playback including xv variations but I'm looking at upgrading to Ubuntu 10.10 Meerkat from Lynx (rather than building ffmpeg and playback apps from svn with vdpau etc because it breaks other packages). Meerkat is available now and building ffmpeg with vdpau and libva enabled by default, along with VLC 1.1.2git, SMplayer and Mplayer are being built with vdpau and va enabled for playback in general I believe. |
Registered Member
|
An OpenGL output alone will not address your concern. As you said, a GL shader can do colorspace conversion, but that is another thing in addition to output. Besides, a video editor (due to numerous filters, compositing, and encoding) can not do a bunch of stuff on the GPU without having to incur the penalty of system<->video memory transfers unless nearly everything is done on the GPU as with the new Adobe Premiere Mercury Engine, and that takes A LOT of work. We are just a few hobbyists. So, all of a sudden, something as simple and contrived as OpenGL output has turned into a mammoth project to reimplement everything on GPU.
Now, regarding the multi-threading, FFmpeg's DNxHD decoder is not multi-threaded (encoder is), and the H.264 decoder threading does not work with some sources including AVCHD. Then, in addition to Kdenlive's main thread, in MLT is currently only two main threads: processing and output. However, I do have a development branch that is nearing merge into master that adds full-on parallel processing to the image processing pipeline: http://www.dennedy.org/index.php?option=content&task=view&id=106&Itemid=2 You should expect that parallelism in release form by end-of-year. CPU core parallelism and vector-instruction optimization of existing routines is easier and more broad architecture-wise than trying to do everything on the GPU. (Props to that effort behind the Mercury Engine!) |
Registered Member
|
Re: VDPAU. Please search these forums; it has already been discussed plenty.
|
Registered Member
|
BTW, on OS X, we do use OpenGL output, but that was mainly a means to workaround SDL integration issues. It would be possible to do it on other platforms as well, but I do not think that is any better than XVideo. A bunch of MLT filters operate in YUV (Y'CbCr) and for simple playback with image "normalization" (conforming resolution, scan mode, aspect) it will stay in that colorspace and XVideo provides very good YUV playback.
|
Registered users: Baidu [Spider], Bing [Bot], Google [Bot], Yahoo [Bot]