Registered Member
|
Hello!
So, from KDE 4.11 onward, the screen tearing prevention feature was introduced. And it's nice and all, but I never really experienced tearing before KDE 4.11 came out. And now I have screen tearing when desktop effects are not turned on, and it's very annoying and very noticeable. I'd leave desktop effects on all the time, but they cause various issues when watching some flash videos in Chrome (usually at full screen), and even worse, when trying to use Wine-silverlight and Pipelight to watch Netflix. The screen will flicker during both, until the X server finally crashes. So is there a way to make it stop tearing without desktop effects on? Because then I can just turn them off when I watch Netflix or flash. Or I can use the handy "disable desktop effects in full screened applications" feature, which I can't use now because there is tons of video tearing when I do. Alternatively, if anyone has a fix for the screen flickering issue, I'd love to hear it. I am using Arch Linux, and I have an Nvidia 8800 GTS card, and I'm using the official nvidia driver package. My CPU is an AMD FX 6300 and I've got 8 gigs of RAM. I also have dual monitors, though I don't think that matters much, as I've encountered this issue on my laptop, which has an Intel card. I'm using OpenGL 3.1 for my desktop effects, and it's set to native instead of rastor. I find this works best for me, but playing around with those settings hasn't helped with the aforementioned issues. I should note that I didn't start to have problems with the screen flicker until KDE 4.12 came out and I upgraded from the nvidia 337.12 drivers. When I first encountered the issue, I downgraded to the previous version of the drivers, but I can't do that now since it'd require me to go back to an old kernel and downgrade a whole bunch of stuff. Any help would be greatly appreciated Thanks! |
|
> now I have screen tearing when desktop effects are not turned on
This is completely unrelated to kwin, probably due to a diver update. W/o compositing, kwin has zero impact on the on-screen display (except for the titlebars What's the output of
> I also have dual monitors, though I don't think that matters much, Please notice that virtually no consumer card can technically sync to more than one screen, ie. the output is synced to display #1 XOR display #2 - never both. Usually it's the primary one. > as I've encountered this issue on my laptop, which has an Intel card That's even more weird, because it rules out kwin (no compositing) as well as the driver -> video player? (which one do you use, and please don't say "flash" - flash is just broken. please check mplayer or vlc) |
Registered Member
|
Thanks for the reply!
You're probably right about it being a driver issue, I thought it might not be because it did it with the laptop/the Intel card, too. The laptop actually does have desktop effects, which I never turn off, and which generally work pretty well. I'm using Intel's open source video card drivers. Anyway, the laptop doesn't matter too much, I don't really use it to watch netflix and that's where this issue is bothering me more than anything else. The output of that command is:
I use either VLC and Smplayer, I've been using VLC more lately. But I don't have any problems watching videos on either of those, they both work great. Like I said, it's mainly really bothersome when I want to watch Netflix, which I do a lot, and if I could get the tearing to stop without desktop effects, I will be good. I only figured KDE changed some setting because I never had issues with tearing without the desktop effects before KDE introduced tearing prevention. Thanks for your help! |
|
First get rid of the framebuffer console:
https://wiki.archlinux.org/index.php/GR ... ramebuffer See if the issue and esp. the other nvrm messages remain. Is this btw. an "optimus" system? (ie. intel + nvidia in one box) As for (sigh) flash: do you use HW decoding? (relevant is /etc/adobe/mms.cfg, see https://wiki.archlinux.org/index.php/Br ... ash_Player) |
Registered Member
|
Ok, so enabling the HW decoding has so far worked to fix my flash issues, so thanks for that!
Unfortunately disabling frame buffer doesn't fix wine-silverlight or the video tearing. It did make my nvrm message look normal, though. at least until the screen flickered and the x server crashed after I tried to make Netflix go to full screen. So now it provides this handy bit of error logging:
Googling the error code led me to this, so it appears to be part of an issue that's been around for a while. I'm actually wondering now if it'd be possible to downgrade back to the version that worked for me via the file provided by nvidia on their website instead of using the Arch package. And nope, my desktop (the specs for it are in my OP) just has a regular ol Nvidia 8800 GTS and no other graphics cards. |
Administrator
|
To downgrade package versions, please see https://wiki.archlinux.org/index.php/Do ... g_packages
KDE Sysadmin
[img]content/bcooksley_sig.png[/img] |
Registered Member
|
I have had two Nvidia cards and the same problem with both. My KDE main panel is configured to autohide, and when it appears, it shows flickering. It usually happen when my PC wake up from hibernation or suspend.
Moreover, just when wake up from those states , the screen is corrupted. If I disable composition (ctrl -alt - f12 is the only way, cause screen is like a puzzle of triangles) the screen is fine and can activate again composition. This always happen only when resuming from suspend or hibernate. |
|
nvidia + STR bug: https://bugs.kde.org/show_bug.cgi?id=323686 Please ensure you're not using a framebuffer konsole. It might also be just a lack of reposting the framebuffer on wakeup (this can usually be configured in BIOS/UEFI) |
|
I take this in general means "wine"? Do you get tearing in OpenGL games or only in Direct3D ones (where wine has to map DD3 -> OpenGL)? Do you get tearing in linux games (that support synced swapping) like eg. ioquake3? (There's also a free set of weapons & maps, you don't need original Quake3)
That's not normal Could either be heat or a specific nvidia config. Can you post/paste "nvidia-settings -q all" (that is MUCH text!) What is btw. the used yielding strategy? (env | grep __GL_YIELD) Try:
|
Registered Member
|
I also experience this problem and have to leave compositing effects enabled while playing fullscreen video or videogames. Why I need Kwin's compositing effects enabled to avoid tearing? I will probably never know.
|
|
You probably need to activate overlay syncing in nvidia-settings, resp. sync to the "correct" device. ("X Server XVideo Settings")
Also the video player must not use X11 as sink (but xv or vdpau) and in case of an OpenGL sink support synced buffer swapping. The compositor will just sync the unsynced output as a side effect. |
Registered Member
|
Ok, I'm back, hello!
After looking around, it seems the solution to this is to supposedly enable triple buffering in my nvidia settings. I've tried to do this three different ways (one of them just for Kwin) and none of them work. I still have screen tearing when desktop effects/compositing are turned off. Here's what I've tried: I attempted this solution and it didn't work. I also generated an Xorg file using nvidia-xconfig and added this:
This also did nothing. Previously I had added the same code to the /etc/X11/xorg.conf.d/20-nvidia.conf file (it's still there) and it also didn't change anything. How can I find out if triple buffering is actually enabled? Also luebking, my env | grep __GL_YIELD is currently at nothing, but before it contained export __GL_YIELD="USLEEP", which didn't do anything. I tried changing it to export KWIN_TRIPLE_BUFFER=1, as outlined in the Arch forum post I linked above and also the Arch wiki I asked the Silverlight/pipelight guys (on their IRC channel) about using those GL env variables while running the program, and they said to just run Firefox with them, as that's what Pipelight is using when I watch Netflix. So I did, and it didn't do anything, either. That said, because you brought up playing games through Wine, I'm going to give that a try soon. I used to play World of Warcraft (using OpenGL, not DirectX) a couple years ago and while it did work flawlessly, I experienced video tearing with it sometimes. I'll try it with Call of Duty 2 or something to see if I have trouble with Direct3D games. I'll try ioquake3, too. Anyway, the output of the nvidia-settings command is indeed really long lol, so I mercifully put it in a pastbin - http://pastebin.com/DGQJK7hg I searched it and didn't find anything about triple buffering in it, but maybe there's something else useful in there. Thanks for all your help! |
|
Triple buffering is only remotely related to video syncing (it prevents the sync from blocking the process) and the _only_ way to activate it is to hand it as option to the X11 server.
To check for it, simply
should print sth. like
It has no impact on the actual driver behavior (but on kwin!) and should only be used if the heuristics fail. Also it has zero impact if compositing is disabled. Reg. "wine-silverlight or the video tearing", the only "fishy" thing i see in your settings is "Attribute 'XVideoSyncToDisplayID' (Tardis:0.0): 3." You've apparently 3 screens attached to the GPU and the overlay video (NOT! OpenGL. We're talking about video playback atm.) is synced to the 3rd one. However _everything_ else (refreshrate, colorrange, metamodes) only deals with the DFP-[0,1] displays, so if that 3rd display is the DVI-I-3, you sync to nowhere. It somehow seems the DVI-I-3 is added via the xinerama order, but not actually physically attached? |
Registered Member
|
I just want to report in here that I solved the issue (not really) by downgrading back to version 337.12 of the nvidia drivers, so now I can watch Netflix again without the screen flicker and X server crash. I know this isn't a real solution, but I was getting annoyed and impatient with it. To get back to the old version, I compiled the PKGBUILDs against my current version of the kernel and it worked, after some nice folks in the Arch IRC channel told me I could do that.
In a couple weeks some friends of mine are giving me their slightly old computer, which has the same video card as my current desktop. I'm going to put Arch + KDE 4 (and other desktop environments) on it and do some testing related to this issue. I'm hoping this won't be an issue with Frameworks and Plasma 5. But thanks for all your help! |
|
Once more: this has *no* relation to KDE - the facts that a) this happens especially WITHOUT compositing b) reverting the driver "fixed" it clearly indicate that this is an issue in the driver. If you don't believe this, try logging into a failsafe session (that's where you get nothing but an xterm), start your browser there and to playback the video. |
Registered users: Bing [Bot], Evergrowing, Google [Bot], ourcraft