KDE Developer
|
Well that's obvious that we wouldn't break keyboard+mouse - why should we? It's about which features should also be reachable through touch gestures. |
Registered Member
|
One might have expected the same of the desktop version of Windows 8, but that expectation would have been bitterly disappointed, that's why I wanted to make sure.
What happens if an application implements a gesture which is also defined as a global gesture? Does the application's implementation of the gesture always "win" if it's performed over the application's window?
If it's done in the application, we'll have a quite inconsistent experience, as only those applications that really care about touch would implement it. I was trying to use my system with only touch and keyboard recently when the batteries of my mouse ran out and the charger didn't work. The thing that I missed most during that time was indeed the right mouse button. Yes, mostly for context menus, but there may be other uses for the right mouse button which touchscreen users may need. Therefore I'd vote for at least thinking about the possible implication of such an emulation a bit further before dismissing it. |
KDE Developer
|
Just like with global keyboard shortcuts: the global gesture wins. We cannot at a global level know whether a given touch sequence will be recognized by the application. There is no feedback from the application to the compositor. On a technical level the compositor sends all touch events to the application. The application can do with it what it wants, but must expect that a compositor sends it a "whoops, I'll take this gesture, no further events for you now". That's of course something to consider when selecting the global gestures: everything bound to the screen corner are good candidates for global gestures as applications can never know whether they border the screen.
Similar to what I just elaborated above: the compositor is lacking information about what the application does: we don't know whether there would be a context menu triggered at that point and turning the touch event into a right click means taking it away from the application, otherwise we might end with a situation that the compositor emulates a right click AND the application emulates a right click. In the end only the application can know what to do with the event, thus it needs to be in the application. |
Registered Member
|
Maybe it would be possible to have two groups of global touch gestures?
One with things like screen edge support, application switching ... and the other one with gestures that make non touch applications more usable, eg right click emulation. Applications could opt out of the second group, but not the first one? Since a lot of (most?) KDE application do not have special touch support / ui, things like an emulated right click would help a lot. |
KDE Developer
|
opt-out won't help. We might get KDE's application to do opt-out, but we don't get e.g. GTK to implement our opt-out. And if we adjust the applications anyway we can adjust them to be correct from the start. |
Registered Member
|
Windows 8 emulates a right-click with a long static press on the touch screen, but only in Desktop mode, not in Modern UI mode. When you long press, a square appear around your finger and right click is sent when you release your pressure. It makes it possible to use any existing Windows application without a mouse (aside from full screen games). From an implementation point of view, I think the whole desktop is considered as a Modern UI application, and anything on the desktop is contained in this application, which enables them to have these separate behaviours. It seems difficult to reproduce in Plasma, but maybe you can take some inspiration from it.
Unfortunately, on Linux, a lot of desktop applications are unusable without right-clicking. Of course if long press is captured by KWin, it might break a few applications, but I think applications unusable without a right click are much more common than applications requiring a long press. I don't see how you can make all desktop and all touch applications work everywhere with only a touch screen. In my opinion, in Plasma Desktop, you should make sure that all desktop applications are usable and touch-centered applications can be considered as second-class citizens. This means that right-clicking should be emulated in Plasma Desktop. On the contrary, Plasma Active should leave long press to the application and not care about the possibility to right click. If an easy way to switch between Plasma Active and Plasma Desktop is provided, I think it could be a good balance. |
KDE Developer
|
ok, seems like I have to think about the right click: on a technical level we know whether the application can:
* accept mouse events * accept touch events If it binds touch we should expect that the application handles them properly. If it doesn't, but accepts mouse events we could emulate the right click. Furthermore we could send the right click to all X11 windows. |
KDE Developer
|
One thing we should probably also look into, and which is a much more sensible thing to do for a notebook, is global touchpad gestures. These could be similar to the ones on the touchscreen, just in a smaller scale, and right next to your palm.
|
KDE Developer
|
I'm not sure whether we can recognize touchpad gestures. Because of that I would prefer to focus this thread just on touchscreen gestures. |
Registered Member
|
hey there,
i want to improve touchscreen support for KDE, too. I'm not yet developer here, but that's only a matter of a couple of weeks. So far I've combined my thoughts with your ideas and came to the following summary: 1. We should only concentrate on touchscreens. Touchscreens aren't future anymore and KDE should be more up-to-date and catch up with other environments. 2. The user should have the possibility to configure gestures. We should give users a variety of gestures (Tap, Hold, Swipe/Drag, ...) with basic operations (e.g. emulate right-clicks) and the possibility to alternatively bind commands to them (in a way like shortcuts). This would be the most configurable and thus far the best solution because we don't force anyone to use our configuration. 3. And this could it be: Our configuration. We want to give users the possibility to fully customize their KDE. But to work out of the box we should indeed give them a default configuration. Summary as i mentioned before (I give default shortcuts from opensuse/kde for examples):
Now to applications: Whatever default configuration or binds we use for global touch gestures, every default we set is a 'limitation' to applications. There is no possibility in thinking about gestures that applications won't use. Developers of these applications can think of the same gestures like we do. So our task is to choose the gestures they most probably not need. I think 'hold' for 'right-click' is the best choice because most applications will need context menus by default, it is the simplest way to do it and if they need the 'mouse' hold down it will be most likely to drag something (which is indeed another gesture). If nonetheless an application needs all gestures we should give them the possibility to set some kind of flag. If the flag is set the application will get control over all gestures it needs. Problem here is: This would probably be far more work than expected first. That's all I could think for now Time for a break Regards, Tristan |
Registered Member
|
I like this slyle, in particular I feel it would be intuitive using the following conventions:
Tap & Drag and Double tap need to wait time for proper recognition, I have seen some discussion about it in the wayland mailing list. Do you plan to use them for global gestures? |
Registered users: Bing [Bot], claydoh, Google [Bot], markhm, rblackwell, sethaaaa, Sogou [Bot], Yahoo [Bot]