Reply to topic

Device Type/UX overview (for moving towards Convergence)

arucard
Registered Member
Posts
52
Karma
0
I created a page on the community site where I try to provide an overview user experiences on different device types which I think should be useful in thinking about Convergence.
https://community.kde.org/Plasma/Convergence_Overview

The idea here is to list the device types that Plasma should work on (many of which are already available). For each device type you can then define the following:
  • What makes this type of device different from the others? (define the characteristics)
  • How will people (want to) use this type of device? (define the vision for the user experience, based on the benefits and drawbacks of its characteristics)
  • How can we make Plasma work in the way that people (want to) use this type of device? (define the behavior of the common components of Plasma in a way that's consistent with the vision)

This list should provide an overview of the device types that are (or will be) supported by Plasma, allow you to think about the user experience in a way that relates it to the characteristics of the device type (enabling you to provide a reason why something should behave a certain way on a specific type of device) and should allow you to identify the points of commonality in Plasma that would need to be worked on. Granted, I think most of this is already being thought and talked about, but I thought it would be useful to have it all in one place so you can keep track of the big picture.

Of course, the page should be updated when discussions result in different, better or more detailed proposals. I hope that this page can provide a way to ensure that discussions can be held more effectively by easily seeing what the current state is, and that improvements can easily be compared against that current state but also be put into perspective (e.g. by relating the results of a discussion about Plasma Mobile to the effects it would have on a Plasma Media Center).

I think it would also be useful as a way to see what the current state is of the discussions surrounding user experiences and device types (especially considering the increased interest in Plasma Mobile). I hope that, if this page gets updated frequently, people can easily see how the KDE community thinks different types of devices should be used (and why), which should allow them to more easily join the discussion if they think differently or just want to improve it even more.

I hope this helps improve discussions about user experiences, device types, convergence, etc. and I hope that this will be a good way to provide people with information about the direction the KDE community wants to take with these things.
User avatar Heiko Tietze
Registered Member
Posts
593
Karma
0
OS
Good idea. What I miss is the head-up or head-mounted display with (sometimes) a large size but no extra input. Except voice control, which might be relevant for other systems as well (by the way, is Simon dead?). Perhaps you could merge this into the TV stuff and rename that. Another device type would be the wearables. You may treat a smart watch as a very small kind of smartphone or as a tiny headless UI (TV like), The question is whether a head-mounted display is like a smartphone or like a smartwatch. The borders become blurred ;)

PS: Thomas started with the mobile HIG. It can be found at https://techbase.kde.org/Projects/Usability/HIG
arucard
Registered Member
Posts
52
Karma
0
I agree that a HUD would be another type of device, but I think we should only include device types that we actually intend to, or at least want to, support in Plasma. I'm not sure a HUD would qualify there, but if you think it's possible feel free to add it to the page. I would not merge it with the TV UX because I think that you would use a TV in a different way than a HUD (as well as an HMD). As for Simon, I assume you mean the speech recognition system. Last thing I read about it was that Mario Fux took over maintainership (http://blogs.fsfe.org/mario/?p=373).

I think that smartwatches can be added as device type though. I would add them as a separate device type, not as "wearables", since I think that you will likely want a different user experience for different types of wearables. This is where I think the characteristics of the device type come in handy. For every device type you need to define the relevant characteristics and exploit those characteristics to the best of your ability in your vision. If you have 2 device types which have the same (relevant) characteristics, then we can treat them as the same. This should help "unblur" the borders.

To use your question as an example, a smartwatch has a small screen with touch input at close range. A smartwatch has an even smaller screen, also with touch input at close range (it can also just have buttons on the side for input, but let's just ignore that right now for the sake of simplicity). The only difference here is the screen size. The smartwatch has a smaller screen size but do you need to differentiate between them? Since the UI from a smartphone would be unusable on a smartwatch, we should differentiate between them. We can call the smartwatch screen size "tiny" instead of "small" to clearly show that they are different. You can do the same for HMD's though I think you would even need separate user experiences for many different types of HMD (e.g. Occulus Rift would probably be used differently than Google Glass).

I did actually see the Mobile HIG before writing this page. I think it is very useful on its own, but i think it could benefit from the kind of "big picture" overview that I wanted to provide with this page. The way I see it, you could have a HIG for each device type, including Mobile, but you need to make sure that all these HIG's are "compatible" with each other. They would need to use the same common components which are just arranged differently in each UX. This is the part that leads to Convergence. So if you create an application for smartphones according to the Mobile HIG, your application can automatically be used on a Desktop because it is just a matter of re-arranging the content according to how you want it shown on a Desktop. This should conform to the Desktop HIG and keep your application very user-friendly without any additional effort from the developers (I hope this is actually possible :P ).

I'll add a smartwatches UX section for now and try to fill it in as much as possible, but feel free to edit the page if you think you can improve it.
arucard
Registered Member
Posts
52
Karma
0
It seems like I can't edit the page anymore even though I created it. I need to be Administrator or Trusted (of which I'm neither). So if anyone else can edit the page, could you please add this section under the Smartphone UX section. I filled it in, but you can just consider this a personal best guess.

Code: Select all
=== Smartwatch UX ===
'''''Screen size''''': Tiny<br />
'''''Input method''''': Touchscreen from close range

'''''UX Vision''''': There is a very limited amount of space, not even enough to always show all of the Application Content and components still need to be large enough to be controlled by fingers (which are less precise than a mouse pointer). This means that the smartwatch can not be used for Application Content, but is mostly intended for things like notifications.

Example of how this can be done in a Plasma Smartwatch workspace
* '''Workspace''': N/A
** '''Application Launcher''': shown full-screen on the main screen, showing only compatible applications
** '''Application Shortcuts''': N/A
** '''Active Application Overview''': shown as separate screen aside from the main screen
** '''Background Application Overview''': shown as a separate screen aside from the main screen
** '''Application-Workspace Interaction''': only from the Active Application Overview (close only)
* '''Application''': N/A (except for compatible applications, shown full-screen)
** '''Application Menu''': N/A
** '''Application Content''': applications need to be made compatible explicitly since the screen isn't big enough to just show the Application Content.

It should be clear from this that Applications are not really the focus for smartwatches. I've allowed for "compatible applications" which would probably have to be built specifically for smartwatches in order to be useful. So the main focus would be on receiving notifications and responding to them. Though if you can add voice recognition to the characteristics as input, this might need to change. Like I said, this is just my best guess, I'm sure there are people who can improve on this.
arucard
Registered Member
Posts
52
Karma
0
With the Phabricator login on the community wiki, I am now able to edit the page again. So I added the smartwatch section to it (https://community.kde.org/Plasma/Conver ... rtwatch_UX).

I was also curious about the Kirigami UI components. They seem like the UI components aimed at complying with the Mobile HIG so I think it would be useful to integrate it into this overview. You can give examples of specifically which Kirigami UI components can be used to implement the different common components of each UX. This should then allow you to give a clear indication of how the mobile/touch-oriented UI components can be changed to other types of components, e.g. the traditional non-touch-oriented UI.

I am not familiar with the Kirigami components that are available so I would appreciate any help with figuring out which components are available in Kirigami and where they fit in this overview.
arucard
Registered Member
Posts
52
Karma
0
While defining the goals for the common components that should be present on all devices, I think I have found a good way of defining what is needed for convergence. We basically need to ensure that each goal can be met on every device for which we wish to have convergence.

These are the goals that I think describe what an environment on any device needs to be able to do:
  • Let the user start an application
  • Let the user easily start their favorite applications (optional but widely used)
  • Let the user switch between active applications and manage (e.g. start/stop) them
  • Let the user manage (e.g. start/stop) background applications
  • Let the user customize how an application behaves within the workspace (e.g. in maximized window)
  • Let the user access an application's deeper functionality (e.g. application settings)
  • Let the user access an application's main content
For each goal, you have to define how to achieve it on each device. And knowing that they have the same goal on each device should allow you to define and implement it in a convergent way. I hope this can help in getting towards convergence.
arucard
Registered Member
Posts
52
Karma
0
I've now added an image to the Desktop UX section. The image shows the components that are mentioned, as they can be seen in a Desktop environment with System Settings as sample Application. This should make it easier to understand what each component refers to.

I'm hoping that others can create similar screenshots with the components marked in the other environments, since I don't have access to all of these environments. I think it would be most beneficial to at least have an image of Plasma Mobile, with the components marked. I don't have access to a running Plasma Mobile environment, but I'm looking around for existing screenshots that I could use. Showing these components on a Plasma Mobile screenshot would allow you to see which components might need to be changed completely when moving from a mobile-based UX to a classic desktop-based UX. For example, this might show a need for specific Kirigami UI elements to be automatically converted to their corresponding desktop UI elements (as opposed to scaling or reorganizing the elements to fit the device), in order to provide convergence which still enables the use of a UI that is fully optimized for a specific device.

 
Reply to topic

Bookmarks



Who is online

Registered users: Baidu [Spider], Bing [Bot], Exabot [Bot], Google [Bot], hotzek, Sogou [Bot], Yahoo [Bot]