This forum has been archived. All content is frozen. Please use KDE Discuss instead.

Suggested Minimum Testing

Tags: None
(comma "," separated)
User avatar
ocumo
Registered Member
Posts
16
Karma
0
OS

Suggested Minimum Testing

Fri Nov 11, 2016 5:12 pm
Coming from an interesting debate about recurrent issues/bugs, one of the main conclusions I take is that the community has to find creative ways to increase the quantity but more importantly the quality of the feedback to developers, without making it more overwhelming for them than what already is. In this context, I would like to propose an idea.

The premise is, that there is a complex "catch 22" case here: There are not enough (or not effective enough) bug reports to help developers with their job, but: simultaneously, the more bug reports they would get, the more difficult it gets to relate them, group them and organize them so that trends, patterns and thus priorities can be properly identified and implemented.

This is just an idea, for the community to do some brainstorming. It may be shattered to dust in the next couple of posts as the stupidest one ever, or simply generate some curiosity or interest; or who know, celebrated as something possibly good. In any of these cases, I will be happy anyway, just please be friendly.

So: What about a list of "Suggested Minimum Testing" ?

Let me explain: I am not trying to interfere or dictate what unit testing methods the Krita developers use. I am talking about what users could do to help the developers, e.g. when a new alpha or beta version is made available to the public and how this could be implemented in an organized, hopefully standardized and simple way. Plus, how to get additional help.

Making the bug reporting too relaxed (i.e., people could use the forum to report bugs in a more informal way) would not help anyone; it would make devs' lifes more miserable. Unfortunately, at the same time, the formal bug reporting alone is not satisfying at all (sorry, it's not, at least for the Krita project; please have a look to this thread as an example). So, perhaps a third way could help. If I, as (non-expert) user, would have a suggested "script" (call it whatever you like) of how could I really help developers by testing betas, I would be really very happy to do so. I can imagine that there are others who would also.

Such a "procedure"/"script"/"method"/"list"/"survey" (from now on I will call it "thing", to avoid offending anyone with some of those words) would have the following benefits:

1. It would clearly identify what areas are considered key to every user, expert or not, and/or to the Project. Currently, as an user, I would have my own idea of what that could be, but it would be best if the whole community (or someone on its behalf, perhaps devs?) would define criteria. One of the problems I see, is that those of us that are NOT Krita experts, are a bit intimidated by some very advanced (or plain fancy) features that we don't use and that may take long before we will do. But for everyone who wants to test Krita, regardless of their degree of expertise, there should be a list of "things that must work no matter what" or that "must be tested" (and how) in every release.

2. The "thing" would not only suggest what to test and how, but also how to evaluate the item. I.e. a simple statement in human language of what could come from an Interface Design Specification but without the pedantic technical tone. Something like:
    test #12: Foo Tool accessibility: The Foo Tool should be easily accessible with either a Stylus, the mouse or keyboard, so that you have the subjective feeling that this action is reasonably natural and not disrupting when moving from Tool Bar to Tool Foo.
    Steps: While on the canvas, with the Brush tool selected, perform XYZ ....
    Evaluation: In your opinion, accessing the Foo Tool felt:
      1 - Quite clumsy and disruptive;
      2 - A bit clumsy
      3 - Not clumsy, I can leave with it
      4 - Really natural and fluent
(Or whatever... :) )

3. The testing would be standardized, yet friendly (and totally volunteer, of course!!). This is of extreme importance.

4. The inexperienced user will learn many things that he/she wouldn't know already, and amazingly, be able to actually judge the way this things work, without any previous experience. That would be actually quite cool for me, and surely for many others. This would essentially a win-win experience: I learn about the most relevant things, while you learn how was my experience about it.

5. It could (or not) exist room for suggestions on selected items, if that'd be considered valuable.

6. The input of the "thing" would be hopefully easier to read and classify than random posts in forums, chats, facebook or even a group of several random bug reports that may or may not be related. This helps to identify bugs and/or prioritize them based on frequency or scoring (e.g. the Foo Tool accessibility scored rather low or the creation of shortcuts failed for many people). You could either open your own bug report, or just go ahead and start working directly on that thing without waiting for someone to write a potentially incomplete or not well explained bug report.

7. If the "thing" outcome would be considered useful enough, perhaps some of its conclusions could even be made public to encourage (and show appreciation to) users to keep doing so for future releases.

8. If the "thing" would be better developed as a web page e.g. in Krita's site but lack of man power would be a big obstacle, this could be an great opportunity for getting help from other kind of developers that would love to help but are not C++/Qt specialists. I am sure that among Krita's users base there are Web Developers, gurus in JavaScript/Ruby/DJango,... etc., that would have their first opportunity to actually write code for the Krita Project and help to put a nice tool available on Krita's site to help developers and the community to get Krita better and stronger. If, on the other hand, a separate tool/app/whatever would be prefered, there are other options, like certain survey sites that I have used in the past for some projects of mine, or who knows, maybe someone else will post here some great idea or proposal.

Now: there is a lot of meat already in those above lines to let the brainstorming move on from here. I have more, but really would like to see what's the impact of this idea, whether it's really worthy or on the contrary, someone will come and say "sorry, tried and failed before" or "wrong place for this discussion" and this thread gets closed and we'd have to move on with our lives. I hope not! :)

Thanks for your comments and suggestions!
User avatar
halla
KDE Developer
Posts
5092
Karma
20
OS

Re: Suggested Minimum Testing

Fri Nov 11, 2016 5:26 pm
I think your idea is really good: whether it works in practice depends on whether there is a volunteer for maintaining the list of testcases. I've been doing software development for about twenty-five years now, and in the companies I've worked for, we always had a manual test plan -- a set of things the testers would go through and check off. (It's an important part, but for me, when I'm building a release for Windows (2...), OSX and Linux, all I do is a smoke test: start and create an image... And that's all I've got time for.)

We even tried this before, with some sort of poll system where every test was a poll, and every tester got an account and would go through the polls for a new beta. That foundered on lack of time to maintain the website that implemented the polling, but maybe it would be easier now.

It would still need someone to curate the tests; sometimes a test becomes obsoleted and has to be removed, sometimes reworded because the UI changes.

If you want to take on that role, I'm sure we can figure out a way to get started!
User avatar
ocumo
Registered Member
Posts
16
Karma
0
OS

Re: Suggested Minimum Testing

Fri Nov 11, 2016 6:02 pm
Thanks a lot for your very positive comments.
It maybe a bit early for me to jump and make big commitment, notice that I am not a Krita expert; but it's an honor to have such suggestion coming from you and I would love to be able to give back something to the project, even if small, so I also don't feel comfortable refusing to help.
I can feel the pain from guys like you, who are for me and to so many people in the community our heroes, as so many others (giants) that made and make FOSS possible. To me, it's a civilizational thing, really. It's not about technology, or being a geek, it's more. And every day we see the news, we realize how much this world desperately needs more civilizational things that make us proud of being humans, because the opposite idea is getting stronger every second.
I'll certainly think about it and try to refine the idea; let's see what other reactions and ideas we get in the mean time.
User avatar
TheraHedwig
KDE Developer
Posts
1794
Karma
10
OS

Re: Suggested Minimum Testing

Sat Nov 12, 2016 2:00 pm
Well, we have this board: https://phabricator.kde.org/project/view/147/ which is an experimental board for users to figure out where they can help with testing and design. We've got a workflow where, when a feature is implemented, the task for it gets moved to this board. When a feature needs gui design, it also can be moved to this board.

There's also https://phabricator.kde.org/project/view/148/ where we make the features end up eventually. With make noise being the ones that can be advertised as stable.

https://phabricator.kde.org/T3541 explains the workflow of a new feature in full.
User avatar
ocumo
Registered Member
Posts
16
Karma
0
OS

Re: Suggested Minimum Testing

Sat Nov 12, 2016 3:29 pm
Thank you TheraHedwig,
That's really interesting. Being new to all that, I look forward to learn all I can about those tools.


Bookmarks



Who is online

Registered users: Bing [Bot], daret, Google [Bot], Sogou [Bot]