[Libreoffice-qa] test cases quality; was: Ubuntu/Canonical doing more manual testing for LibreOffice?
gautier.sophie at gmail.com
Fri Mar 2 10:39:02 PST 2012
On 02/03/2012 19:21, Petr Mladek wrote:
> Sophie Gautier píše v Pá 02. 03. 2012 v 18:02 +0100:
>> First, I don't take anything personal in your mail. I disagree with you
>> but it's nothing personal :)
> I hope that we could learn from each other :-)
>> On 02/03/2012 17:26, Petr Mladek wrote:
>>> I appreciate that you want to teach people using Litmus. Though, I am
>>> afraid that you did not get my point.
>> I don't want to teach them using Litmus, I want them to get an interest,
>> get fun and don't feel harassed by the task.
> Sure. This is my target as well. The translations checks looked boring
> on the first look.
QA has a great potential to get boring ;)
>> We are testing functionalities and by the same way are checking for
>> basic i18n conversion (numbers, accentuated characters, date, size of
>> the string...)
> Ok, so one example of the current test:
> --- cut ---
> Name: Translation check of creating a new database
> Steps to Perform:
> *Open a new database file (New → Database) and check [Create a
> database] then click on Next button.
> * Check [Yes, I want the wizard to register the database] and
> [Open the database for edition] and click on Finish.
> * Enter a name for the database (using special characters in your
> language) in the dialog box and click OK.
> Expected Results:
> * the database wizard open: all strings in the dialog box and
> window are correctly localized to your own language.
> --- cut ---
> Ok, it checks translation and functionality.
> Do we really need to check the functionality in all 100 localizations?
It's only checked in 5 or 6 language, even less if you look at the poll
I've ran on the l10n list.
> IMHO, if the database opens in English, it opens in all licalizations.
> We do not need to force 100 people to spend time on this functional
> Do we need to check translation even when the strings were not changed
> between the releases?
yes, because the amount of strings in the database is really big and you
need more than two eyes to check for the quality.
> => I strongly suggest to separate translation and functional checks. It
> is very ineffective to test them together.
you spare some resources, most of the time tests are done by people in
their native language. Do you want to run them only in English?
> Thanks to Rimas, we could mark test cases as language dependent and
> independent, so we have a great support for this separation.
Yes but again, this won't change a lot about the translation of the test
cases, testers will need to run them in their language.
>> Litmus should be an entry for approaching QA for the community at large
>> i.e. no language barrier, no technical barrier, a team behind to guide
>> you further in more complex testing. Unfortunately, it's not a tool
>> adapted to our needs.
> I agree with you. I just say that many of the current test cases sounds
> crazy as they are and might point people in a wrong direction.
yes, this is why Litmus is not adapted.
>> As said, I'm not speaking about translation. The contents of the test
>> may confuse you when it speaks about localization, but it's only a
>> second purpose of the test, a "*while you are here*, please check that
>> the dialog has the good special characters in your language"
> Yes, it is confusing because they mix the translation and functional
> tests. All I want to say is that it is not effective and we should not
> go this way.
Ok lets try without checking for the translation, we can remove the
specific directions about language in the test.
>> No, it's not enough, because most of the time, the team doing the
>> translation is one person only, so you can't remember where and when the
>> translation is longer than the original, and for some languages it's
>> always true.
> We could use some scripting here. Andras is interested into the
> translations stuff. I wonder if he has time and could help here.
>>> You might say that we should check quality of the translation. I mean if
>>> the translation makes sense in the context of the given dialog. Well,
>>> this is not mentioned in the current test case. Also, I am not sure if
>>> it is worth the effort. We do not change all strings in every release.
>>> So, we do not need to check all translations.
>> When you see the amount of strings for the number of people doing
>> translation, having a proof reading of the dialog during QA is not a
>> luxury ;) But I agree, as said it's not the first aim of the tests
> Sure. On the other hand, checking 1000 dialogs because you changed only
> 20 of them is not luxury as well.
>>>> We had that by the past with the VCLTestool.
>>> Hmm, how VCLTesttool helped here? Did it checked that a string was
>>> localized? Did it checked if a translation was shrinked or confusing?
>> It took a snapshot of each dialog, menu, submenu, etc. When you want to
>> reach a certain amount of quality for you version, it was very useful
>> because you were sure that everything was checked. I don't say that you
>> run it on each version but I did it on each major OOo versions.
> I am not sure if we are speaking about the same tool. I speak about the
> testtool that used the .bas scripts from the testautomation sources
> What do you mean by snapshot?
Sorry, wrong name, it's a screenshot I was talking about.
> IMHO, it went through many dialogs and did many actions. AFAIK, it did
> not check translations. It did not check that a dialog was shown
> correctly. It only checked that it worked as expected.
> Unfortunately, it was a pain to maintain and pain to analyze results?
> Have you found any real bug by this tool?
> IMHO, it gave you a false feeling that it checked something. It was
> running several days. Printed many errors. I always spend few days
> analyzing them. Most of them were random errors caused by asynchronous
> operations or bugs in the test tool (testtool was not updated for the
> new functionality). I found only very few bugs and spend many days/weeks
> with it.
Once you're used to them and don't change your environment, I was able
to find several bugs. But I don't want this tool back, I was just
talking about the ability to check for our translation by taking
screenshots of the dialogs.
>> Because each team has to adapt the test in his language, the basis in
>> English doesn't mention every specificities.
> Yes, but only small part of the functionality is language dependent.
yes this is why I didn't see it as an issue to check for the language at
the same time the test is run
>> Yes, you're right. But keep in mind that to teach people in their spare
>> time, they need to enjoy it. It needs to be a step by step learning,
>> growing interest as well as knowledge at the same time. And don't forget
>> the fun too. There should be very simple test cases and more complex
>> ones. Simple samples of document and much more complex ones. Defined
>> period of test to create a dynamic in the group, with visible results
>> and visible recognition, etc...
> Yup. I like the test case:
> "Translation check when creating a table in a database" It makes perfect
> sense as a functional test. I only do not understand why it focuses too
> much on translation.
>>> IMHO, it would be easier that start with functional tests rather than
>>> entering hunderts of the "same" translations tests.
>> yes, of course :) but I think you get what I was meaning. You may see
>> localization of a test as repeating the test, when it's offering
>> somebody to come in with no other burden that the joy of participating
>> in his language and with his basic skills. Accept to waste some time
>> with less effective test but allow to more people to participate is what
>> is behind that tool.
> I think that we are on the same page. After all, the new test cases are
> not that bad. My only problem is that they mix functional and
> translation checks.
ok, lets see what it brings if we remove this part from the test. But
I'm afraid you're speaking of English tests only with no localization of
> Have a nice weekend,
Thanks, you too!
Founding member of The Document Foundation
More information about the Libreoffice-qa