<div dir="ltr">Hey Winfried,<br><br><br><div class="gmail_extra"><br><div class="gmail_quote">On Sat, Apr 16, 2016 at 5:29 PM, Winfried Donkers <span dir="ltr"><<a href="mailto:winfried.libreoffice@gmail.com" target="_blank">winfried.libreoffice@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Hi Markus,<span class=""><br>
<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
@Winfried: You are writing most of the spreadsheet functions right now. Would that be something that would help you add tests for all new functions? You could add your test cases directly in the spreadsheet and use it for manual checking as well as later for an automated test.<br>
</blockquote></span>
I would like to add unit tests like sc/qa/unit/data/xlsx/functions-excel-2010.xlsx and tested in sc/qa/unit/subsequent_export-test.cxx (test principle conceived by Kohei), e.g. no csv files but an xlsx-document saved by Excel. The document contains boolean verifications is such a way that one cell (here Sheet1.E2) needs to be checked in the unit test for all functions in the document. Only if that cell contains a false value, the individual verifications for each function needs to be traversed to provide a message which function(s) went wrong.<br>
With the xlsx, both excel-import of the function and the function itself are tested.<br>
As at that time csv files were to be used for testing the special cases and various use cases of each function, and Raal seemed to pick up that part, I never went farther than the above-mentioned xlsx-document.<br>
<br>
Of course the principle as used in the above test file can perfectly be used for more elaborate unit tests, removing the need for csv files. it is also possible for non-developers to create these test files, as long as the 'verification cells' are in a fixed row or column. The only developer action needed would be to include the file, like in sc/qa/unit/subsequent_export-test.cxx for the excel2010/2013 functions.<br>
<br>
For the Excel2016 functions, on which I have bee working since last autumn, the FORECAST_ETS functions present a problem for me as far as testing is concerned. The results of the functions depend on the algorithms used (and thresholds used) to find optimum values for coefficients, and will always differ between Excel and Calc. Even with Calc I can imagine an improvement of the algorithm leading to different results. It wouldn't seem right to have to modify the unit test as well in that case. As for the other functions, There still being developed/evaluated and unit tests are on my to-do-list, just a providing a basis for Calc help.<br></blockquote><div><br><br></div><div>Well the advantage would be that you could write the document while you implement the function. You can add all the test cases that you would normally use for manual checking into that document and at the end when everything works just include it into the automated testing.<br> <br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<br>
One (personal) remark: deciding which use cases, which values, etc. to use for unit tests is a bit of a vague area for me. Arguments can be of different types (e.g. string, double, single/double external/internal reference, matrix, single value or range), can have almost unlimited different values and can be optional. Testing all use cases would fill a lot of cells and doesn't seem practical to me. But for deciding which use cases to test and which to skip clear guidelines would be a great help (for me at least).<br></blockquote><div><br></div><div>Ideally all special cases would be tested. It does not matter how many cases that are and of course we don't need all at once but in an ideal world you have a test case for each possible branch in the code. Basically while implementing a new function I would add at least tests for all the special cases that I encounter during the implementation, e.g. handling for optional parameters, special handling for some values, ... Normally you need to write such a test anyway for your manual tests as you need to manually test your change. The difference to our present workflow would just be that instead of doing this in a temporary document that you throw away at some point you just collect all these cases and include them in the test framework.<br><br> <br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<br>
Does all this match with your ideas?<span class="HOEnZb"><font color="#888888"><br></font></span></blockquote><div><br><br></div><div>Yes, I think we basically agree. I'll see that I work on a template that can be used and that makes it easy to write such tests.<br> <br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><span class="HOEnZb"><font color="#888888">
<br>
Winfried<br>
<br>
</font></span></blockquote></div><br></div></div>