Wednesday, 4 June 2008

Why XFAIL?

A couple of weeks ago I implemented a feature called --XFAIL-- in the PHP test runner (run-tests.php). The idea was not mine – it was Pierre's and I admit I had a few reservations about it when he first suggested it but I convinced myself that it might be useful. There was also some helpful discussion on the PHP QA list.



In this post I'll explain what I have done and a give a couple of instances in which I think it might be of some use.



In the following example I have added an XFAIL section to a test called cos_basic1.phpt in ext/standard/tests/math:



--XFAIL--
Expected to fail because I've messed with expected output to make it fail

I have also messed with the expected output to ensure that the test really does fail.

When I use run-tests.php to execute all the tests in the math directory the final section of the report looks like this:



=====================================================================
Number of tests : 110 109
Tests skipped : 1 ( 0.9%) --------
Tests warned : 0 ( 0.0%) ( 0.0%)
Tests failed : 0 ( 0.0%) ( 0.0%)
Expected fail : 1 ( 0.9%) ( 0.9%)
Tests passed : 108 ( 98.2%) ( 99.1%)
---------------------------------------------------------------------
Time taken : 1 seconds
=====================================================================
=====================================================================
EXPECTED FAILED TEST SUMMARY
---------------------------------------------------------------------
Test return type and value for expected input cos() [math/cos_basic1.phpt]
=====================================================================

The test cos_basic1.phpt fails, the usual .out, .exp etc files are generated - the only difference is in the way that the failure is reported. There is a new line in the summary data (Expected fail:) and a new section called EXPECTED FAILED TEST SUMMARY.


The intention of XFAIL is to help people working on developing PHP. Consider first the situation where you (as a PHP implementer) are working through a set of failing tests. You do some analysis on one test but you can't fix the implementation until something else is fixed – however – you don't want to lose the analysis and it might be some time before you can get back to the failing test. In this case I think it's reasonable to add an XFAIL section with a brief description of the analysis. This takes the test out of the list of reported failures making it easier for you to see what is really on your priority list but still leaving the test as a failing test.


The second place that I can see that XFAIL might be useful is when a group of people are working on the same development project. Essentially one person on the project finds a missing feature or capability but it isn't something they can add immediately, or maybe another person has agreed to implement it. A really good way to document the omission is to write a test case which is expected to fail but which will pass if the feature is implemented. This assumes that there is general agreement that implementation is a good idea and needs to be done at some stage.


Both of these situations have more to do with what is useful for a developer than a tester, so XFAIL is probably not a feature that I'll use much myself. One person also raised the possibility that the function is really already covered by the SKIPIF section. I don't think it is and I think the distinction is simply that if something is in a SKIPIF section it is something that is never expected to work – like some of the file system tests on Windows. I also can't think of a good reason that there would ever be XFAILing tests in released code, in contrast we often use SKIPIF sections in released levels of PHP.


The XFAIL feature is only implemented in PHP 5.3 and PHP6. It's documented in the usual place

Friday, 30 May 2008

Zoom Splatter

Several people have asked me why this blog is called zoomsplatter, someone even suggested that it might be a reference to what I do on the motorbike. Thankfully it isn't – or not since I was 19.

It's actually a reference to IBM's office system. When I first joined IBM we all used PROFS, IBM's office system. It was amazingly good and had one invaluable feature that I have not seen in anything since – the “unsend” function. This was an absolute godsend to those of us with poor impulse control.

The PROFS spell checker used to offer a few alternatives if it didn't like what you were typing – and frankly it was not very keen on either of my names. Zoom Splatter was what it thought I should be called – and who am I to argue?

Wednesday, 21 May 2008

How can we tell if we're making a difference?

It's quite hard to tell how well the PHP TestFest is succeeding in the aim of improving the PHP test coverage. Next time we do this we should take a snapshot of the coverage before we start - hindsight is a wonderful thing.

Here is what I did to try and find out:

From the CVS change log for May 2008, 101 PHP test files were added or changed in PHP 5.3 under the ~ext directory.

Looking at these I can identify 49 that I know come from TestFest activity, these are 30 new mcrypt tests – we have David Soria Parra to thank for these – having a one-man TestFest there :-)

Of the others there are 17 new DOM tests – for which we thank the London PHP group and 2 new reflection tests that came from the Dutch PHP TestFest. Given that there currently about 5000 tests in PHP, if we only count the tests committed that's about a 1% increase.


So where are the rest of the tests? There are still 73 tests in testfest.php.net waiting to be reviewed and committed – we really need someone who understands reflection to review and commit reflection tests. I notice that there is a new cURL test today as well – that's brave.


What about coverage? I believe that the mcrypt coverage leapt from less than 30% up to over 75% thanks to David's efforts! The DOM coverage will go up by about 3.5% by the time all the London tests are in and there is plenty more to do in that extension. As for reflection - I can't tell but I'd love to know. In case anyone wants to check later it's at 75% today without the the new tests.



In case anyone is looking for places to help, check this for some interesting analysis of what is and is not tested. Oh yes, there's plenty more to do :-)

Sunday, 11 May 2008

My wildest dreams get wilder every day

The Flatlanders, not a prolific band but it terms of country music - absolute perfection. What made me think of them? Partly the Dutch - literally the “flat landers”, the land of windmills, clogs, tulips and now, PHP tests.

On May 10th, led by Sebastian Bergmann and Stefan Koopmanschap a team of 10 dedicated PHP programmers wrote 37 tests tests for the Reflection extension and had a lot of fun doing it.


The Dutch beat the UK[*] team, led by Scott MacVicar, Steve Seear, Ant Phillips, Josie Messa and YT[**], into an honourable second place. With 18 skilled PHP programmers we managed 26 tests for the DOM extension in 3 hours.

Why? It's a PHP TestFest. This is the wild dream – that PHP should have a complete set of test cases for the implementation.

With the extraordinary power of open source and the good will of PHP programmers around the world, this particular wild dream is fast becoming a reality!


Zoe Slattery


[*] I say UK, but have to admit help from several people with the sort of names and accents that romantic novelists give their heros.

[**] I really mean Yours Truly. Not the YT of SnowCrash , that would be a wild dream. In Snowcrash terms I'm more like YT's mom but with Poor Impulse Control.