2002-09-17 02:07:03 +02:00
|
|
|
<chapter id="testing">
|
|
|
|
<title>Writing Conformance tests</title>
|
|
|
|
|
|
|
|
<sect1 id="testing-intro">
|
|
|
|
<title>Introduction</title>
|
|
|
|
<para>
|
2004-04-07 21:06:35 +02:00
|
|
|
The Windows API follows no standard, it is itself a de facto standard,
|
|
|
|
and deviations from that standard, even small ones, often cause
|
|
|
|
applications to crash or misbehave in some way.
|
|
|
|
</para>
|
|
|
|
<para>
|
|
|
|
The question becomes, "How do we ensure compliance with that standard?"
|
|
|
|
The answer is, "By using the API documentation available to us and
|
|
|
|
backing that up with conformance tests." Furthermore, a conformance
|
|
|
|
test suite is the most accurate (if not necessarily the most complete)
|
|
|
|
form of API documentation and can be used to supplement the Windows
|
|
|
|
API documentation.
|
2002-09-17 02:07:03 +02:00
|
|
|
</para>
|
|
|
|
<para>
|
|
|
|
Writing a conformance test suite for more than 10000 APIs is no small
|
|
|
|
undertaking. Fortunately it can prove very useful to the development
|
|
|
|
of Wine way before it is complete.
|
|
|
|
<itemizedlist>
|
|
|
|
<listitem>
|
|
|
|
<para>
|
|
|
|
The conformance test suite must run on Windows. This is
|
|
|
|
necessary to provide a reasonable way to verify its accuracy.
|
|
|
|
Furthermore the tests must pass successfully on all Windows
|
|
|
|
platforms (tests not relevant to a given platform should be
|
|
|
|
skipped).
|
|
|
|
</para>
|
|
|
|
<para>
|
|
|
|
A consequence of this is that the test suite will provide a
|
|
|
|
great way to detect variations in the API between different
|
|
|
|
Windows versions. For instance, this can provide insights
|
|
|
|
into the differences between the, often undocumented, Win9x and
|
|
|
|
NT Windows families.
|
|
|
|
</para>
|
|
|
|
<para>
|
|
|
|
However, one must remember that the goal of Wine is to run
|
|
|
|
Windows applications on Linux, not to be a clone of any specific
|
|
|
|
Windows version. So such variations must only be tested for when
|
|
|
|
relevant to that goal.
|
|
|
|
</para>
|
|
|
|
</listitem>
|
|
|
|
<listitem>
|
|
|
|
<para>
|
|
|
|
Writing conformance tests is also an easy way to discover
|
|
|
|
bugs in Wine. Of course, before fixing the bugs discovered in
|
|
|
|
this way, one must first make sure that the new tests do pass
|
|
|
|
successfully on at least one Windows 9x and one Windows NT
|
|
|
|
version.
|
|
|
|
</para>
|
|
|
|
<para>
|
|
|
|
Bugs discovered this way should also be easier to fix. Unlike
|
|
|
|
some mysterious application crashes, when a conformance test
|
|
|
|
fails, the expected behavior and APIs tested for are known thus
|
|
|
|
greatly simplifying the diagnosis.
|
|
|
|
</para>
|
|
|
|
</listitem>
|
|
|
|
<listitem>
|
|
|
|
<para>
|
|
|
|
To detect regressions. Simply running the test suite regularly
|
|
|
|
in Wine turns it into a great tool to detect regressions.
|
|
|
|
When a test fails, one immediately knows what was the expected
|
|
|
|
behavior and which APIs are involved. Thus regressions caught
|
|
|
|
this way should be detected earlier, because it is easy to run
|
2004-09-07 22:42:41 +02:00
|
|
|
all tests on a regular basis, and be easier to fix because of the
|
2002-09-17 02:07:03 +02:00
|
|
|
reduced diagnosis work.
|
|
|
|
</para>
|
|
|
|
</listitem>
|
|
|
|
<listitem>
|
|
|
|
<para>
|
|
|
|
Tests written in advance of the Wine development (possibly even
|
2003-07-09 21:50:14 +02:00
|
|
|
by non Wine developers) can also simplify the work of the
|
|
|
|
future implementer by making it easier for him to check the
|
2002-09-17 02:07:03 +02:00
|
|
|
correctness of his code.
|
|
|
|
</para>
|
|
|
|
</listitem>
|
|
|
|
<listitem>
|
|
|
|
<para>
|
2002-09-17 20:34:38 +02:00
|
|
|
Conformance tests will also come in handy when testing Wine on
|
|
|
|
new (or not as widely used) architectures such as FreeBSD,
|
|
|
|
Solaris x86 or even non-x86 systems. Even when the port does
|
|
|
|
not involve any significant change in the thread management,
|
|
|
|
exception handling or other low-level aspects of Wine, new
|
|
|
|
architectures can expose subtle bugs that can be hard to
|
|
|
|
diagnose when debugging regular (complex) applications.
|
2002-09-17 02:07:03 +02:00
|
|
|
</para>
|
|
|
|
</listitem>
|
|
|
|
</itemizedlist>
|
|
|
|
</para>
|
|
|
|
</sect1>
|
|
|
|
|
2002-12-05 20:13:42 +01:00
|
|
|
|
2002-09-17 02:07:03 +02:00
|
|
|
<sect1 id="testing-what">
|
|
|
|
<title>What to test for?</title>
|
|
|
|
<para>
|
|
|
|
The first thing to test for is the documented behavior of APIs
|
|
|
|
and such as CreateFile. For instance one can create a file using a
|
|
|
|
long pathname, check that the behavior is correct when the file
|
|
|
|
already exists, try to open the file using the corresponding short
|
|
|
|
pathname, convert the filename to Unicode and try to open it using
|
|
|
|
CreateFileW, and all other things which are documented and that
|
|
|
|
applications rely on.
|
|
|
|
</para>
|
|
|
|
<para>
|
|
|
|
While the testing framework is not specifically geared towards this
|
|
|
|
type of tests, it is also possible to test the behavior of Windows
|
|
|
|
messages. To do so, create a window, preferably a hidden one so that
|
|
|
|
it does not steal the focus when running the tests, and send messages
|
|
|
|
to that window or to controls in that window. Then, in the message
|
|
|
|
procedure, check that you receive the expected messages and with the
|
|
|
|
correct parameters.
|
|
|
|
</para>
|
|
|
|
<para>
|
|
|
|
For instance you could create an edit control and use WM_SETTEXT to
|
|
|
|
set its contents, possibly check length restrictions, and verify the
|
|
|
|
results using WM_GETTEXT. Similarly one could create a listbox and
|
|
|
|
check the effect of LB_DELETESTRING on the list's number of items,
|
2004-09-07 22:42:41 +02:00
|
|
|
selected items list, highlighted item, etc. For concrete examples,
|
|
|
|
see <filename>dlls/user/tests/win.c</> and the related tests.
|
2002-09-17 02:07:03 +02:00
|
|
|
</para>
|
|
|
|
<para>
|
|
|
|
However, undocumented behavior should not be tested for unless there
|
|
|
|
is an application that relies on this behavior, and in that case the
|
|
|
|
test should mention that application, or unless one can strongly
|
|
|
|
expect applications to rely on this behavior, typically APIs that
|
|
|
|
return the required buffer size when the buffer pointer is NULL.
|
|
|
|
</para>
|
|
|
|
</sect1>
|
|
|
|
|
|
|
|
|
2002-12-05 20:13:42 +01:00
|
|
|
<sect1 id="testing-wine">
|
|
|
|
<title>Running the tests in Wine</title>
|
2002-09-17 02:07:03 +02:00
|
|
|
<para>
|
|
|
|
The simplest way to run the tests in Wine is to type 'make test' in
|
|
|
|
the Wine sources top level directory. This will run all the Wine
|
|
|
|
conformance tests.
|
|
|
|
</para>
|
|
|
|
<para>
|
|
|
|
The tests for a specific Wine library are located in a 'tests'
|
|
|
|
directory in that library's directory. Each test is contained in a
|
2002-12-05 20:13:42 +01:00
|
|
|
file (e.g. <filename>dlls/kernel/tests/thread.c</>). Each
|
2002-09-17 02:07:03 +02:00
|
|
|
file itself contains many checks concerning one or more related APIs.
|
|
|
|
</para>
|
|
|
|
<para>
|
|
|
|
So to run all the tests related to a given Wine library, go to the
|
|
|
|
corresponding 'tests' directory and type 'make test'. This will
|
2002-12-05 20:13:42 +01:00
|
|
|
compile the tests, run them, and create an '<replaceable>xxx</>.ok'
|
|
|
|
file for each test that passes successfully. And if you only want to
|
|
|
|
run the tests contained in the <filename>thread.c</> file of the
|
|
|
|
kernel library, you would do:
|
2002-09-17 02:07:03 +02:00
|
|
|
<screen>
|
|
|
|
<prompt>$ </>cd dlls/kernel/tests
|
|
|
|
<prompt>$ </>make thread.ok
|
|
|
|
</screen>
|
|
|
|
</para>
|
|
|
|
<para>
|
|
|
|
Note that if the test has already been run and is up to date (i.e. if
|
|
|
|
neither the kernel library nor the <filename>thread.c</> file has
|
|
|
|
changed since the <filename>thread.ok</> file was created), then make
|
|
|
|
will say so. To force the test to be re-run, delete the
|
|
|
|
<filename>thread.ok</> file, and run the make command again.
|
|
|
|
</para>
|
|
|
|
<para>
|
|
|
|
You can also run tests manually using a command similar to the
|
|
|
|
following:
|
|
|
|
<screen>
|
2002-12-05 20:13:42 +01:00
|
|
|
<prompt>$ </>../../../tools/runtest -q -M kernel32.dll -p kernel32_test.exe.so thread.c
|
2004-09-07 22:42:41 +02:00
|
|
|
<prompt>$ </>../../../tools/runtest -P wine -p kernel32_test.exe.so thread.c
|
2002-09-17 02:07:03 +02:00
|
|
|
thread.c: 86 tests executed, 5 marked as todo, 0 failures.
|
|
|
|
</screen>
|
2004-09-07 22:42:41 +02:00
|
|
|
The '-P wine' option defines the platform that is currently being
|
|
|
|
tested and is used in conjunction with the 'todo' statements (see
|
|
|
|
below). Remove the '-q' option if you want the testing framework
|
2002-12-05 20:13:42 +01:00
|
|
|
to report statistics about the number of successful and failed tests.
|
|
|
|
Run <command>runtest -h</> for more details.
|
2002-09-17 02:07:03 +02:00
|
|
|
</para>
|
|
|
|
</sect1>
|
|
|
|
|
2002-12-05 20:13:42 +01:00
|
|
|
|
2004-02-07 02:29:06 +01:00
|
|
|
<sect1 id="cross-compiling-tests">
|
|
|
|
<title>Cross-compiling the tests with MinGW</title>
|
|
|
|
<sect2>
|
|
|
|
<title>Setup of the MinGW cross-compiling environment</title>
|
|
|
|
<para>
|
2004-02-26 06:30:15 +01:00
|
|
|
Here are some instructions to setup MinGW on different Linux
|
|
|
|
distributions and *BSD.
|
2004-02-07 02:29:06 +01:00
|
|
|
</para>
|
|
|
|
<sect3>
|
|
|
|
<title>Debian GNU/Linux</title>
|
|
|
|
<para>
|
2004-09-07 22:42:41 +02:00
|
|
|
On Debian do <command>apt-get install mingw32</>.
|
|
|
|
</para>
|
|
|
|
<para>
|
|
|
|
The standard MinGW libraries will probably be incomplete, causing
|
|
|
|
'undefined symbol' errors. So get the latest
|
|
|
|
<ulink url="http://mirzam.it.vu.nl/mingw/">mingw-w32api RPM</>
|
|
|
|
and use <command>alien</> to either convert it to a .tar.gz file
|
|
|
|
from which to extract just the relevant files, or to convert it
|
|
|
|
to a Debian package that you will install.
|
2004-02-07 02:29:06 +01:00
|
|
|
</para>
|
|
|
|
</sect3>
|
|
|
|
<sect3>
|
|
|
|
<title>Red Hat Linux like rpm systems</title>
|
|
|
|
<para>
|
|
|
|
This includes Fedora Core, Red Hat Enterprise Linux, Mandrake,
|
|
|
|
most probably SuSE Linux too, etc. But this list isn't exhaustive;
|
|
|
|
the following steps should probably work on any rpm based system.
|
|
|
|
</para>
|
2004-02-26 06:30:15 +01:00
|
|
|
<para>
|
|
|
|
Download and install the latest rpm's from
|
|
|
|
<ulink url="http://mirzam.it.vu.nl/mingw/">MinGW RPM packages</>.
|
|
|
|
Alternatively you can follow the instructions on that page and
|
|
|
|
build your own packages from the source rpm's listed there as well.
|
|
|
|
</para>
|
2004-02-07 02:29:06 +01:00
|
|
|
</sect3>
|
|
|
|
<sect3>
|
|
|
|
<title>*BSD</title>
|
|
|
|
<para>
|
|
|
|
The *BSD systems have in their ports collection a port for the
|
|
|
|
MinGW cross-compiling environment. Please see the documentation
|
|
|
|
of your system about how to build and install a port.
|
|
|
|
</para>
|
|
|
|
</sect3>
|
|
|
|
</sect2>
|
|
|
|
<sect2>
|
|
|
|
<title>Compiling the tests</title>
|
|
|
|
<para>
|
|
|
|
Having the cross-compiling environment set up the generation of the
|
|
|
|
Windows executables is easy by using the Wine build system.
|
|
|
|
</para>
|
|
|
|
<para>
|
|
|
|
If you had already run <command>configure</>, then delete
|
|
|
|
<filename>config.cache</> and re-run <command>configure</>.
|
|
|
|
You can then run <command>make crosstest</>. To sum up:
|
|
|
|
<screen>
|
|
|
|
<prompt>$ </><userinput>rm config.cache</>
|
|
|
|
<prompt>$ </><userinput>./configure</>
|
|
|
|
<prompt>$ </><userinput>make crosstest</>
|
|
|
|
</screen>
|
|
|
|
</para>
|
|
|
|
</sect2>
|
|
|
|
</sect1>
|
|
|
|
|
|
|
|
|
2002-12-05 20:13:42 +01:00
|
|
|
<sect1 id="testing-windows">
|
|
|
|
<title>Building and running the tests on Windows</title>
|
|
|
|
<sect2>
|
|
|
|
<title>Using pre-compiled binaries</title>
|
|
|
|
<para>
|
2004-09-07 22:42:41 +02:00
|
|
|
The simplest solution is to download the
|
|
|
|
<ulink url="http://www.astro.gla.ac.uk/users/paulm/WRT/CrossBuilt/winetest-latest.exe">latest
|
|
|
|
version of winetest</>. This executable contains all the Wine
|
|
|
|
conformance tests, runs them and reports the results.
|
|
|
|
</para>
|
|
|
|
<para>
|
|
|
|
You can also get the older versions from
|
|
|
|
<ulink url="http://www.astro.gla.ac.uk/users/paulm/WRT/CrossBuilt/">Paul
|
|
|
|
Millar's website</>.
|
2002-12-05 20:13:42 +01:00
|
|
|
</para>
|
|
|
|
</sect2>
|
|
|
|
<sect2>
|
|
|
|
<title>With Visual C++</title>
|
|
|
|
<itemizedlist>
|
2004-09-02 22:06:08 +02:00
|
|
|
<listitem><para>
|
|
|
|
If you are using Visual Studio 6, make sure you have the
|
|
|
|
"processor pack" from
|
|
|
|
<ulink url="http://msdn.microsoft.com/vstudio/downloads/tools/ppack/default.aspx">http://msdn.microsoft.com/vstudio/downloads/tools/ppack/default.aspx</>.
|
|
|
|
The processor pack fixes <emphasis>"error C2520: conversion from
|
|
|
|
unsigned __int64 to double not implemented, use signed __int64"</>.
|
|
|
|
However note that the "processor pack" is incompatible with
|
|
|
|
Visual Studio 6.0 Standard Edition, and with the Visual Studio 6
|
|
|
|
Service Pack 6. If you are using Visual Studio 7 or greater you
|
|
|
|
do not need the processor pack. In either case it is recommended
|
|
|
|
to the most recent compatible Visual Studio
|
|
|
|
<ulink url="http://msdn.microsoft.com/vstudio/downloads/updates/sp/">service pack</>.
|
|
|
|
</para></listitem>
|
2002-12-05 20:13:42 +01:00
|
|
|
<listitem><para>
|
|
|
|
get the Wine sources
|
|
|
|
</para></listitem>
|
|
|
|
<listitem><para>
|
|
|
|
Run msvcmaker to generate Visual C++ project files for the tests.
|
|
|
|
'msvcmaker' is a perl script so you may be able to run it on
|
|
|
|
Windows.
|
|
|
|
<screen>
|
|
|
|
<prompt>$ </>./tools/winapi/msvcmaker --no-wine
|
|
|
|
</screen>
|
|
|
|
</para></listitem>
|
|
|
|
<listitem><para>
|
|
|
|
If the previous steps were done on your Linux development
|
|
|
|
machine, make the Wine sources accessible to the Windows machine
|
|
|
|
on which you are going to compile them. Typically you would do
|
|
|
|
this using Samba but copying them altogether would work too.
|
|
|
|
</para></listitem>
|
|
|
|
<listitem><para>
|
|
|
|
On the Windows machine, open the <filename>winetest.dsw</>
|
|
|
|
workspace. This will load each test's project. For each test there
|
|
|
|
are two configurations: one compiles the test with the Wine
|
2004-09-02 22:06:08 +02:00
|
|
|
headers, and the other uses the Microsoft headers.
|
|
|
|
</para></listitem>
|
|
|
|
<listitem><para>
|
|
|
|
If you choose the "Win32 MSVC Headers" configuration, most of the
|
|
|
|
tests will not compile with the regular Visual Studio headers. So
|
|
|
|
to use this configuration, download and install a recent
|
|
|
|
<ulink url="http://www.microsoft.com/msdownload/platformsdk/sdkupdate/">Platform SDK</>
|
2004-11-02 20:26:42 +01:00
|
|
|
as well as the latest <ulink url="http://msdn.microsoft.com/library/default.asp?url=/downloads/list/directx.asp">DirectX SDK</>.
|
2004-09-02 22:06:08 +02:00
|
|
|
Then, <ulink url="http://msdn.microsoft.com/library/default.asp?url=/library/EN-US/sdkintro/sdkintro/installing_the_platform_sdk_with_visual_studio.asp">configure Visual Studio</>
|
|
|
|
to use these SDK's headers and libraries. Alternately you could go
|
|
|
|
to the <menuchoice><guimenu>Project</> <guimenu>Settings...</></>
|
|
|
|
menu and modify the settings appropriately, but you would then
|
|
|
|
have to redo this whenever you rerun msvcmaker.
|
2002-12-05 20:13:42 +01:00
|
|
|
</para></listitem>
|
|
|
|
<listitem><para>
|
|
|
|
Open the <menuchoice><guimenu>Build</> <guimenu>Batch
|
|
|
|
build...</></> menu and select the tests and build configurations
|
|
|
|
you want to build. Then click on <guibutton>Build</>.
|
|
|
|
</para></listitem>
|
|
|
|
<listitem><para>
|
|
|
|
To run a specific test from Visual C++, go to
|
|
|
|
<menuchoice><guimenu>Project</> <guimenu>Settings...</></>. There
|
|
|
|
select that test's project and build configuration and go to the
|
|
|
|
<guilabel>Debug</> tab. There type the name of the specific test
|
|
|
|
to run (e.g. 'thread') in the <guilabel>Program arguments</>
|
|
|
|
field. Validate your change by clicking on <guibutton>Ok</> and
|
|
|
|
start the test by clicking the red exclamation mark (or hitting
|
|
|
|
'F5' or any other usual method).
|
|
|
|
</para></listitem>
|
|
|
|
<listitem><para>
|
|
|
|
You can also run the tests from the command line. You will find
|
|
|
|
them in either <filename>Output\Win32_Wine_Headers</> or
|
|
|
|
<filename>Output\Win32_MSVC_Headers</> depending on the build
|
|
|
|
method. So to run the kernel 'path' tests you would do:
|
|
|
|
<screen>
|
|
|
|
<prompt>C:\></>cd dlls\kernel\tests\Output\Win32_MSVC_Headers
|
2004-09-07 22:42:41 +02:00
|
|
|
<prompt>C:\wine\dlls\kernel\tests\Output\Win32_MSVC_Headers></> kernel32_test path
|
2002-12-05 20:13:42 +01:00
|
|
|
</screen>
|
|
|
|
</para></listitem>
|
|
|
|
</itemizedlist>
|
|
|
|
</sect2>
|
|
|
|
<sect2>
|
|
|
|
<title>With MinGW</title>
|
|
|
|
<para>
|
2004-02-26 06:30:15 +01:00
|
|
|
Wine's build system already has support for building tests with a MinGW
|
|
|
|
cross-compiler. See the section above called 'Setup of the MinGW
|
|
|
|
cross-compiling environment' for instructions on how to set things up.
|
|
|
|
When you have a MinGW environment installed all you need to do is rerun
|
|
|
|
configure and it should detect the MinGW compiler and tools. Then run
|
|
|
|
'make crosstest' to start building the tests.
|
2002-12-05 20:13:42 +01:00
|
|
|
</para>
|
|
|
|
</sect2>
|
|
|
|
</sect1>
|
|
|
|
|
|
|
|
|
|
|
|
<sect1 id="testing-test">
|
|
|
|
<title>Inside a test</title>
|
2002-09-17 02:07:03 +02:00
|
|
|
|
|
|
|
<para>
|
|
|
|
When writing new checks you can either modify an existing test file or
|
|
|
|
add a new one. If your tests are related to the tests performed by an
|
|
|
|
existing file, then add them to that file. Otherwise create a new .c
|
|
|
|
file in the tests directory and add that file to the
|
|
|
|
<varname>CTESTS</> variable in <filename>Makefile.in</>.
|
|
|
|
</para>
|
|
|
|
<para>
|
|
|
|
A new test file will look something like the following:
|
|
|
|
<screen>
|
|
|
|
#include <wine/test.h>
|
|
|
|
#include <winbase.h>
|
|
|
|
|
|
|
|
/* Maybe auxiliary functions and definitions here */
|
|
|
|
|
|
|
|
START_TEST(paths)
|
|
|
|
{
|
|
|
|
/* Write your checks there or put them in functions you will call from
|
|
|
|
* there
|
|
|
|
*/
|
|
|
|
}
|
|
|
|
</screen>
|
|
|
|
</para>
|
|
|
|
<para>
|
|
|
|
The test's entry point is the START_TEST section. This is where
|
|
|
|
execution will start. You can put all your tests in that section but
|
|
|
|
it may be better to split related checks in functions you will call
|
|
|
|
from the START_TEST section. The parameter to START_TEST must match
|
|
|
|
the name of the C file. So in the above example the C file would be
|
|
|
|
called <filename>paths.c</>.
|
|
|
|
</para>
|
|
|
|
<para>
|
|
|
|
Tests should start by including the <filename>wine/test.h</> header.
|
|
|
|
This header will provide you access to all the testing framework
|
|
|
|
functions. You can then include the windows header you need, but make
|
|
|
|
sure to not include any Unix or Wine specific header: tests must
|
|
|
|
compile on Windows.
|
|
|
|
</para>
|
|
|
|
<para>
|
|
|
|
You can use <function>trace</> to print informational messages. Note
|
|
|
|
that these messages will only be printed if 'runtest -v' is being used.
|
|
|
|
<screen>
|
2004-09-07 22:42:41 +02:00
|
|
|
trace("testing GlobalAddAtomA\n");
|
|
|
|
trace("foo=%d\n",foo);
|
2002-09-17 02:07:03 +02:00
|
|
|
</screen>
|
|
|
|
</para>
|
|
|
|
<para>
|
|
|
|
Then just call functions and use <function>ok</> to make sure that
|
|
|
|
they behaved as expected:
|
|
|
|
<screen>
|
|
|
|
ATOM atom = GlobalAddAtomA( "foobar" );
|
2004-09-07 22:42:41 +02:00
|
|
|
ok( GlobalFindAtomA( "foobar" ) == atom, "could not find atom foobar\n" );
|
|
|
|
ok( GlobalFindAtomA( "FOOBAR" ) == atom, "could not find atom FOOBAR\n" );
|
2002-09-17 02:07:03 +02:00
|
|
|
</screen>
|
|
|
|
The first parameter of <function>ok</> is an expression which must
|
|
|
|
evaluate to true if the test was successful. The next parameter is a
|
|
|
|
printf-compatible format string which is displayed in case the test
|
|
|
|
failed, and the following optional parameters depend on the format
|
|
|
|
string.
|
|
|
|
</para>
|
2003-01-21 00:36:22 +01:00
|
|
|
</sect1>
|
|
|
|
|
|
|
|
<sect1 id="testing-error-messages">
|
|
|
|
<title>Writing good error messages</title>
|
|
|
|
<para>
|
|
|
|
The message that is printed when a test fails is
|
|
|
|
<emphasis>extremely</> important.
|
|
|
|
</para>
|
|
|
|
<para>
|
|
|
|
Someone will take your test, run it on a Windows platform that
|
|
|
|
you don't have access to, and discover that it fails. They will then
|
|
|
|
post an email with the output of the test, and in particular your
|
|
|
|
error message. Someone, maybe you, will then have to figure out from
|
|
|
|
this error message why the test failed.
|
|
|
|
</para>
|
|
|
|
<para>
|
|
|
|
If the error message contains all the relevant information that will
|
|
|
|
be easy. If not, then it will require modifying the test, finding
|
|
|
|
someone to compile it on Windows, sending the modified version to the
|
|
|
|
original tester and waiting for his reply. In other words, it will
|
|
|
|
be long and painful.
|
|
|
|
</para>
|
|
|
|
<para>
|
|
|
|
So how do you write a good error message? Let's start with an example
|
|
|
|
of a bad error message:
|
|
|
|
<screen>
|
|
|
|
ok(GetThreadPriorityBoost(curthread,&disabled)!=0,
|
2004-09-07 22:42:41 +02:00
|
|
|
"GetThreadPriorityBoost Failed\n");
|
2003-01-21 00:36:22 +01:00
|
|
|
</screen>
|
|
|
|
This will yield:
|
|
|
|
<screen>
|
|
|
|
thread.c:123: Test failed: GetThreadPriorityBoost Failed
|
|
|
|
</screen>
|
|
|
|
</para>
|
|
|
|
<para>
|
|
|
|
Did you notice how the error message provides no information about
|
|
|
|
why the test failed? We already know from the line number exactly
|
|
|
|
which test failed. In fact the error message gives strictly no
|
|
|
|
information that cannot already be obtained by reading the code. In
|
|
|
|
other words it provides no more information than an empty string!
|
|
|
|
</para>
|
2002-09-17 02:07:03 +02:00
|
|
|
<para>
|
2003-01-21 00:36:22 +01:00
|
|
|
Let's look at how to rewrite it:
|
|
|
|
<screen>
|
|
|
|
BOOL rc;
|
|
|
|
...
|
|
|
|
rc=GetThreadPriorityBoost(curthread,&disabled);
|
2004-09-07 22:42:41 +02:00
|
|
|
ok(rc!=0 && disabled==0,"rc=%d error=%ld disabled=%d\n",
|
2003-01-21 00:36:22 +01:00
|
|
|
rc,GetLastError(),disabled);
|
|
|
|
</screen>
|
|
|
|
This will yield:
|
|
|
|
<screen>
|
|
|
|
thread.c:123: Test failed: rc=0 error=120 disabled=0
|
|
|
|
</screen>
|
|
|
|
</para>
|
|
|
|
<para>
|
|
|
|
When receiving such a message, one would check the source, see that
|
|
|
|
it's a call to GetThreadPriorityBoost, that the test failed not
|
|
|
|
because the API returned the wrong value, but because it returned an
|
|
|
|
error code. Furthermore we see that GetLastError() returned 120 which
|
|
|
|
winerror.h defines as ERROR_CALL_NOT_IMPLEMENTED. So the source of
|
|
|
|
the problem is obvious: this Windows platform (here Windows 98) does
|
|
|
|
not support this API and thus the test must be modified to detect
|
|
|
|
such a condition and skip the test.
|
|
|
|
</para>
|
|
|
|
<para>
|
|
|
|
So a good error message should provide all the information which
|
|
|
|
cannot be obtained by reading the source, typically the function
|
|
|
|
return value, error codes, and any function output parameter. Even if
|
|
|
|
more information is needed to fully understand a problem,
|
|
|
|
systematically providing the above is easy and will help cut down the
|
|
|
|
number of iterations required to get to a resolution.
|
|
|
|
</para>
|
|
|
|
<para>
|
|
|
|
It may also be a good idea to dump items that may be hard to retrieve
|
|
|
|
from the source, like the expected value in a test if it is the
|
|
|
|
result of an earlier computation, or comes from a large array of test
|
|
|
|
values (e.g. index 112 of _pTestStrA in vartest.c). In that respect,
|
|
|
|
for some tests you may want to define a macro such as the following:
|
2002-09-17 02:07:03 +02:00
|
|
|
<screen>
|
|
|
|
#define eq(received, expected, label, type) \
|
2004-09-07 22:42:41 +02:00
|
|
|
ok((received) == (expected), "%s: got " type " instead of " type "\n", (label),(received),(expected))
|
2002-09-17 02:07:03 +02:00
|
|
|
|
|
|
|
...
|
|
|
|
|
2003-01-21 00:36:22 +01:00
|
|
|
eq( b, curr_val, "SPI_{GET,SET}BEEP", "%d" );
|
2002-09-17 02:07:03 +02:00
|
|
|
</screen>
|
|
|
|
</para>
|
|
|
|
</sect1>
|
|
|
|
|
2002-12-05 20:13:42 +01:00
|
|
|
|
2002-09-17 02:07:03 +02:00
|
|
|
<sect1 id="testing-platforms">
|
|
|
|
<title>Handling platform issues</title>
|
|
|
|
<para>
|
|
|
|
Some checks may be written before they pass successfully in Wine.
|
|
|
|
Without some mechanism, such checks would potentially generate
|
|
|
|
hundred of known failures for months each time the tests are being run.
|
|
|
|
This would make it hard to detect new failures caused by a regression.
|
|
|
|
or to detect that a patch fixed a long standing issue.
|
|
|
|
</para>
|
|
|
|
<para>
|
|
|
|
Thus the Wine testing framework has the concept of platforms and
|
|
|
|
groups of checks can be declared as expected to fail on some of them.
|
|
|
|
In the most common case, one would declare a group of tests as
|
|
|
|
expected to fail in Wine. To do so, use the following construct:
|
|
|
|
<screen>
|
|
|
|
todo_wine {
|
|
|
|
SetLastError( 0xdeadbeef );
|
2004-09-07 22:42:41 +02:00
|
|
|
ok( GlobalAddAtomA(0) == 0 && GetLastError() == 0xdeadbeef, "failed to add atom 0\n" );
|
2002-09-17 02:07:03 +02:00
|
|
|
}
|
|
|
|
</screen>
|
|
|
|
On Windows the above check would be performed normally, but on Wine it
|
|
|
|
would be expected to fail, and not cause the failure of the whole
|
|
|
|
test. However. If that check were to succeed in Wine, it would
|
|
|
|
cause the test to fail, thus making it easy to detect when something
|
|
|
|
has changed that fixes a bug. Also note that todo checks are accounted
|
|
|
|
separately from regular checks so that the testing statistics remain
|
|
|
|
meaningful. Finally, note that todo sections can be nested so that if
|
|
|
|
a test only fails on the cygwin and reactos platforms, one would
|
|
|
|
write:
|
|
|
|
<screen>
|
|
|
|
todo("cygwin") {
|
|
|
|
todo("reactos") {
|
|
|
|
...
|
|
|
|
}
|
|
|
|
}
|
|
|
|
</screen>
|
|
|
|
<!-- FIXME: Would we really have platforms such as reactos, cygwin, freebsd & co? -->
|
|
|
|
But specific platforms should not be nested inside a todo_wine section
|
|
|
|
since that would be redundant.
|
|
|
|
</para>
|
|
|
|
<para>
|
|
|
|
When writing tests you will also encounter differences between Windows
|
|
|
|
9x and Windows NT platforms. Such differences should be treated
|
|
|
|
differently from the platform issues mentioned above. In particular
|
|
|
|
you should remember that the goal of Wine is not to be a clone of any
|
|
|
|
specific Windows version but to run Windows applications on Unix.
|
|
|
|
</para>
|
|
|
|
<para>
|
|
|
|
So, if an API returns a different error code on Windows 9x and
|
|
|
|
Windows NT, your check should just verify that Wine returns one or
|
|
|
|
the other:
|
|
|
|
<screen>
|
|
|
|
ok ( GetLastError() == WIN9X_ERROR || GetLastError() == NT_ERROR, ...);
|
|
|
|
</screen>
|
|
|
|
</para>
|
|
|
|
<para>
|
|
|
|
If an API is only present on some Windows platforms, then use
|
|
|
|
LoadLibrary and GetProcAddress to check if it is implemented and
|
|
|
|
invoke it. Remember, tests must run on all Windows platforms.
|
|
|
|
Similarly, conformance tests should nor try to correlate the Windows
|
|
|
|
version returned by GetVersion with whether given APIs are
|
|
|
|
implemented or not. Again, the goal of Wine is to run Windows
|
|
|
|
applications (which do not do such checks), and not be a clone of a
|
|
|
|
specific Windows version.
|
|
|
|
</para>
|
2003-09-22 21:34:38 +02:00
|
|
|
<!--para>
|
2002-09-17 02:07:03 +02:00
|
|
|
FIXME: What about checks that cause the process to crash due to a bug?
|
2003-09-22 21:34:38 +02:00
|
|
|
</para-->
|
2002-09-17 02:07:03 +02:00
|
|
|
</sect1>
|
|
|
|
|
|
|
|
|
|
|
|
<!-- FIXME: Strategies for testing threads, testing network stuff,
|
2004-09-07 22:42:41 +02:00
|
|
|
file handling... -->
|
2002-09-17 02:07:03 +02:00
|
|
|
|
|
|
|
</chapter>
|
|
|
|
|
|
|
|
<!-- Keep this comment at the end of the file
|
|
|
|
Local variables:
|
|
|
|
mode: sgml
|
2003-04-19 04:50:57 +02:00
|
|
|
sgml-parent-document:("wine-devel.sgml" "set" "book" "part" "chapter" "")
|
2002-09-17 02:07:03 +02:00
|
|
|
End:
|
|
|
|
-->
|