WineD3D Benchmarking Part 2: Phoronix Test Suite

After Part 1, I have waited to run any benchmarks, until it looked like they were getting close to actually releasing Wine 1.6. I wanted to run them all at once, so that the results weren’t skewed by system updates between runs (Another problem I have with looking at Stefan’s historic results: I don’t know which improvements are due to wine development, and which were from video driver/kernel/library updates).

In its own $WINEPREFIX, I have installed Phoronix-Test-Suite (PTS) and php for Windows. Unfortunately, some hacking was required to get PTS to output desirable system information, as the Windows side isn’t fully developed (There’s a lot of code like: if(phodevi::is_windows()) { // TODO: Windows support }). I didn’t keep track of everything I ended up changing, but I’d imagine much of it is irrelevant, due to a newer release of PTS (4.6) which I have yet to try.

After some experimentation, I’ve determined that I want to run as many tests as possible in Full HD, and with max settings. I want to really stress my GPU, to see what Wine can pull out of it. Many of Stefan’s (as well as the default PTS) benchmarks are configured to a lower, more common resolution by default, although many of them are also written to be configurable. My options were somewhat limited, as I was running windows tests, and I wanted to get a fair spectrum of Directx9 as well as OpenGL graphics. The tests I settled on are:

  • pts/lightsmark-1.2.0 (OpenGL)
  • pts/unigine-heaven-1.2.0 (OpenGL and DirectX 9)
  • pts/xonotic-1.2.0 (OpenGL)
  • stefandoesinger/3dmark2001-1.0.0 (DirectX 8.1)
  • stefandoesinger/clear_d3d-1.0.0(D3D)
  • stefandoesinger/drawprim_d3d-1.0.0(D3D)
  • stefandoesinger/dynbuffer_d3d-1.0.0(D3D)
  • stefandoesinger/streamsrc_d3d-1.0.0 (D3D)
  • stefandoesinger/halflife2-1.0.3 (DirectX 8.1 and 9.5)
  • stefandoesinger/tmnations-1.0.3 (Directx 9)
  • local/furmark (OpenGL)
  • local/shadermark (DirectX Pixel Shaders 2.0 and 3.0)

You’ll notice, I have a couple “local” tests: I added the Shadermark test because Wine work seems to be progressing well on PixelShader2.0, but not so much on 3 (yet). This benchmark is also written to specifically use the HLSL Compiler, which has seen development in the Wine 1.5 series. I added Furmark because it’s a common Windows GPU stress-testing utility.

A few of these tests require external utilities that are assumed to be installed: specifically AutoHotKey and the windows versions of GNU coreutils and grep. AHK is required for automated interaction with many of the Windows dialogs, since win32 applications tend not to have much in the way of command-line parameters. Coreutils and grep are needed for some of the output parsing: The scripts provided by Stefan relied on certain behaviour in cmd.exe that wasn’t implemented in earlier versions of Wine’s cmd.exe (I forget the specific version): being able to use “for /f” syntax in batch files didn’t work as expected in the earlier versions. Using extra tools was my workaround.

Other notes:

  • The 3DMark2001 AHK script needed to change at wine-1.5.9, as the RANDR support developed: various dialog boxes would display the resolutions in different orders. For earlier versions of wine before 1.5.9, 3DMark2001 doesn’t accept 1920×1080 as a valid resolution.
  • The TMNations AHK script needed to change at wine-1.5.20, as before this, the video card is reported by wine as a fallback. After this, wine reports the correct video card (GTX 660), but TMNations is not familiar with the card, and throws an error dialog (which is easily bypassed).
  • Because of these two things, it might be worth remembering that:
    • Before 1.5.9, Resolution may be incorrect
    • Before 1.5.20, Video card detection may be incorrect (or whatever version your card’s PCI-ID and model were added to the wine codebase)

To set up the automation, the first thing I did was create a script simply to run PTS in the correct environment. Among other things, it passes the various environment variables needed to auto-login to Steam and TMNations, disables Wine debugging (for speed), and sets $LINUX_VERSION and $WINE_VERSION so that PTS can extract them later, as within Wine, I couldn’t find an easy way to find this information. This allowed me to run PTS by hand by simply calling the script, and passing any options that PTS takes as arguments to the script. I very cleverly named this script /home/orion/bin/pts-win.

Then I created a local test-suite. This is an XML file that contains all the tests to run, and what arguments/parameters to run them with. Now I can test all the tests I listed above by running pts-win bench local/winebench.

Once I had that working, I set up a wrapper script around that to auto-configure a specific run for the currently-installed version of Wine. This sets some environment variables that need to be set to run PTS automatically (TEST_RESULTS_NAME,  TEST_RESULTS_DESCRIPTION,  TEST_RESULTS_IDENTIFIER), as well as an optional list of tests to skip (makes debugging the script that much faster/easier). Then it renames the test results (For some reason, they all come out being named for my graphics card, even though I set all the correct variables. I couldn’t find out where in PTS code to fix this). Continuing with my highly sophisticated naming convention, this script is called /home/orion/bin/bench-wine. I’ve decided I’ll wait until I’ve processed the full run before uploading any results to OpenBenchmarking.org.

Good, now I can bench the current wine version. I need to wrap that in another script that changes the wine version. For no good reason other than it’s what I typed, let’s call this one /home/orion/bin/wine-bench-wrapper. We’ll use dpkg –compare-versions to do the version checks and copy in the files specific to running the benchmark for different wine versions (see the AHK notes above). I’ll add a specific exception for myself to /etc/sudoers so that my script can apt-get install various versions of wine non-interactively (I’ve already added exceptions for my auto-build script, so that it could use mount, umount, and chroot to make the i386 packages). Have the script loop over the versions I’m testing (using brace expansion, the list is: 1.3.{35..37} 1.4 1.5.{0..31} 1.6-rc{1..5}) and … ready… GO!

As a safety, I have a killscript/watchdog running in another terminal: It checks for known binaries (e.g. cmd.exe Steam.exe hl2.exe Heaven.exe winedebug), checks their age, and kills them if they’re older than 7 minutes. No individual test takes more than 5 minutes (worst case), and most take no longer than 2 minutes.

A full run of local/winebench-1.0.0 takes about 90 minutes. I’m attempting to run through 41 versions of wine, so at a minimum (should nothing stall or break), the run will take about 62 hours. That’s 2.5 days. If all goes well, I’ll start the run tonight.

Tags: , , ,

One Comment

  • Aaron Haviland says:

    Missed the 1.6 release by a day. I had expected they’d release on Friday, as they had been during the development cycle, but they surprised me!

    Tests are running well, and should be done sometime tomorrow night, and if all is well I’ll upload them to OpenBenchmarking on Saturday.