Original Link: https://www.anandtech.com/show/1509




Introduction

Doom3 was a turning point for a lot of us as it marked an important milestone in next generation game engines. We have been keeping a very close eye on id's Linux adventure, and at the core of id's Linux development is Timothee Besset, the Linux port maintainer.

"I'm getting surprisingly good performance compared to the Windows version."

Timothee Besset, Linuxgames.com [1]

This sounds like the premise of a wonderful opportunity to put Doom3 through its paces. We crafted this entire analysis around Timothee's expectations.

Our goals for this analysis are twofold. We want to take the newest working video cards that we can find and test their performance on Linux using Doom3. This is slightly a continuation of last week's GPU roundup as the Doom3 engine will ultimately become the next cornerstone for Linux first-person shooter games. This includes exhaustive image quality (IQ) testing. Secondly, we wish to run comparative analysis on how Doom3 performs and looks on Linux versus Windows.




Setting it Up

We are required to run 24-bit color for this game, so the default 16-bit desktop had to be reconfigured using SaX2 for 24-bit color. We also double-checked to make sure that we were running the NVIDIA driver, as opposed to the NV driver. We can only modify AA and AF settings using the nvidia-settings utility if we use the NVIDA driver. Below, you can see a quick screenshot of the nvidia-settings utility.




Click to enlarge.


As in our previous roundup, the 6800 and 6800 Ultra video cards are not recognized natively by the NVIDA driver. We had to select the 5950 Ultra driver manually in SaX2, run the "switch2nvidia" utility provided with our drivers, and then reboot the system. We are not entirely sure why the reboot is necessary, but Doom3 would not initialize without it.

Just like our large GPU roundup last week, we attempted to run Doom3 on ATI and NVIDIA mid-range/high end cards from the last 18 months. Unfortunately, we were greeted with an ominous warning on Doom3's Linux port webpage.
"Currently, the game will not run correctly on ATI cards using the fglrx driver. However, the ATI developers are working on new driver releases, and eventually the game will be supported. "[2]
We haven't heard much from the ATI team yet, but hopefully, their upcoming Linux announcement will rectify this.




The Test

Below, you can see our test rig configuration. The ATI cards have been removed, since they do not run on the Linux configuration.

 Performance Test Configuration
Processor(s): AMD Athlon 64 3800+ (130nm, 2.4GHz, 512KB L2 Cache)
RAM: 2 x 512MB Mushkin PC-3200 CL2 (400MHz)
Motherboards: MSI K8T Neo2 (Socket 939)
Memory Timings: Default
Hard Drive(s): Seagate 7200.7 120GB SATA
Video Card(s): GeForce 6800 Ultra 256MB
GeForce 6800 128MB
GeForceFX 5950 Ultra 256MB
GeForceFX 5900 Ultra 128MB
GeForceFX 5700 Ultra 128MB
GeForceFX 5600XT 128MB
Operating System(s): SuSE 9.1 Professional (kernel 2.6.8-14-default)
Windows XP SP2
Driver: NVIDIA 1.0-6111
Detonator 61.77

Our testing procedure is very simple. We take our various video cards and run respective time demos while using our AnandTech FrameGetter utility. We rely on in-game benchmarks for some of our tests as well. We post the average frames per second scores calculated by the utility. Remember, FG calculates the frames per second every second, but it also tells us the time our demo ran, and how many frames it took. This average is posted for most benchmarks, but where we want to illustrate important differences, we also show the average FPS per second.

For Doom3, we do not run the "timedemo" command, only the "playdemo" command. Timedemo changes the speed of the playback - that's not what we are interested in, since it skews our results in FrameGetter. We also appended a "-" after the demo to enable pre-caching.

Much to our delight, version 0.1.0 of our FrameGetter utility that we released last week works correctly with Doom3. For Windows, we are still using FRAPS to record our timedemo information, although we are working actively on getting the AnandTech FrameGetter ported over to Windows.

All of our benchmarks are run three times and the highest scores obtained are taken - and as a general trend, the highest score is usually the second or third pass at the timedemo. Why don't we take the median values and standard deviation? For one, IO bottlenecks tend to occur due to the hard drive and memory, even though they "theoretically" should behave the same every time we run the program. Memory hogs like Doom3 and UT2004 that tend to also load a lot of data off the hard drive are notorious for behaving strangely on the first few passes, even though we are using the pre-caching option.




Doom3 Low Resolution

Below, you can see how our cards performed under lower resolution on both Windows and Linux. Texture Sharpening was disabled in all tests. Obviously, since we could not run these tests at 16X AF on Linux for all cards, we had to turn the settings down to 8X AF on Windows. (We talk more about the 16X issue in the IQ section of this analysis.) Both platforms are using 4X 9-Tap Gaussian AntiAliasing. All tests were taken with default settings in game (other than resolution).

Linux 1024x768 No AF No AA

Linux 1024x768 4X AA 8X AF

Windows 1024x768 No AA No AF

Windows 1024x768 4X AA 8X AF

Notes From the Lab

Feel free to check here how our Linux testing fared on Windows from the Doom3 launch. Derek uses a slightly faster machine with slightly different testing parameters, but it should give you a general idea of where our results scale on Windows.

Unfortunately, it's fairly ridiculous to think that you will run the game with any of these settings on a 5600XT.

At this point, we are looking at ulterior meanings to Timothee's statement. Performance ranges anywhere from 25% to 15% worse on the Linux platform fairly consistently. Performance looks OK, but we certainly can't expect our Linux platform to outperform Windows in this portion of the analysis.




Doom3 High Resolution Averages

Below, you can see how our cards performed under higher resolution on both Windows and Linux. Texture Sharpening was disabled in all tests. We didn't even test the 5600XT here; it was a waste of time on the 1024x768 resolution, and it is an even larger waste of time here.

Linux 1280x1024 No AF No AA

Linux 1280x1024 4X AA 8X AF

Windows 1280x1024 No AF No AA

Windows 1280x1024 4X AA 8X AF

Notes From the Lab

Again, Linux could not keep up to Windows here. The Linux hardware can keep up slightly, but it seems the debate was over before it began.




Full Screen Anti Aliasing

An interesting and helpful quality about the AnandTech FrameGetter is that it always records the first frame of a timedemo, so long as the timedemo takes more than 2 seconds to load. This makes sense, since the screen before the timedemo is almost always a static screen that just says "loading"; nothing new is outputted from the frame buffer. In any case, this provides us with excellent opportunity to do some very neat IQ testing.

Curiously, our NVIDIA drivers have a slider for 16X AA. This is generally unsupported outside of the Quadro cards for NVIDIA on Windows. Attempting to run Doom3 on 16X AA resulted in less than 10FPS during the demo1 timedemo. In fact, we can enable 2X Bilinear, 2X Quincunx, 4X Bilinear or 4X 9-tap Gaussian, 8X or 16X AA. We have a simple analysis, which follows, of the below image under various AA settings.

The screenshot image that we are using for analysis can be seen below in 16X AA. Feel free to download our AA raw data files here.




Click to enlarge.


Now, we look at a smaller piece of the puzzle for each image.

AA Setting Image
(mouse-over for No AA)
Difference Map
(click to enlarge)
No AA
2X AA Bilinear
2X AA Quincunx
4X AA Bilinear
4X AA 9-tap Gaussian
8X AA
16X AA

16X is clearly working on our Linux machine, to the advantage of our Linux users over our Windows users. Performance is abysmal, but it's not something that we are totally concerned about right now, since abysmal performance and Doom3 tend to go hand in hand a lot. The graph below demonstrates how AA affected performance in our demo1 timedemo on the 6800 for 1280x1024.

Linux AA Scaling - GeForce 6800

It should be noted that when setting AA higher than 4X on our GeForce 6800 cards, the screen would occasionally corrupt into a static/snowy image, and then freeze our entire machine.

It seems that our sweet spot for FSAA on Doom3 is right in the 4X 9-Tap Gaussian mode. 16X AA looks amazing;there is a clear, visual difference. However, the performance gap is extremely noticeable. Below, we have provided a difference map of 4X 9-Tap and 16X. The shadow seems considerably sampled, it looks less artificial now.



Although the difference is definitely visible between the two screenshots, it seems that 4X Gaussian does a fairly good job of cleaning up the jagged edges around the sides of the machine. The real difference seems to occur right in the center where the 16X image really blends the hard edges into a more fluid looking object.




Texture Sharpening

Texture sharpening is something that kind of bugs us a little. Enabling Texture Sharpening should only increase Anisotropic Filtering one increment higher than what it is already set - unless you have already set it to 8X, in which case, it does nothing. In our examples, we were surprised to find that Texture Sharpening does not even do that. In the two screenshots below, you will see the default image on the left, and the image with texture sharpening enabled on the right.

Click to Enlarge

There is absolutely no difference between these two images. You can check out the difference map that we made of the images below.



Of course, we expect Texture Sharpening to look exactly like 2X AF. Below, you can see the image with no AF and 2X AF below, as well as the difference map.

Click to Enlarge



It's hard to see, but clicking on the difference enlarged image shows 2X very clearly. It should not really matter anyway, as you can set the slider to determine what AF settings to use, but under Linux, checking the Texture Sharpening box does not. There is no change in average FPS enabling or disabling TS.




Anisotropic Filtering

Remember, we claimed AF was not working for NVIDIA in our previous analysis. Using the NVIDIA driver (as opposed to the NV driver), and enabling AF does work, but not quite the way that you would expect on the NVIDIA 5950 Ultra that we used. Below, you can see comparisons between our image at 2X, 4X and 8X AF, as well as the difference map.

 AF Setting  Image
(click to enlarge)
 Difference Map
(click to enlarge)
No AF -
2X AF
4X AF
8X AF

You'll notice that we didn't have any mouse-over comparisons for the above image. Quite frankly, there is not much difference from image to image, and we should see significant differences from 8X and 2X. This caused us much alarm. We also noted that NVIDIA's drivers do not support 16X AF for this card (while the Windows drivers do). Something seemed amiss. We repeated the above procedure with a GeForce 6800 Ultra.




Anisotropic Filtering

The NVIDIA 6xxx series uses a different algorithm for Anisotropic Filtering. AF on the 6xxx cards appears to be working fine. There is a clear difference between each setting.

AF Setting Image
(click to enlarge)
Difference Map
(click to enlarge)
No AF -
2X AF
4X AF
8X AF
16X AF

Below, you can see the image quality difference between the image with no AF, the image with 2X AF, and the image with 16X AF.



No AF. Hold your mouse over image ti see 16X AF.


There is a very clear difference here between the various AF levels, unlike on the 5xxx series cards. We have a chart indicating the various frames per second for each AF setting under the GeForce 6800 in 1280x1024 mode.

Linux AF Scaling - GeForce 6800

Performance drops as much as 25% moving from 1X (No) AF to 16X AF. There does not appear to be a definitive sweet spot, since the graph scales very linearly.




Final Thoughts

The majority of our analysis revolved around image quality for this first look at Doom3 on Linux. NVIDIA's addition of 16X and 8X Anti-Aliasing to the 1.0-6111 driver was a welcomed feature. The practicality of running 16X or 8X AA during game play is about nil (as shown on the previous pages), but any Linux user should appreciate a feature that they have over a Windows user. Hopefully, the infrequent crashing issue with higher AA levels will be fixed in the next driver revision.

The practicality of NVIDIA's Texture Sharpening is already somewhat contested; it only bumps the AF slider one bar (and without telling the user). For Linux, enabling Texture Sharpening seemed to provide no performance or image quality benefits.

Finally, to answer the question of which OS runs Doom3 faster - unfortunately, there really was not the type of contest that we had anticipated. Yes, Doom3 for Linux stays competitive with Doom3 for Windows, but in several instances, it falls more than 25% behind on NVIDIA graphics cards. If and when ATI cards start working for Doom3, we are likely to see even larger differences.


Log in

Don't have an account? Sign up now