The Test

Below, you can see our test rig configuration. The ATI cards have been removed, since they do not run on the Linux configuration.

 Performance Test Configuration
Processor(s): AMD Athlon 64 3800+ (130nm, 2.4GHz, 512KB L2 Cache)
RAM: 2 x 512MB Mushkin PC-3200 CL2 (400MHz)
Motherboards: MSI K8T Neo2 (Socket 939)
Memory Timings: Default
Hard Drive(s): Seagate 7200.7 120GB SATA
Video Card(s): GeForce 6800 Ultra 256MB
GeForce 6800 128MB
GeForceFX 5950 Ultra 256MB
GeForceFX 5900 Ultra 128MB
GeForceFX 5700 Ultra 128MB
GeForceFX 5600XT 128MB
Operating System(s): SuSE 9.1 Professional (kernel 2.6.8-14-default)
Windows XP SP2
Driver: NVIDIA 1.0-6111
Detonator 61.77

Our testing procedure is very simple. We take our various video cards and run respective time demos while using our AnandTech FrameGetter utility. We rely on in-game benchmarks for some of our tests as well. We post the average frames per second scores calculated by the utility. Remember, FG calculates the frames per second every second, but it also tells us the time our demo ran, and how many frames it took. This average is posted for most benchmarks, but where we want to illustrate important differences, we also show the average FPS per second.

For Doom3, we do not run the "timedemo" command, only the "playdemo" command. Timedemo changes the speed of the playback - that's not what we are interested in, since it skews our results in FrameGetter. We also appended a "-" after the demo to enable pre-caching.

Much to our delight, version 0.1.0 of our FrameGetter utility that we released last week works correctly with Doom3. For Windows, we are still using FRAPS to record our timedemo information, although we are working actively on getting the AnandTech FrameGetter ported over to Windows.

All of our benchmarks are run three times and the highest scores obtained are taken - and as a general trend, the highest score is usually the second or third pass at the timedemo. Why don't we take the median values and standard deviation? For one, IO bottlenecks tend to occur due to the hard drive and memory, even though they "theoretically" should behave the same every time we run the program. Memory hogs like Doom3 and UT2004 that tend to also load a lot of data off the hard drive are notorious for behaving strangely on the first few passes, even though we are using the pre-caching option.

Let's talk about Drivers Doom3 Low Resolution
Comments Locked

36 Comments

View All Comments

  • - Saturday, October 24, 2009 - link

    http://www.goph3r.com/mh">http://www.goph3r.com/mh
    (air jordan, air max, shox tn, rift, puma, dunk sb, adidas) nike jordan shoes 1-24 $32
    lv, coach, chane bag $35
    COOGI(jeans, tshirts, hoody, jacket) $30
    christian audigier(jeans, tshirts, hoody) $13
    edhardy(shoes, tshirts, jeans, caps, watche, handbag) $25
    Armani(jeans, tshirts,) $24
    AF(jeans, coat, hoody, sweater, tshirts)Abercrombie & Fitch $31
    http://www.goph3r.com/mh">http://www.goph3r.com/mh
  • Roots - Tuesday, February 15, 2005 - link

    I predict that for a "normal" WinXP user, the spyware/adware/viruses will bog the performance down enough to make Linux look like the undisputed king of modern gaming. :D
  • lapierrem - Monday, December 13, 2004 - link

    Just wondering why you are doing this from X at all?
    I know pretty much any app that can be run from command line takes a bit of a hit run on X.
    I have no idea and maybe i'm missing something but is it not possible to run from the command line, or is it a driver issue?
    SuSE is not an dist I would ever use, I had it installed but I didn't really like it.
    I wouldn't mind seeing this done - best case linux scenario vs. best case windows
    on the same hardware.
  • aaime - Monday, October 25, 2004 - link

    Is there any change to see a 2D performance comparison? Yes, you're read properly: 2D.
    I ask about it because 2D acceleration of some propietary drivers is ridicolously low, and people should know about it _before_ buying a graphics card.

    For an example of such a comparison, see:
    http://homepage.hispeed.ch/rscheidegger/atilinux_o...
  • mikebabcock - Tuesday, October 19, 2004 - link

    From a different angle than some people suggested, it would be very helpful if you'd use the *same scale* on all graphs on a page. That way we can compare the FPS numbers between tests, instead of having to compute them mentally.

    One graph ends at 80fps, one at 150fps ... the bars are at the same points, how do you notice easily that one test is so much faster than another?

    This is a standard graphing concept -- use the same scale for all charts for comparisons.
  • yokem55 - Tuesday, October 19, 2004 - link

    Another thing effecting this review is that Doom3 provides its own libstdc++.so and libgcc.so libs, which when replaced with symlinks against the system's own built in libs results in a decent performance boost. This is probably a bit more noticable on a distro like Gentoo which has optimized these libs more specifically to the architecture provided. Also, the kernel being used can have a big impact. The vanilla 2.6 kernel has a rather (IMHO) ugly scheduler that allows programs like Doom3 to get run over a bit more than it should. With the ck patchset, this is improved a good amount. As for how well gcc has optimized the doom3 code, an interesting comparison would be to see the performance of the windows version of doom3 running under wine. Granted, the overhead of wine might affect some things, but you would be able to figure out better where exactly the performance losses are coming into play.
  • nyda - Tuesday, October 19, 2004 - link

    You compared Windows vs Linux performance on a 6800 with the 5950 driver for Linux (only)?

    As far as I'm concerned, that makes this benchmark pretty much useless. The latest driver, as of release of this article supports the 6800 perfectly fine here. Since it doesn't seem to be the case on your system, did you ever consider that there might be other issues with the system?

    Furthermore the whole point is pretty much moot seeing this test was performed on a "kernel 2.6.8-14-default". A Linux System is NOT to be used with a default kernel. You can use those to boot from a liveCD but thats about it. They are supposed to give you something to start with, not something to live with. Even Suse ships with automated tools to generate a kernel for your system.

    If you are serious about comparing Linux vs Windows, use a distribution or system that supports your hardware 100% and don't stick with a non-optimized kernel.

    If you were really serious, you would also talk about the various possible optimizations on an open source system (compiling code for your specific system *will* make it *a lot* faster), mention high performance distributions like Gentoo and use an optimized glibc2 which already gives you another 9-11% performance over the one provided with doom3 (which seems to be compiled without any optimization flags for whatever reason).

    You left a lot of Linux potential uncovered and wonder why it turns out to be slower. Hopefully the next article will be more representative. Sadly, this one is just pretty pictures without any meaning. :/

    My own results are similar to poster #18 's results: On an AMD64 3200+, FX6800, Gentoo (-O3 -march=athlon-xp) it's 25% faster than on WinXP. About 30-35% after replacing the provived crappy glibc with a symlink to the one my Gentoo system normally uses.
  • nongamer1 - Tuesday, October 19, 2004 - link

    I have to second #28.

    This review may have been written by someone who spent decades in Windows, applied all the latest tweaks and tuning tools to it, yet on the Linux side, things probably don't look that bright yet.

    Unfortunately the article doesn't mention whether it was the 32bit or 64bit SuSE distro that it tested.
    64bit has been reported to be able to make a difference of up to 30% or 40% CPU performance wise *sometimes*, so in exchange for much worse Linux graphics driver optimization, why not bring the much worse 64bit CPU "optimization" of Windows into the game? ;-)
    (64bit Linux drivers for Nvidia have been available for a few weeks/months, right?)

    As has been said by #28, there are also enormous speed differences between various distros, e.g. Yoper or Gentoo should be blindingly fast compared to a rather slow "standard" Red Hat, and I'd guess that SuSE isn't totally on the fast side either.

    I'd be willing to bet that with the fastest Linux distro and the fastest currently available kernel (e.g. a Con Kolivas kernel), Linux could actually beat Windows hands down, especially with a rather "slight" 25% performance difference as it stands now, despite less optimized graphics drivers...
    And then let them enable SSE2 in the Linux binary and use gcc 3.4.x for compilation and let them have a well-optimized Nvidia driver etc., and the speed advantage should be very obvious.

    Main point: a bog-standard off-the-shelf distro isn't necessarily what you'd want to use for very fast gaming.
    (but OTOH you perhaps didn't tune Windows in a special way for this review either, so the comparison of currently available "standard" components should have been fair after all, I guess...)
  • solidliq - Monday, October 18, 2004 - link

    Perhaps you guys should learn a little more about linux before you continue with these comparisons. Don't get me wrong, I think it's _great_ that you're doing linux comparisons, but a little more knowledge would be helpful.

    First, you're using the 2.6.8 kernel for this test, which is known to be buggy. The major problem: the scheduler. How does this affect game and graphics performance? Greatly. This is why I'm using the 2.6.7 kernel, compiled from source from www.kernel.org.

    Secondly, since the port has been under development for a while, it was probably done with a 2.4.x series kernel. That series of kernel would have been my first choice for testing.

    Third, why have a full desktop environment loaded up to play a fullscreen game (KDE)? You _want_ overhead introduced into the testing?

    Fourth, you always should test multiple distros. As is, this is _not_ a Windows versus Linux comparison. It is a SUSE versus Windows comparison. At a minimum, try SUSE, Mandrake, Fedora, Gentoo, and possibly Slackware (my distro of choice). Each one has its own quirks. Slackware is truly linux, because the vendor does not modify any of the source. However, it is much more difficult to configure correctly. Gentoo is a great distrobution because it is designed for being compiled specifically for the machine it is installed on. Using the straight source packages rather than vendor modified packages also goes a long way towards creating a fair test.

    Finally, learn to do the configuration from a shell, rather than from a gui. This is the way it is done by a vast majority of the linux community. This will allow you to see exactly what's going on, rather than scratching your heads over questions such as, "We're not sure why the reboot was required...". This is linux, not windows. If you know what you're doing, you should never have to scratch your heads over these types of questions. If you did it from a shell, a reboot should not have been necessary. Most likely, the reboot was required because you use a graphical login, and the X Server had to be restarted, _not_ the whole operating system.

    Otherwise, thanks for finally taking linux seriously. Just please to the comparisons right. As I'm sure it took some time to learn the ins and outs of windows, it will also take some time to learn the ins and outs of linux, as it is a very different operating system. Hiring a linux guru would give you a good head start.
  • sprockkets - Saturday, October 16, 2004 - link

    I took someones advice or suggestion here and used cedaga to play starcraft, works much better and faster on Linux, though bnet has issues...

Log in

Don't have an account? Sign up now