The Test

Below, you can see our test rig configuration. The ATI cards have been removed, since they do not run on the Linux configuration.

 Performance Test Configuration
Processor(s): AMD Athlon 64 3800+ (130nm, 2.4GHz, 512KB L2 Cache)
RAM: 2 x 512MB Mushkin PC-3200 CL2 (400MHz)
Motherboards: MSI K8T Neo2 (Socket 939)
Memory Timings: Default
Hard Drive(s): Seagate 7200.7 120GB SATA
Video Card(s): GeForce 6800 Ultra 256MB
GeForce 6800 128MB
GeForceFX 5950 Ultra 256MB
GeForceFX 5900 Ultra 128MB
GeForceFX 5700 Ultra 128MB
GeForceFX 5600XT 128MB
Operating System(s): SuSE 9.1 Professional (kernel 2.6.8-14-default)
Windows XP SP2
Driver: NVIDIA 1.0-6111
Detonator 61.77

Our testing procedure is very simple. We take our various video cards and run respective time demos while using our AnandTech FrameGetter utility. We rely on in-game benchmarks for some of our tests as well. We post the average frames per second scores calculated by the utility. Remember, FG calculates the frames per second every second, but it also tells us the time our demo ran, and how many frames it took. This average is posted for most benchmarks, but where we want to illustrate important differences, we also show the average FPS per second.

For Doom3, we do not run the "timedemo" command, only the "playdemo" command. Timedemo changes the speed of the playback - that's not what we are interested in, since it skews our results in FrameGetter. We also appended a "-" after the demo to enable pre-caching.

Much to our delight, version 0.1.0 of our FrameGetter utility that we released last week works correctly with Doom3. For Windows, we are still using FRAPS to record our timedemo information, although we are working actively on getting the AnandTech FrameGetter ported over to Windows.

All of our benchmarks are run three times and the highest scores obtained are taken - and as a general trend, the highest score is usually the second or third pass at the timedemo. Why don't we take the median values and standard deviation? For one, IO bottlenecks tend to occur due to the hard drive and memory, even though they "theoretically" should behave the same every time we run the program. Memory hogs like Doom3 and UT2004 that tend to also load a lot of data off the hard drive are notorious for behaving strangely on the first few passes, even though we are using the pre-caching option.

Let's talk about Drivers Doom3 Low Resolution
Comments Locked

36 Comments

View All Comments

  • Guspaz - Thursday, October 14, 2004 - link

    I'm sorry, I snapped. The tone of my post was uncalled for.

    I still believe, however, that further investigation is in order.
  • LittleKing - Thursday, October 14, 2004 - link

    I've said it before and I'll say it again, the rollover images don't work with FireFox. It's a shame Anand won't support it I guess.

    LK
  • KristopherKubicki - Wednesday, October 13, 2004 - link

    Guspaz: We get different frame rates with 8X and 16X. Even if they produce the same image, I suspect they use different algorithms. I don't know why that would seem unprofessional?

    Kristopher
  • walmartshopper - Wednesday, October 13, 2004 - link

    #20... If it's not doing 16x, how do you explain the performance drop? Just because the images are the exact same does not mean it's not doing the work. Theres a limit to antialiasing, meaning there's a limit to how smooth an edge can be. I think in many cases, 8x hits that limit, and 16x is just overkill. But just because there's no difference in this particular screenshot doesn't mean theres not ever a difference. I'm sure it makes some kind of difference in most cases, even if only a few pixels. Just imagine if 32xAA existed... compare 16x to 32x and i doubt you would ever be able to find any difference. But that doesn't mean the card isn't doing extra work. With 8x and 16x you are starting to get into the same territory where anything above 8x makes little or no difference.

    Sorry, but that post gave me a good laugh.
  • sprockkets - Wednesday, October 13, 2004 - link

    Keep in mind that SuSE uses xfree86 in 9.1and will go to x.org in the next release, or it seem by the way the ftp is hinting it will.
  • walmartshopper - Wednesday, October 13, 2004 - link

    I'm getting noticably better performance on Linux over xp. I run 1600x1200 high quality with 4xAA on a 6800gt.

    Linux:
    Slackware 10
    kernel: custom 2.6.8.1
    X: xorg 6.8
    NV driver: 6106
    Desktop: KDE 3.3 (4 desktops, 4 webcam monitors, amsn, gaim, desktop newsfeeds, and a kicker loaded with apps all running while i play)

    xp:
    fresh copy with nothing more than a few games installed
    NV driver: 66.72

    At the same settings, the game feels noticably smoother on linux. Thanks to ReiserFS, the loading time is also much faster. Sorry for no benchmarks, but I got rid of the windows install after my first time playing on Linux. I had problems with the 6111 driver crashing, but 6106 works flawlessly. (Although I can switch drivers in less than a minute without even rebooting) I can't wait until nvclock supports overclocking on the 6800 series.

    I'm a little disappointed that all the testing was done on SuSE. The beauty of Linux is being able to customize and optimize just about anything. I realize that SuSE is a distro that average joe is likely to use, but I think you should also include some scores from a simple, fast, optimized distro/kernel such as Slack or Gentoo to show what Linux is really capable of.

    3500+ @ 2.44ghz
    1gb ddr @ 444mhz 2.5-3-3-7
    k8n neo2 platinum
  • Guspaz - Wednesday, October 13, 2004 - link

    I did a difference on the 8x and 16x AA images.

    You're wrong. Your review cards are NOT doing 16xAA. They're doing 8xAA.

    Considering how the 8x and 16x images look identical, that should have been your first hint; once you run a diff and find they ARE identical, I'd expect you to know better.

    Seriously, make a correction, this looks unprofessional.
  • Saist - Wednesday, October 13, 2004 - link

    #15 - My bad to make such generalizations..

    but the fact that you can't even use an ATI card to play D3 doesn't bode well for Linux gaming...

    ****

    Keep in mind, ATi is several months, if not a year or more behind Nvidia in supporting Linux. Give ATi time, and things like this will probably become as obsolete as Win95.
  • Olias - Wednesday, October 13, 2004 - link

    At 800x600-high quality, I get 32fps in XP and 45 in Linux. Linux is 13fps(40%) faster.

    AMD Athlon XP 3200+
    nForce2 chipset
    5900XT Video Card
    Memory: 2x512MB PC-3200 CL2.5 (400MHz)
    Distro: Gentoo Linux
    Kernel: 2.6.7-r14
    X: xorg-x11-6.7.0-r2
  • ath50 - Wednesday, October 13, 2004 - link

    You list on the first page

    "Timothee Besset, Linuxgamers.com " as the source of the quote, but its actually Linuxgames.com, linuxgamers is one of those search pages with popups that tries to reset your homepagge :

    Just a little thing...I went off to linuxgamers.com first hehe.

Log in

Don't have an account? Sign up now