Doom 3 Performance

Not surprisingly, the NVIDIA cards maintain their place as the performance leader in Doom 3. We did run this test in high quality mode, which enables 8x AF; medium or low quality would give more playable results on the cheaper ultra low end cards. This test posts the lowest framerates of all the tests that we looked at today, and luckily, Doom 3 also looks better than most other games when compared at the lowest resolutions.

Doom 3 Performance


At 800x600, there is a 14% difference between the two fastest TurboCache cards that we tested. The cheapest NVIDIA and the ATI X300 HyperMemory cards perform equally well here.



X300 HyperMemory performance changes less versus resolution than the NVIDIA cards, but even at the lowest resolution, we would recommend turning off some of the extras and going with a lower quality mode.

Round 2: Performance Far Cry v1.3 Performance
Comments Locked

33 Comments

View All Comments

  • A554SS1N - Tuesday, May 17, 2005 - link

    Hello Derek, are you planning to test the AGP 6200 version which uses the NV44A core - it's virtually the same core as the PCI-E Turbo-Cache cards, except it is a native AGP solution. They are all 64-bit as far as NVidia have told me and although all board makers only provide passive cooling, that's all it needs. I've only seen one brief review on another website so far (of the Inno3D one) that didn't really have enough information. As reviews on Anandtech are very thorough, it would be good to test and compare this card. The core is clocked at 350MHz like the Turbo-cache cards and the card has 128mb of 500MHz effective memory.
  • Zoomer - Monday, May 16, 2005 - link

    http://www.anandtech.com/video/showdoc.aspx?i=2413

    Typo?

    " ...the bear minimum in graphics cards supports DX9 level graphics...
    "

    Shouldn't bear be bare?
  • pxc - Friday, May 13, 2005 - link

    The x300 HM results wouldn't be so disappointing if ATI had not been saying that HM cards would be faster than TC cards.

    Derek: 128MB HM doesn't make much of a difference. I had an XPRESS 200M laptop with 128MB dedicated memory plus HyperMemory and performance was horrible, much much slower than a x300 SE. I wouldn't be surprised if turning off HM made the card faster.
  • PrinceGaz - Friday, May 13, 2005 - link

    #23 Jarred- "Despite the fact that the system was high-end, the performance of the cards is the limiting factor. A 3.0 GHz or 3000+ CPU with the same amount of RAM would likely post similar scores. However, 512MB of RAM would have something of an impact on performance. Anyway, $100 gets you 1GB of RAM these days, so we're more or less through with testing anything with 512MB of RAM or less."

    Whilst I agree that 1GB and it is not unreasonable for even budget systems to have that amount, $100 certainly won't buy you 1GB of OCZ PC3200 with 2-2-2 timings. Given that the performance of these cards is very dependent on system memory and a budget system is likely to use budget memory, it is important to have benchmarks done with 2.5-3-3 timings or even 3-3-3.

    I support AT's use of a very fast CPU etc in all their usual graphics-cards reviews, and a very fast GPU for all their CPU reviews, so that performance is dependent on the component being tested rather than the rest of the system. However with the TurboCache and HyperMemory cards, the system memory effectively becomes an important part of the card, so it is important to test with the sort of budget memory these cards would be used with.

    By using premium memory, AT has probably reduced the importance of the quantity of onboard memory and skewed the relative results of the tested cards. The 64MB TC may have performed significantly better in comparison with the others, had the test system been outfitted with 1GB of budget memory at 3-3-3 timings.
  • kmmatney - Thursday, May 12, 2005 - link

    #28, you can get decent integrated graphics with the ATI XPRESS 200 chipset - however the only motherboard at NewEgg is $102, which is a bit high. I once saw them for around $80.

    http://www.newegg.com/Product/Product.asp?Item=N82...
  • bupkus - Thursday, May 12, 2005 - link

    "...perhaps upcoming integrated graphics solutions from ATI and NVIDIA will be as compelling as these parts show value products can be."

    I have a need for cheap but functional graphical PCs for smaller children, so I'm still waiting for a replacement for the NVIDIA nForce2 IGP. What's the holdup?
  • RadeonGuy - Thursday, May 12, 2005 - link

    Moral of the story is they both suck
  • DerekWilson - Thursday, May 12, 2005 - link

    Our comparison tests will continue to be done on higher end hardware. Our environments are designed to test the performance of graphics hardware not to determine the actual performance an end user can expect.

    We run these games on a fresh install of Windows XP SP2. Audio hardware is not installed and the service is disabled in Windows. Nothing that isn't necessary to get the graphics to the screen is not enabled.

    Most people have lots of background programs, services, and audio running. With the way Intel and AMD are approaching dual core, multitasking is perched to become even more pervasive. This will enevitably widen the gap between our tests and real world performance when we are looking at graphics cards. Our CPU performance tests will continue to grow and include more and more multitasking as we are able to come up with the tests.

    Of course, #25 has a very valid point -- it may well be worth it to run one test in a price targeted system running a more "typical use" environment for reference purposes. Budget systems cluttered with loads of apps running in the background are no place to build a hardware comparison, but that doesn't mean they are entirely unuseful.
  • jediknight - Thursday, May 12, 2005 - link

    #17
    It would be useful, however, to include benchmarks with all budget hardware - to see if you really can get acceptable (30fps+) gaming performance with these cards in realistic settings.
  • Wellsoul2 - Thursday, May 12, 2005 - link

    Thanks Derek.

    Seems crazy that there's no DVI since CRT's are
    going away fast. Even my old budget 9200SE card
    had DVI.

    Just wondering..I would think with 17inch LCD's
    around $175 and 19inch around $300 (with DVI)
    that this is the future.
    What do you all think?

Log in

Don't have an account? Sign up now