Quake 4 Performance

This is the only OpenGL title on our list, which is much of the reason we keep using this benchmark. Based on id Software's Doom 3 engine, Quake 4 creates an intense atmosphere through the use of excellent lighting and shadows. Ultra Mode is used, enabling uncompressed texture and normal maps, which has a large impact on the memory subsystem of the graphics card. Our test is a timedemo based on a recording of the first minute of play in the game. The 1.3 patch has been applied, and SMP support has been enabled.

Quake 4




It seems that enabling Ultra Mode in Quake 4 has a huge impact on the 8800 GTS 320MB. Using uncompressed data eats up memory very quickly, and this causes problems for the memory limited card. This time, the score of the 8800 GTS 320MB matches up with the two other lower memory parts we tested, but it is difficult to tell whether this is a Quake 4 issue or a hardware/driver issue.

Quake 4




Interestingly, performance on the 8800 GTS 320MB doesn't fall as much moving from no AA to 4xAA as it has under other games. In spite of this, most of the cards are still faster than the new GTS with 4xAA enabled. Resolution scaling under 4xAA is similar between most of the cards, but unfortunately we were unable to test AA at 2560x1600, as we had some stability issues with Quake 4 at that resolution with Ultra Mode enabled.

Half-Life 2: Episode One Performance Rainbow Six: Vegas Performance
Comments Locked

55 Comments

View All Comments

  • Marlin1975 - Monday, February 12, 2007 - link

    Whats up with all the super high resolutions? Most people are running 19inch LCDs that = 1280 X 1024. How about some comparisions at that resolution?
  • poohbear - Tuesday, February 13, 2007 - link

    have to agree here, most people game @ 12x10, especially people looking @ this price segment. Once i saw the graphcs w/ only 16x12+ i didnt even think the stuff applies to me. CPU influence is`nt really a factor except for 1024x768 and below, i`ve seen plenty of graphs that demonstrate that w/ oblivion. a faster cpu didnt show any diff until u tested @ 1024x768, 12x10 didnt show much of a diff between an AMD +3500 and a fx60 @ that resolution (maybe 5-10fps). please try to include atleast 12x10 for most of us gamers @ that rez.:)thanks for a good review nonetheless.:)
  • DigitalFreak - Monday, February 12, 2007 - link

    WHY didn't you test at 640x480? Waaahhh
  • DerekWilson - Monday, February 12, 2007 - link

    Performance at 1280x1024 is pretty easily extrapolated in most cases. Not to mention CPU limited in more than one of these games.

    The reason we tested at 1600x1200 and up is because that's where you start to see real differences. Yes, there are games that are taxing on cards at 12x10, but both Oblivion and R6:Vegas show no difference in performance in any of our tests.

    12x10 with huge levels of AA could be interesting in some of these cases, but we've also only had the card since late last week. Even though we'd love to test absolutely everything, if we don't narrow down tests we would never get reviews up.
  • aka1nas - Monday, February 12, 2007 - link

    I completely understand your position, Derek. However, as this is a mid-range card, wouldn't it make sense to not assume that anyone looking at it will be using a monitor capable of 16x12 or higher? Realistically, people who are willing to drop that much on a display would probably be looking at the GTX(or two of them) rather than this card. The lower widescreen resolutions are pretty reasonable to show now as those are starting to become more common and affordable, but 16x12 or 19x12 capable displays are still pretty expensive.
  • JarredWalton - Monday, February 12, 2007 - link

    1680x1050 displays are about $300, and I think they represent the best fit for a $300 (or less) GPU. 1680x1050 performance is also going to be very close - within 10% - of 1600x1200 results. For some reason, quite a few games run a bit slower in WS modes, so the net result is that 1680x1050 is usually within 3-5% of 1600x1200 performance. Lower than that, and I think people are going to be looking at midrange CPUs costing $200 or so, and at that point the CPU is definitely going to limit performance unless you want to crank up AA.
  • maevinj - Monday, February 12, 2007 - link

    Exactly. I want to know how this card compares to a 6800gt also. I run 1280x1024 on a 19' and with the 6800gt and need to know if it's worth spending 300 bucks to upgrade or just wait.
  • DerekWilson - Monday, February 12, 2007 - link

    The 8800 GTS is much more powerful than the 6800 GT ... but at 12x10, you'll probably be so CPU limited that you won't get as much benefit out of the card as you would like.

    This is especially true if also running a much slower processor than our X6800. Performance of the card will be very limited.

    If DX10 and all the 8 series features are what you want, you're best off waiting. There aren't any games or apps that take any real advantage of these features yet, and NVIDIA will be coming out with DX10 parts suitable for slower systems or people on more of a budget.
  • aka1nas - Monday, February 12, 2007 - link

    It would be nice to actually have quantifiable proof of that, though. 1280x1024 is going to be the most common gamer resolution for at least another year or two until the larger panels come down in price a bit more. I for one would like to know if I should bother upgrading to a 8800GTS from my X1900XT 512MB but it's already getting hard to find direct comparisons.
  • Souka - Monday, February 12, 2007 - link

    Its faster than a 6800gt.... what else do you want to know.

    :P

Log in

Don't have an account? Sign up now