Jedi Knight: Jedi Academy

Released in 2003, Jedi Academy represents the pinnacle of what the Quake3 engine could offer. With massive levels, dynamic glow, and lightsabers abound, it's one of the most punishing Quake3 engine games ever made, and a good representation of the vast number of games made in the early 2000's with this engine. As our only OpenGL title in this roundup, it's also our gauge to see if ATI's OpenGL performance changed at all over the 3-year period That said, even with ATI's traditionally poor OpenGL performance, we still had to increase our testing resolution to 1600x1200 in order to put a sizable dent in to our test setup; otherwise, we would continuously hit the 100fps frame rate cap.

Jedi Academy

Jedi Academy HQ

As the Quake 3 engine was already 3+ years old at the time of the earliest drivers, it should come as no surprise that there is not much variation to speak of here either with or without AA/AF. Even with that, we can see that ATI still managed to work in one significant performance improvement in between the Catalyst 3.00 and 3.04 driver sets, with a 10% frame rate increase. The numbers are a bit more mixed with AA/AF enabled, but even here, the peak performance difference is a very noticeable 14%.

Looking at the screen captures, however, we see a very interesting story that the benchmarks do not show, and it's not all performance related.



Catalyst 3.04 versus 3.00 (mouse over to see 3.00)

The performance improvements that we saw between the 3.00 and 3.04 drivers appear to have been completely free, as there is no difference between the two images. Comparing the 3.06 and 3.09 drivers, however...



Catalyst 3.09 versus 3.06 (mouse over to see 3.06)

Unlike the earlier comparison, there is a very noticeable IQ difference between the 3.06 and 3.09 drivers, but looking at our charts, there is no such difference in performance. This is a prime example of how drivers aren't just about performance improvements, as the IQ difference is the result of a bug fix by ATI with dynamic glow. On drivers previous to 3.09, the JA team had to use a hack to get around a bug in ATI's drivers, causing the inferior image quality seen above. These hacks are not used in drivers 3.09 and later, and as we can see, ATI was able to fix the bug without a performance hit. There was no further change to IQ after the 3.09 drivers.

Overall, however, Jedi Academy shows that other than early improvements and a bug fix, there was little change in performance in this game with the 9700 Pro.

D3DAFTester Unreal Tournament 2004
Comments Locked

58 Comments

View All Comments

  • timmiser - Wednesday, December 14, 2005 - link

    That is why they are so good. It shows that the early drivers are already well opitimized and that there is not much improvement over the months/years from driver release to driver release.

    Nvidia on the other hand, will have a driver release (typically around the launch of a competing ATI card) that all of a sudden shows a 25% gain or some ungodly number like that. This shows us that either A) Nvidia didn't do a very good job with opitimizing their drivers prior to that big speed increase, or B) held the card back some via the driver so that they could raise the speed upon any threat (new release) by ATI.

    Either way, it reflects poorly on Nvidia.
  • DerekWilson - Monday, December 12, 2005 - link

    lots of people have requested more modern games.

    our goal at the outset was to go back as far as possible with the drivers and select a reasonable set of games to test. most modern games don't run on older drivers, so we didn't consider them.

    for future articles of this nature, we will be including a couple modern games (at the very least, half-life 2 and doom 3). we will handle the driver compatibility issue by starting with the oldest driver that supports the game.

    very new games like FEAR won't be useful because they've only got a driver revision or two under their belt. Battlefield 2 is only about 6 months old and isn't really a suitable candidate either as we can't get a very good look at anything. My opinion is that we need to look at least a year back for our game selection.

    thanks for the feedback. we're listening, and the next article in the series will definitely incorporate some of the suggestions you guys are making.
  • Cygni - Tuesday, December 13, 2005 - link

    I cant belive people missed this point. I thought it was pretty obvious in the text of the article. Throwing teh gaem of teh futar at a videocard running drivers from 1997 is going to have obvious consequences. That doesnt give you anyway to measure driver performance increases over time, whatsoever.
  • nserra - Monday, December 12, 2005 - link

    I agree.

    But I think the best candidate would be the R200 (8500) for testing,
    since everyone said it was a good card (hardware) with bad drivers (software).

    So a good retro test is how the R200 would standup with recent drivers VS nvidia geforce 3/4 with the older games.
    The all idea is to see if 8500 could keep up with geforce 3/4 if it had good drivers.

    Resuming:
    2002/2003 games | radeon8500 card | 2002/2003 driver
    2002/2003 games | geforce3/4 card | 2002/2003 driver

    2002/2003 games | radeon8500 card | 2005 driver
    2002/2003 games | geforce3/4 card | 2005 driver

    2004/2005 games | radeon8500 card | 2005 driver
    2004/2005 games | geforce3/4 card | 2005 driver
  • JarredWalton - Monday, December 12, 2005 - link

    The problem is that the R200 is no longer acceptable for even moderate gaming. If you have a 9700 Pro, you can still get reasonable performance on almost any modern game. Yes, you'll need to drop to medium and sometimes even low quality settings, but a 9700 Pro is still three or four times (or more) as fast as the best current IGP.

    I'm not working on these articles, but personally I have little to no interest in cards that are more than 3 generations old. It might be intersting to study from an academic perspective, but for real-world use there's not much point. If enough people disagree with this, though, I'm sure Ryan could write such an article. :)
  • Hardtarget - Monday, December 12, 2005 - link

    Neat article idea but I would deffinitely of thrown in one more game, a modern one. Probably Half Life 2, see how it does on teh older hardware in general, and see what sort of driver revisions do for it. Would of been pretty interesting.
  • Avalon - Monday, December 12, 2005 - link

    I think Far Cry, HL2, and Doom 3 ought to be tested. I remember running those games on my 9700 pro. Far Cry and D3 ran just fine at 10x7, and HL2 ran great at 12x9. I'm pretty sure quite a few people were using these cards before upgrading, in these games.
  • WileCoyote - Monday, December 12, 2005 - link

    My conclusion after seeing the numbers: ATI prefers directing money/man-power/time/resources towards synthetic benchmarks rather than improving game performance. I consider THAT cheating.
  • Questar - Monday, December 12, 2005 - link

    Explain the image quality increases then.

    Or do you consider nvidia lowering image quality from generation to generation an improvment?
  • Jedi2155 - Monday, December 12, 2005 - link

    Explain the Halo benchmark then?

Log in

Don't have an account? Sign up now