High End GPU Performance

While a few titles based on the Unreal Engine 3 have already made their way onto the scene, the detail and atmosphere of Unreal Tournament 3 really show off what developers can do with this engine if they take the time to understand it. Between its first party titles and Bioshock, Epic certainly makes a solid statement that its Unreal Engine 3 is an excellent choice for leading edge graphics.

For this test, we are looking at performance scaling on high end video cards ($300+) across multiple resolutions and on multiple maps. We will absolutely be revisiting this game with midrange hardware and multiGPU configurations. In this analysis, we focus on the performance of the Suspense capture the flag map flyby. This is the most graphically intense flyby we have, and the other two maps we tested tended to exhibit similar relative performance between cards.

With our high end hardware, we've pulled out the 1920 x 1200 tests, as this is very likely to be the resolution paired with one of these parts.

Unreal Tournament 3 Demo Performance
 

The NVIDIA GeForce 8800 GTX and Ultra both outperform the AMD Radeon HD 2900 XT, which is to be expected: the 2900 XT costs much less. But the performance gap here is not huge, and the 2900 XT gets major points for that. It handily outperforms its direct competition, the 8800 GTS (both 640MB and 320MB perform nearly identically). Not surprisingly, the X1950 XTX bests the 7900 GTX, and both of these parts perform worse than the 8800 GTS cards.

If we look at the scaling graph for Suspense, we can see that the GTS cards remain above 40fps even at 2560x1600. This is quite impressive, especially for the low memory GTS, but we do have to keep in mind that this is a flyby in a demo version of the game and we may see changes in performance between now and the final version.

Also intriguing is the fact that the high end NVIDIA hardware seems to become CPU limited at less than 1600x1200. This leads to the fact that AMD's Radeon HD 2900 XT actually outperforms the 8800 Ultra at 1280x1024. The 8800 Ultra does seem to scale very well with resolution, while the 7900 GTX drops off quickly and under performs through out the test.

While the rest of this data is very similar to what we've already presented, we did go to the trouble of running the numbers. In order to present a complete picture of what we've seen on the less demanding levels, here is the rest of our data:

Unreal Tournament 3 Demo Performance
 

 

Unreal Tournament 3 Demo Performance
 

 

Final Words

We're just getting started with our UT3 performance analysis, but already there are some interesting conclusions to be had. Quite possibly the biggest takeaway from this comparison is the dramatic improvement in multi-threaded game development over the past couple of years. Starting from a point where none of our game benchmarks were multi-threaded just two years ago, here we are today with the latest and greatest from Epic, and seeing huge gains from one to two cores, and promising improvements when moving to four cores.

Quad-core gaming is still years away from being relevant (much less a requirement), but the industry has come a tremendous distance in an honestly very short period of time. We're more likely to have multi-threaded games these days than 64-bit versions of those titles, mostly thanks to the multi-core architecture in both the Xbox 360 and PlayStation 3. Like it or not, but much of PC gaming development is being driven by consoles, the numbers are simply higher on that side of the fence (even though the games themselves look better on this side).

On the GPU side, NVIDIA of course does quite well with the 8800 lineup, but the real surprise is how competitive AMD is with the Radeon HD 2900 XT. There may be some salvage yet for that architecture, if AMD could only bring out a midrange part that actually offered compelling performance...

Overall CPU Comparison
Comments Locked

72 Comments

View All Comments

  • kmmatney - Wednesday, October 17, 2007 - link

    The benchmarks show that AMD cpu's are not performing as well as they should here. This will hopefully be fixed in the future.

    You sound like someone who has an AMD processor and is bitter...
  • clairvoyant129 - Wednesday, October 17, 2007 - link

    These are the same people who said there is a big difference using Netburst CPUs and K8s. Right, if a Netburst CPU coupled with a 7800GTX got 60 FPS when a K8 got 90 FPS, it was a huge difference to them but now it doesn't seem like it.
  • hubajube - Wednesday, October 17, 2007 - link

    quote:

    The benchmarks show that AMD cpu's are not performing as well as they should here.
    And how should they be performing in your opinion? 100 fps is not good enough for you? How about 500 fps? Is that better?

    quote:

    You sound like someone who has an AMD processor and is bitter...
    I'm definitely not bitter, just realistic. The difference between 90 and 180 fps is totally irrelevant. An Intel E2140 gets over 90fps. Hell, a Sempron with a decent video card could play this game extremely well.

    Benchmarks are great in that you can use them to judge how your system will perform with a game but they're not the be all end all of performance nor is a CPU that does 100 fps a pile of shit because it doesn't do 105 fps.
  • JarredWalton - Wednesday, October 17, 2007 - link

    The point is that at 1920x1200 we're at a completely GPU-limited resolution (as shown by the fact that the difference between E6550 and X6850 is only 1%). AMD still runs 9% slower, so it seems that architecture, cache, etc. means that even at GPU limited resolutions AMD is still slower than we would expect. Is it unplayable? No, but we're looking at the top-end AMD CPU (6400+) and in CPU-limited scenarios it's still 10% slower than an E6550.

    It seems to me that we're in a similar situation to what we saw at the end of the NetBurst era: higher clock speeds really aren't bringing much in the way of performance improvements. AMD needs a lot more than just CPU tweaks to close the gap, which is why we're all waiting to see how Phenom compares.
  • clairvoyant129 - Wednesday, October 17, 2007 - link

    That 9% was using 1920x1200. Majority of PC users use a much lower resolution than that. At 1024x768, it's much much higher.

    Think again moron.
  • KAZANI - Wednesday, October 17, 2007 - link

    And most people don't care about framerates higher than their monitor's refresh rate. Both processors were well above 100 frames in 1024*768.
  • hubajube - Wednesday, October 17, 2007 - link

    No moron, 1024x768 on a 8800GTX is NOT what "most PC users users" are going to be using. The video cards that "most PC users users" will be using was not tested in this benchmark. YOU need to actually THINK next time.
  • clairvoyant129 - Wednesday, October 17, 2007 - link

    Where did I say majority of PC users with an 8800GTX use 1024x768? What's your idea of testing CPUs? Benchmark them by using GPU limited resolutions? What a joke. You people never complained when Anand compared Netburst CPUs to K8s at 1024x768 or lower resolutions.

    Don't get your panties twisted AMD fanny.
  • IKeelU - Wednesday, October 17, 2007 - link

    Ummm...how do you launch the flybys used in this analysis?
  • customcoms - Wednesday, October 17, 2007 - link

    You mention that you cranked the resolution to 1920x1200, but the charts still say 1024x768...the results look like those at 1920x1200 though, so I'm guessing its a typo. GPU bound CPU Comparison charts here: http://anandtech.com/video/showdoc.aspx?i=3127&...">http://anandtech.com/video/showdoc.aspx?i=3127&...

Log in

Don't have an account? Sign up now