How does CPU Speed Impact Graphics Performance?

For the most part, we ultimately make our purchasing decisions based on price. If we have $200 to spend on a processor, it doesn't matter how fast an Extreme Edition runs our apps and games unless it sells for $200. It's the price point that determines what our options are, and then we look at the best performer at that price point to make our final decision. But when upgrading, it's sometimes difficult to know when to upgrade various components - especially a CPU.

If you have a 2.4GHz Pentium 4, is it worth it to upgrade to a 3.4GHz P4 in order to get greater performance in Doom 3? Or is your 2.4GHz P4 paired up just fine with a NVIDIA 6800GT? These next set of graphs are designed to help you see the type of CPU scaling you can expect out of a high end card like the GeForce 6800 Ultra, or a slower card like the Radeon 9800 Pro.

The graphs below are frame rate vs. clock speed graphs, taken using a Pentium 4 C and varying the clock speed on a single platform. Once again it's the curve of the graph that you want to look at; the steeper the slope of the curve is, the more benefit you'll get from having a faster CPU. The flatter the curve is, the less benefit you'll get from having a faster CPU.

Although we're only showing two cards here, you can extrapolate performance of faster and/or slower cards pretty easily. Using our Doom 3 Graphics Guide you should know what cards are faster or slower than the two we're representing here; then just remember that a faster card will have a steeper (more CPU dependent) curve, while a slower card will have a flatter (less CPU dependent curve).

Also keep in mind that the scaling will be relatively similar for both AMD and Intel platforms; we chose to stick with only a single platform here in the interest of time as well as keeping these pages simple.

First up, we have the 6800 Ultra at 800x600:

Here we have a decently steep curve, probably the steepest it will get since we're dealing with one of the fastest GPUs at a relatively low resolution. The move from a 2.4GHz to a 3.2GHz processor resulted in a 21% increase in performance, considering that this was because of a 33% increase in clock speed it is safe to say that at lower resolutions the more money you put into a faster CPU, the higher your Doom 3 performance will be on a 6800 Ultra.

At higher resolutions the burden shifts to the GPU as is evident by the change in the slope of the curve. Now we have a distinctly more flat curve, with only a 13% difference between the fastest and the slowest CPUs - it's not insignificant, but definitely not huge.

Looking at the 9800 Pro at 800x600 we see a curve that closely resembles the 6800 Ultra's curve at 1280x1024, once again with a 13% gain seen from the 2.4GHz processor to the 3.2GHz part.

Although the rest of our CPU tests use the 6800 Ultra, the standings and degrees of performance improvement will apply to other graphics cards as well. As we've just seen, at 1280x1024 the GeForce 6800 Ultra scales much like a Radeon 9800 Pro at 800x600 - keep that in mind as we compare CPUs under id's latest and most impressive 3D game to date.

Index The Battlegrounds
Comments Locked

59 Comments

View All Comments

  • kabob983 - Thursday, September 9, 2004 - link

  • kabob983 - Thursday, September 9, 2004 - link

    So where does the A64 3200+ fit on this scale anyways...above or below the P4 3.4EE?
  • T0918273645 - Monday, September 6, 2004 - link

    this statement from their review is partially wrong.


    "Doom 3 sees system memory as one big cache and drives performance up considerably. It is also the on-die memory controller that makes cache size less of an issue on the Athlon 64, while too small of a cache seems to make or break performance with the Pentium 4."

    notice the 3.2 E with its 1mb cache gets 72.8 while the 3.2 C with 512KB cache gets 68.7. the only difference between this chips is the cache, which results in a difference of 4.1 FPS.

    look at the FX53, with dual channel, and 1mb cache, it gets 103.4

    and the 3800+ with 2.4ghz, 512kb cache, and dual channel gets 99.8

    a difference of 3.6. So the larger cache does help. I think maybe anandtech might have mixed up the fact that they were comparing with amd the difference between dual channel, and cache. while with intel they were comparing only the difference between caches.

    i think looking at the AMD charts you can see how close amd's view of the tradeoff for dual channel with the loss of half the cache actually is, at least as far as doom3 is concerned.

    The 3400+ (no dual channel, 1mb cache, and 2.2ghz) performs at 80.4 FPS

    The 3500+ (dual channel, 512KB cache, 2.2ghz) turns in 79.5FPS. So the dual channel/ cache tradeoff is so close you'd never be able to tell the difference.

    So basically they mixed up the fact that the smaller cach size of the 3500+ compared to the 3400+ is masked by the dual channel ram which nearly completely makes up for the loss in cache.

    I'm surpised they overlooked that fact.

    Or maybe they thought that a difference of 4.1 is more of an issue than a difference of 3.6, but really that difference is nearly the same.
  • manson909 - Monday, August 9, 2004 - link

    i can't believe no one has thought of this: since Doom 3 is capped at 60FPS, why not compare the MINIMUM framerates the CPU's encounter when playing? a system that has an average of 60FPS but is locked in at this speed (no dips) is far superior than a system that has an average of 70FPS but with lows in the 20's & high's in the 100's imo... anyone?
  • HermosaBeach - Friday, August 6, 2004 - link

    Short version:

    I would like to see the final graph comparing CPUs using 1600x1200, Highest Quality, 4xAA (Anti-Aliasing) and 16xAF (Anisotropic Filtering)

    With this graph we may actually see a flattening of the frames per second where an increase in the CPU has no real impact on frames per second. This is where the GPU (GeForce 6800 Ultra) hits the wall and becomes the limiting factor.

    Dave
  • HermosaBeach - Friday, August 6, 2004 - link

    Dear 53, I disagree. Here's another way to look at the problem or question. Let's say I own an ATI X800 XT PE. I would like to know what computer (CPU/MB) to get. I want to play games at 1600x1200 with 6xAA and 16xAF. Starting from the slowest computer like the AMD XP 2000+. As you improve the CPU and climb the performance ladder we expect the frames per second to increase. But, there may come a time where an increase in CPU does not appreciably increase the frame rate. At this CPU level getting a faster CPU is not really going to help you. I was hoping to find this magical CPU point. Early 2003, with my ATI 9800 Pro, the sweet spot was the AMD XP 1800+. A faster CPU imporved frames per second, but not by much and was not worth the significantly increase in cost. For example, with FarCry and Doom3 the AMD 64 3000+ might be the sweet spot where getting the AMD 6400 3700+ really does not significantly increase your frame rate. If this is true, I would get the AMD 64 3000+. If the frame per second continue to significant improve as you increase the CPU then I would get the top end CPU. Sadly, at 1280x1024 without 4xAA and 16xAF you really can't tell.

    Dave
  • skiboysteve - Friday, August 6, 2004 - link

    Dave, your retarded. Try reading the review, esp the first page.
  • HermosaBeach - Thursday, August 5, 2004 - link

    This article should have used 1280x1024 and 1600x1200. The 800x600 resolution was a waste of time and reading. Who's going to purchase any AMD 64 or any ATI X800 series or NVidia 6800 series card and play at such pathetic resolutions. I found the article a waste of time and it certainly did NOT answer the question - what CPU to get. I own a ATI 9800 Pro and it certainly did not help with what CPU to get when I purchase my next graphics card - sigh.

    Dave
  • xxxfubar187xxx - Thursday, August 5, 2004 - link

    Thanks for the benchmarks guys! You do great work over there. Why wasn't the Athlon64 3700+ included in these benchmarks? Being the top of the line Socket 754 processor I figured it would be included in the tests.
  • gimpyd00 - Thursday, August 5, 2004 - link

Log in

Don't have an account? Sign up now