Who Cares about Clock Speeds?

So far we've figured out that UT3 likes large caches, sees a huge benefit from two cores (and a minor improvement from 4) but what about raw clock speed? We took an unlocked Intel Core 2 Duo processor and ran it at 333MHz increments from 2.0GHz up to 3.33GHz, plotting performance vs. frequency on the chart below in all three flybys:

At 1024 x 768, a reasonably CPU bound resolution, the curve isn't as steep as you'd expect. Over a 66.5% increase in clock frequency, overall performance goes up less than 28%. Things like L2 cache size and microprocessor architecture in particular seem to matter more here than raw clock speed.

Multi-Core Gaming is Upon Us AMD vs. Intel - Clock for Clock
Comments Locked

72 Comments

View All Comments

  • Ryan Smith - Wednesday, October 17, 2007 - link

    See the post right above yours. ;-)
  • CRimer76 - Wednesday, October 17, 2007 - link

    Isn't this game supposed to use the Ageia crap to improve performance? Would love to see some benchies on that.
  • Ryan Smith - Wednesday, October 17, 2007 - link

    Yes, it can use the PhysX hardware to improve performance. However flybys are completely useless for testing PhysX, because there is no physics work going on. Since I cover the PPU side of things, once we have the ability to play with demos we'll have an article up.
  • MadBoris - Wednesday, October 17, 2007 - link

    Great work showing all the different gradients on vid card, cache, cpu, threading.
    The game really scales impressively on hardware.
    How did Sweeney manage to make the 2900xt shine? hehe

    I've got some small reservations using the flyby's, hopefully you guys will have a demo run for next time around. Theoretically, a demo loop should be pretty similar, mainly more CPU intensive, but I'm curious to see.

    Good work showing off the games scaling.
  • Sunrise089 - Wednesday, October 17, 2007 - link

    Not only does the X1950XT absolutely destroy the 7900GTX (the ATI card REALLY ended up being the better long-term buy), but the HD 2900XT looks absolutely great here. If this sort of performance is indicative of the future (and it may not be - perhaps there is something that will favor nVidia when the release copy of the game arrives with better graphics) then ATI looks much better than it did even last month.

    PLEASE do a follow up with this game at launch/when the new midrange GPUs launch. It's going to be very interesting to see the price/performance between the new 8800GT, an overclocked HD 2900pro, and the new ATI midrange card (HD 2950pro?)
  • tmx220 - Wednesday, October 17, 2007 - link

    they used an X1950XTX, though it would be about the same
    they should have used the XT because it can still be found, and for a decent price
  • Darth Farter - Wednesday, October 17, 2007 - link

    Anand,
    the last 2 game reviews are nice to have 1024x768 CPU comparisons, but I think we're seeing too many pages of 1024x768 of that which probably only 2% of the guys planning on running these games will use.

    I'd rather suggest showing at least the scaling if there's an intention of showing CPU limitations to the common 1280x1024, the 1680x1050(1600x1200) and the 1920x1200 to reflect what we're also actually are going to see on screen as user by swapping/Upgrading a CPU/Platform to another model/platform while planning on playing with our obviously high end GFX card at 1920x1200.

    an Overall CPU comparison with at only 1024x768 left me severely disappointed though I can understand the time constraints. This regrettably reminds me at Toms with their beside-the-point tactics in articles.

    Just my .02 for making Anands a bit better.

    Ty
  • IKeelU - Wednesday, October 17, 2007 - link

    I'd really like to see 1280x720. I usually play on my 720p TV and I imagine that resolution would be CPU-bound as well, though probably less so than 1024x768.
  • Roy2001 - Wednesday, October 17, 2007 - link

    I am more interest to see Phenom data.
  • hubajube - Wednesday, October 17, 2007 - link

    I'm officially done with Anandtech's conclusions on benchmarks. Why are you making these blanket "AMD is not competitive" statements? A 9% difference in fps performance and now AMD is shit?! Not to mention that whopping 9% difference only works out to roughly 5 whole frames per second! ROFLMAO!!!!!

    Brad: Hey Johnny! Did you see the new UT3 review from Anandtech? They showed the Intel CPU's KILLING AMD by five frames per second!

    Johnny: Holy shit Brad!!!!! AMD is going out of business with piss poor performance like that. I'll NEVER buy AMD again.

Log in

Don't have an account? Sign up now