3D Rendering Performance & Power Usage

3D Rendering Performance - 3dsmax 8  

Looking at 3D rendering performance, Intel's Core 2 Duo still comes out on top in performance, but once again our focus this time around is on power consumption, so let's have a look at that.

3D Rendering Power Usage - 3dsmax 8  

There's a noticeable reduction in total system power consumption with the move to 65nm, but AMD's EE/EE SFF and Intel's Core 2 processors all draw less power than the new 5000+. 

3D Rendering Performance per Watt - 3dsmax 8  

Looking at efficiency however, Brisbane is the best AMD has got to offer.  It is still no where near the performance per watt you can get with Intel these days, but it's a step in the right direction.  If AMD's updated micro-architecture can narrow the performance gap next year, we may see some competition in the performance and performance per watt space once again.

3D Rendering Performance - Cinebench 9.5  

The performance under Cinebench is far closer between the E6600 and the X2 5000+, with the slight nod going to the Core 2 CPU. 

3D Rendering Performance - Cinebench 9.5  

Power consumption is also relatively close between the two CPUs, with Intel once again coming in a bit lower at 195.1W.  The move from 90nm to 65nm shaves off about 15W of total system power consumption, which isn't bad given that there's no change in processor pricing. 

3D Rendering Performance - Cinebench 9.5  

Performance per watt is close between Intel and AMD, closer than in any of our other tests, but Intel ends up with the overall win.  Looking just at AMD CPUs, the Brisbane core continues to offer better performance per watt than even the most efficient 90nm X2s AMD had previously offered. 

Media Encoding Performance & Power Consumption - Continued Gaming Performance & Power Usage
Comments Locked

63 Comments

View All Comments

  • dev0lution - Friday, December 15, 2006 - link

    Now that both companies claim to offer "platforms" it'd be interesting to see an Intel 965 board vs. an ATI board being used in these benches. Not sure that the NVIDIA models were the best choice here, especially since their not apples to apples on generation and power consumption.
  • poohbear - Thursday, December 14, 2006 - link

    im quite shocked to see the gaming performance advantage @ 1600x1200!!!! is'nt the cpu removed as a performance bottle neck @ such a high resolution? its all about the gpu horsepower @ that resolution no? can't believe a cpu can show a 25fps diff @ 1600x1200?!?!?
  • JumpingJack - Friday, December 15, 2006 - link

    quote:

    im quite shocked to see the gaming performance advantage @ 1600x1200!!!! is'nt the cpu removed as a performance bottle neck @ such a high resolution? its all about the gpu horsepower @ that resolution no? can't believe a cpu can show a 25fps diff @ 1600x1200?!?!?


    Some more research here is in order. The G80 GPU, commonly known as the 8800 GTX or GTS on the market, has taken graphics performance to a new level. All of the reviews that used an AMD FX-60 or 62 CPU clearly showed throttling back to the CPU in many if not most cases, only at the highest possible resolutions/AA + FSAA did the scaling turn back on with an FX-60. The X6800 released the full potential of this card.

    The difference in framerate you see bettweent he 5000+ and the E6600 is that the E6600 is has pushed the bottleneck further ahead -- simply because the E6600 is a better gaming CPU.

    Tom's did a good article titled: The 8800 Needs the Fasted CPU.
    In essense, even for a single card solution, an AMD CPU is not a good match for this card.
  • Makaveli - Friday, December 15, 2006 - link

    The reason for the 25fps difference must mean the Geforce is more cpu bottlenecked on the AMD platform than the intel one.

    U gotta remember the 8800GTX is an insanely fast card, and is still bottlenecked on Conroe systems.
  • JarredWalton - Friday, December 15, 2006 - link

    This is why back at the Core 2 Duo launch we talked about CPU performance using games running at 1280x1024 0xAA. When a faster GPU comes out and shifts the bottleneck to the CPU (as has happened with the 8800 GTX), saying "Athlon X2 and Core 2 Duo are equal when it comes to gaming" is a complete fabrication. It's not unreasonable to think that there will be other games where that ~25% performance difference means you can enable high quality mode. Company of Heroes is another good example of a game that can chug on X2 chips at times, even with high-end GPUs.
  • XMan - Thursday, December 14, 2006 - link

    Do you think you might have gotten a higher overclock if you weren't running HTT at 1125MHz?!?
  • Sunrise089 - Thursday, December 14, 2006 - link

    The chart on page one claims to have pricing info. There is none as far as I can see.
  • JarredWalton - Friday, December 15, 2006 - link

    Fixed. :)
  • RichUK - Thursday, December 14, 2006 - link

    Same old sh!t. It's nice to see AMD finally kicking off their 65nm retail chips, but lets see this new core for God sakes.

    They're lacking big time, this really is sad. **Thumbs-Down**
  • Stereodude - Thursday, December 14, 2006 - link

    How's a chip that uses less power run hotter? On the last page the 65nm X2 5000+ hit 51C under load, lower than any other chip, but it uses more power than any chip except the 90nm X2 5000+. How's that work?

Log in

Don't have an account? Sign up now