3D Rendering Performance using 3dsmax 7 & CineBench 9.5

We're looking at 3D rendering performance using two different applications: 3D Studio Max and Cinebench 9.5. Cinebench is a free performance testing utility based off of the CINEMA 4D R8 rendering package. Our scores from 3D Studio Max are a composite score from four rendering tests: CBalls2, SinglePipe2, UnderWater, and 3dsmax5 Rays.

General Performance - 3D Rendering

General Performance - 3D Rendering

General Performance - 3D Rendering


3D rendering relies almost entirely on CPU performance, and cache sizes have very little impact. The end result is that our overclocked E6300 and E6400 place very near the top of the charts, and the overclocked E6400 actually manages to take the lead over the X6800 in the Cinebench multi-CPU rendering test. Clock for clock, Core 2 Duo holds about a 9-11% performance advantage in 3D rendering over the AMD X2 processors. The difference between the fastest and slowest systems tested here is roughly 60%-70%, and due to the time-consuming nature of 3D rendering even small performance increases are very welcome.

Once again we see that while the Core 2 Duo E6300 is slightly faster than the Athlon 64 X2 4200+, and once overclocked it's out of reach of even an FX-62. The E6400 is also an impressive little chip, offering performance around the X2 4600+ and X2 5000+ levels.

Application Performance using Winstone 2004 Encoding Performance using DivX 6.1, WME9, Quicktime (H.264) & iTunes
Comments Locked

137 Comments

View All Comments

  • Gary Key - Wednesday, July 26, 2006 - link

    quote:

    ...that indeed you all get weekly checks from Intel for the favorable press.


    Damn, Intel must have lost my address. ;-)
  • coldpower27 - Wednesday, July 26, 2006 - link

    This is just so sad, how far AMD fanboys will go. I really wish there was moderation allowed, here the user point system is hardly effective enough.
  • JarredWalton - Wednesday, July 26, 2006 - link

    I'm pretty sure that was sarcasm, Coldpower. LOL :)
  • coldpower27 - Wednesday, July 26, 2006 - link

    Well, I guess my bad, though without a /sarcasm tag it's hard to tell. This is n't real life where you can here the tone of people's voices. :P
  • Sunrise089 - Wednesday, July 26, 2006 - link

    You were unable to tell "the Magic Money Fairy" was sarcasm?

    Why not just come clean and admit that you didn't read carefully.
  • coldpower27 - Wednesday, July 26, 2006 - link

    I read plenty carefully, thanks.
  • Sunrise089 - Thursday, July 27, 2006 - link

    Seriously?

    You were seriously unable to tell the following was sarcasm:

    "...you all get weekly checks from Intel"
    "...most Intel processors really don't even work at all"
    "...Intel pays off the companies to say they're Intel Inside"
    and of course
    "the Magic Money Fairy."?

    Dude, it's understandable that you were reading fast and thought the post was fanboy-ism, which there is indeed a lot of. Refusing to admit that and stating instead that the original (actually quite funny) post wasn't clear is insulting to that poster and frankly somewhat alarming.
  • coldpower27 - Thursday, July 27, 2006 - link


    Your taking this way to seriously, if I can't recognize without smiley faces or a tag that it is sarcasm, it's acceptable considering this is written language. I rely on the tone of the conversation, which is absent here.

    There is nothing to admit. There continues to be alot of AMD fanboyism at this site, even reading carefully, it sometimes isn't a simple task to deduce what is sarcasm from the rest of the fanboy drivel.

    I am not refusing to admit anything, the poster should have considered this before he made the post, that not everyone will be able to catch the sarcasm, I assume the poster would have thought about this, and I already said my bad in the above post. You may think the establish indicators are sufficient for you, for me they are not.

    Not everyone can percieve exactly the same things you can allright, and assuming otherwise is ridiculous in itself.
  • lewisc - Thursday, July 27, 2006 - link

    lol - was a bit of a knee-jerk response, I thought the same thing until I read all the replies before, and then realised that it was indeed (hopefully!) sarcasm. You can't blame coldpower with the amount of rubbish being spouted by some users with how 'biased' this site is.
  • VooDooAddict - Wednesday, July 26, 2006 - link

    Nice to see you get your own article again from time to time, reminds me of when I started visiting AT.

    The following deals with the gaming performance question most people are asking. I understand that the true focus of the article is 2m Cache vs. 4m Cache and the Overclocking impact. E6600 is still at the top of my list for a powerful new SFF thanks to the article. Regardless the article has prompted the following:

    (Maybe the following can be highlighted in a Budget/Midrange Gaming system buying guide...)

    I think it's undeniably clear that the Core 2 Duo Chips offer the most headroom for future GPUs and should therefore be at the top of most gamers’ lists if they can afford it.

    What I think some people may still find important is that with any of these amazing CPUs ... gaming is still GPU limited. It begs for the comparison of the E6300/E6400, the 3800/4200 X2, and 3500/3800 Single Cores. With a quality lower cost boards and single video cards like the 7950GX2 and 1900 XT. Do the lower end CPUs really limit gaming with a single card solution? I think the 7950 would also give a good showing as to if the new higher end and Dual Cores really needed for SLI, or if new new "low end" which used to be the high end are enough to keep that 7950 going. Most gamers I find at LANs still only run 1280x1024 without massive AA/AF simply due to the popularity of the cheep 17" and 19" sub 12ms LCDs. I also find a large number of gamers (who enjoy gaming but don't really spend much time enthusiastic about the hardware) don't ever turn on AA/AF.

    I'm not saying you didn't state that most games are still GPU bound. You clearly tell gamers clearly in the article that it's probably best to buy the E6300 with a high-end video card then a E6600 and a mid range video card. I just think that it needs to be shown.

Log in

Don't have an account? Sign up now