Gaming Performance using F.E.A.R. & Rise of Legends

Our F.E.A.R. test should be fairly familiar by now, as it is the built in performance test included with the game. Computer settings were left at "Maximum" while the graphics settings were set to "High" with the resolution cranked up to 1600 x 1200. F.E.A.R. ends up still being more GPU than CPU bound at these settings, even with a pair of X1900 XTs at its disposal, but we do see some separation among the processors:

Gaming Performance - F.E.A.R. v1.03

The top three spots still go to the top three Core 2 CPUs, with the E6300 falling around the level of the X2 4600+. A trend that we've been seeing all throughout this review is that the performance of these CPUs effectively falls into three groups: Core 2 processors at the top, Athlon 64 X2s in the middle and Pentium D at the very bottom of the charts. In a sense that's the easiest way to classify these three groups of processors: if you want the fastest it's Core 2, mid-range goes to the Athlon 64 X2 and if you don't like good performance there's always the Pentium D.

Rise of Legends is a newcomer to our game benchmark suite and what an excellent addition it is. This Real Time Strategy game looks very good and plays well too; it serves as good filler until the next Command & Conquer title eventually arrives for those looking for a RTS fix. We ran with the resolution set to 1600 x 1200 and the graphics settings set to the medium defaults. We recorded a custom playback of a 3 vs. 2 multiplayer battle and played it back at 4x speed, recording the average frame rate for 10 minutes of the battle. The 10 minutes we focused on contained a good mix of light skirmishes between opponents, base/resource management with very few characters on the screen and of course some very large scale battles.

Gaming Performance - Rise of Legends v1.0

As with most RTSes, Rise of Legends is extremely CPU bound. The performance variability between runs was fairly high in this test, mainly because of how disk intensive the playback can get. Differences in performance of up to 5% should be ignored, but the standings are correct - the Core 2 line of processors absolutely demolish the competition: you're looking at true next-generation CPU performance here. The E6300 isn't nearly as impressive when compared to its more expensive siblings, but when you compare it to AMD's lineup it looks very good, especially considering its proposed cost.

Gaming Performance using Quake 4, Battlefield 2 & Half Life 2 Episode 1 Gaming Performance using Oblivion
Comments Locked

202 Comments

View All Comments

  • crystal clear - Saturday, July 15, 2006 - link

    Just to remind all that-" Intel decides to release a B2 stepping of its Conroe
    processors.Also
    BEWARE of Engineering samples & reviews based on Engineering samples inlcuded.
  • OcHungry - Saturday, July 15, 2006 - link

    I don’t know if this Q been asked or answered (sorry no time reading 120 posts)
    Bu Mr. Anand, can I kindly ask you couple of concerns:
    1) why don’t we see any reference to temp. under load? Temp. is a crucial factor in deciding wether or not I buy conroe since I live in a very hot climate.
    2)A lot people make reference to 64bit and window vista and that conroe is a 32bit architecture and will not perform as good in 64bit. Is it true? can we have some 64bit benchmarks? It would be great to do this test in multitasking.
    3) I have also noticed (from so many who have had pre-release ES @ XS, and other forums) that overclocked conroe does not correspond directly to performance, unlike A64.
    What I mean is: If A64 is overclocked 20%, the performance increases ~20% 9more or less, in most cases), But have not seen this to hold w/ conroe. So I am wondering what would happen if we put a low end conroe, such as E6400 against A64 4000 x2, 2x1mb cache (same price range after price drop) and overclock them to their limit, using stock cooling, and do the benchmarks (64bit included). The reason I am interested in this type of review is because I am an average end user on budget and would like to know which would give me better price/performance. I think I am speaking for at least 90% of consumers. Not everyone buys $1000 cpu and consumers on budget is detrimental to survival of conroe or AM2 cpus. This alone should give you enough incentive to put together a review oriented around us, the mainstream computer users. We can make or break any chipmaker.
    So please Mr. Anad, can we have another review along those lines described above?
    We greatly appreciate it.
    Thanks,
    ochungry
  • aznskickass - Saturday, July 15, 2006 - link

    Hey ochungry, I believe Xbitlabs did an overclocking comparison between a Conroe E6300 and an A64 X2 3800+, and while they didn't do any 64bit benchmarks, the conclusion is that at stock speeds the E6300 is slightly faster than X2 3800+, and when both are overclocked to the max, E6300's lead increases even more.

    Here is the review/comparison:
    http://xbitlabs.com/articles/cpu/display/core2duo-...">http://xbitlabs.com/articles/cpu/display/core2duo-...
  • OcHungry - Saturday, July 15, 2006 - link

    Those guys @ X-bit lab do not fool me. And I hope Anandtech conducts the same test w/ best suited memory module for “BOTH” platforms. We know (so as Xbitlab knows) that, because of IMC, A64 performs its best @ tightest memory timings. X-bit lab should have used http://www.newegg.com/Product/Product.asp?Item=N82...">This memory module if the test was going to be fair and square. Furthermore A64 3800 x2 @ 3ghz is 10x300, which means DDR2 667 1:1 could have been better to use. This http://www.newegg.com/Product/Product.asp?Item=N82...">DDR2 667 @ 3-3-3-10 would have given about 10% better performance than DDR2 4-4-4-12 that they used. X-bit does not mention anything about memory/cpu ratio. What divider was used? Was it 133/266? Or as close to 1:1 as possible? Sooner or later the truth will prevail when we end users try it for ourselves (oh BTW, not ES), and we will see if xbitlab and others were genuinely interested on behalf of consumers, or the interest destined to ill-fated purpose. Will not accuse anyone, but it all look very fishy.
    I am certain that Mr. Anad will clear all these conspicuous reviews , and hand us another that concerns the consumer’s majority- us average users.
  • IntelUser2000 - Saturday, July 15, 2006 - link

    quote:

    We know (so as Xbitlab knows) that, because of IMC, A64 performs its best @ tightest memory timings. X-bit lab should have used This memory module if the test was going to be fair and square. Furthermore A64 3800 x2 @ 3ghz is 10x300, which means DDR2 667 1:1 could have been better to use. This DDR2 667 @ 3-3-3-10 would have given about 10% better performance than DDR2 4-4-4-12 that they used.


    W-R-O-N-G!!! DDR2-800 at EVEN slower 5-5-5-15 timings is FASTER THAN DDR2-667 3-3-3-10: http://xbitlabs.com/articles/cpu/display/amd-socke...">http://xbitlabs.com/articles/cpu/display/amd-socke...

    Prefer AT's results?: http://www.anandtech.com/cpuchipsets/showdoc.aspx?...">http://www.anandtech.com/cpuchipsets/showdoc.aspx?...

    DDR2-800 is faster. The myth that it performs amazingly better(comparatively) with lower latency is just, a myth. I believe there was a thread in forums that tells exactly that.
  • OcHungry - Sunday, July 16, 2006 - link

    Iguess you don’t understand much about AMD's memory latency, direct connect, and 1:1 ratio.
    It is not like Intel's illusionary, that faster than FSB is better. It is not and is useless. Anad proved it here. But this subject has a tendency to be dragged on for ever by those who don’t understand the concept of IMC. So it's better leave it alone.
    But I still would like to see tightest timing and 1:1 ratio. It is now clear to me that those reviews in favor of Intel, artfully evade this argument/request, knowing it will give AMD advantage over Intel's FSB.
  • aznskickass - Sunday, July 16, 2006 - link

    *sigh*

    How is the AMD disadvantaged if BOTH platforms are reviewed using the same RAM?

    AMD needs ultra low latency DDR2 to attain best performance? Well, bad luck, Intel doesn't , there is no 'deliberate' conspiracy to put AMD in a bad light.

    Look, if you just want to hang on to the notion that AMD has been cheated in the reviews, then go ahead and get your X2 4200+ and see how close you can get to Conroes numbers.

    I'll be using my E6600 @ 3.5GHz+ and laughing at your stupidity.
  • aznskickass - Saturday, July 15, 2006 - link

    I consider Xbitlabs to be one of the more trustworthy sites around, and do note that they are testing mainstream chips, and expensive CL3 DDR2 just doesn't make sense in a budget setup, which further puts Conroe in a good light, as it doesn't require expensive CL3 DDR2 to perform well.
  • sum1 - Friday, July 14, 2006 - link

    A good read. For the editor, I found 4 errors, search the full lines of text below at:
    http://www.anandtech.com/printarticle.html?i=2795">http://www.anandtech.com/printarticle.html?i=2795

    That begin said,
    and H.264 in coding
    as soon as its available
    at the fastest desktop processor we've ever tested
    ("and" would be more readable that "at" in this sentence)
  • JarredWalton - Friday, July 14, 2006 - link

    Thanks. At least one of those (in coding) can be blamed on my use of Dragon NaturallySpeaking and not catching the error. The others... well, they can be blamed on me not catching them too. LOL. The last one is sort of a difference of opinion, and I've replaced the comma with a dash, as that was the intended reading. :)

    --Jarred

Log in

Don't have an account? Sign up now