Gaming Performance

It is very interesting that almost all the game benchmarks are slightly higher on the 2.4GHz 1MB cache 4000+ than on the 4200+ Dual-Core (2.2GHz with 512KB Cache on each core). As Anand pointed out in the X2 launch article, this is in line with AMD's claims. Gaming today is heavily single threaded and the dual core performs about the same as a similar speed single core. However, the difference between the 4000+ and 4200+ is generally very small.

Most of the gaming benchmarks respond very well with the X2 overclock to 2.7GHz and yield impressive performance returns with the extra 500MHz.

Gaming Performance - Single Video

Gaming Performance - Single Video

Gaming Performance - Single Video

Gaming Performance - Single Video

Gaming Performance - Single Video

Gaming Performance - Single Video

Gaming Performance - Single Video

Looking closely it is very interesting that two of the most recent games, Doom 3 and Half Life 2, seem to have their performance almost entirely dictated by the graphics card. With the increases in graphics power we tested all games at 1280x1024 where possible. Whether 2.2GHz, 2.4GHz with double the cache, or 2.7GHz, Doom 3 and Half Life 2 performed about the same using the same graphics card at the same 1280x1024 resolution.

The game benchmarks we use for memory testing were much more responsive to processor speed increases. Wolfenstein ET saw a 18.4% increase in a 22.5% CPU speed boost, and Q3 increased 16%. UT2004 performed similarly at 16%, while Far Cry was in-between at 9.5%. These results should give you a good idea of why we use Wolfenstein-ET and Q3 for memory benchmarking.
General Performance and Media Encoding 3D Workstation Performance
Comments Locked

53 Comments

View All Comments

  • cryptonomicon - Friday, June 24, 2005 - link

    #42 you are a moron. if their 4200 was really cherry picked, they would have it at 3ghz on air. i think this is valid. 2.6-2.7ghz is average ground on the DFI board with 90nm proc, and i see overclocking results every day. its not extordinary.

    end of story
  • DavidHull - Friday, June 24, 2005 - link

    This article is crap. As long as Anandtech uses cherry-picked processors directly from the manufacturer, it is no more than hired advertising. What happened to journalistic integrity? Do you think that AMD is going to send you a randomly picked processor from the line? How about you test my processor that I bought from a retailer and they we'll talk about how good XX CPU really is.

  • phaxmohdem - Friday, June 24, 2005 - link

    #38, I was kind of pissed about that myself, however I have to wonder, lots of times the prices chip manufacturers quote, are for retailers at a quantity of 1000 chips or more. Perhaps that is why? OR perhaps too many geeks spooged prematurely and are willing to shell out extra cash on inflated prices for new and top of the line shiznat. Who knows. All I know is I"m stuck on socket 940 for a while :(
  • Klaasman - Friday, June 24, 2005 - link

    I have tested Battlefield2 with 1 gig of ram and 2 gig of ram. With 1 gig, it uses about 675mb. With 2 gig, it uses about 750mb. And still has 725mb in swap. It should use more but don't for some reason.
  • miketheidiot - Thursday, June 23, 2005 - link

    i agree with #17, rome: total war definately needs to be added to the game benchmarks, especailly for CPU tests. I'm not sure if its ram or processor limited, but it easily brings my 2.55ghz winchester and 1gig of ram to its knees in some of the larger battles.
  • yacoub - Thursday, June 23, 2005 - link

    Btw I like how the AMD chart shows the 4200+ sold at around $530 but if you check the RTPE, the cheapest is $575 plus shipping. =P
  • SilthDraeth - Thursday, June 23, 2005 - link

    I believe Val brings up interesting points. I would like to see benchmarks of identicly systems, except for the amount of RAM. This would prove if his perceptions have merit. I believe they do, but I do not have any benchmarks to prove it.
  • Gatak - Thursday, June 23, 2005 - link

    #35

    You are right that overclocking is highly random. All chips are made with lots of things in mind. For example different target models, performance, and not to forget minimum life expectancy.

    If you overclock and stress the components you _WILL_ reduce the lifetime. Also, things like temperature also affect both achievable performance and lifetime. If you increase the temperature by 10c you would reduce the life expectancy by half!

    The amount you can overclock is usually the margin you have against lifetime and stability.
  • fishbits - Thursday, June 23, 2005 - link

    "We have asked AMD for a 4400+ and 4800+ for comparison"

    OK, non-overclocker questions about that: My understanding that the overclocking of a chip can vary from production run to production run, and even individual CPU to individual CPU. So wouldn't any CPU (or GPU etc) manufacturer test some of their CPUs and set aside a box full of the best overclockers to be sent to review sites, to give the impression that that's what the average one can achieve? I guess this kind of arrangement is a necessity though so that you have a sample in a timely fashion.

    Granted in the real world there's money concerns, availability concerns, etc. But wouldn't it theoretically be better to buy a random one from a random vendor who doesn't know it's going to reviewers? If it is an issue in any way in OC situations, maybe it's worth noting early on that the sample was provided to the staff by the manufacturer, but then again perhaps that should go without saying.
  • val - Thursday, June 23, 2005 - link

    ***75-80 percent of available main...****

Log in

Don't have an account? Sign up now