The Test

While the major benchmark suites still lack support for the OS, we are trying our hands at testing CPUs under Windows Vista with this review.  By now we know not to expect a significant performance difference between Windows Vista and XP, but given Vista's compelling feature set we see it becoming the dominant PC OS for new system builds in the enthusiast community.  

With a substantial number of our CPU benchmarks available in 64-bit versions, using the 64-bit version of Vista wasn't a difficult choice.  Since we're also using modern components in our testbeds, driver support wasn't an issue either.  Given what we saw in our Vista Performance Guide, the ability to use more memory for features like SuperFetch warrants the switch to 64-bit if you don't have any legacy hardware without driver support. 

The only other change we've made to our test beds is the use of 4GB of memory; by no means is it necessary (yet) but Vista's added memory requirements coupled with its better use of free memory makes 4GB a good target for enthusiasts. 

CPU: AMD Athlon 64 X2 6000+ (3.0GHz/1MBx2)
AMD Athlon 64 X2 5600+ (2.8GHz/1MBx2)
AMD Athlon 64 X2 5000+ (2.6GHz/512KBx2)
AMD Athlon 64 X2 3800+ (2.0GHz/512KBx2)
Intel Core 2 Duo E6700 (2.66GHz/4MB)
Intel Core 2 Duo E6600 (2.40GHz/4MB)
Intel Core 2 Duo E6300 (1.86GHz/2MB)
Motherboard: ASUS P5B Deluxe (P965)
ASUS M2N32-SLI Deluxe (nForce 590 SLI)
Chipset: Intel P965
NVIDIA nForce 590 SLI
Chipset Drivers: Intel 8.1.1.1010 (Intel)
Integrated Vista Drivers (NVIDIA)
Hard Disk: Seagate 7200.9 300GB SATA
Memory: Corsair XMS2 DDR2-800 4-4-4-12 (1GB x 4)
Video Card: NVIDIA GeForce 8800 GTX
Video Drivers: NVIDIA ForceWare 100.54
Desktop Resolution: 1600 x 1200
OS: Windows Vista Ultimate 64-bit
The Real Story is Pricing General Performance
Comments Locked

34 Comments

View All Comments

  • Chadder007 - Monday, April 9, 2007 - link

    Considering that if you are getting a whole new PC, then you would most likely be getting a new OS to. To which I would go with 64bit Vista also.
  • SunAngel - Monday, April 9, 2007 - link

    Just think, when MS gets Vista fully patch and tweeked and FSB1333 or 1600 and HT2400 or 3000 for Quad and Octo-cores is mainstream, you'll be able to take a 30 min. MS-DVR recording from Media Center and reduce it in like 2 minutes. 18-30 months from now I expect Intel and AMD to performance quite some magic with their processors. Good times ahead indeed.
  • yacoub - Monday, April 9, 2007 - link

    And we all know how certain benchmarks vary WIDELY between XP and Vista - often due to drivers or programming issues - so to say "oh the numbers should be close enough between the two OSes" would be completely untrue.
  • yacoub - Monday, April 9, 2007 - link

    quote:

    While Intel will still hold control of the world's fastest desktop processor title, AMD may actually offer better value at lower price points.

    As long as you don't allow overclocking into the equation, then yes. But if you allow for overclocking, even a modestly overclocked E4300 can match or beat an E6400 and thus best the 6000+.

    We rarely hear much about how the AMD chips overclock these days... is it just due to a lack of overclock-oriented boards? Have all the board manufacturers focused on Intel because that is where all the attention is and where they hope to get the most profit for their boards?

    It would be interesting to see a good update on AMD overclocking on AM2. Do the chips even have much headroom? If so, are there overclocker boards available to OC them with? etc
  • yyrkoon - Monday, April 9, 2007 - link

    From my experiences, which by the way is not a lot, I have noticed that generally, desktop classed CPUs from AMD do not OC well. However, that being said, I have an Opteron 1210, paid $150usd for it, and have had it running 310Mhz CPU core(2790Ghz overall, stock is 1.8Ghz . . .), on an ABIT NF-M2 nView motherboard, with inexpencive Cosair XMS2 ProMos memory, using stock cooling. Granted, the system immediatly BSoD'd when trying to run SuperPI, but I have little doubt, that if I had a better cooler (my case is very compact, so it is pretty difficult to find something small, and efficient), that it would have been able to run this speed fine.

    From what I have read, the 3600+ can hit 3Ghz using water cooling, but I have no hands on experience personally. Personally, I am very happy with my Opteron, it runs every game I play just fine, the only real problem I have with my current system, is that my current video card is already showing age, and it is only 6 months old (7600GT) :/
  • JarredWalton - Monday, April 9, 2007 - link

    I think the best AMD chips will OC to around 3GHz, give or take. The problem is, an E4300 will overclock to around 3.6GHz pretty easily (get a better CPU cooler is all you need). At that point, the E4300 is so much faster than anything AMD currently has on the desktop that I think it's a bit silly to consider AMD for serious overclocking - unless you already have an AM2 board? At stock speeds, however, AMD does quite well on pricing, especially post price-cuts.
  • D4LDarksideD4L - Wednesday, March 18, 2009 - link

    No think your wrong here. The best AMD cpu doesn't max out at 3.0Ghz it clock way faster than that with very extreme cooling. Guys at AMD inc. had overclock an AMD qual core to a insanely 6.5Ghz. and its very stable. To achieve this speed they use LiquNitrogen and LiquHelium to bring down the temperature to below 200 degree F. To my understanding That the world record breaker.

    http://game.amd.com/us-en/landings/tomslanding.asp...">http://game.amd.com/us-en/landings/toms...CR=World...
  • yyrkoon - Monday, April 9, 2007 - link

    I do not dissagree with you 100% Jarred, you points are completely valid. However, the system in question would cost more than a comparrible AMD system, and in my case, I am very specific in what I want for features, and brand, so, it would cost me a lot more. Also this performance difference you speak of looks great on paper, and graph, but in realy world application, I bet, you, me, or anyone else, would be hard pressed to notice the difference. Perhaps if all you do is encode/decode sure, but general usage, and game playing, the noticable difference just would not be there.

    Now, if you have the cash, I would reccomend to anyone who plays games, to go with Intel C2D, but know that you will pay for the speed difference, and the chances are good, unless you spend a lot, you would never know the difference. As I said before, build me a C2D system for ~$500, with an LCD monitor, and then we can talk. Granted, this AMD ~$500 system, would not play FEAR, Oblivion, or any other graphically intensive game well, but my system does using an eVGA 76000GT KO (can be had easily for ~$130usd, making the over all cost higher, but still very viable, even on a budget).
  • TA152H - Monday, April 9, 2007 - link

    I agree with you, mainly. My main system, if I have one that can be called that, is a Katmai running at 600 MHz. Why? Because it's completely fanless and it uses very little power and I don't need anything better for what it does. I have a seperate development machine, of course, and other machines I use for more demanding stuff, but 90% of the time, unless I'm testing, this computer is more than adequate for what I want.

    Overclocking is important only within the tiny context of the person who is going to buy a processor to do it. Intel and AMD don't care that much about it, unless it makes their chips less reliable and then they are against it since it hurts their reputation. It's done by such a small percentage of people, it's not going to greatly impact their sales. I almost got tarred and feathered back about 10 years ago when I recommended that our company buy the Celeron 300As and overclock them to 450 MHz, rather than using Pentium IIs, which were slower and way more expensive. I never made that mistake again.

    Back in the bad old days, you actually had to have a bit of a clue to overclock; you'd have to unsolder the crystal and replace it. A lot of people did this with the original PC/AT 139, so IBM changed it so you couldn't for the 239 (by putting a timing loop in the ROM). So, I guess it was somewhat more common then. But then again, the computer user back then was much more sophisticated since they were not mainstream devices like they are now.

    AMD is in deep trouble, as evidenced by their recent announcement of extremely poor sales. I just do not understand their timing with respect to ATI, because it gave Intel a golden opportunity. Intel knows AMD is cash strapped, and they can't fight a price war. AMD doesn't know this though, and they are playing chicken driving a VW Bug against a Hummer. Sooner or later, AMD must lose, and they are idiotic for thinking Intel does not know this. ATI made it impossible for AMD to follow the course they are now, but they are. Good luck to them. Cost cutting isn't the answer, Intel will just extend their manufacturing lead.

    One thing I don't get is the announced price cuts by Intel not being called aggressive. Call me crazy, but when you chop 1/3 of the price off of already attractive products, that's very substantial.

    A lot of this points to Barcelona (what a stupid name) being really good. Intel is trying to kill AMD before it comes out, and AMD seems to believe they have to keep market share at any cost. Obviously, AMD's path is unsustainable and they would eventually go out of business on it if nothing changed. If Barcelona is really good, they could suffer a few quarters of it while waiting for Barcelona, which would presumably sell quite well if it is as good as they say. The problem is, they are losing market share even with their low prices.

    One positive about all this is how much smarter consumers have become. It used to be Intel could sell whatever they had even if it sucked. But, when Intel had a bad product most recently, they lost share. Now they have a better product, they have gained it. It hasn't always been this way.
  • yyrkoon - Monday, April 9, 2007 - link

    Well, I do not know which way it is going to go, but it is either very, very good for AMD, or very, very bad. Just for the 'monopoly' Intel would gain, in AMD going out of business, and the 'you have this product, and you have to like it' effect we would get from Intel, I think it would be very, very bad for everyone, if AMD went out of business.

    I have been computing for a long, long time, (since 82-83 ), and have been used to AMD being the underdog, so I do not really see this as the nail in their coffin just yet. Only time will tell, and AMD knows how to fight a price war, from the bottom up( or middle up, if you ever really considered Cyrix a compeditor ).

Log in

Don't have an account? Sign up now