Supreme Commander

Supreme Commander is one of the better RTS games to be released in recent memory although we are still huge fans of Command and Conquer 3 along with Company of Heroes. We chose Supreme Commander as it is both a GPU and CPU hog when it comes to systems resources. We utilize the built-in performance test to benchmark the game. We set all of the settings to high and only change the resolutions between benchmark runs. This benchmark provides a cornucopia of results but for our tests we will report the average frame rates during the benchmark. We generally find this game to be playable with frame rates at or above 35fps.

Gaming Performance - Supreme Commander


Company of Heroes

Company of Heroes was released last year and is still proving to be a very addictive RTS game around the office. The game is extremely GPU intensive and also requires a hefty CPU at times. If this is beginning to sound a lot like Oblivion, well CoH is very similar to Oblivion in system requirements. The visuals and audio experience within the game will at times have you believing the game is based more on a First Person Shooter than a traditional Real Time Strategy game. We set all options to High and turn on all additional video options.

The game contains a built-in performance test that utilizes the game engine to generate several different action scenes along with a coffee argument as a sideline distraction before the war starts. We found the performance test gives a good indication of how well your system will perform throughout the game on average. We have found some of the in-game action sequences to be more demanding than the performance test and are working on game play benchmark that is repeatable. We generally found the game to be enjoyable with an average frame rate above 35fps.

Gaming Performance - Company of Heroes


Prey

Prey offers some superb action sequences, unique weapons and characters, and is a visually stunning game at times. It still requires a very good GPU to run it with all of the eye candy turned on. We set all graphic settings to their maximum except for AA/AF and utilize a custom timedemo that takes place during one of the more action oriented sequences. We generally found the game to be enjoyable with an average frame rate above 35fps.

Gaming Performance - Prey


S.T.A.L.K.E.R.

S.T.A.L.K.E.R. is one of the favorite first person shooter games around the office as it continues to provide a great deal of replay value and the graphics are very good once the eye candy is turned on. What we especially like about the game is the atmosphere and the fact that it makes for a great system benchmark.

Gaming Performance - S.T.A.L.K.E.R.

Gaming Summary

Once again the P35 boards, especially when operating at 1333 FSB, lead the pack although the differences overall are minimal when looking at the separation between boards except in Prey and S.T.A.L.K.E.R. Gaming continues to be driven by the GPU but every extra bit of performance always helps.

Gaming Performance Disk, Power, and FSB OC Results
Comments Locked

58 Comments

View All Comments

  • Comdrpopnfresh - Tuesday, May 22, 2007 - link

    The power could be attributed to the DDR3. With it not being so mature there may be a lot of signaling going on that isn't necessary. Also- with all the new technologies, these boards simply have more going on on them. With more transistors on a cpu its is expected they will use more power- more connections and circuits on a board would mean the same. Everything is running faster too. The power consumption doesn't make sense given the lack of matching real-world performance enhancements, but as the article makes good sense in pointing out, Bios are a big contributing factor here.
  • TA152H - Tuesday, May 22, 2007 - link

    Except they ran the power tests with DDR2 on P35 based machines as well, and they were higher than P965 with the same memory. So, obviously, that isn't the cause in this instance.
  • Gary Key - Tuesday, May 22, 2007 - link

    After speaking with the board manufacturers and Intel, our original thoughts (briefings/white paper review) were confirmed that the additional circuitry required on the P35 DDR3 boards and in the MCH result in the increased power consumption on the DDR3 platform compared to the DDR2 platform. This holds true for the P35 DDR2 boards when compared to the DDR2 P965, the additional DDR3 circuity/instruction set is still active even though it is not being used. This is why you will see the DDR2/DDR3 combo boards shortly. However, the BIOS engineers believe that can work a little magic with the SpeedStep and C1E wait states to reduce power consumption, however we are talking just a few watts at best. More on this subject in the roundup, at least we hope we will have more... ;)
  • TA152H - Tuesday, May 22, 2007 - link

    Gary,

    Thanks, it's useful to know. Are they going to shackle the x38 with DDR2 support too?

    Just confirms my earlier opinion, they should have gotten rid of DDR2 support. Intel is an interesting company, they can come out with a great product like the Core 2, and then have some monkey decide to include DDR2 and DDR3 on the P35. You never know if they'll have a clue, or not. I guess it's a good thing they make turkeys like this and the P7, otherwise we wouldn't have AMD. Although AMD might be the cause of this.

    The monkey that decided to do this probably thought, "Oh, look what we can do that AMD can't". It seems to me they did that with the P7, a technological marvel way beyond AMD's capability to design, thank goodness, and the groundbreaking Itanium. Except neither one worked great. AMD's pragmatism has paid off nicely, and even though they can't realistically support DDR2 and DDR3 on the same motherboard, I don't think they really care. Of course, I'm just guessing, when a company does something this stupid, it's always difficult to understand why they did it. It would have been so simple to just have DDR3 support for the P35, and let the P965 handle the DDR2 crowd. It's perfectly adequate.

    Thanks again for the information. It's disappointing, but with Intel you get used to it. They can't do everything right after all, and still be Intel.
  • strikeback03 - Wednesday, May 23, 2007 - link

    There might be a more practical reason, such as lack of production capability for DDR3 or HP and Dell threatening to use VIA chipsets instead of P35 in order to keep using DDR2 and keep their prices competitive. I doubt consumers would like their prices increasing by a few hundred dollars for no noticeable performance improvement. And if they only keep the computer 3 or 4 years they will probably spend less on energy than on that DDR3.

    Who knows about X38, I'd guess DDR2 support won't disappear until the chipset revision for Nehalem.
  • TA152H - Wednesday, May 23, 2007 - link

    Well, I agree if P35 were the only choice from Intel, this would be the case, but again, would you buy VIA if you could get a P965? I wouldn't. If the P965 were a lousy, and seriously obsolete chipset, yes, sure, you'd have to come out with something that replaced it. But they could have easily validated it for FSB of 1333, and at the point the only thing really new in the P35 would be the DDR3 support. So, why would you need it?

    I was going to get the P35 rather than the x38 because I figure x38 will be even more of a power hog considering the, to me, useless features it has. I don't plan on getting two high-end video cards, and I don't think I will run anything that requires twice the performance of the current PCI-E, but if they drop the DDR2 support, it might the one to go after. If you ever look at an Athlon 64 CPU, you can see the memory controller is simply enormous, so dropping it on the x38 could be significant. With it being high end, they may decide DDR2 isn't a high end technology so they drop it. I hope so.
  • JarredWalton - Monday, May 21, 2007 - link

    Could be the Vista factor? I dunno what else to think about the power numbers.
  • XcomCheetah - Wednesday, May 23, 2007 - link

    Could you do a little testing on it... why so high power numbers..
    Secondly if i remember correctly the power number difference between 680i and P965 chipsets was greater than 20W.. but in your current tests the difference is pretty small.? So any guess what has caused this positive change.?
    Reference
    http://www.anandtech.com/cpuchipsets/showdoc.aspx?...">http://www.anandtech.com/cpuchipsets/showdoc.aspx?...
    http://www.xbitlabs.com/articles/chipsets/display/...">http://www.xbitlabs.com/articles/chipsets/display/...

    current power numbers on Anandtech
    http://www.anandtech.com/cpuchipsets/showdoc.aspx?...">http://www.anandtech.com/cpuchipsets/showdoc.aspx?...

Log in

Don't have an account? Sign up now