Starcraft 2

Our next game is Starcraft II, Blizzard's 2010 RTS megahit. Starcraft II is a DX9 game that is designed to run on a wide range of hardware, and given the growth in GPU performance over the years it's often CPU limited before it's GPU limited on higher-end cards.

Starcraft 2 - GPU Bench

Starcraft 2 - GPU Bench

Starcraft 2 - GPU Bench

Despite being heavily influenced by CPU performance, Starcraft 2 shows big gains when moving to Trinity. The improvement over Llano ranges from 16 - 27% in our tests. The performance advantage over Ivy Bridge is huge.

 

The Elder Scrolls V: Skyrim

Bethesda's epic sword & magic game The Elder Scrolls V: Skyrim is our RPG of choice for benchmarking. It's altogether a good CPU benchmark thanks to its complex scripting and AI, but it also can end up pushing a large number of fairly complex models and effects at once. This is a DX9 game so it isn't utilizing any new DX11 functionality, but it can still be a demanding game.

The Elder Scrolls V: Skyrim

The Elder Scrolls V: Skyrim

We see some mild improvements over Llano in our Skyrim tests, and even Intel is able to catch up a bit. Trinity still does quite well, only NVIDIA's GeForce GT 640 can really deliver better performance than the top-end A10-5800K SKU.

Portal 2 & Battlefield 3 Performance Minecraft & Civilization V Performance
Comments Locked

139 Comments

View All Comments

  • dishayu - Thursday, September 27, 2012 - link

    Hate to be offtopic here, i wanted to ask what happened to this weeks Podcast? Was really looking forward to a talk about IDF and Haswell.
  • Ryan Smith - Thursday, September 27, 2012 - link

    Busy,. Busy busy busy. Perhaps on the next podcast Anand will tell you what he's been up to and how many times he's flown somewhere this month.
  • idealego - Thursday, September 27, 2012 - link

    I don't think the load GPU power consumption is fair and will explain why.

    The AMD processors are achieving higher frame rates than the Intel processors in Metro 2033, the game used for the power consumption chart. If you calculated watts per frame AMD would actually be more efficient than Intel.

    Another way of running this test would be to use game settings that all the processors could handle at 30 fps and then cap all tests at 30 fps. Under these test conditions each processor would be doing the same amount of work. I would be curious to see the results of such a test.

    Good article as always!
  • SleepyFE - Thursday, September 27, 2012 - link

    True.
    But you are asking for consumption/performance charts. You can do those yourself out of the data given.
    They test consumption under max load because noone will cap all their games at 30fps to keep consumption down. People use what they get and that is what you would get if you played Metro 2033.
  • idealego - Thursday, September 27, 2012 - link

    Some people want to know the max power usage of the processor to help them select a power supply or help them predict how much cooling will be needed in their case.

    Other people, like me, are more interested in the efficiency of the architecture of the processor in general and as a comparison to the competition. This is why I'm more interested in a frames per watt or watts at a set fps, otherwise it's like comparing the "efficiency" of a dump truck to a van by comparing only fuel economy.
  • CeriseCogburn - Thursday, October 11, 2012 - link

    LMAO - faildozer now a dump truck, sounds like amd is a landfill of waste and garbage, does piledriver set the posts for the hazardous waste of PC purchase money signage ?

    Since it's great doing 30fps in low low mode so everyone can play and be orange orange instead of amd losing terribly sucking down the power station, just buy the awesome Intel Sandy Bridge with it's super efficient arch and under volting and OC capabilities and be happy.

    Or is that like verboten for amd fanboys ?
  • IntelUser2000 - Thursday, September 27, 2012 - link

    We can't even calculate it fairly because they are measuring system power, not CPU power.
  • iwod - Thursday, September 27, 2012 - link

    I think Trinity is pretty good chip for low cost PC. Which seems to be the case for majority of PCs sold today. I wonder why is it now selling well compared to Intel.
  • Hardcore69 - Thursday, September 27, 2012 - link

    I bought a 3870K in February. I've now sold it and replaced it with a G540. APU's are rather pointless unless you are a cheap ass gamer that can't afford a 7870 or above or for a HTPC. Even there, I built a HTPC with a G540. You don't really need more anyway. Match it to a decent Nvidia GPU if you want all the fancy rendering. Personally I don't see the point for MadVR and I can't see the difference between 23.976 @ 23.976 or 23.976 at 50Hz.

    All that being said, I bet that on the CPU side, AMD has failed. Again. CPU grunt is more important anyway. A G620 can compete generally with a 3870K on the CPU side. That is just embarrassing. The 5800K isn't much of an improvement.

    Bottom line, a Celeron is better for a basic office/pornbox, skip the Pentium, skip the i3, get an i5 if you do editing or encoding, i7 if you want to splurge. GPU performance is rather moot for most uses. Intel's HD 1000 does the job. Yes, it can accelerate via Quicksync or DXVA, yes its good enough for youtube. Again, if you want to game, get a gaming GPU. I've given up on AMD. Its CPU tech is too crap and its GPU side can't compensate.
  • Fox5 - Thursday, September 27, 2012 - link

    A 7870 goes for at least $220 right now, that's a pretty big price jump.

    AMD has a market, it's if you want the best possible gaming experience at a minimum in price. You can't really beat the ~$100 price for decent cpu and graphics performance, when it would cost you at least half that much (probably more) for a graphics card of that performance level. Also, in the HTPC crowd, form factor and power usage are critical, so AMD wins there; I don't want a discrete card in my HTPC if I can avoid it.

Log in

Don't have an account? Sign up now