Minecraft

Switching gears for the moment we have Minecraft, our OpenGL title. It's no secret that OpenGL usage on the PC has fallen by the wayside in recent years, and as far major games go Minecraft is one of but a few recently released major titles using OpenGL. Minecraft is incredibly simple—not even utilizing pixel shaders let alone more advanced hardware—but this doesn't mean it's easy to render. Its use of massive amounts of blocks (and the overdraw that creates) means you need solid hardware and an efficient OpenGL implementation if you want to hit playable framerates with a far render distance. Consequently, as the most successful OpenGL game in quite some number of years (at over 7.5mil copies sold), it's a good reminder for GPU manufacturers that OpenGL is not to be ignored.

Minecraft

Minecraft does incredibly well on Trinity. While the improvement over Llano is only 15%, the advantage over Ivy Bridge is tremendous.

 

Civilization V

Our final game, Civilization V, gives us an interesting look at things that other RTSes cannot match, with a much weaker focus on shading in the game world, and a much greater focus on creating the geometry needed to bring such a world to life. In doing so it uses a slew of DirectX 11 technologies, including tessellation for said geometry, driver command lists for reducing CPU overhead, and compute shaders for on-the-fly texture decompression. There are other games that are more stressful overall, but this is likely the game most stressing of DX11 performance in particular.

Civilization V

Civilization V

Civilization V shows some of the mildest gains in all of our tests vs. Llano. The 5800K/7660D manage to outperform Llano by only 8 -11% depending on the test. The advantage over Intel is huge of course.

Starcraft 2 & Skyrim Performance Compute & Synthetics
Comments Locked

139 Comments

View All Comments

  • dishayu - Thursday, September 27, 2012 - link

    Hate to be offtopic here, i wanted to ask what happened to this weeks Podcast? Was really looking forward to a talk about IDF and Haswell.
  • Ryan Smith - Thursday, September 27, 2012 - link

    Busy,. Busy busy busy. Perhaps on the next podcast Anand will tell you what he's been up to and how many times he's flown somewhere this month.
  • idealego - Thursday, September 27, 2012 - link

    I don't think the load GPU power consumption is fair and will explain why.

    The AMD processors are achieving higher frame rates than the Intel processors in Metro 2033, the game used for the power consumption chart. If you calculated watts per frame AMD would actually be more efficient than Intel.

    Another way of running this test would be to use game settings that all the processors could handle at 30 fps and then cap all tests at 30 fps. Under these test conditions each processor would be doing the same amount of work. I would be curious to see the results of such a test.

    Good article as always!
  • SleepyFE - Thursday, September 27, 2012 - link

    True.
    But you are asking for consumption/performance charts. You can do those yourself out of the data given.
    They test consumption under max load because noone will cap all their games at 30fps to keep consumption down. People use what they get and that is what you would get if you played Metro 2033.
  • idealego - Thursday, September 27, 2012 - link

    Some people want to know the max power usage of the processor to help them select a power supply or help them predict how much cooling will be needed in their case.

    Other people, like me, are more interested in the efficiency of the architecture of the processor in general and as a comparison to the competition. This is why I'm more interested in a frames per watt or watts at a set fps, otherwise it's like comparing the "efficiency" of a dump truck to a van by comparing only fuel economy.
  • CeriseCogburn - Thursday, October 11, 2012 - link

    LMAO - faildozer now a dump truck, sounds like amd is a landfill of waste and garbage, does piledriver set the posts for the hazardous waste of PC purchase money signage ?

    Since it's great doing 30fps in low low mode so everyone can play and be orange orange instead of amd losing terribly sucking down the power station, just buy the awesome Intel Sandy Bridge with it's super efficient arch and under volting and OC capabilities and be happy.

    Or is that like verboten for amd fanboys ?
  • IntelUser2000 - Thursday, September 27, 2012 - link

    We can't even calculate it fairly because they are measuring system power, not CPU power.
  • iwod - Thursday, September 27, 2012 - link

    I think Trinity is pretty good chip for low cost PC. Which seems to be the case for majority of PCs sold today. I wonder why is it now selling well compared to Intel.
  • Hardcore69 - Thursday, September 27, 2012 - link

    I bought a 3870K in February. I've now sold it and replaced it with a G540. APU's are rather pointless unless you are a cheap ass gamer that can't afford a 7870 or above or for a HTPC. Even there, I built a HTPC with a G540. You don't really need more anyway. Match it to a decent Nvidia GPU if you want all the fancy rendering. Personally I don't see the point for MadVR and I can't see the difference between 23.976 @ 23.976 or 23.976 at 50Hz.

    All that being said, I bet that on the CPU side, AMD has failed. Again. CPU grunt is more important anyway. A G620 can compete generally with a 3870K on the CPU side. That is just embarrassing. The 5800K isn't much of an improvement.

    Bottom line, a Celeron is better for a basic office/pornbox, skip the Pentium, skip the i3, get an i5 if you do editing or encoding, i7 if you want to splurge. GPU performance is rather moot for most uses. Intel's HD 1000 does the job. Yes, it can accelerate via Quicksync or DXVA, yes its good enough for youtube. Again, if you want to game, get a gaming GPU. I've given up on AMD. Its CPU tech is too crap and its GPU side can't compensate.
  • Fox5 - Thursday, September 27, 2012 - link

    A 7870 goes for at least $220 right now, that's a pretty big price jump.

    AMD has a market, it's if you want the best possible gaming experience at a minimum in price. You can't really beat the ~$100 price for decent cpu and graphics performance, when it would cost you at least half that much (probably more) for a graphics card of that performance level. Also, in the HTPC crowd, form factor and power usage are critical, so AMD wins there; I don't want a discrete card in my HTPC if I can avoid it.

Log in

Don't have an account? Sign up now