Minecraft

Switching gears for the moment we have Minecraft, our OpenGL title. It's no secret that OpenGL usage on the PC has fallen by the wayside in recent years, and as far major games go Minecraft is one of but a few recently released major titles using OpenGL. Minecraft is incredibly simple—not even utilizing pixel shaders let alone more advanced hardware—but this doesn't mean it's easy to render. Its use of massive amounts of blocks (and the overdraw that creates) means you need solid hardware and an efficient OpenGL implementation if you want to hit playable framerates with a far render distance. Consequently, as the most successful OpenGL game in quite some number of years (at over 7.5mil copies sold), it's a good reminder for GPU manufacturers that OpenGL is not to be ignored.

Minecraft

Minecraft does incredibly well on Trinity. While the improvement over Llano is only 15%, the advantage over Ivy Bridge is tremendous.

 

Civilization V

Our final game, Civilization V, gives us an interesting look at things that other RTSes cannot match, with a much weaker focus on shading in the game world, and a much greater focus on creating the geometry needed to bring such a world to life. In doing so it uses a slew of DirectX 11 technologies, including tessellation for said geometry, driver command lists for reducing CPU overhead, and compute shaders for on-the-fly texture decompression. There are other games that are more stressful overall, but this is likely the game most stressing of DX11 performance in particular.

Civilization V

Civilization V

Civilization V shows some of the mildest gains in all of our tests vs. Llano. The 5800K/7660D manage to outperform Llano by only 8 -11% depending on the test. The advantage over Intel is huge of course.

Starcraft 2 & Skyrim Performance Compute & Synthetics
Comments Locked

139 Comments

View All Comments

  • Kougar - Friday, September 28, 2012 - link

    Given no mention of a "preview" was mentioned in the title, it would have been nice if the The Terms of Engagement section was at the very top of the "review" to be completely forthright with your readership.

    I read down to that section and stopped, then went looking through the review for CPU benchmarks which didn't exist. Can thank The Tech Report for posting an editorial on AMD's "preview" clause before I realized what was going on.
  • Omkar Narkar - Friday, September 28, 2012 - link

    would you guys review 5880k crossfired with HD 6670 ???
    because I've heard that when you pair it with high end GPU like HD7870 then integrated graphics cores doesn't work.
  • TheJian - Friday, September 28, 2012 - link

    Why was this benchmark used in the two reviews before the 660TI launch, and here today, but not in the 660TI article Ryan Smith wrote? This is just more stuff showing bias. He could have easily ran it with the same patch as the two reviews before the 660TI launch article. Is is because in both of those two articles the 600 series dominated the 7970ghz edition and the 7950 Boost? This is at the very least, hard to explain.
  • plonk420 - Monday, October 1, 2012 - link

    are those discrete GPUs on the charts being run on the AMD board? or a Sandy/Ivy?
  • seniordady - Monday, October 1, 2012 - link

    Please,can you make some test to the CPU vs... not only to the GPU?
  • ericore - Monday, October 1, 2012 - link

    http://news.softpedia.com/news/GlobalFoundries-28n...

    Power leaking reduced by up to 550%; wow.
    What an unintended coup for AMD haha all because of Global Foundries.
    Take that Intel.

    AMD is also first one working on Java GPU acceleration.
  • shin0bi272 - Tuesday, October 2, 2012 - link

    This is cool if you want to game at 13x7 at low res... but who does that anymore? When you bump up games like BF3 or Crysis2 (which you didnt test but toms did) the FPS falls into the single digits. This cpus is fine if you dont really play video games or have a 17" CRT monitor. The thing that I think is funny about this is that in all the games a 100 dollar nvidia gpu beat the living snot out of this apu. Other than HTPC people who want video output without having to buy a video card or someone who doesnt play FPS games but wants to play farmville or minecraft no one will buy this thing. Yet people are still trying to make this thing out to be a gaming cpu/gpu combo and its just not going to satisfy anyone who buys it to play games on and thats disingenuous.
  • Shadowmaster625 - Tuesday, October 2, 2012 - link

    When you tested your GT440, you didnt do it on this hardware right? If you were to disable the trinity gpu and put a GT640 in its place, do you think it would still do better? Or would its score be pretty close to that of the iGPU??
  • skgiven - Sunday, October 7, 2012 - link

    No idea what the NVidia GT440 is doing there; where are the old AMD alternatives?

    Given the all to limited review I don't see the point in comparing this to NVidia's discrete GT640.
    Firstly, it's not clear if you are comparing the APU's to a DDR3 GT640 version (of which there are two; April 797MHz and June 900MHz) or the GDDR5 version (all 65W TDP).
    Secondly, the GT640 has largely been superseded by the GTX650 (64W TDP).
    So was your comparison the 612GFlops model, the 691, or 729 GFlops version?
    Anyway, the GTX650 is basically the same card but has is rated as 812GFlops (30% faster than the April DDR3 model). Who knows, maybe you intended to add these details along with the GTX650Ti, in a couple of days?

    If you are going to compare these APU to discrete entry level cards, you need to add a few more cards. Clearly the A10-5800k falls short against Intels more recent processors for most things (nothing new there), but totally destroys anything Intel has when it comes to gaming, so there is no point in over-analysing that. It wins that battle hands down, so the real question is, how does it perform compared to other gaming APU's and discrete entry level cards?

    I'm not sure why you stuck to the same 1366 screen resolution? Can this card not operate at other frequencies, or can the opposition not compete at higher resolutions?
    1366 is common for laptops. I don't think these 100W chips are really intended for that market. They are for small desktops, home theatre, entry level (inexpensive) gaming systems.

    These look good for basic gaming systems and in terms of performance per $ and Watt, even for some office systems, but their niche is very limited. If you want a good home theatre/desktop/gaming system, throw in a proper discrete GPU and operate at a more sensible 1680 or 1920 for real HD quality.

Log in

Don't have an account? Sign up now