Performance in Older Games

In response to our preview a number of you asked for performance in older titles. We dusted off a couple of our benchmarks from a few years ago to see how Intel's HD 3000 and AMD's Radeon HD 6550D handled these golden oldies.

First up is a personal favorite: Oblivion. Our test remains unchanged from when we used to run this test, the only difference is we're actually able to get playable frame rates from integrated graphics now. We set the game to High Quality defaults, although the Intel platform had to disable HDR in order to get the game to render properly:

The Core i3-2105 with its HD Graphics 3000 can actually deliver a playable experience at 1024 x 768 with just over 40 fps. Move to higher resolutions however and you either have to drop quality settings or sacrifice playability. The A8-3850 gives you no such tradeoff. Even at 1920 x 1200 the A8 manages to deliver over 40 fps using Oblivion's High Quality defaults.

We saw similar results under Half Life 2: Episode Two:

Here the Core i3 maintains playability all the way up to 1920 x 1200, but you obviously get much higher frame rates from the Llano APU.

Asymmetric CrossFire Compute & Video Transcoding Performance
Comments Locked

99 Comments

View All Comments

  • Roland00Address - Thursday, June 30, 2011 - link

    1366x768 would be preferable than 1024x768, for very few things use 1024x768 anymore, and 1366x768 is the default resolution for 720p tvs, 18.5 inch monitors, and older cheap monitors bought in the last 3 to 4 years.
  • BigDDesign - Friday, July 1, 2011 - link

    I like to play games. 1024X768 is useable in a pinch. So bragging is out. Who cares?
  • CSMR - Sunday, July 3, 2011 - link

    What do you mean, image quality comparison? If two graphics cards differ in image quality, one of them does not work and needs bug-fixes. So your question is really, are there any bugs in graphics output on this "APU".

    Unfortunately sites often do "image quality comparisons" but it is nonsense, actually marketing nonsense.
  • Musafir_86 - Tuesday, July 5, 2011 - link

    -Image Quality (IQ) here means the rendered 3D images of 3D games, the latest of them includes Crysis 2, DiRT 3, etc.

    -In most games, different levels of quality (quality setting like low, medium, high) is provided for scalability reason (so they can cater a wider range of customers).

    -Different GPU uses different algorithms/techniques or 'tweaks' to squeeze maximum performance at a given quality metric. So, in applying those tweaks, the rendered output is sometimes different between one another even though the quality level is same (e.g. High vs. High). Driver maturity is also another contributing factor.

    -FYI, Anandtech DOES provide IQ comparisons before, especially when comparing new Radeon and GeForce generations.

    -BTW, I hope the IQ comparison as promised by Ganesh would be available soon.

    Regards.
  • ckryan - Thursday, June 30, 2011 - link

    Like the mobile version, Llano on the desktop is actually kinda impressive. Not necessarily for its current perfomance, but rather as an indication of what to expect from Trinity. With a significant performance boost to both parts of the CPU, I can easily envision my next laptop using Llano's successor.

    While its impossible not to compare these parts to Sandy Bridge, its not like a Phenom II x4 is hopelessly obsolete. Llano's desktop power figures are pretty good, but it seems like the 2105 draws way more power than it needs at idle -- my i5 2500k/P67 and a GTX 460 pulls 49w from the wall according to my P3 Kill A Watt.

    Still, I'm glad to see AMDs fusion initiative paying some dividends. It's good to have options, but I think it's time they stopped playing hard to get with Bulldozer.
  • L. - Thursday, June 30, 2011 - link

    Hey .. I was just thinking, considering the scaling, would you just run those tests again with a decent memory kit ?

    I mean, most people buying Llano will probably get 1866 at the price of 1600, considering usual RAM market trends (1333 is dead/ price rising - 1600 is bottom price, many kits actually do much more than their SPD - 1866 is bottom within 3 months I guess, etc.).

    So, what about some real neat kits @ 2.2+ or even relatively cheap ones around 2Ghz ?

    I'm pretty sure that's where Llano will start making sense (or ... with quad-ddr5 .. 8 times the bandwidth should do the trick, right ?)
  • L. - Thursday, June 30, 2011 - link

    I forgot one detail, there is no benchmark showing how that 6550D fares when you add a bit of shiny to your gfx settings, is it really so pitiful noone would ever consider pressing the button ?
  • mino - Sunday, July 3, 2011 - link

    i* cannot handle that, so writing such is a big no-no.
  • jjj - Thursday, June 30, 2011 - link

    No overclocking section :(
    I know the ASRock A75 Extreme6 review covers it a bit,but not nearly enough(doesn't even tell us idle voltages)
    Was curious to see how it overclocks in a few scenarios:
    - with a discrete GPU (IGP fully sleeping)
    - with 2 CPU cores disabled (if it is even possible)
    - with the GPU starved by limiting the ammount of ram it gets
    - how far can it be underclocked
  • beginner99 - Thursday, June 30, 2011 - link

    Marketed for years as being something special (APU) but seriously it's nothing special at all. Intel was 1.5 years earlier (Arrandale). Agreed the power of the igp was pretty bad but still...

    This is basically only usable in the mobile version. And there it ain't to bad especially in terms of power consumption. Considering AMDs mobile track record in the last couple of year I would say its a pretty good comeback. And this was obviously its main target.

Log in

Don't have an account? Sign up now