AMD’s Integrated HD 4290 vs. Intel Integrated HD Graphics

As I mentioned at the start of this article, AMD’s 890GX graphics are identical in every way (other than model number) to the 790GX integrated graphics. Earlier this year, Intel introduced its biggest leap to date in integrated graphics performance with its Intel HD Graphics.

Found only on the physical CPU package of Core i5, i3 and Pentium G-series CPUs, Intel HD Graphics is the first step towards on-die graphics integration. Today, Intel uses graphics performance as a differentiator between various CPU price points. Depending on what CPU you buy, you either get Intel’s HD Graphics clocked at 533MHz, 733MHz or 900MHz. In order to provide the clearest picture, I’ve included results from all Clarkdale CPUs.

AMD’s graphics is still on-motherboard, and thus all 890GX boards run their Radeon HD 4290 core at the same 700MHz clock. To show the influence of CPUs on integrated graphics performance I’ve included results from both a Phenom II X4 965 and an Athlon II X2 255. As I’ve already shown that there’s no difference in performance between a 790GX and 890GX, I’m only using 890GX numbers in these charts.

Many of you have asked for us to compare integrated graphics to low end discrete graphics. For this specific purpose I’ve included results with a Radeon HD 5450.

Batman: Arkham Asylum

Intel technically has the same performance as AMD does, but it requires a $210 CPU to get it. If you look at the more mainstream Core i5s and i3s, they use the slower 733MHz graphics clock and aren’t as fast as the 890GX’s 4290.

Dragon Age Origins

We see a similar story under Dragon Age. If you have a Core i5 661 then you’ve got clearly better than AMD performance, but pick any of the other i3s or i5s and you’re about equal. If you pick a more reasonable AMD CPU, Intel takes a small lead.

Dawn of War II

The Core i5s all post roughly the same numbers as the 890GX. If you opt for a more reasonable AMD processor, the gap widens. Now compared to the Core i3s, it boils down to what CPU you have. If you have a quad-core AMD you’re probably faster on the 890GX. If you have a dual-core AMD, you’re probably faster with Intel.

The fact that picking the winner is this conditional means that Intel has come a long way in integrated graphics performance.

Modern Warfare 2

AMD has the clear victory here. The 890GX’s graphics is a good 35% faster than Intel’s fastest.

BioShock 2

BioShock 2 is a pain to play on either platform, even at the lowest settings. Upgrading to a $50 graphics card makes a huge difference in playability. Intel’s Core i5 661 is technically the fastest followed by the 890GX with a Phenom II X4 965 and then the rest of the i3/i5 lineup and finally the 890GX with an Athlon II.

World of Warcraft

I had to drop the quality settings on my WoW test from Good to Fair in order to get something playable on these IGPs. That doesn’t change the results though, Intel loses out on another key benchmark here. WoW is noticeably faster on the 890GX, regardless of CPU.

HAWX

AMD scores another win in HAWX.

Overall Intel has significantly improved its integrated graphics, but it loses on a couple of key benchmarks (Modern Warfare 2 and World of Warcraft). I’d call Intel HD Graphics in the same league as AMD’s 890GX (something AMD could have avoided by actually increasing performance this generation), but when AMD wins it tends to be by a larger margin.

The 890GX also benefits from much wider compatibility testing and optimization done for AMD’s other GPUs, which make it a safer bet from a being able to run all games standpoint.

I do have to say though, the fact that a $50 video card can offer roughly twice the performance of either integrated graphics solution is telling. Integrated graphics can be useful for playing a small subset of games, but you still need a discrete solution to have fun.

Power Consumption

While we're presenting power consumption data here, it's not a very balanced comparison. Our H55 test motherboard is a micro-ATX board without any fancy features like USB 3.0 and uses scaled down power delivery. Our 890GX board however is a fully equipped ATX board from Gigabyte and thus can easily draw another 10W+ over a reduced micro-ATX design.

With nearly identical motherboards, we would expect x264 playback power consumption to be relatively similar between Intel and AMD integrated platforms. It's also not a far stretch to imagine an AMD system with similar idle power given how close the Athlon II X2 255 is to the H55 setup. The only situation where AMD doesn't stand a chance is under 3D gaming load.

Dual Graphics Support

AMD’s 890GX chipset does support two PCIe x8 slots for running cards in CrossFire mode, but you do have one more option for flexibility. If you have a Radeon HD 5450 and happen to be running one of the following games: Battleforge, BioShock, Company of Heroes, HAWX, World in Conflict or 3DMark Vantage then the 890GX’s IGP will work in tandem with the 5450.

Unfortunately I could not get Dual Graphics mode working on my testbed. AMD is calling this a preview at this point (hence the limited hardware and software support), so we may see a more extensive rollout in the future.

Generally, after a certain point it doesn’t make sense to use the integrated graphics core to boost performance. With 40 SPs the integrated Radeon HD 4290 can add 50% to the core count of a 5450. But go up to a 5670 and the 890GX can offer only 10%, and even less if you take into account clock frequencies.

The 890GX + 5450 (or similar class hardware) would seem to be the only combination that makes sense.

Index Better Suited for USB 3.0?
Comments Locked

65 Comments

View All Comments

  • semo - Tuesday, March 2, 2010 - link

    Hi Anand,

    I noticed something strange with my Sharkoon ( http://www.sharkoon.com/html/produkte/docking_stat...">http://www.sharkoon.com/html/produkte/d...tations/... )device recently and I thought you might find it interesting. I had a 3.5" SATA drive connected to it and I just switched off my PC. When it turned off, the case fans kept spinning (CPU fan spinning up/down constantly) and the card reader/temp sensor turned red and started beeping. It kept doing after I took off the power cord. After 10 mins of looking around and head scratching I remembered that the Sharkoon has power going to it. Unplugged that and the PC shut down. I don’t think I’ve seen this anywhere else but the Sharkoon’s USB power is actually bidirectional (the DriveLink at least). That doesn’t happen usually I think and maybe different motherboards won’t like this.

    That’s unfortunate that your C300 died. I wonder why if a non essential device like drive fails, the system doesn’t POST. There shouldn’t be such a condition ever (I actually have one SATA drive that does that actually). Something as simple as removing the DVD drive belt can cause the system to POST or at least take much longer to do so. Why, the thing is not essential?
    Also do you have an explanation why the Vertex LE has such good write performance compared to read. I’ve assumed that you expect the opposite from NAND flash.

    Looking to full review of the 890FX, hopefully it will be more polished!
  • SunLord - Tuesday, March 2, 2010 - link

    that has to be a defect... There is no way anyone would design something to send power into a computer via a usb port it would cause all kinda of bad voodoo for the system
  • SunLord - Tuesday, March 2, 2010 - link

    Why does the index indicates that the 890GX is DirectX 10.1 and has UVD2 while the 790GX is DX10 and UVD1 if they are exactly the same? Is the index wrong or do these changes require no hardware tweaks?
  • Anand Lal Shimpi - Tuesday, March 2, 2010 - link

    I've updated the article a bit. The move from DX10 to 10.1 in AMD's case didn't require much of a change. Technically the 890GX is more like a 785G/790GX hybrid. Either way, performance is identical between all of the cores clocked at 700MHz.

    Take care,
    Anand
  • SunLord - Wednesday, March 3, 2010 - link

    Put some active cooling on it and overclock it!
  • psychobriggsy - Tuesday, March 2, 2010 - link

    At least the southbridge is better featured, with SATA3 and GigE, even if the former wasn't really tested in this review, and the latter wasn't utilised.

    Shame that AMD didn't bump the shader count to (e.g.,) 60, it would have made a massive difference since Intel actually put some effort in on their recent attempts. Then again, Cedar 5450 should have had 160 shaders in my opinion to make it a reasonable low-end purchase.

    Clearly it's a tide-over chipset until Llano changes everything.
  • nice123 - Tuesday, March 2, 2010 - link

    They can't boost it to 60 sadly because they are arranged in blocks of 40 - the next step up is 80, which is of course Radeon 5450 territory since they decided not to add any more shaders to that and kept it exactly the same performance as the 4550.
  • fiki959 - Tuesday, March 2, 2010 - link

    I am little disappointed with the new chipset. But there is a reason that AMD didn't improve IGP performance because doing so will probably hurt radeon 5450 sales. An improvement of 30-50% will bring the IGP very close to low end dedicated cards so maybe that is the main reason or staying with 55nm process have something to with the decision I don't know..

    By the way I see some Athlon 2 laptops in my country, some review for these CPUs please.
  • shrihara - Tuesday, March 2, 2010 - link

    If it has USB 3.0 which is backward compatible, then what is the need of having USB 2.0 along with that? I was hoping that AMD 890 will come up with only USB 3.0 on board like SATA 6GBps.
  • strikeback03 - Tuesday, March 2, 2010 - link

    because for whatever reason AMD didn't include USB3 and they didn't want to spend the money/PCIe lanes on a bunch of external USB3 controllers?

Log in

Don't have an account? Sign up now