AMD’s Integrated HD 4290 vs. Intel Integrated HD Graphics

As I mentioned at the start of this article, AMD’s 890GX graphics are identical in every way (other than model number) to the 790GX integrated graphics. Earlier this year, Intel introduced its biggest leap to date in integrated graphics performance with its Intel HD Graphics.

Found only on the physical CPU package of Core i5, i3 and Pentium G-series CPUs, Intel HD Graphics is the first step towards on-die graphics integration. Today, Intel uses graphics performance as a differentiator between various CPU price points. Depending on what CPU you buy, you either get Intel’s HD Graphics clocked at 533MHz, 733MHz or 900MHz. In order to provide the clearest picture, I’ve included results from all Clarkdale CPUs.

AMD’s graphics is still on-motherboard, and thus all 890GX boards run their Radeon HD 4290 core at the same 700MHz clock. To show the influence of CPUs on integrated graphics performance I’ve included results from both a Phenom II X4 965 and an Athlon II X2 255. As I’ve already shown that there’s no difference in performance between a 790GX and 890GX, I’m only using 890GX numbers in these charts.

Many of you have asked for us to compare integrated graphics to low end discrete graphics. For this specific purpose I’ve included results with a Radeon HD 5450.

Batman: Arkham Asylum

Intel technically has the same performance as AMD does, but it requires a $210 CPU to get it. If you look at the more mainstream Core i5s and i3s, they use the slower 733MHz graphics clock and aren’t as fast as the 890GX’s 4290.

Dragon Age Origins

We see a similar story under Dragon Age. If you have a Core i5 661 then you’ve got clearly better than AMD performance, but pick any of the other i3s or i5s and you’re about equal. If you pick a more reasonable AMD CPU, Intel takes a small lead.

Dawn of War II

The Core i5s all post roughly the same numbers as the 890GX. If you opt for a more reasonable AMD processor, the gap widens. Now compared to the Core i3s, it boils down to what CPU you have. If you have a quad-core AMD you’re probably faster on the 890GX. If you have a dual-core AMD, you’re probably faster with Intel.

The fact that picking the winner is this conditional means that Intel has come a long way in integrated graphics performance.

Modern Warfare 2

AMD has the clear victory here. The 890GX’s graphics is a good 35% faster than Intel’s fastest.

BioShock 2

BioShock 2 is a pain to play on either platform, even at the lowest settings. Upgrading to a $50 graphics card makes a huge difference in playability. Intel’s Core i5 661 is technically the fastest followed by the 890GX with a Phenom II X4 965 and then the rest of the i3/i5 lineup and finally the 890GX with an Athlon II.

World of Warcraft

I had to drop the quality settings on my WoW test from Good to Fair in order to get something playable on these IGPs. That doesn’t change the results though, Intel loses out on another key benchmark here. WoW is noticeably faster on the 890GX, regardless of CPU.

HAWX

AMD scores another win in HAWX.

Overall Intel has significantly improved its integrated graphics, but it loses on a couple of key benchmarks (Modern Warfare 2 and World of Warcraft). I’d call Intel HD Graphics in the same league as AMD’s 890GX (something AMD could have avoided by actually increasing performance this generation), but when AMD wins it tends to be by a larger margin.

The 890GX also benefits from much wider compatibility testing and optimization done for AMD’s other GPUs, which make it a safer bet from a being able to run all games standpoint.

I do have to say though, the fact that a $50 video card can offer roughly twice the performance of either integrated graphics solution is telling. Integrated graphics can be useful for playing a small subset of games, but you still need a discrete solution to have fun.

Power Consumption

While we're presenting power consumption data here, it's not a very balanced comparison. Our H55 test motherboard is a micro-ATX board without any fancy features like USB 3.0 and uses scaled down power delivery. Our 890GX board however is a fully equipped ATX board from Gigabyte and thus can easily draw another 10W+ over a reduced micro-ATX design.

With nearly identical motherboards, we would expect x264 playback power consumption to be relatively similar between Intel and AMD integrated platforms. It's also not a far stretch to imagine an AMD system with similar idle power given how close the Athlon II X2 255 is to the H55 setup. The only situation where AMD doesn't stand a chance is under 3D gaming load.

Dual Graphics Support

AMD’s 890GX chipset does support two PCIe x8 slots for running cards in CrossFire mode, but you do have one more option for flexibility. If you have a Radeon HD 5450 and happen to be running one of the following games: Battleforge, BioShock, Company of Heroes, HAWX, World in Conflict or 3DMark Vantage then the 890GX’s IGP will work in tandem with the 5450.

Unfortunately I could not get Dual Graphics mode working on my testbed. AMD is calling this a preview at this point (hence the limited hardware and software support), so we may see a more extensive rollout in the future.

Generally, after a certain point it doesn’t make sense to use the integrated graphics core to boost performance. With 40 SPs the integrated Radeon HD 4290 can add 50% to the core count of a 5450. But go up to a 5670 and the 890GX can offer only 10%, and even less if you take into account clock frequencies.

The 890GX + 5450 (or similar class hardware) would seem to be the only combination that makes sense.

Index Better Suited for USB 3.0?
Comments Locked

65 Comments

View All Comments

  • rachotilko - Thursday, March 25, 2010 - link

    What about RAID offered on desktop boards ? What's the point in providing us with 6x6Gb SATA, when NB-to-SB link is capable of delivering a mediocre 2Gb/s ? I would be more than happy with soft/fake RAIDs that comes with desktop boards, had there not been this absurd bottleneck. In that case I would be able to build a decent RAID10 or RAID5 (with writeback cache enabled) delivering mindblowing disk performances. Now I have to buy overpriced Adaptec/3ware/Areca... or witch to Intel. Shame on you, nVidia and AMD.
  • stuhad - Thursday, March 11, 2010 - link

    Thanks for the article Anand, I really appreciate all the work you do in helping the layman understand new technology.

    So anyway, after reading your article on AMD's Radeon HD 5450, I was just wondering whether hybrid graphics could perhaps improve the 5450's deinterlacing ability? And would dx11 still run if you had the igp and discrete card in hybrid mode or would you need to run dx10.1?

    Hopefully an updated BIOS will eventually allow you to actually get hybrid graphics working and answer all these questions.

    Thanks again
  • Alouette Radeon - Wednesday, March 10, 2010 - link

    You know, in the last few years I've realized something and I have to admit I was a complete moron. I used to be your typical, stupid, "Intel and nVidia ONLY" computer user. Sure, I had good reason. I'd used Intel CPUs exclusively from the 8088 all the way to the Core 2 Duo. Then when I was working at tigerdirect I saw a HUGE difference in price and I wondered "Could AMD really be that bad?". Well after doing a TON of market research I came to discover that in fact, most of the bad things I'd heard about AMD had been put out by Intel themselves. This was all a part of why the EU and Korea fined them and the USFTC is now charging them. I also had been reading things about ATi's resurgence. Now, understand that ATi was nothing new to me. I'd owned an ATi EGA and then VGA wonder card in the past. ATi used to RULE graphics and they were expensive but they were also the best. Things started to fall apart because management got stupid and they made some bad products. This was when nVidia grabbed the market by being less expensive and competitive at the same time. Now, the situation has changed. I was interested in a Core 2 Quad Q8200 and a GeForce GTX 260. After doing that research I decided to risk it and go with the AMD side of things. I figured they couldn't be THAT bad or they'd be out of business. Well, in the end I bought a Phenom II X4 940 and a Radeon HD 4870. I couldn't be more satisfied with my purchase because not only did I save over $300 on the cost of my rig, now that I've found out about the dishonesty that Intel and nVidia have been involved in, I'm so glad I didn't hand over my dollars to them. Intel and nVidia are a cancer on this industry and until they clean themselves up (which won't happen until AMD is big enough to force them to) those two corporations won't get a PENNY from me. Remember, when the HD 4870 came out, nVidia dropped the prices of the GTX 260 and GTX 280 by a whopping 62%! If they could do that and still make a decent profit, they must have been RAPING the public before that. Sorry mVidia and Intel, you blew it with me and anyone else who has a brain (or at least gives a shit about this industry.)
  • Mr Bill - Saturday, March 6, 2010 - link

    I wonder if the poor performance is simply a matter of driver signaling protocol? Maybe a driver rewrite will increase SSD performance.
  • juampavalverde - Thursday, March 4, 2010 - link

    Looking at the specification of this two chips, and the previous name path which amd choose, with the 790fx, then 790x, then 770, then 780g, and later 790gx (780g at 700 mhz with sideport and crossfire), and recently 785g, it is clear that todays new northbridge shall be called "795GX" (785g at 700 mhz with sideport and crossfire), because imho the southbridge change doesnt justify the addition of 105 to the model number. As long as i remember, when 790fx dropped sb600 in favor of sb700 an later of sb750, there was no change in his name.
  • elerick - Thursday, March 4, 2010 - link

    Any talk whether or not 890 platform will support mini itx? Last I heard until we get integrated chipsets on cpu it would be next to impossible to get am3 on mini-itx.

    Thanks,

    Jason
  • amalinov - Wednesday, March 3, 2010 - link

    Anand,
    I greatly appreciate the work you do (recently the SSD articles are great). But in this case I find some inconsistencies/ommissions that shouldn't be there IMHO (or I totaly misunderstand something):
    1. In the table comparing SB750 with SB850 the "PATA" line shows "2 channels" for both. But SB750 has only 1 channel (of course for 2 devices: master+slave).
    2. About SB850 PATA - from what I see every mainboard with SB850 uses ANOTHER chip for the PATA channel and this leads me to the conclusion that the AMD powerpoint slides are partialy copy-paste from SB750 and the "Parallel ATA" is a wrong left-over (do you have explicit confirmation for PATA in SB850? Is there a single mainboard using PATA from SB850?).
    3. SidePort DDR3 - what is the width of the interface (8bit - 128bit)? supported speeds? (appearently DDR1333). Is it used as Frame buffer or otherwise? Could UMA/Shared memory be disabled and the IGP be forced to use SidePort memory only?
    4. Manufacturing technology, die area, transistor count, chip power consumption. Some of these are listed for AMD890GX (but not all), but the SB850 is totaly in the dark. (I ask for these assuming that AMD donsn't hesitate to share this with the general public)
    5. Other SB850 features. If I remember correctly the leaked roadmaps this southbridge was to have integrated clock generator(s), DASH 8051 controllers, improved power management and hardware monitoring features - what happened to these? Also, there it was stated that SB850 would have 4 PCIe x1 ports (in addition to the A-Link III). Now we see only 2. Is this realy the total count or only the initial wave of mainboards does not use the other 2?
    6. SATA/600 - I hope when you get back a SSD with over-300MB/s speeds that you will do a test SB850 vs. Marvell-on-AMD vs. Marvell-on-Intel (in addition to the SB850/300 vs. SB850/600 and the AHCI/MS/etc. driver issues)

    Here start the not so important things:
    7. IGP audio - does the 890GX DisplayPort support audio? (and what about mainboards routing to the DisplayPort one of the USBs? The other of the pair could go to a combo eSATA/USB port... but I start wishfull thinking here :))
    8. Gigabit ethernet MAC - I don't see any mainboards using it, but don't assume it is missing (like the PATA), because of most mainboards not using Intel MAC too - but do you have any info if EVER this is going to be used (eg. what PHYs are compatible with SB850, etc. - maybe some of the nF590 times?)

    And here some could be considered off-topic:
    9. SLI support - after Nvidia starting selling "license" to mainboard manufacturers for SLI support - are there any announcements for AMD-chipset boards with SLI support?
    10. You mentioned briefly Hybrid Crossfire IGP+GPU and it seems like a "beta", but when it is ready would it support IGP+GPU+GPU (two HD5450 with 890GX) and IGP+GPU+GPU+GPU (this seems very theoretical, but maybe for some people if the IGP can increase performance even by 1fps it is better to use it than waste it... development/testing/certification costs-for-zero-benefit aside)?
    11. USB 3.0 (this is a general note on USB3.0, not regarding this particular review) - most articles state that USB3.0 is "10 times USB 2.0" or "like PCIe 2.0" or "4.8" or "5" GT/s (resulting in speeds between 480MB/s for 4.8GT/s with 8/10 - and up to - 625MB/s for 5GT/s without appling 8/10 in the calculation). Then we have other overheads, etc. Additionaly, it is clear that the 3.0 is bi-directional (4 wires) in contrast to 2-wire USB2.0. I remember vaugely that in some USB.org file I have readed something that implied that the 4.8 or 5 number applies to the sum of both directions, thus in a single direction USB 3.0 is half that speed. But maybe I misunderstanded something here... Anyway, I haven't found a good description of USB 3.0 speed - starting from the raw link speed in one direction, going trough 8/10, etc., reaching pre-protocol overhead speed (equivalent of USB2.0 480Mbps-60MB/s), explicitely stating where they double to take into account bi-direction full duplex capability. So that in the end we know that the USB 3.0 theoretical maximum is xxx MB/s per direction, 2*xxx MB/s total (to make assumptions for the influence of NEC chip-to-southbridge and north/CPU-to-south link limitations). It would be good to have this single direction/bi-directional and 8/10 issues sorted out in a nice table with the versions of UATA, SATA, PCI, PCIe, USB2.0, USB3.0, QPI, HT - where some will have only single-direction and other bi-directional. In addition to shared (like PCI and UATA)/not shared types. Hmm, and what about SATA controllers in the southbridges? Do we have 6x600MB/s in SB850, or 3x600MB/s where there are pairs (primary+secondary ports with masters only) sharing the same 600MB/s link between a internal SATA two-port controller and chipset "backbone"? What about transfers between SATA ports on the same controller (eg. 1st primary-to-1st secondary) or on different controllers (eg. 1st primary-to 2nd primary) - do they pass trough main memory/shared SATA controller link/north-to-south link - or does the SB850 have something like the direct PCIe-to-PCIe switch logic for SLI/CrossFire? But, again I am going off-topic here.

    Pffeu. It got a long post. Thank you for reading it and best regards!
  • lplatypus - Wednesday, March 3, 2010 - link

    The 790GX basically does have UVD2 capabilities, contrary to the table on the first page of this article.

    See this forum post from an AMD employee who supports their linux video driver:
    http://www.phoronix.com/forums/showpost.php?p=1056...">http://www.phoronix.com/forums/showpost.php?p=1056...
  • vol7ron - Tuesday, March 2, 2010 - link

    Great post, but can you overclock these IGPs?
  • Paladin1211 - Wednesday, March 3, 2010 - link

    It's the same IGP as in the 785G, so it can be overclocked as well as the HD 4200.

Log in

Don't have an account? Sign up now