Performance Improvement with 8 SIMDs

The maximum theoretical performance difference between the two configurations is 14.3%. We re-ran our tests at 1680x1050 for all games but Crysis and Age of Conan which we re-ran at 1280x1024. We computed the percent increase between our previous data and the new numbers and plotted them on the following chart. To get a better sense of the significance of the performance increase, we set the maximum on the x-axis to 15 (to reflect the maximum performance increase we could possibly see).

We did re-run all of our numbers, but rather than reporting them all here, we have just updated our previous article with the corrected data. This shows that there was indeed an impact, but that it wasn't quite as large as the theoretical maximum. Because the only difference between this and the 4850 is clock speed an the number of SIMDs, this would indicate that there while more compute and texture hardware does improve performance, the performance of AMD hardware doesn't scale linearly with SPs. NVIDIA hardware does come closer to scaling linearly with SPs as we've seen in past tests (and as evidenced by the GTX 260 core 216), but the architecture is a bit different. It is possible that the bottleneck in the games we tested is just elsewhere and we would see more linear scaling on more shader intensive code.

In any case, while the increases are significant (in most cases), the impact on our conclusions isn't huge. It doesn't fundamentally change the class of the hardware.

So Why DID This Happen? Final Words
Comments Locked

24 Comments

View All Comments

  • JAKra - Saturday, October 25, 2008 - link

    Hi!

    I thought that these disabled SIMDs were fused off in the ASIC. What if you flash your HD4830 with an HD4850 BIOS or a modified one enabling all the SIMDs? The PCB looks the same, they use the same GPU.( I think )
    Good old modding days are back: 9800SE to 9800XT, anyone? :D
  • Clauzii - Sunday, October 26, 2008 - link

    Heh, I have that :D Still rockin' :D
  • JAKra - Saturday, October 25, 2008 - link

    Wehehe Marlin1975, you beat me to it. You won. :D
  • DerekWilson - Saturday, October 25, 2008 - link

    we'll have to look into that :-)
  • Marlin1975 - Saturday, October 25, 2008 - link

    Does this mean we might be able to just do a firmware flash and get a 4850 level performance??? :)
  • Gary Key - Saturday, October 25, 2008 - link

    I tried that on a retail card, no go.
  • PrinceGaz - Saturday, October 25, 2008 - link

    Did you down-clock the 4850 BIOS so that it was within the limits of the 4830 card? I'm thinking especially about the memory here which would be most likely to have problems at the 4850 clock-speed, unless you've already verified that your 4830 can run at the 4850 core and memory clocks. As an nVidia owner, I don't know what BIOS editing tools are available for AMD cards, but I doubt any have been tailored for the 4830 as it is brand new.

    Even if the 4830 could cope with 4850 clocks, there may well be other tiny but significant changes to the card components which mark it as a 4830 and prevent it being flashed with a 4850 BIOS. A modded 4830 BIOS with more than 8 blocks of 80 SIMDs enabled would therefore have the best chance of success, and may well be forthcoming now that AMD inadvertently made BIOSes with both 7 and 8 blocks enabled public, especially if combined with what can be ascertained from the 4850 BIOS. I wouldn't know where to start with BIOS editing (I can work with x86 assembly, but that is child's play compared to understanding proprietary BIOS code). Some people have produced utilities which can edit at least certain parameterrs in BIOS files, so it's possible a tool to select the number of SIMD blocks could be produced.

    It's probably not worth the effort though as the 4850 is only slightly more expensive anyway, so even if you could up the SPs from 640 (8 blocks) to 800 (all 10 blocks), given the performance differences presented in this article by going from 560 (7 blocks) to 640 (8 blocks); you'd be lucky to add more than 10% at best-- and that's assuming your card actually has a core with all 10 blocks capable of working correctly. It's not like the good old days of the 9500 non-Pro which if you were lucky could have its pipelines upped from 4 to 8 (like a 9700 non-Pro) giving a staggering increase in performance.

    It's obvious how AMD messed up with the card samples they sent out. The only possible explanation is that the 4830 was originally intended to have 560 SPs and the BIOS was designed with that limit. Late in development, serious price-drops of 9800GT cards and the like forced AMD to give their 4830 a bit extra umph to compete at the price-point they had already targeted it at. Whereas they could easily adjust any physical limitation of the hardware pretty much as it was leaving the fab if they do so (by disabling less blocks on those cores), it takes longer to modify the software and check it to a level they can be confident the card will actually function correctly, such that some review samples went out with the older BIOS.

    As such, all 4830 card cores may well be limited to 8 functional blocks (including those samples sent out with a BIOS setting it to 7 blocks), but only AMD insiders would know.
  • Goty - Sunday, October 26, 2008 - link

    Or maybe it was much less complex than that and AMD actually tried more than one configuration during testing and a batch slipped through the cracks once everything was finalized. Y'know, that whole Occam's Razor thing.
  • iwodo - Saturday, October 25, 2008 - link

    ATI deliberately sent out sample with worst results. Then give better bios for enabling proper performance to catch Nvidia Off Guard.
  • SiliconDoc - Monday, January 19, 2009 - link

    How about ATI did this because they are morons, the ATI division just caused another huge chargeout loss for AMD, so there's bad karma at ati , and this shows exactly why their drivers are so screwey and broken all over the place.
    " We were in a hurry" - can you imagine ? Something as ignorant as the wrong bios applied - that really takes the cake.
    Congratulations ati, you screwed up again - and no, your fanboys can't admit it, they think it's a masterous marketing plan to destroy nvidia...
    ATI blew it again.
    ( They did do very well on this card though, it's the first one at the pricepoint it's at that I like from them - since there's no CUDA, no PhysX, no driver profiles, a bloated CCC bloated a bit less, - no I don't plan on picking one up. )
    Yeah, despite the major, amatuer, almost laughable mistake - makes ya wonder if anyone is at home or in charge there, this is a nice card for the price right now.

Log in

Don't have an account? Sign up now