Bringing it All Together: Everything OC'd

So each of core, shader and memory overclocking didn't produce dramatic results on their own, but when we put them all together we get quite a different picture. It is sort of hard to set an upper limit on maximum performance improvement when we are faced with different factors that limit performance which could all interact. Throwing more factors in there complicates it as well. I'm not a statistician or mathematician, but it is logical that we could never see a performance improvement greater than the product of the separate percent improvements to each subsystem (i.e. overclocked performance must be less than (stock performance) * 1.11 * 1.143 * 1.179).

The actual limit is lower than the 50% potential gain implied by this, as there is no way to gain the maximum benefit on overall performance by each subsystem simultaneously as gaining the maximum benefit requires that a subsystem be the sole significant bottleneck. I'm not sure how to model anything this complex, especially considering the fact that the performance of any one subsystem affects the efficiency of the other two. Please feel free to school me in the comments on this one.

But the proof that you can get huge returns on overclocking is in the pudding.




1680x1050    1920x1200    2560x1600


Call of Duty and Race Driver GRID get over a 30% boost at 1680x1050 when everything is overclocked simultaneously. Everything else sees respectable gains at over 1680x1050 while these huge boosts go away at higher resolution. An overall gain of 10% to 15% at 2560x1600 isn't too shabby at all, but it doesn't live up to the potential we clearly see in some of our other tests.

The complexity of the factors that go into these performance differences deserve a little more investigation. So we'll look at a few more tests before we throw out our raw numbers.

Shader Overclocking Pulling it Back Apart: Performance Interactions
Comments Locked

43 Comments

View All Comments

  • balancedthinking - Thursday, June 4, 2009 - link

    Actually, you can save as much as 40W(!)idle when you underclock a 4890/4870. The important part is to underlcock the GDDR5 which automatically adjusts voltage too.
    http://www.computerbase.de/news/hardware/grafikkar...">http://www.computerbase.de/news/hardwar...ten/ati/...
  • 7Enigma - Friday, June 5, 2009 - link

    Anyway you could translate the 4870 portion? Babelfish is not working for me for some reason....
  • jeramhyde - Thursday, June 4, 2009 - link

    great article thanks :-) good work with the graphs and setting it out in an easy to follow way :-)

    my housemate just got a gtx275 today, so we'll be trying those overclocks tonight :-)
  • nvalhalla - Thursday, June 4, 2009 - link

    Alright, I just spent the last 10 minutes looking for a 900 shader 4980 before I realized you meant a 900MHz 4890. It's wrong in every graph.
  • DerekWilson - Thursday, June 4, 2009 - link

    It's not wrong, it's just noted in a different way.

    like I say (702 core) for the GTX 275 with a 702MHz core clock, i say (900 core) for a 4890 with a 900MHz core clock.

    I'm sorry for the confusion, but it the graphs were already so crowded that I wanted to save space as much as I could.
  • nvalhalla - Thursday, June 4, 2009 - link

    no, you're not getting me. It's listed as a 4980, not a 4890. I thought it was a new card, the "NEXT TIER" if you will. The 900 I thought might be a reference to the number of SP. Once I realized you just transposed the numbers, I got the 900 was MHz.
  • DerekWilson - Thursday, June 4, 2009 - link

    oooooooooooohhhhhhhhh ... okay ... that's a fun typo. I can't really do search and replace on these graphs either. I'll try and get it cleaned up as soon as I can.

    thanks for clarifying.
  • walp - Thursday, June 4, 2009 - link

    Very nice article as always! :)

    GTX275 and 4890 is really evenly matched in every different way(Price, performance, overclocking performance etc) except for the fact that 4890 can be used with the 19$ Accelero S1:

    http://translate.google.com/translate?prev=hp&...">http://translate.google.com/translate?p...mp;sl=sv...

    , which makes it totally quiet and cool. Just watch those VRM-temperatures and you will be just fine!

    This is the main reason why I chosed the 4890 over GTX275, and the fact that I had a CF-compatible motherboard.

    By the way, why didnt you include the 4890 (1\1.2)GHz idle power draw? Or is it just a little type-o? :)

    Request: GTX275 SLI vs 4890 CF (And overclocking bonanza! :)))))

    \walp
  • SiliconDoc - Monday, June 22, 2009 - link

    I can't help but use google and check the very first site that comes up:
    http://www.techspot.com/review/164-radeon-4890-vs-...">http://www.techspot.com/review/164-radeon-4890-vs-...
    --
    and the GTX275 beats the TAR out of the 4890 !!!
    --
    I guess derek has once again used some biased bench or the special from the manufacturer 4890, and then downclocked the GTX275 to boot.
    WHAT A CROCK !
  • SiliconDoc - Saturday, June 6, 2009 - link

    It's hilarious - the extravaganza overclock they do here for Nvidia can't even match the stock timings of a common EVGA.
    http://www.newegg.com/Product/Product.aspx?Item=N8...">http://www.newegg.com/Product/Product.aspx?Item=N8...
    -
    core 713 derek weakness red rooster nvidia hate 703

    So what AT has done here is put the 4890 max overclocked in their other article against a low end stock gtx275, then in their extravaganza overclock gtx275 they put up a pathetic overclock that is BEATEN BY An EVGA gtx275 as it arrives !
    Worse yet, they jammed their ati 4890 maxxxxx overclocks in for comparison !
    ------------
    How to be a completely biased load of crap by Derek and AT :

    max overclock your ati 4890 and put stock gtx275's against it
    weak overclock your gtx275 and put maxx overclock 4890's against it
    ___

    ROFLMAO - I HAVE TO LUAGH IT'S SO BLATANT AND PATHETIC.
    ---
    cocka doodle doooo ! cocka doodle dooo ! red rooster central.

Log in

Don't have an account? Sign up now