Pulling it Back Apart: Performance Interactions

Rather than test everything combination of clock speeds and look at scaling as we did in our Radeon HD 4890 overclocking article, we wanted a streamlined way to get a better idea of how combinations clock domain overclocking could help. Our solution was to add only one test configuration and use multiple comparison points to get a better idea of the overall impact of changing multiple clocks at a time.

Testing our hardware while overclocking both the core clock and the shader clock gives us four more key comparisons that fill in the gaps between what we've already seen and how the different aspects of the hardware interact with each other. First, and most obviously, we can see how much performance improvement we get beyond stock when overclocking both core and shader clocks.




1680x1050    1920x1200    2560x1600


We see higher performance improvement for overclocking both of these at the same time than we do for just overclocking one at a time. And we can break this down into two components in order to answer two different questions: how much faster does overclocking the shaders make the GPU when the core is overclocked, and how much faster does overclocking the core make the GPU when the shaders are already overclocked? These two graphs are very closely related, but they can further help make a decision on how to balance your overclock on NVIDIA hardware.




1680x1050    1920x1200    2560x1600





1680x1050    1920x1200    2560x1600


If we look back and compare our additional performance improvements from increasing either core or shader clock while the other is at maximum, we can get a good idea of how scaling translates in a different landscape. In fact, we see that increasing shader clock speed generally has a larger impact when the core is already overclocked than if the core is set to stock speeds. This could indicate that an increased core clock alleviates some bottleneck on the shader hardware that allows it more breathing room.

We see a similar relationship between core scaling with everything else stock and core scaling with overclocked shaders. This could indicate a reciprocal relationship between the core and shader clocks, meaning that users may typically get a larger benefit from overclocking both at the same time rather than pushing one higher at the expense of the other.

The last question we want to answer with this test is about memory. We saw that overclocking the GTX 275's RAM didn't return much of our investment. But what if both core and memory are overclocked, would memory speed have a larger impact on performance when? If the bottleneck for performance scaling with memory overclocking is in how fast the GPU can consume data, than we might see better performance improvement from memory overclocking when the core and shader are running faster.




1680x1050    1920x1200    2560x1600


These results are certainly interesting, showing, in general, less benefit from moving to 2560x1600 when the GPU is overclocked. We also see less improvement at lower resolution where memory performance isn't as large an issue in the first place (it seems to become even less important). But at 1920x1200, overclocking memory has a higher impact when the GPU is fully overclocked. So at lower resolutions, memory speed isn't as important anyway and the GPU overclock has the prevailing benefit on overall speed. This makes sense. So does the increasing performance at 1920x1200. But the fact that performance improvement we can attribute to faster memory at 2560x1600 is lower with a faster core and shader clocks is a bit of an enigma.

While we can get a better feel for the effects of tweaking different aspects of the chip through these glimpses into scaling, it's still not possible from this data to definitively pin down the interactions between core, shader and memory clock speed. The benefit to different games is dependent on their demand for resources, and there's no real formula for knowing what you will get out.

But the thing to take away is that overclocking the GTX 275 should be done with balance between the three clocks in mind. No single aspect is a magic bullet, and NVIDIA has balanced things pretty well already. Maintaining the balance is the key to extracting good performance improvement when overclocking the GTX 275.

That sums up our analysis of overclocking the GTX 275. The following pages are our raw data for those more interested in direct/absolute comparisons.

Bringing it All Together: Everything OC'd Raw Performance Data
Comments Locked

43 Comments

View All Comments

  • balancedthinking - Thursday, June 4, 2009 - link

    Actually, you can save as much as 40W(!)idle when you underclock a 4890/4870. The important part is to underlcock the GDDR5 which automatically adjusts voltage too.
    http://www.computerbase.de/news/hardware/grafikkar...">http://www.computerbase.de/news/hardwar...ten/ati/...
  • 7Enigma - Friday, June 5, 2009 - link

    Anyway you could translate the 4870 portion? Babelfish is not working for me for some reason....
  • jeramhyde - Thursday, June 4, 2009 - link

    great article thanks :-) good work with the graphs and setting it out in an easy to follow way :-)

    my housemate just got a gtx275 today, so we'll be trying those overclocks tonight :-)
  • nvalhalla - Thursday, June 4, 2009 - link

    Alright, I just spent the last 10 minutes looking for a 900 shader 4980 before I realized you meant a 900MHz 4890. It's wrong in every graph.
  • DerekWilson - Thursday, June 4, 2009 - link

    It's not wrong, it's just noted in a different way.

    like I say (702 core) for the GTX 275 with a 702MHz core clock, i say (900 core) for a 4890 with a 900MHz core clock.

    I'm sorry for the confusion, but it the graphs were already so crowded that I wanted to save space as much as I could.
  • nvalhalla - Thursday, June 4, 2009 - link

    no, you're not getting me. It's listed as a 4980, not a 4890. I thought it was a new card, the "NEXT TIER" if you will. The 900 I thought might be a reference to the number of SP. Once I realized you just transposed the numbers, I got the 900 was MHz.
  • DerekWilson - Thursday, June 4, 2009 - link

    oooooooooooohhhhhhhhh ... okay ... that's a fun typo. I can't really do search and replace on these graphs either. I'll try and get it cleaned up as soon as I can.

    thanks for clarifying.
  • walp - Thursday, June 4, 2009 - link

    Very nice article as always! :)

    GTX275 and 4890 is really evenly matched in every different way(Price, performance, overclocking performance etc) except for the fact that 4890 can be used with the 19$ Accelero S1:

    http://translate.google.com/translate?prev=hp&...">http://translate.google.com/translate?p...mp;sl=sv...

    , which makes it totally quiet and cool. Just watch those VRM-temperatures and you will be just fine!

    This is the main reason why I chosed the 4890 over GTX275, and the fact that I had a CF-compatible motherboard.

    By the way, why didnt you include the 4890 (1\1.2)GHz idle power draw? Or is it just a little type-o? :)

    Request: GTX275 SLI vs 4890 CF (And overclocking bonanza! :)))))

    \walp
  • SiliconDoc - Monday, June 22, 2009 - link

    I can't help but use google and check the very first site that comes up:
    http://www.techspot.com/review/164-radeon-4890-vs-...">http://www.techspot.com/review/164-radeon-4890-vs-...
    --
    and the GTX275 beats the TAR out of the 4890 !!!
    --
    I guess derek has once again used some biased bench or the special from the manufacturer 4890, and then downclocked the GTX275 to boot.
    WHAT A CROCK !
  • SiliconDoc - Saturday, June 6, 2009 - link

    It's hilarious - the extravaganza overclock they do here for Nvidia can't even match the stock timings of a common EVGA.
    http://www.newegg.com/Product/Product.aspx?Item=N8...">http://www.newegg.com/Product/Product.aspx?Item=N8...
    -
    core 713 derek weakness red rooster nvidia hate 703

    So what AT has done here is put the 4890 max overclocked in their other article against a low end stock gtx275, then in their extravaganza overclock gtx275 they put up a pathetic overclock that is BEATEN BY An EVGA gtx275 as it arrives !
    Worse yet, they jammed their ati 4890 maxxxxx overclocks in for comparison !
    ------------
    How to be a completely biased load of crap by Derek and AT :

    max overclock your ati 4890 and put stock gtx275's against it
    weak overclock your gtx275 and put maxx overclock 4890's against it
    ___

    ROFLMAO - I HAVE TO LUAGH IT'S SO BLATANT AND PATHETIC.
    ---
    cocka doodle doooo ! cocka doodle dooo ! red rooster central.

Log in

Don't have an account? Sign up now