Shader Overclocking

We were able to really crank up the shader core on our GTX 275, hitting almost an 18% increase in clock speed with our 1656MHz shader clock. This is a pretty huge overclock for an already highly clocked aspect of the hardware. While or core clock speed couldn't quite push up like other overclockers, our shader overclock made up for this and was pretty high from what else we've seen out there on stock cooling.

In shader heavy applications, we should see a significant benefit from this, but the reality of the situation is a little bit disappointing.




1680x1050    1920x1200    2560x1600


At best we see about a 7 and three quarter percent improvement in Age of Conan at 2560x1600. This certainly doesn't come near our 18% theoretical maximum. Most of our other tests don't even see the type of performance they got from a much more modest boost in core clock speed. In fact, in a couple cases it makes more sense to overclock the memory than the shader core.

If we take everything separately, the prospects for getting good performance improvement out of the GTX 275 don't look that great. But, even more so than with the 4890, putting overclocking together right can make huge gains in realized performance.

Core Overclocking Bringing it All Together: Everything OC'd
Comments Locked

43 Comments

View All Comments

  • balancedthinking - Thursday, June 4, 2009 - link

    Actually, you can save as much as 40W(!)idle when you underclock a 4890/4870. The important part is to underlcock the GDDR5 which automatically adjusts voltage too.
    http://www.computerbase.de/news/hardware/grafikkar...">http://www.computerbase.de/news/hardwar...ten/ati/...
  • 7Enigma - Friday, June 5, 2009 - link

    Anyway you could translate the 4870 portion? Babelfish is not working for me for some reason....
  • jeramhyde - Thursday, June 4, 2009 - link

    great article thanks :-) good work with the graphs and setting it out in an easy to follow way :-)

    my housemate just got a gtx275 today, so we'll be trying those overclocks tonight :-)
  • nvalhalla - Thursday, June 4, 2009 - link

    Alright, I just spent the last 10 minutes looking for a 900 shader 4980 before I realized you meant a 900MHz 4890. It's wrong in every graph.
  • DerekWilson - Thursday, June 4, 2009 - link

    It's not wrong, it's just noted in a different way.

    like I say (702 core) for the GTX 275 with a 702MHz core clock, i say (900 core) for a 4890 with a 900MHz core clock.

    I'm sorry for the confusion, but it the graphs were already so crowded that I wanted to save space as much as I could.
  • nvalhalla - Thursday, June 4, 2009 - link

    no, you're not getting me. It's listed as a 4980, not a 4890. I thought it was a new card, the "NEXT TIER" if you will. The 900 I thought might be a reference to the number of SP. Once I realized you just transposed the numbers, I got the 900 was MHz.
  • DerekWilson - Thursday, June 4, 2009 - link

    oooooooooooohhhhhhhhh ... okay ... that's a fun typo. I can't really do search and replace on these graphs either. I'll try and get it cleaned up as soon as I can.

    thanks for clarifying.
  • walp - Thursday, June 4, 2009 - link

    Very nice article as always! :)

    GTX275 and 4890 is really evenly matched in every different way(Price, performance, overclocking performance etc) except for the fact that 4890 can be used with the 19$ Accelero S1:

    http://translate.google.com/translate?prev=hp&...">http://translate.google.com/translate?p...mp;sl=sv...

    , which makes it totally quiet and cool. Just watch those VRM-temperatures and you will be just fine!

    This is the main reason why I chosed the 4890 over GTX275, and the fact that I had a CF-compatible motherboard.

    By the way, why didnt you include the 4890 (1\1.2)GHz idle power draw? Or is it just a little type-o? :)

    Request: GTX275 SLI vs 4890 CF (And overclocking bonanza! :)))))

    \walp
  • SiliconDoc - Monday, June 22, 2009 - link

    I can't help but use google and check the very first site that comes up:
    http://www.techspot.com/review/164-radeon-4890-vs-...">http://www.techspot.com/review/164-radeon-4890-vs-...
    --
    and the GTX275 beats the TAR out of the 4890 !!!
    --
    I guess derek has once again used some biased bench or the special from the manufacturer 4890, and then downclocked the GTX275 to boot.
    WHAT A CROCK !
  • SiliconDoc - Saturday, June 6, 2009 - link

    It's hilarious - the extravaganza overclock they do here for Nvidia can't even match the stock timings of a common EVGA.
    http://www.newegg.com/Product/Product.aspx?Item=N8...">http://www.newegg.com/Product/Product.aspx?Item=N8...
    -
    core 713 derek weakness red rooster nvidia hate 703

    So what AT has done here is put the 4890 max overclocked in their other article against a low end stock gtx275, then in their extravaganza overclock gtx275 they put up a pathetic overclock that is BEATEN BY An EVGA gtx275 as it arrives !
    Worse yet, they jammed their ati 4890 maxxxxx overclocks in for comparison !
    ------------
    How to be a completely biased load of crap by Derek and AT :

    max overclock your ati 4890 and put stock gtx275's against it
    weak overclock your gtx275 and put maxx overclock 4890's against it
    ___

    ROFLMAO - I HAVE TO LUAGH IT'S SO BLATANT AND PATHETIC.
    ---
    cocka doodle doooo ! cocka doodle dooo ! red rooster central.

Log in

Don't have an account? Sign up now