Core Overclocking

After G80 hit (the first NVIDIA GPU to employ a separate clock domain for shaders), silent shader clock speed increases were made with any core clock speed increase. At first this made sense because NVIDIA only exposed the ability to adjust core and memory clock and the shader clock was not directly adjustable by the end user. Of course, we went to some trouble back then to try our hand at BIOS flashing for shader overclocking. After NVIDIA finally exposed a separate shader adjustment, they still tied core clock and shader clock to some degree.

Since the middle of last year, NVIDIA's driver based clock speed adjustments have been "unlinked," meaning that the shader clock is not affected by the core clock as it used to be. This certainly makes things a lot easier for us, and we'll start by testing out core clock speed adjustment.

The maximum core clock we could hit on our reference GTX 275 was 702. Try as we might, we just could not get it stable beyond that speed. But it's still a good enough overclock for us to get a good handle on scaling. We know some people have GTX 275 parts that will get up toward 750 MHz, so it is possible to get more speed out of this. Still, we have an 11% increase in core clock speed which should net us some decent results.




1680x1050    1920x1200    2560x1600


Call of Duty edges up toward the theoretical maximum but drops off up at 2560x1600 which is much more resource intensive. Interestingly, most of the other games see more benefit at the highest resolution we test hitting over 5% there but generally between 2 and 5 percent at lower resolutions. FarCry 2 and Fallout 3 seem not to gain as much benefit from core overclocking as our other tests.

It could be that the fact we aren't seeing numbers closer to theoretical maximums because there is a bottleneck either in memory or in the shader hardware. This makes analysis a little more complex than with the AMD part, as there are more interrelated factors. Some aspects of a game could be accelerated, but if a significant amount of work is going on elsewhere, we'll still be waiting on one of the other subsystems.

Let's move on to the last independent aspect of the chip and then bring it all together.

Memory Overclocking Shader Overclocking
Comments Locked

43 Comments

View All Comments

  • SiliconDoc - Monday, June 22, 2009 - link

    Not only does the gtx275 beat the 4890, but the gtx260 beats the 4870 !
    http://www.techspot.com/review/164-radeon-4890-vs-...">http://www.techspot.com/review/164-radeon-4890-vs-...
    ..
    Oh gee, I guess I should follow dereks chart instead... ( GAG ! )
  • SiliconDoc - Monday, June 8, 2009 - link

    " Given the performance and pricing of Phenom II and the upcoming Radeon 5000 series, if AMD does not pull into black [records a profit] and achieves great sales success, we don't know what needs to happen in order for AMD to actually earn some serious money. "
    LOL
    Will ATI ever make a profit ? We keep hearing how smart they are and how they can really make money while NVidia's monster vore costs nvidia so much ! LOL
    Awwww- poor ati can't make dime one while nvidia keeps posting profits...
    http://www.brightsideofnews.com/news/2009/6/3/ati-...">http://www.brightsideofnews.com/news/20...-has-the...
    ----
    Any red roosters gonna stop crowing about smaller gpu cores and their savings for ATI who loses money every single quarter, anytime soon ?
    Probably NOT - they're so smart, so wise about making gpu's.

  • SiliconDoc - Saturday, June 6, 2009 - link

    Better be careful - from their 4890 extravaganza overclock article they did first and used the results in this comparison against nvidia:
    -
    " We absolutely must caution our readers once again that these are not off-the-shelf retail parts. These are parts sent directly to us from manufacturers and could very likely have a higher overclocking potential than retail parts. "
    YES WE KNOW - NOW YOU'VE TRANSFERRED YOUR RESULTS HERE AGAINST A 703 CORE THAT IS LOWER THAN THE FTW EVGA GTX275 you can but stock faster!
    ---
    You had BETTER look at a few other places that aren't so GD biased it's built in - and Derek is insanely red rooster fan boyed, and hates Nvidia, obviously, like so many little cluckers here for over a year now.-
    ---
    Jeepers- a special 4890 up against non overclocked gtx275 in the 4890 article, then transfer the special results here, and put NOT a regular clocked 4890 as in the other article to be half fair - but jam the massive results in from the special cards they got...
    ---
    I'm mean you really can't screw it up much worse than that.
    They did their best red rooster bias blab possible, I'll say that much for em, and covered it up as best they could.

Log in

Don't have an account? Sign up now