Overclocking

As we briefly mentioned in the introduction, the GT 240 has a 70W TDP, one we believe to be chosen specifically to get as much out of the card as possible without breaking the 75W limit of a PCIe slot. Based on the lower core and shader clock speeds of the GT 240 compared to the GT 220 (not to mention the 8800 GT) we believe it to be a reasonable assumption that the GPU is capable of a great deal more, so long as you’re willing to throw the 75W limit out the window.

There’s also the matter of the RAM. The DDR3 card is a lost cause anyhow thanks to its low performance and the fact that it’s only 10MHz below its RAM’s rated limit, but the GDDR5 cards have great potential. The Samsung chips on those cards are rated for 4000MHz effective, some 17% more than the stock speed of 3400MHz effective. If the memory bus can hold up, then that’s a freebie overclock.

So with that in mind, we took to overclocking our two GDDR5 cards: The Asus GT 240 512MB GDDR5, and the EVGA GT 240 512MB GDDR5 Superclocked.

For the Asus card, we managed to bring it to 640MHz core, 4000MHz RAM, and 1475MHz shader clock. This is an improvement of 16%, 17%, and 10% respectively. We believe that this card is actually capable of more, but we have encountered an interesting quirk with it.

When we attempt to define custom clock speeds on it, our watt meter shows the power usage of our test rig dropping by a few watts, which under normal circumstances doesn’t make any sense. We suspect that the voltage on the GPU core is being reduced when the card is overclocked, however there’s currently no way to read the GPU voltage of a GT 240, so we can’t confirm this. However it does fit all of our data, and makes even more sense once we look at the EVGA card.

For the EVGA card, we managed to bring it to 650MHz core, 4000MHz RAM, and 1700MHz shader clock. This is an improvement of 18%, 11%, and 27% respectively.

Compared to our Asus card, we do not get any anomalous power readings when attempting to overclock it, so if our GPU core voltage theory is correct, then this would explain why the two cards overclocked so differently. In any case, the significant shader overclocking should be quite beneficial in what shader-bound situations exist for the GT 240.

Crysis gives us some very interesting results when overclocked, and this data helps cement the idea that Crysis is ROP-bound on the GT 240. Compared to a stock GT 240, our overclocked Asus and EVGA cards get roughly 15% and 17% more performance. More interestingly, they’re only a few tenths of a frame apart, even though the EVGA card has its shaders clocked significantly higher.

As the performance difference almost perfectly matches the core overclock, this makes Crysis an excellent candidate for proving that the GT 240 can be ROP-bound. It’s a shame that the GT 240’s core doesn’t overclock more than the shaders, as given the ROP weakness we’d rather have more core clockspeed than shader clockspeed in our overclocking efforts.

When it comes to Far Cry 2, the significant shader speed differences finally make themselves more apparent. Our cards are still suffering from RAM limitations, but it’s still enough to get another 10% out of the EVGA card.

Finally with Battleforge we can see a full stratification of results. The overclocked EVGA card is some 19% faster than a stock GT 240, and 5% faster than the overclocked Asus. Meanwhile the Asus is 14% faster than stock, even with its more limited shader overclock.

Finally, on a quick note we’ll talk about power usage. As we mentioned previously only the EVGA card behaved correctly when overclocking – in overclocking that card we saw a 21W jump from 172W under load to 193W under load. If indeed that card is at or close to 70W under normal circumstances as NVIDIA’s specifications call for, then when overclocked it’s over 90W. It becomes readily apparent here that the clock speeds of the GT 240 were picked by NVIDIA to meet the PCIe power limit rather than the capabilities of the GPU itself.

Left 4 Dead Power, Temperature, & Noise
Comments Locked

55 Comments

View All Comments

  • BelardA - Wednesday, January 6, 2010 - link

    Anyone notice any lack of SLI on these cards? Of course they are soooo slow.

    Okay, the ATI 4670 (DX 10.1) came out over a year ago with an MSRP of $90~100. Considering the age, its about the same wattage and noise as the GT240 and in many cases, its a slower card.

    Why bother even making such a card? Other than the profit sold from a $90 GT240 is much better than a $90 9800GT.... except nobody in their right mind would bother with a GT240

    If the GT240 was a $65~80 part, nobody would complain.

    But what happens when ATI releases their $100 5600 series cards? Since the 5700s are pretty much on par with the 4800s. I'm not expecting the 5600s to be that exciting. Other than being $100 DX11 cards that are faster than 4670s but maybe around 4830 performance.
  • Penti - Wednesday, January 6, 2010 - link

    OEMs, OEMs would.
  • BelardA - Thursday, January 7, 2010 - link

    Yeah yeah, I know. OEMS love such things.

    Kind of sick to look at ordering forms on sites like Dell. When a basic desktop has a default price... add something like a ATI 4670 or GT240 and the price goes up $150. Apple is the WORST with their quad-SLI setup with GT120 (I think) video cards... wow, 4 slow cards at about $150 a pop! While on the same Apple order form, a single $200 ATI 4870 is available and should be faster.

  • aegisofrime - Wednesday, January 6, 2010 - link

    I might be nitpicking, but you have listed all the ASUS results as "nVidia Geforce GT 240" instead of "ASUS Geforce GT 240" in the charts. :p
  • Ryan Smith - Wednesday, January 6, 2010 - link

    For the performance data, that is correct. Not to slight Asus of course, but their cards are stock cards. Hence they're the reference values I'm using for the GT 240, and are listed as such.
  • aegisofrime - Wednesday, January 6, 2010 - link

    Ah I see. Thanks for the clarification!
  • lopri - Wednesday, January 6, 2010 - link

    Thank you Ryan for this excellent review. It's refreshing to read a sensible piece without personal drama and baseless conspiracy theories.
  • Devo2007 - Wednesday, January 6, 2010 - link

    Might want to fix the power charts as they currently list an NVidia Geforce 4870 X2 card. Unless of course that is how they have decided to compete with ATI (rebranding Radeons). :)
  • korbendallas - Wednesday, January 6, 2010 - link

    The load temperature graph has to be wrong - there's no way two cards with the same cooler and the same power consumption has such a difference in temperature.
  • korbendallas - Wednesday, January 6, 2010 - link

    Oh, the fan is bugged out... nevermind :)

Log in

Don't have an account? Sign up now