The GTX 260 Super Overclock

In order to run a card so far out of NVIDIA’s specs, you have to build a card that equally exceeds those specifications. It’s not enough to just bin chips to find the best performers, your memory and PCB need to be capable of supporting such a chip. So while the GTX260 Super Overclock’s main claim to fame is its high clock speeds, it also has to be built as a better card than what you can get away with at stock.

To accomplish this, Gigabyte has given the card the UltraDurable treatment that many of their other high-end products get. This includes a 2oz. copper based PCB, solid capacitors, low RDS(on) MOSFETs, and ferrite core chokes. Added to that UltraDurable qualifier is that they only use “1st tier” Samsung and Hynix memory, although this is a bit of a redundant claim since those two companies provide most high-speed GDDR3 these days anyhow.

The card is otherwise indistinguishable on the outside from any other stock GTX 200 series card. Gigabyte is using the same GTX cooler, adorned with Gigabyte decals. Gigabyte has gone ahead and programmed in a different cooling profile for this card as compared to the stock GTX 260, and this along with the greater heat generated by the overclocked card means that the cooling performance will differ in spite of being the usual GTX cooler.

Gigabyte has used an interesting choice for the port layout on the GTX260 Super Overclock. Instead of the standard 2x DVI + TV-Out configuration, they’re using 1x DVI + VGA + HDMI out. We find this somewhat odd since the GTX 260 Super Overclock is not a card targeted for HTPC use, so the HDMI port would see little use unless someone is using an HDMI-only TV as a desktop monitor. But on the flip side, this means that a special audio-carrying DVI->HDMI dongle isn’t required to get full HDMI out.

With the different port layout comes different dongles. 1 DVI->VGA dongle is included, along with an HDMI->DVI dongle for a second DVI port. Also included is a pair of Molex->PCIe power adapters, and a S/PDIF connector cable.

Rounding out the collection of included goodies is the obligatory manual and driver CD. Gigabyte also includes their VGA Tools Gamer Hud Lite. You won’t see this advertised on the box, which really says all that needs to be said. The NVIDIA drivers do a better job covering things here.

The warranty is listed at 3 years, which along with lifetime limited are the two most common warranty periods for higher-end video cards. Truth be told, we would have rather seen a lifetime warranty on an overclocked video card, particularly since the core speeds for this card are beyond anything the GT200b was designed and validated for. Furthermore Gigabyte’s competitors such as eVGA and BFG offer lifetime warranties on their overclocked parts, leaving Gigabyte the odd man out. 3 years is still a fair warranty, and we believe that realistically it should be good enough, but it’s just that.

Index The Test
Comments Locked

29 Comments

View All Comments

  • palladium - Monday, October 12, 2009 - link

    Just wondering, with HAWX, is DX10.1 enabled for ATI cards?
  • Ryan Smith - Monday, October 12, 2009 - link

    No.
  • Nfarce - Monday, October 12, 2009 - link

    I just ask because I bought a stock EVGA 275 and have it overclocked quite nicely, which puts it above the performance of this o/c 260. Even AT posted about the 275's performance capabilities in an article back on June 4. You aren't really comparing apples to apples here other than one being purchased factory overclocked and others being purchased factory stock. No serious gamer ever keeps a video card stock just like a CPU.
  • Ryan Smith - Monday, October 12, 2009 - link

    Absolutely. User overclocking is by no means guaranteed, whereas factory overclocking is as good as anything else sold.

    As I stated in the article this card is a poor choice if you intend to do any overclocking on your own, but if you're the kind of person that does not do any overclocking (and I do know "serious gamers" that don't touch their card's clocks) then this is just as good as a GTX 275.
  • Abhilash - Monday, October 12, 2009 - link

    It is not worth the 25% premium over a stock gtx260.

    Where is the power consumption results???
  • Ryan Smith - Monday, October 12, 2009 - link

    My Kill A Watt decided to kill itself during some testing this weekend. There wasn't time to get it replaced and run new tests while still meeting all of the article deadlines this week. It'll be back soon™.
  • SirKronan - Monday, October 12, 2009 - link

    That's what I was wondering from the first page of the review: "Ok, so it performs like a 275, but how much power does it consume to do the same amount of work?" The title and conclusion indicate the performance is there for $10 to $20 less, but I kept looking on the review pages for the only thing I really wanted to know: "How do they differ on power?"

    I am typically one who praises Anand's articles, but I wouldn't have even published this without at least some kind of power figures. I understand that your Killawatt got "killed" (er... died, heh), but at least give us figures from a UPS that has a wattage meter built in. What was the difference in overall power consumption? That would at least give us an idea of how much extra power the 260 OC'd is going to use versus a 275. If you game enough, the power savings might even nearly negate the extra $10 you save over 2 or 3 years, depending on where you live.
  • Finally - Monday, October 12, 2009 - link

    Thanks for pointing this out. I was about to ask that.
    I guess that is the card's weak spot that would stand in the way of a "recommendation"...

    Under the rug, under the rug...
  • 7Enigma - Monday, October 12, 2009 - link

    Ryan did mention in the comments above that his Kill-A-Watt died during testing so that would explain why the info is not there.

    What should have been mentioned (I may have missed it) in the article was this explaination. No where did I find it, and like most of us my first thought was, OK but how much more power is this thing using, as that makes a big difference in my personal buying decisions (and why the 5850 is so darn likable across the board).
  • Stas - Sunday, October 11, 2009 - link

    As noted, 4890 is $20 cheaper than this Gigabyte card. Performance almost equal. But don't forget that you can easily get extra 100-150Mhz on the 4890 GPU with stock cooling, and 100-200 on memory. Which would make it 5-10% faster. So now we have a card (HD4890) that's cheaper ($20) AND faster than Gigabyte GTX260 O/C. I think it's a no brainer. Of course, Gigabyte did a great job with this card (I love Gigabyte), but you can only compete so well, when the limitation is set by the chip's architecture. Out of all GTX260 cards, this one is probably the best. But it isn't the best value or performance when compared to HD4890.
    P.S. Even with both cards at stock, in games where GTX260 prevails, it only does so by 10% or so. Wherever the HD4890 comes atop, it beats the other by up to 30%.

Log in

Don't have an account? Sign up now