The GTX 260 Super Overclock

In order to run a card so far out of NVIDIA’s specs, you have to build a card that equally exceeds those specifications. It’s not enough to just bin chips to find the best performers, your memory and PCB need to be capable of supporting such a chip. So while the GTX260 Super Overclock’s main claim to fame is its high clock speeds, it also has to be built as a better card than what you can get away with at stock.

To accomplish this, Gigabyte has given the card the UltraDurable treatment that many of their other high-end products get. This includes a 2oz. copper based PCB, solid capacitors, low RDS(on) MOSFETs, and ferrite core chokes. Added to that UltraDurable qualifier is that they only use “1st tier” Samsung and Hynix memory, although this is a bit of a redundant claim since those two companies provide most high-speed GDDR3 these days anyhow.

The card is otherwise indistinguishable on the outside from any other stock GTX 200 series card. Gigabyte is using the same GTX cooler, adorned with Gigabyte decals. Gigabyte has gone ahead and programmed in a different cooling profile for this card as compared to the stock GTX 260, and this along with the greater heat generated by the overclocked card means that the cooling performance will differ in spite of being the usual GTX cooler.

Gigabyte has used an interesting choice for the port layout on the GTX260 Super Overclock. Instead of the standard 2x DVI + TV-Out configuration, they’re using 1x DVI + VGA + HDMI out. We find this somewhat odd since the GTX 260 Super Overclock is not a card targeted for HTPC use, so the HDMI port would see little use unless someone is using an HDMI-only TV as a desktop monitor. But on the flip side, this means that a special audio-carrying DVI->HDMI dongle isn’t required to get full HDMI out.

With the different port layout comes different dongles. 1 DVI->VGA dongle is included, along with an HDMI->DVI dongle for a second DVI port. Also included is a pair of Molex->PCIe power adapters, and a S/PDIF connector cable.

Rounding out the collection of included goodies is the obligatory manual and driver CD. Gigabyte also includes their VGA Tools Gamer Hud Lite. You won’t see this advertised on the box, which really says all that needs to be said. The NVIDIA drivers do a better job covering things here.

The warranty is listed at 3 years, which along with lifetime limited are the two most common warranty periods for higher-end video cards. Truth be told, we would have rather seen a lifetime warranty on an overclocked video card, particularly since the core speeds for this card are beyond anything the GT200b was designed and validated for. Furthermore Gigabyte’s competitors such as eVGA and BFG offer lifetime warranties on their overclocked parts, leaving Gigabyte the odd man out. 3 years is still a fair warranty, and we believe that realistically it should be good enough, but it’s just that.

Index The Test
Comments Locked

29 Comments

View All Comments

  • Leyawiin - Thursday, February 18, 2010 - link

    At this late stage of the game I decided to buy one of these. I game at 1680 x 1050 on XP and at that resolution this seems to be the best performing card for the money. The way it was built (better quality techniques and materials) is just icing on the cake. I don't want to wait for Fermi any longer (and I bet they will be out of my price range when/if they appear) and I don't want to spend $85-100 more for an HD 5850 for the small improvement it would give me on a 22" monitor. Should be a good match for my X4 955.
  • chizow - Monday, October 12, 2009 - link

    While its understandable why it took so long to do a review like this (first retail part clocked this high), these kind of OC scaling results would be much more useful closer to the launch of a product line to better determine the impact of individual clockspeeds and core/functional unit modifications.

    Derek did a few of these OC scaling drill-downs for ATI 4890 and GTX 275 I believe, but they were also very late given the GTX 260/280 and 4850/4870 had already been released for months. They would've been much more helpful if they were done at launch alongside the main reviews to give prospective buyers a better idea of the impact of actual hardware differences vs. software/artificial differences like clockspeed.

    The problem is Nvidia and ATI both mix hardware and clockspeed differences on these parts to obfuscate the actual performance delta between the parts, which is particularly significant because the ASICs are typically the same sans artificially neutered functional units. At the very least, launch reviews should normalize clockspeeds when possible to give a better idea of the impact of actual hardware differences.

    For example, with the 4850 vs. 4870, you have 625MHz vs 750MHz on the core along with GDDR3 and GDDR5. You can't really do much about the bandwidth disparity, but you can try to clock that 4850 up to 750MHz, which would give you a much better idea of the impact of the actual hardware differences and bandwidth. Similarly for the GTX 260 to 275, the original GTX 260s were clocked at 576/1242 and 275s were clocked at 633/1350. Normalizing clocks would then isolate the differences from the additional 8 TMU and 24 SP cluster rather than the artificial/binned difference in clockspeeds.

    To bring it full circle, doing these comparisons earlier wouldn't give you the guarantee a product would run at max overclocks like this factory OC'd part would, it would just set expectations when it actually mattered so that when an OC'd part like this does come along, you could just refer to the comparison done months ago and say "Yeah it performs and scales just as we thought it would when we did this Overclocking Comparison on GT200 parts 18 months ago".
  • SirKronan - Monday, October 12, 2009 - link

    There are two conclusions that would matter at all here:

    Performance per dollar
    and
    Performance per watt

    They got the performance per dollar, and the results aren't surprising at all. But this is half an article without the performance per watt aspect. As soon as they get a new killawatt or measure with a UPS that has a meter, and update the results, this article will actually be meaningful.

    An article comparing a lesser card overclocked to a faster card at stock MUST contain power usage comparison or it's missing half its teeth.
  • chizow - Tuesday, October 13, 2009 - link

    I typically don't find vertical price to performance comparisons relevant because you're always going to have trouble competing with those dirt cheap options that are free AR. Sure you may get great FPS per dollar, but what good does that do you when the aggregate FPS aren't playable? Similarly, with CF or SLI, its very difficult for the higher-end parts to keep ahead of the price and performance options that multi-GPU offer from lower-end parts.

    Performance per watt in this range of parts isn't hugely useful either imo, its been this way for some time, high-end graphics cards consume a lot of power. There's no panacea that's going to fix this, if a card is in the upper tier of performance for its time its probably going to consume a similar amount of power to its predecessors. This only changes as newer processes and faster parts are introduced, where they typically outperform older parts in the same segment using similar power, or slightly less.

    I personally prefer comparisons to be made based on performance, then you can do a lateral comparison of price and power consumption within that performance segment. That way you have an expected perfomrance level and can make an informed decision about other key characteristics, like price and power consumption.
  • shotage - Monday, October 12, 2009 - link

    I agree with this. Power Utilization followed by noise is something I would be concerned about with overclocked cards.

    Personally I don't think it's worth the money for a DX10 card anymore, even if it is fast. If I want a card its going to be DX11 capable.
  • Leyawiin - Monday, October 12, 2009 - link

    From the early reviews I've read there's no contest between the HD 5770 and a stock GTX 260 (216) in all but the most ATI friendly titles. The gap will be even greater with this card. This fact its very cool and quiet for the performance makes it even more compelling. And yes, the HD 5770 will be $160. If you can get it for that price (or get it at all for weeks after its release). I agree with Ryan on this one. Darn good deal for the price.
  • macs - Monday, October 12, 2009 - link

    My stock gtx 260 can overclock to those frequency as well. I can't see any reason to buy those overclocked (and overpriced) video cards...
  • mobutu - Monday, October 12, 2009 - link

    Yep. At 160 usd the 5770 will be (probably, we will find out tommorrow) very very very close in performance to this card priced at 200 usd.
    So the choice is/will be clear.
  • poohbear - Monday, October 12, 2009 - link

    im not quite sure what the point of this review is since the 5800 series have been released, in ur conclusion u didnt even mention anything about DX11 and how the 260 is not even capable of handling DX 10.1 let alone the future DX11. For people who keep their graphics cards for 2 years before upgrading this is an important factor.
  • Alexstarfire - Monday, October 12, 2009 - link

    I fail to see how DX11 matters since no game is even slated to use it yet, that I've heard of anyway. Might be useful if you keep your car 2 years, but I'm guessing by that time the card probably won't be able to handle DX11 much like the first DX10 cards couldn't handle DX10 very well when it first debuted. Of course, just going off of pricing and performance I wouldn't get this card anyway, or any nVidia card for that matter. I'm just saying.

Log in

Don't have an account? Sign up now