Final Words

We tested seven games. AMD and NVIDIA split it, each winning three of them and virtually tied in the seventh. I hate to disappoint those looking for a one sided fight here, but this one is a wash. NVIDIA would want to point out that CUDA and PhysX are significant advantages that would put the Core 216 over the top but honestly there's no compelling application for either (much like the arguments for Havok and DirectX 10.1 from the AMD camp).

Our recommendation here is to first see if either card happens to run a game you care about better than the other, but if not then just buy whatever is cheaper. Today that would be the Radeon HD 4870, currently it's very tough to find stock-clocked Core 216s and those are priced above $300; even if we could find availability at $279, the 4870 is still cheaper. Until the price comes down, the Radeon HD 4870 still remains our pick at the $250 - $300 pricepoint. While NVIDIA has closed the performance gap, the part they used still maintains a price gap.

NVIDIA says they will have availability on the silicon but that only two manufacturers are going to have parts out of the gate on this, which does give us pause. If the GTX 260 had been originally released with 9 TPCs (216 SPs), then it would have been a better competitor to the Radeon HD 4870 and we wouldn't need this slight tweak of a readjusted part. It doesn't generally deliver near it's 12.5% maximum theoretical performance improvement, and really seems like its only a thinly attempt to win at a couple more benchmarks than usual.

Yes it does that, and yes the consumer does benefit even if the benefit is ever so slight. But what none of us benefit from is an over abundance of parts released at nearly the same price point with nearly the same name and nearly the same specs. NVIDIA really needs to stop this trend. ATI tried this a few generations ago, but thankfully (at least since the AMD merger) they seem to have cleaned up their act a bit. There is no reason to have a continuum of hardware with increasingly complex naming as the gaps between parts are filled in.

What we need is less confusion in the market place and a focus on fairly pricing competitive hardware. Trying to get around supply and demand by cluttering up the market with different parts that have similar names and slightly different pricing isn't a consumer friendly way to go.

Power Consumption
Comments Locked

65 Comments

View All Comments

  • helldrell666 - Tuesday, September 16, 2008 - link

    this is a overclocked new 260gtx cuz the stock one has the same clock and shader frequency of the original 260gtx.
    you should have included a 4870 top or a xoc 4870 in this test.
  • strikeback03 - Wednesday, September 17, 2008 - link

    If you had actually read the article, you would see in multiple places that they ran it at both stock clocks and overclocked (as received) and showed both results.

    After all the complaining the AMD fanbois did when they showed a 9600GSO in the 4670 article, why would they bring in a new AMD overclocked card and hear the same thing from the NVIDIA fanbois?
  • toyota - Tuesday, September 16, 2008 - link

    what are you talking about? those are the same clocks as the standard GTX260.
  • Staples - Tuesday, September 16, 2008 - link

    The ridiculous amount of power draw for an idle card has been going on too long. I have a 4850 and my system consumes 30w more than it did beofre with a 7950GT. Most people do not pay attention to this number but I sure do. I am glad to see that NVIDIA has done more than just bumped up the chips inside this, there is significantly less power draw when it is idle.

    And here is hoping that ATI can actually come out with some better drivers this month. The 8.8 cause all kinds of trouble with a 780G chipset (in Vista 32) and a 4850 (in XP). Amazing but I have to run 8.7 on both computers because the 8.8 drivers are really problematic.
  • Vidmo - Wednesday, September 17, 2008 - link

    Power Savings??? Where? 160-200 watts for sitting there? Give me a break. These GPUs are a massive waste of power. ATI/nVidia should be ashamed of themselves.
  • MrSpadge - Wednesday, September 17, 2008 - link

    Take a calm look at the power consumptions of actual cards, e.g. here:
    http://www.xbitlabs.com/articles/video/display/zot...">http://www.xbitlabs.com/articles/video/display/zot...

    You'll see that the 4870 draws 65 W at idle (seems power play doesn't work there either). Assuming 80% power supply efficiency that means a draw of 81 W at the wall. Therefore ATs system draws 122.5 W without the GPU and the NVs consume about 36 W from the wall and 29 W for the cards themselves. That's way better than previous generation NV cards, which consumed 40 - 60 W at idle. That's what the previous poster meant by "power savings".

    (seems like power mixer is not working for XBits NV cards, whereas for ATs it works)

    MrS
  • bespoke - Tuesday, September 16, 2008 - link

    I know you were reviewing the chip and not really the EVGA product, but looking at the product images on newegg, I see all the GTX 260s look exactly the same, so the fan noise of this card should be fairly representative for other GTX 260 Core 216s. (Wow, that name is a mouthful.)
  • piroroadkill - Tuesday, September 16, 2008 - link

    Why didn't they just call it the Geforce GTX 270
  • cabul - Thursday, May 21, 2009 - link

    I read somewhere else recently that it is because the 260 and the 260 Core 216 can still be SLIed together. If they called it anything other than a 260, they felt that consumers would be confused.

    It makes perfect sense to me now.
  • gaiden2k5 - Tuesday, September 16, 2008 - link

    or GTX 265 since it's a variant of the 260 card

Log in

Don't have an account? Sign up now