A Quick Look Under The Hood

Our first concern, upon hearing about this hardware, was whether or not they could fit two of GTX 260 GPUs on a single card without melting PSUs. With only a 6 pin + 8 pin PCIe power configuration, this doesn't seem like quite enough to push the hardware. But then we learned something interesting: the GeForce GTX 295 is the first 55nm part from NVIDIA. Of course, the logical conclusion is that single GPU 55nm hardware might not be far behind, but that's not what we're here to talk about today.


Image courtesy NVIDIA

55nm is only a half node process, so we won't see huge changes in die-size (we don't have one yet, so we can't measure it), but the part should get a little smaller and cheaper to build. As well as a little easier to cool and lower power at the same performance levels (or NVIDIA could choose to push performance a little higher).


Image courtesy NVIDIA

As we briefly mentioned, the GPUs strapped on to this beast aren't your stock GTX 260 or GTX 280 parts. These chips are something like a GTX 280 with one memory channel disabled running at GTX 260 clock speeds. I suppose you could also look at them as GTX 260 ICs with all 10 TPCs enabled. Either way, you end up with something that has higher shader performance than a GTX 260 and lower memory bandwidth and fillrate (remember that ROPs are tied to memory channels, so this new part only has 28 rops instead of 32) than a GTX 280. This is a hybrid part.


Image courtesy NVIDIA

Our first thought was binning (or what AMD calls harvesting), but being that this is also a move to 55nm we have to rethink that. It isn't clear whether this chip will make it's way onto a single GPU board. But if it did, it would likely be capable of higher clock speeds due to the die shrink and would fall between the GTX 260 core 216 and GTX 280 in performance. Of course, this part may not end up on single GPU boards. We'll just have to wait and see.

What is clear, is that this is a solution gunning for the top. It is capable of quad SLI and sports not only two dual-link DVI outputs, but an HDMI out as well. It isn't clear whether all boards built will include the HDMI port the reference board includes, but more flexibility is always a good thing.


Image courtesy NVIDIA

Index Preliminary Thoughts
Comments Locked

69 Comments

View All Comments

  • SuperGee - Saturday, December 20, 2008 - link

    There are dozen preview and I see it not comes neer 280SLI.
    the spect of each GPU fall between 260and 280.
    And the result are just like that.

    Sure I seen result of it equal out. but the scaling is top off by a CPU bottneck. It's all flattend out 3 or 5 off the pack.

    // seen the benches of that site.
    What I see in sitaution where the CPU give room for SLI scaling.
    the GTX295 is very neer GTX260+SLI.
    GXT280SLI take a larger lead over the GTX295.
    it's More GTX260. With full shaders.
    Could be that some games arent shader heavy but fill rate.

    So if you pick out CPU limited. And only look at 280SLI and 295 and ignor everything else. You get so wrong impression.
    The games setting that barely fit in 1GB makes 295 dive. where 2870x2 leads.

    So in short where ever SLI may scale beyound 280SLI.
    Like Tri GTX285. :)
    GTX295 is like a GTX260++X2 (240)

    Didn't you notice if Where GTX295 equals GTX280sLI it also equals 4870x2 and GTX260+ and 4870-1GBCF With small differences.
    Me get within second a impression that such game with such setting is CPU bound with these cards.

  • SiliconDoc - Sunday, December 21, 2008 - link

    Well, if the 295 is cpu bound, then it has longer legs to go higher.
    I find it funny that the 4870x2 isn't referred to as cpu bound...
    So that means it's all legged out.
    It's also absolutely clear this 295 is two between the 260 and 280 - So the already completed SLI tests will compare just like the already completed 4870 Cf tests (512 and 1024 /. x2 respectively)
    So the numbers are already out there. Have been for some time one can say.
    The FPS numbers said for a long time the 260 is slightly better than the 4870 - and the 280 is definitely better.
    The 260 wins in more, not in everything ( I noticed the 4870 won all time the Devil may cry 4 - a notable exception to the other game stats), but the cut is clear for those not willing to tell lies. Period.
    Sure it's very close, and bundle and free game, or just whim, or slickness of look, or whatever it is (type of chipset, wanting a single card solution, power connectors - length of case etc etc )- is plenty good enough to make the choice between the two.
    Now the 295 ( = to X2 the 260's released with extra help from all 240 shaders, instead of 192 or 216 ) is a bit more - and so the stats spread will jump a bit more.
    That's just the way it is going to be. That's the way it already is.
    I know, it's very dangerous saying so. My golly, the world might explode.
  • DigitalFreak - Thursday, December 18, 2008 - link

    I'm thinking $599.
  • Lord 666 - Thursday, December 18, 2008 - link

    Sheer size of the current 200's is what is holding me back from purchasing. While I have a P180 case, all bays are populated with drives for work.

    The current SLI'd 8800GTS 640's just squeeze in there.
  • tejas84 - Thursday, December 18, 2008 - link

    Why don't the AMD fanboys stop criticizing Derek Wilson when he speaks the truth? Crossfire is a load of garbage...

    AMD has the worst driver release schedule and game support, thus SLI with its TWIMTBP support is superior in every way... Oh and if Nvidia sell the GTX 205 under the 4870X2 msrp then what will they debt ridden AMD do??

    Plus in case the AMD fanboys don't know your beloved AMD/ATI is circling the drain and I doubt they will survive the economic problems.Layoffs all over, consecutive losses for 2 years.. this is not a company that is "pwning" Nvidia..


    The only company that has been run as ineptly as AMD is General Motors and look at their fate...AMD for chapter 11 in 2009
  • The0ne - Thursday, December 18, 2008 - link

    Nvidia hasn't been doing well themselves. And please don't make comparisons of how bad things are for each company. If the company is not doing well it's NOT doing well.

    Both companies have their share of driver problems. I won't vouch for either at this point in time. My take on all these newer hardware is that the software side (driver mainly) isn't getting the time it requires to catch up. That's the biggest drawback from my perspective.
  • CadESin - Thursday, December 18, 2008 - link

    Why is it bad that AMD beat nvidia, I still have friends that say to be day after day that the 4870 is going to kill the PC market becouse its going to hurt Nvidia and as AMD is going to die, anything that hurts Nvidia is bad....

    I just can't understand why having AMD back in the game makes so many Nvidia fan boys all upset. Sure it means that the Green is not number 1 100% of the time anymore, but it allso means you fan boys got your GTX 280's for a lot less then they would have cost you without the 4800 line of cards.

    I myself run 2 systems, a X48 with 2 4870's and a X58 with 2 GTX 260's, and to be honest they both play about the same. I will note that I have less issues with ATI's drivers then I do with Nvidia's, not FPS wise in game, but crashing to desktop and driver failer.

    I agree with Derek that SLI is a bit better then CF is SOME games, but I allso like to point out that CF distorys SLI in some games as well, CoD4, Grid and Dirt to name a few.

    AMD saved us all from 500+ GTX 280's, and thats good new for everyone. But I do see this new card just turning into the next 9800GX2. Running to hot, drawing to much power, and totaly pawning Nvidia's next high end card, just the way the 9800GX2 beat the GTX280. Stop-gap products are great, soft launch Stop-gap products are less great, soft launch stop-gap products right before the bigest retail time of the year, thats just being desperate.

    Nvidia knows they are loseing, both the marketshare and money. AMD povoed to them that they will not allways be on top, and thats good. Maybe in the end, we can get another 8800GTX out of this, and by that I mean, something truely amazing... not the GTX280 which is something, truely not amazing...
  • MadMan007 - Thursday, December 18, 2008 - link

    "I just can't understand why having AMD back in the game makes so many Nvidia fan boys all upset."

    You answer your own question in the same sentence - 'fanboys.'
  • SiliconDoc - Saturday, December 20, 2008 - link

    It seems to me the problem with fanboyism is the LYING that goes along with it.
    I certainly have seen endless threads where reds jump in, and start slamming away, and they ALL claim they have never had a single issue with an ATI card, and most of them have some never seen anywhere else by me story about some Nvidia card problem.
    Well, that tells me what the problem is right there. I'm not sure what is causing it - I've seen reviews skewed by the same phenomena - I suppose they are "mad" that NVidia got their money in the not so distant past, and now they have a "revenge" company to cling to. I guess that's it. Maybe it's just my shiny car is better than yours syndrome. Maybe if they (realize but can't admit it in public) made the wrong choice, they have to try to convince everyone they "meet" they made the right one. It certainly is a wild pile of lies from what I have seen - and imagine the immense nerd arrogance in the front end enthusiast region that fuels it. Oh, woe unto thee who criticizes the enthusiast "expert of all areas". You might as well call his GF ugly or spit on his muscle car at the show, it would be the same thing as not agreeing his vidcard is the very best...(haha)
    Anyway, a good friend I have with an amazing amount of tech knowledge for his young age and a finely skilled overclocker was treated to my personally purported to be objective opinion on this whole topic - we had some laughs with others in our techtalk TeamS. , but he had perhaps some lingering red favoritism it seemed IMO (unwarranted).
    So he ordered this vidcard recently (ati 3650) and waited and waited for it from the great white north... finally it came... and the nightmare began ... can't overlcock - oh finally got that unglitched - the thing wouldn't do ANY 3d game - I was biting my tongue so hard but with the hassles he was exploding -lol -
    So anyway, driver changes, this and that utility....
    He left just earlier after the several days nightmare as he discussed his 3 system upgrade for the lawyers office - and after I suggested a certain board with ati2100 integrated, his comment was "I don't want ATI on any of those boards".
    That's just the way it goes, or rather that's how it goes - all too often. I know, I know, this is the only time a 3650 ever had any kind of issue, and the 8600gt is way, way, way more prone to like totally exploding...it might destroy your case innards...it's probably radioactive and baby seals are killed to produce it.
  • SkullOne - Thursday, December 18, 2008 - link

    Spoken like a true nVidia fanboy. Can you say Hello Kettle, I'm Pot?

    Nobody should be wishing for AMD to go under because there's nobody to pick up the slack. So unless you like $600 video cards and high priced CPU's go ahead and keep that mentality.

Log in

Don't have an account? Sign up now