A Quick Look Under The Hood

Our first concern, upon hearing about this hardware, was whether or not they could fit two of GTX 260 GPUs on a single card without melting PSUs. With only a 6 pin + 8 pin PCIe power configuration, this doesn't seem like quite enough to push the hardware. But then we learned something interesting: the GeForce GTX 295 is the first 55nm part from NVIDIA. Of course, the logical conclusion is that single GPU 55nm hardware might not be far behind, but that's not what we're here to talk about today.


Image courtesy NVIDIA

55nm is only a half node process, so we won't see huge changes in die-size (we don't have one yet, so we can't measure it), but the part should get a little smaller and cheaper to build. As well as a little easier to cool and lower power at the same performance levels (or NVIDIA could choose to push performance a little higher).


Image courtesy NVIDIA

As we briefly mentioned, the GPUs strapped on to this beast aren't your stock GTX 260 or GTX 280 parts. These chips are something like a GTX 280 with one memory channel disabled running at GTX 260 clock speeds. I suppose you could also look at them as GTX 260 ICs with all 10 TPCs enabled. Either way, you end up with something that has higher shader performance than a GTX 260 and lower memory bandwidth and fillrate (remember that ROPs are tied to memory channels, so this new part only has 28 rops instead of 32) than a GTX 280. This is a hybrid part.


Image courtesy NVIDIA

Our first thought was binning (or what AMD calls harvesting), but being that this is also a move to 55nm we have to rethink that. It isn't clear whether this chip will make it's way onto a single GPU board. But if it did, it would likely be capable of higher clock speeds due to the die shrink and would fall between the GTX 260 core 216 and GTX 280 in performance. Of course, this part may not end up on single GPU boards. We'll just have to wait and see.

What is clear, is that this is a solution gunning for the top. It is capable of quad SLI and sports not only two dual-link DVI outputs, but an HDMI out as well. It isn't clear whether all boards built will include the HDMI port the reference board includes, but more flexibility is always a good thing.


Image courtesy NVIDIA

Index Preliminary Thoughts
Comments Locked

69 Comments

View All Comments

  • SiliconDoc - Sunday, December 21, 2008 - link

    Jiminy crickets, can I buy you a TWIMTBP dartboard so you can get out some anger?
    I think you missed the boat entirely. I think it's great the two companies are competing, and they have a war going - I bet it's motivational for both developer teams.
    I'd also bet the "sandwhich" method NVidia used will be good learning for cards down the line, because it doesn't look like shrinkage is going to beat transistor count raising.
    Since the 260 / 192 and 216 only slightly beat the 4870 / 512 and 1024 - I disagree NVidia "tried it with them" and found it not good enough - they well knew to open it to 240 shaders for more of an advatage in the X2 boost. That's just simple plain common sense - that your analysis in that matter didn't have. A bitter putdown it was.
    Now, the point you missed that is valid, is what BOTH card companies do, and so do the cpu makers- and they cover it up by claiming problems with "yield". And the enthusiast public sucks it down like mental giant candy - oh so smart is the geek end user he knows why these things happen - and he or she blabbers it to all his or her peers... oh so smart the blabbering corporate protecter geek end user is...
    The bottom line is they CRIPPLE cards, gpu's, cpu's, bios, whatever it is they need to do - cut pipelines, chop shaders, whack cache...
    And somehow the supposedly so bright geek enthusiast squads babble the corporate line "Well, no we uhh.. we have a bad chip come out, and uhh... well we have to disable part of it - err, uhh, we just like disable it and mark it lower..."
    (add the speed binning - which occurs - to validate all the other chanting BS - and it works - for the corporate line)
    When the real truth is THEY PLAN THE MARKET SEGMENT - THEY CHOP AND DISABLE AND SLICE AND CUT AND REMOVE AND ADD BACK IN - TO GET WHAT THEY WANT.
    Yes, there are some considerations - # of ram chips to be on vidcard for example for bit width - but this they decide...
    They don't just shoot it straight out the way they HAVE TO because of "low yields".
    The only part that is valid is SPEED BINNING.
    Somehow though, the excuses for 'em are everywhere, as if wonder the cpu or GPU die " has the exactly correct portion magically come out retarded " - the portion they want disabled, or changed, for their TIER insertion of exactly proper product class.
    I can't stand it when BS becomes widespread belief.
  • Razorbladehaze - Sunday, December 21, 2008 - link

    What does this even mean?

    "Since the 260 / 192 and 216 only slightly beat the 4870 / 512 and 1024 - I disagree NVidia "tried it with them" and found it not good enough - they well knew to open it to 240 shaders for more of an advantage in the X2 boost. "

    First of all this is a total contradiction to logic if I'm indeed reading this correctly. I'll even let you keep the idea that 260 performs better (which it never has, although the 216 does border with the 4870 much better).

    But why then does nvidia release another version (216) with increased shaders? It's not because they suddenly have better yields, which could be the case, but once again it was reactionary. From the overall landscape of nvidia's moves they would have been happy to go on with their original design.

    Further more the logic once again is not there that nvidia would increase shaders again, when releasing their product and waiting for the decrease in process size (probably the larger of the two factors) before releasing the GT295. Unless of course they feel that HAD too in order to ensure they would beat out the 4870X2. If what is posited as true in the quote by silicon doc, logically nvidia would have wasted no time and likely released a GT295 much closer to the release date of the 4870X2.

    Finnaly what the hell are you talking about "for more of an advantage in the X2 boost."

    Does this mean the 4870X2 had a boost? Or are you trying to state that there is some special feature on 4870X2 that provides a boost (which there is not unless this is a reference to OC'ing through overdrive). Perhaps though and what I interpret this as a mistake or typo that you really said X2 when referencing the GT295.

    And with that interpretation the whole comment, " they well knew to open it to 240 shaders for more of an advantage in the X2 boost. " only goes to reinforce the logic that nvidia felt they had to make another REACTIONARY change to their own designs to gain the edge.

    Finally I'm done posting in here, so doc you're more than welcome to flame all you want. I've already spend enough of my precious time replying to the senseless dribble.

  • SiliconDoc - Monday, December 22, 2008 - link

    You're a red freak dude. "Reactionary design". LOL
    roflmao
    You need some help.
    Quoting Aristotle to Nvidia - and you want to send it to their people.- Like I said you've got a huge chip on your shoulder, so you yap out big fat lies.
    Can you stop yourself sometime soon ? I hope so.
  • SiliconDoc - Sunday, December 21, 2008 - link

    You're the one who said they tried it(dual) with the 260 and found it failed, not to mention you claiming (pulling the rabbit out of your spat) they tried it with 280 and found heat or watts untenable.... another magic trick with your mind.
    Obviously the reason they opened up to 240 shaders is because they could. There's ZERO room left in 4780- none fella. Zip.Nada.
    I guess you know what Nvidia engineers do because you figured it out all on your own, and they are so stupid, they tried your idiot idea, and found it didn't work.
    (NO, YOU SAID IT BECAUSE YOU HAVE A HUGE CHIP ON YUOR SHOULDER - AND YOU'RE OBVIOUSLY WRONG BECAUSE IT'S DUMB, DUMB, DUMB.)
    That's the very type of stuff that we've had to put up with for so long now, - and the next round is already here.
    Endless lying pukehead statements made by the little war beasts.
    I'm sick of it - I don't care if you're not lying, don't care if you're just making up stupid dissing stuff, but to lie up a big can of crud then pretend it's true - and expect others to accept it - NO.
  • SiliconDoc - Sunday, December 21, 2008 - link

    Perhaps you forgot the 4870 had a 512 version released, then a 1024.
    Maybe that would explain it to you, but I doubt it.
  • Razorbladehaze - Sunday, December 21, 2008 - link

    This is funny, "I can't stand it when BS becomes widespread belief. "

    I've noticed that this poster is quite guilty of spinning BS for others to buy into.

  • rocky1234 - Friday, December 19, 2008 - link

    Problems in farcry2 with crossfire? I have a 4870x2 & the game has worked fine from the day I got & I got it about three days after it was released...I play on maxed out settings & 1080P without issue's & the game always ran smooth The only reason I stopped palying it is because I got bored of the game.

    This release of the 295 is just a PR move for Novidia nothing more they beleive they have to be Number 1 or nothing because if they are number 1 they can charge what ever they want for their video cards.

    I am just waiting for all those websites to start doing the reviews & when they Novidai card wins a few they trash the AMD card because it lost by a few frames even though the card will let you play pretty much any game at high res & maxed settings. Oh yeah I did find a use for my old 8800GT/9800GT I now use it as a slave card for PhysX the only problem is where are they games so far the ones that use it are low in numbers & the game play does not seem to get better by having it except now when the Novida card kicks in & does the CPU's work my quad core can take a small nap...lol

  • SiliconDoc - Tuesday, December 30, 2008 - link

    " This release of the 295 is just a PR move for Novidia nothing more they beleive they have to be Number 1 or nothing because if they are number 1 they can charge what ever they want for their video cards. "

    Oh, so I guess if I ( like tens of millions of others) only have a single 16x pci-e like on the awesome Gigabyte P45 > GA-EP45-UD3R , or GA-EP45-UD3L , or UD3LR, or ASUS P5Q SE, P5Q, P5Q SE+, or Biostar T-Force, or endless number of P35 motherboards, or 965, G31, G33, etc... I should just buy the ATI product, and screw me if I can't stand CCC, or any number of other things, I'll just have to live with the red card for top frames... YOU DON'T WANT ME OR TEN MILLIONS OTHERE ALLOWED TO BUY A TOP DUAL NVIDIA ? !?!?!

    " I am just waiting for all those websites to start doing the reviews & when they Novidai card wins a few they trash the AMD card because it lost by a few frames even though the card will let you play pretty much any game at high res & maxed settings."

    Well I am waiting for it too, and I can hardly wait, since those same websites compared 2 gpu cores to one core, and declared by total BS, 2 cores the winner. If you start screaming it's best single slot solution like dozens have for 6 months, then tell me and mine it's a waste for NVidia to do the same thing, don't mind me when I laugh in your face and consider you a liar, ok ?

    " Oh yeah I did find a use for my old 8800GT/9800GT I now use it as a slave card for PhysX"

    Ahh, good. You won't be doing that with a red card anytime soon, and if you do, we see by the news, IT WILL BE NVIDIA with a rogue single developer supporting it ! (not ati - supporting their own cards for it - so much for customer care and cost consciousness)
    _________________________________________________________________

    So, you think the tens of millions of single slot 16x, so NO 16x 16x, or just 16x 4x PCI-E users deserve a chance at a top end card other than ATI ? OH, that's right you already basically said " NO !" - and squawked it's a pr crown stunt...

    How about SLI board owners who don't want to go ATI ? You think they might be interested in DUAL in one slot or QUAD ?
    No, you didn't give it a single brain cell of attentiveness or thought...
    Duhh... drool out a ripping cutdown instead....

    I'm sure I'll hear, I'm the one with the "problem", by another spewing propaganda.
  • initialised - Thursday, December 18, 2008 - link

    [quote=Anand]Making such a move is definitely sensible, but it is at the highest end (2560x1600 with tons of blur (I mean AA, sorry))[/quote]

    No, according to the Geordise translation (that we coined on the way home from CyberPower in a traffic jam) AF is Deblurrin' while AA is Dejaggin'), please ammend the article.
  • JarredWalton - Thursday, December 18, 2008 - link

    No, antialiasing adds some blurriness to images. It smooths out jaggies on edges by blurring those edges. It's one of the reasons I don't think 4xAA or even AA in general should be at the top of the IQ features people should enable; I only turn it on when everything else is enabled, and I start at 2xAA and progress from there. I'm with Derek in that 2xAA with high resolution displays is more than enough for all but the most demanding people.

Log in

Don't have an account? Sign up now