A Quick Look Under The Hood

Our first concern, upon hearing about this hardware, was whether or not they could fit two of GTX 260 GPUs on a single card without melting PSUs. With only a 6 pin + 8 pin PCIe power configuration, this doesn't seem like quite enough to push the hardware. But then we learned something interesting: the GeForce GTX 295 is the first 55nm part from NVIDIA. Of course, the logical conclusion is that single GPU 55nm hardware might not be far behind, but that's not what we're here to talk about today.


Image courtesy NVIDIA

55nm is only a half node process, so we won't see huge changes in die-size (we don't have one yet, so we can't measure it), but the part should get a little smaller and cheaper to build. As well as a little easier to cool and lower power at the same performance levels (or NVIDIA could choose to push performance a little higher).


Image courtesy NVIDIA

As we briefly mentioned, the GPUs strapped on to this beast aren't your stock GTX 260 or GTX 280 parts. These chips are something like a GTX 280 with one memory channel disabled running at GTX 260 clock speeds. I suppose you could also look at them as GTX 260 ICs with all 10 TPCs enabled. Either way, you end up with something that has higher shader performance than a GTX 260 and lower memory bandwidth and fillrate (remember that ROPs are tied to memory channels, so this new part only has 28 rops instead of 32) than a GTX 280. This is a hybrid part.


Image courtesy NVIDIA

Our first thought was binning (or what AMD calls harvesting), but being that this is also a move to 55nm we have to rethink that. It isn't clear whether this chip will make it's way onto a single GPU board. But if it did, it would likely be capable of higher clock speeds due to the die shrink and would fall between the GTX 260 core 216 and GTX 280 in performance. Of course, this part may not end up on single GPU boards. We'll just have to wait and see.

What is clear, is that this is a solution gunning for the top. It is capable of quad SLI and sports not only two dual-link DVI outputs, but an HDMI out as well. It isn't clear whether all boards built will include the HDMI port the reference board includes, but more flexibility is always a good thing.


Image courtesy NVIDIA

Index Preliminary Thoughts
Comments Locked

69 Comments

View All Comments

  • AdamK47 - Thursday, December 18, 2008 - link

    The clock speeds are way lower than what I had expected, especially since this is 55nm.
  • SuperGee - Saturday, December 20, 2008 - link

    Then your take a lot factor not in to accaunt.

    1 ) GTX280 isn't only bigger chip due to 65nm.
    But also 1,5 times more Transistors then RV770.
    So it still is a bigger chip but 55nm makes it less extreem.
    2 ) with that a 55nm GT200 on 280 speeds draws a lot more power to beat RV770.
    3 ) while the audiencee blame GT200 65nm as a power draw king this doesn't make a RV770 a green chip. It dissipate also far over 100Watts.
    4 ) 4870x2 with its 275Watt is just heater like GTX280 even more and ATI is pushin it to.
    5 ) To not make a next king in power heating 300Watt is the limit.
    6 ) GT200 55 nm is to power hungry to full take the potentional out of GT200. They could do GT285 speeds. but that would be in the 365Watts. 280 speeds 320Watts.
    7 ) You get a ULV GT200x2 with 289Watt tad more then it direct competition and enough power to beat it with a small margin.

    In sight mistake
    A ) 55nm is mo miricla solution. The size of GT200 would make it a RV770 ish thing only from 40nm. GT200 on 40nm makes sense. To fill the gap to GT300 possible nextgen DX11 part. nV doesn't need a new 40nm dX10 chip. They have GT200.
    B ) RV770 isn't a candidate for enviomental prices, draws still a lot of power.

    From my history of 80386S 250Watt to 550Watt

    So to me it's just as expected. I speculate about the name GTX265X2 But they dropped X2 for new number GTX295
  • SiliconDoc - Sunday, December 21, 2008 - link

    But the 4870 should be compared to either 260 because that's where it's stats rest.
    What has already been docmunted endless times is the 4870 is 1-3 watts less than the 260 in full 3d use, whole the 260 is 30 watts less at idle.
    So the 4870 is the WORSE card when it comes to power consumption.
    Now if you want to compare it to the 280, why then you're comparing it to the card that beats it soundly, not just a little bit like the 260.
    I saw all the carts with the 4870 supposedly beating the 260 in power consumption, because the 3d consumption was 1-3 watts less, and the 300 watt idle advantage for the 260 was "secondary".
    No, doesn't make sense to me unless you're gaming 10x-30x more than in 2d or on your desktop- and then it would be a tie- but people DON'T have their cards in 3d gaming mode at that percentage of time compared to 2d or "idle".
    So there was plenty of skew about there.
    I don't understand how "judgement" can be so far off, except by thinking the charts I referred to are "auto generated" and use the 3d mode score ONLY for the ordering. Must be too difficult or too much of a hassle to manually change the order, so the reviewer instead of apologizing for the chart generation method just goes along and makes the twisted explanation.
    Then the "fans" just blabber on repeating it.
    That is exactly what I have seen.
    The 260 uses less power than the 4870, and it beats it slightly overall in game benches.
    Now there's the truth. That's actually exactly what all the data says. Oh well.
  • s d - Thursday, December 18, 2008 - link

    4 simple words ...

    n o t ... direct ... x ... 11

    :)
  • SuperGee - Saturday, December 20, 2008 - link

    Could take a while wenn the first DX11 cards come. And must proof them selfs with first for a long time with DX10.
    Then DX11 sDK comes out close after the card possible But the Games take a lot longer.

    It could be 6 month after DX11 runtime and hardware release that the first dx11 game drop in.

    Don't expect it in 2009 more mid 2010 if dX11 is release end 2009.

    till then I would enjoy games with a GTX285
  • michaelheath - Thursday, December 18, 2008 - link

    I know the engineering process for a new card release starts a few years in advance, but what is Nvidia thinking by producing nothing but high-end cards? Sure, a dual 200-series GPU card is a prize pony you can trot out to say, "Yeah, we got the fastest, prettiest one out there." In this day and age, though, the 'halo effect' goes to the company who can produce a high-performance product without associating a high-performance cost-to-own or cost-to-run.

    Nvidia needs to fill the void left in the 200-series' wake. Price-conscious shoppers might go for the 9x00 cards, but the tech-savvy price-conscious buyers know well enough that the 9x00 cards are nothing but re-named or, at best, die-shrinks of 2 year old technology. ATI, on the other hand, has a full range of cards in the 4xx0-series that is new(er) technology, covers a fuller spectrum of price ($50-300 before spiking to $500 for the X2 cards, and $100-200 buys you very respectable performance), and the new generation consistently out-paces the last-gen products.

    Now if ATI's driver team would spend more time on Q/A and fix that shadow bug in Left 4 Dead, I'd dump my 8800 GT 512 in a heart beat for a Radeon 4870 1GB. The only item that would keep me with the green team is if they die-shrink the GTX260, bump the clock speeds considerably, and put a MSRP of $200 on it.
  • SiliconDoc - Sunday, December 28, 2008 - link

    ps - Maybe you should have just been HONEST and come out with it.
    " My 880GT is hanging really tough, it's still plenty good enough to live with, and going with the same card company won't be a different experience, so the 4870 1 gig looks real good but it's $300.00 bucks and that's a lot of money when it isn't better than the 260.
    Why can't NV make a 4870 1 gig killer FOR ME, that makes it worth replacing my HIGH VALUE 880GT that's STILL hanging tough for like $200 ?
    _________________________________________________________________

    Yeah, good luck. With all the cards one could still say it's "moving rather slowly" - since years older tech (8800 series)still runs most monitors (pushing it at 1600x****) , and plays games pretty well.

    You're sitting in a tough spot, the fact is your card has had a lot of lasting value.
  • SiliconDoc - Sunday, December 28, 2008 - link

    Oh wait a minute, I guess I misinterptreted. ATI took their single 4000 series chips, and went wackado on it - and made a whole range of cards, and NVidia took their single 80series core- and went wackado on it - and made a whole range of cards - oh but NV even went a step further and reworked the dies and came up with not just G80 but G82,G84,G90,G92...
    Umm, yeah did ATI do that ? YES ?
    _________________________________________

    I REST MY CASE !
  • SiliconDoc - Sunday, December 28, 2008 - link

    So the cards - NV, all the way down to the 8400GS for 20 bucks at the egg... just don't exist according to you.
    I guess NV should take another corporate board/ceo wannabe's net advice and remake all their chips that already match the knockdown versions of the ATI latest great single chip that can't even match the NV one.
    For instance, the 4850 is compared to the 9800GT, the 9800gtx and 9800GTX+ . Why can't ATI make a chip better that NVidia's several gens older chips ? Maybe that's what you should ask yourself. It would be a lot more accurate than what you said.
    Let me know about these cards "filling the lower tiers".
    GeForce 9400 GT
    GeForce 9500 GT
    GeForce 9600 GT
    GeForce 9600 GSO
    GeForce 9800 GT
    GeForce 9800 GT
    GPU
    GeForce 8400 GS
    GeForce 8500 GT
    GeForce 8600 GT
    GeForce 8600 GT
    GeForce 8800 GT
    GeForce 8800 GT

    ______________________________-

    Those don't exist, right ? The reason NV doesn't make a whole new chip lineup just to please you, you know, one that is "brand new" like the 6mo. or year old 4000 series now, is BECAUSE THEY ALREADY HAVE A WHOLE SET OF THEM.
    _________________________________

    Whatever, another deranged red bloviator on wackoids.
  • sandman74 - Thursday, December 18, 2008 - link


    There is hard performance data on this card at www.bit-tech.net

    Basically it performs like a 280 in SLI in most cases (or thereabouts) which is pretty good, and does indeed beat the 4870 X2.

    I doubt this card is for me though. Too expensive, too hot to run, too much power. Im hoping the GTX 285 will more be appropriate for me.

Log in

Don't have an account? Sign up now