Preliminary Thoughts

With a board power of 289W, this thing isn't going to be cheap to run. Plugging two in a system is going to push the envelope, but 3-way GTX 280 will still consume more power. It is likely that NVIDIA made the changes to memory bandwidth in order to save on a couple hundred megs of RAM that would draw too much more power. Making such a move is definitely sensible, but it is at the highest end (2560x1600 with tons of blur (I mean AA, sorry)) where tons of RAM are needed to push performance.

Of course, with two cards (especially if a game is capable of alternate frame rendering (AFR)), memory limited performance issues will be mitigated quite a bit, and opening up the shader power of two GTX 280 cards in a single slot is big for games that use a lot of compute. The way future games tackle the balance of compute and memory has yet to be seen, but NVIDIA has been saying for years that the future continues to be increasing the compute ratio.

We like hard launches. This isn't one. While that's disappointing, we do really want to get our hands on this hardware. The GTX 295 definitely looks like it will best the Radeon HD 4870 X2 in terms of raw power. Beyond that, it is clear that AMD hasn't taken driver development seriously enough and CrossFire just isn't as robust as SLI. Relying on a CrossFire based solution for their highest end part means it is necessary to provide reliable performance and stability across all games, new and old, and on all platforms. Making user defined profiles that allow the forcing of different CrossFire modes on certain games would go a long way to helping, but the real relief will come when AMD decides to fix their broken driver development model.

As it stands, SLI is a better solution than CrossFire and the GPUs on the GTX 295 will really put the screws to RV770. We will very likely see NVIDIA take back the crown in terms single card performance.

That said, how sad is it that NVIDIA had to go and push this press info out there 3 weeks before availability just to try and slow AMD's momentum during the holiday season.

A Quick Look Under The Hood


View All Comments

  • AdamK47 - Thursday, December 18, 2008 - link

    The clock speeds are way lower than what I had expected, especially since this is 55nm. Reply
  • SuperGee - Saturday, December 20, 2008 - link

    Then your take a lot factor not in to accaunt.

    1 ) GTX280 isn't only bigger chip due to 65nm.
    But also 1,5 times more Transistors then RV770.
    So it still is a bigger chip but 55nm makes it less extreem.
    2 ) with that a 55nm GT200 on 280 speeds draws a lot more power to beat RV770.
    3 ) while the audiencee blame GT200 65nm as a power draw king this doesn't make a RV770 a green chip. It dissipate also far over 100Watts.
    4 ) 4870x2 with its 275Watt is just heater like GTX280 even more and ATI is pushin it to.
    5 ) To not make a next king in power heating 300Watt is the limit.
    6 ) GT200 55 nm is to power hungry to full take the potentional out of GT200. They could do GT285 speeds. but that would be in the 365Watts. 280 speeds 320Watts.
    7 ) You get a ULV GT200x2 with 289Watt tad more then it direct competition and enough power to beat it with a small margin.

    In sight mistake
    A ) 55nm is mo miricla solution. The size of GT200 would make it a RV770 ish thing only from 40nm. GT200 on 40nm makes sense. To fill the gap to GT300 possible nextgen DX11 part. nV doesn't need a new 40nm dX10 chip. They have GT200.
    B ) RV770 isn't a candidate for enviomental prices, draws still a lot of power.

    From my history of 80386S 250Watt to 550Watt

    So to me it's just as expected. I speculate about the name GTX265X2 But they dropped X2 for new number GTX295
  • SiliconDoc - Sunday, December 21, 2008 - link

    But the 4870 should be compared to either 260 because that's where it's stats rest.
    What has already been docmunted endless times is the 4870 is 1-3 watts less than the 260 in full 3d use, whole the 260 is 30 watts less at idle.
    So the 4870 is the WORSE card when it comes to power consumption.
    Now if you want to compare it to the 280, why then you're comparing it to the card that beats it soundly, not just a little bit like the 260.
    I saw all the carts with the 4870 supposedly beating the 260 in power consumption, because the 3d consumption was 1-3 watts less, and the 300 watt idle advantage for the 260 was "secondary".
    No, doesn't make sense to me unless you're gaming 10x-30x more than in 2d or on your desktop- and then it would be a tie- but people DON'T have their cards in 3d gaming mode at that percentage of time compared to 2d or "idle".
    So there was plenty of skew about there.
    I don't understand how "judgement" can be so far off, except by thinking the charts I referred to are "auto generated" and use the 3d mode score ONLY for the ordering. Must be too difficult or too much of a hassle to manually change the order, so the reviewer instead of apologizing for the chart generation method just goes along and makes the twisted explanation.
    Then the "fans" just blabber on repeating it.
    That is exactly what I have seen.
    The 260 uses less power than the 4870, and it beats it slightly overall in game benches.
    Now there's the truth. That's actually exactly what all the data says. Oh well.
  • s d - Thursday, December 18, 2008 - link

    4 simple words ...

    n o t ... direct ... x ... 11

  • SuperGee - Saturday, December 20, 2008 - link

    Could take a while wenn the first DX11 cards come. And must proof them selfs with first for a long time with DX10.
    Then DX11 sDK comes out close after the card possible But the Games take a lot longer.

    It could be 6 month after DX11 runtime and hardware release that the first dx11 game drop in.

    Don't expect it in 2009 more mid 2010 if dX11 is release end 2009.

    till then I would enjoy games with a GTX285
  • michaelheath - Thursday, December 18, 2008 - link

    I know the engineering process for a new card release starts a few years in advance, but what is Nvidia thinking by producing nothing but high-end cards? Sure, a dual 200-series GPU card is a prize pony you can trot out to say, "Yeah, we got the fastest, prettiest one out there." In this day and age, though, the 'halo effect' goes to the company who can produce a high-performance product without associating a high-performance cost-to-own or cost-to-run.

    Nvidia needs to fill the void left in the 200-series' wake. Price-conscious shoppers might go for the 9x00 cards, but the tech-savvy price-conscious buyers know well enough that the 9x00 cards are nothing but re-named or, at best, die-shrinks of 2 year old technology. ATI, on the other hand, has a full range of cards in the 4xx0-series that is new(er) technology, covers a fuller spectrum of price ($50-300 before spiking to $500 for the X2 cards, and $100-200 buys you very respectable performance), and the new generation consistently out-paces the last-gen products.

    Now if ATI's driver team would spend more time on Q/A and fix that shadow bug in Left 4 Dead, I'd dump my 8800 GT 512 in a heart beat for a Radeon 4870 1GB. The only item that would keep me with the green team is if they die-shrink the GTX260, bump the clock speeds considerably, and put a MSRP of $200 on it.
  • SiliconDoc - Sunday, December 28, 2008 - link

    ps - Maybe you should have just been HONEST and come out with it.
    " My 880GT is hanging really tough, it's still plenty good enough to live with, and going with the same card company won't be a different experience, so the 4870 1 gig looks real good but it's $300.00 bucks and that's a lot of money when it isn't better than the 260.
    Why can't NV make a 4870 1 gig killer FOR ME, that makes it worth replacing my HIGH VALUE 880GT that's STILL hanging tough for like $200 ?

    Yeah, good luck. With all the cards one could still say it's "moving rather slowly" - since years older tech (8800 series)still runs most monitors (pushing it at 1600x****) , and plays games pretty well.

    You're sitting in a tough spot, the fact is your card has had a lot of lasting value.
  • SiliconDoc - Sunday, December 28, 2008 - link

    Oh wait a minute, I guess I misinterptreted. ATI took their single 4000 series chips, and went wackado on it - and made a whole range of cards, and NVidia took their single 80series core- and went wackado on it - and made a whole range of cards - oh but NV even went a step further and reworked the dies and came up with not just G80 but G82,G84,G90,G92...
    Umm, yeah did ATI do that ? YES ?

  • SiliconDoc - Sunday, December 28, 2008 - link

    So the cards - NV, all the way down to the 8400GS for 20 bucks at the egg... just don't exist according to you.
    I guess NV should take another corporate board/ceo wannabe's net advice and remake all their chips that already match the knockdown versions of the ATI latest great single chip that can't even match the NV one.
    For instance, the 4850 is compared to the 9800GT, the 9800gtx and 9800GTX+ . Why can't ATI make a chip better that NVidia's several gens older chips ? Maybe that's what you should ask yourself. It would be a lot more accurate than what you said.
    Let me know about these cards "filling the lower tiers".
    GeForce 9400 GT
    GeForce 9500 GT
    GeForce 9600 GT
    GeForce 9600 GSO
    GeForce 9800 GT
    GeForce 9800 GT
    GeForce 8400 GS
    GeForce 8500 GT
    GeForce 8600 GT
    GeForce 8600 GT
    GeForce 8800 GT
    GeForce 8800 GT


    Those don't exist, right ? The reason NV doesn't make a whole new chip lineup just to please you, you know, one that is "brand new" like the 6mo. or year old 4000 series now, is BECAUSE THEY ALREADY HAVE A WHOLE SET OF THEM.

    Whatever, another deranged red bloviator on wackoids.
  • sandman74 - Thursday, December 18, 2008 - link

    There is hard performance data on this card at

    Basically it performs like a 280 in SLI in most cases (or thereabouts) which is pretty good, and does indeed beat the 4870 X2.

    I doubt this card is for me though. Too expensive, too hot to run, too much power. Im hoping the GTX 285 will more be appropriate for me.


Log in

Don't have an account? Sign up now