Preliminary Thoughts

With a board power of 289W, this thing isn't going to be cheap to run. Plugging two in a system is going to push the envelope, but 3-way GTX 280 will still consume more power. It is likely that NVIDIA made the changes to memory bandwidth in order to save on a couple hundred megs of RAM that would draw too much more power. Making such a move is definitely sensible, but it is at the highest end (2560x1600 with tons of blur (I mean AA, sorry)) where tons of RAM are needed to push performance.

Of course, with two cards (especially if a game is capable of alternate frame rendering (AFR)), memory limited performance issues will be mitigated quite a bit, and opening up the shader power of two GTX 280 cards in a single slot is big for games that use a lot of compute. The way future games tackle the balance of compute and memory has yet to be seen, but NVIDIA has been saying for years that the future continues to be increasing the compute ratio.

We like hard launches. This isn't one. While that's disappointing, we do really want to get our hands on this hardware. The GTX 295 definitely looks like it will best the Radeon HD 4870 X2 in terms of raw power. Beyond that, it is clear that AMD hasn't taken driver development seriously enough and CrossFire just isn't as robust as SLI. Relying on a CrossFire based solution for their highest end part means it is necessary to provide reliable performance and stability across all games, new and old, and on all platforms. Making user defined profiles that allow the forcing of different CrossFire modes on certain games would go a long way to helping, but the real relief will come when AMD decides to fix their broken driver development model.

As it stands, SLI is a better solution than CrossFire and the GPUs on the GTX 295 will really put the screws to RV770. We will very likely see NVIDIA take back the crown in terms single card performance.

That said, how sad is it that NVIDIA had to go and push this press info out there 3 weeks before availability just to try and slow AMD's momentum during the holiday season.

A Quick Look Under The Hood
Comments Locked

69 Comments

View All Comments

  • chizow - Thursday, December 18, 2008 - link

    Well I do agree that the 295 preview is clearly a marketing ploy to sway holiday buyers to wait after New Year's. And I also agree that the GTX 295 isn't integral to NV's strategy, its just a band-aid with only 1 real purpose: to take back the single card crown. I don't think you can really fault Nvidia for that though, they're clearly very good at more than just designing GPUs.

    I also don't fault you for not condemning the 4870X2 soft launch as you were one of the few select sites to receive one. Obviously there's pressure in your business just as any other to not fall behind the competition. I just figured AT had changed their stance due to industry pressures, which is why I was surprised by some of the comments here.

  • SiliconDoc - Sunday, December 28, 2008 - link

    So for the endless thousands of people without 2 pci-e x16 slots on their motherboards, this was just a war for top crown with ATI ?
    I guess choice is a big fat zero in our new socialist economy.
    I suppose everyone here has 2 or 3 pci-e 16x slots on their motherboards.
    Yes, what a pig of a thing for NVidia to do, actually make a card that will offfer THE MAJORITY OF CONSUMERS WHO BOUGHT MOTHERBOARDS - the best possible framerates because they don't have 2 pci-e 16x slots.
    What terrible crown freaks they are.
  • jordanclock - Thursday, December 18, 2008 - link

    No, it is pathetic. Both companies have been doing just fine with hard launches in the past few years (I don't know if you recall the days of soft launches a month in advance of retail availability, or when an announced product never showed up at all) and for either company to take a step back with a paper launch is just stupid.

    If AMD did the same thing, I'm sure they would have been called stupid too. Rightfully so, I might add.

    My problem with this is that nVidia has yet to release a mid-range product in how long? The only way to get a mid-range card from nVidia is to buy the upper end of a previous generation. I would really like to see a release of sub-$200 MSRP 200-series card. That would be news worthy. Not another $500 card that even nVidia doesn't expect to sell all that well.
  • SiliconDoc - Friday, December 19, 2008 - link

    I'm just thrilled by the 3 week early release. I've been attacked relentlessly by red fanboys for merely telling the truth then providing the link.
    So the madder they are about a week early announcement, the happier I am. But that's not the only reason. Despite the redfans screaming unfair, and crying their base is going down unfairly by "marginalized" xmas purchases..LOL... I'd be one ticked off hombre' if NVidia kept their claptrap shut and on Jan. 8th announced their new 4870x2 killer - and of course if the whiners were honest about anything at all, they'ed say so as well. That ONE consideration outwieghs ANYTHING ELSE THEY WHINE ABOUT, PERIOD.
    So, I have to say there are so many I consider complete raving loons because they are so far off the mark they don't have even their own personal pocketbook in mind - which of course is akin to self immolation. Yes, they are fired up. Burning, burning down the house. They just forgot to step outside first, and remember, that my golly, that is their house they might be upgrading.
    So, I certainly HOPE that ATI releases a gigantic secret cheap upgrade card on Jan. 8th, after all the red fanboys splurged their xmas cookie monies on something that DIDN'T have a " early announced release that is "harmful" to the end user".
    YES, WHAT A CROCK.
  • Mr Perfect - Thursday, December 18, 2008 - link

    Yes, the lack of new cards below the 260 is disappointing. They have already renamed 8000 series parts to make the 9000 series parts, and there where reports of the 9000 series parts being renamed as GT 100 parts. Hopefully it was just a rumor.

    You know there is at least one poor sap out there who's going to replace his 8800GT with a 9800GT, and then upgrade again to a GTS 150. The last time you could make a whole chain of "upgrades" and get essentially the same thing was what? The GeForce 2MX/4MX/4000 string?
  • RagingDragon - Sunday, December 21, 2008 - link

    How about Geforce 7600 -> 8600 -> 9500?
  • mczak - Thursday, December 18, 2008 - link

    A GeForce 4 MX is in no way a renamed 2 MX. While true it doesn't have the feature set of "real" GeForce 4 it is indeed a very different chip than 2 MX (with faster performance).
  • StevoLincolnite - Thursday, December 18, 2008 - link

    The Geforce 4 MX was basically a Geforce 2 on Steroids, with it's enhanced memory controller and advanced (For back then) bandwidth saving technology's.

    Dual-monitor support, and a multi-sampling anti-aliasing unit from the Ti series; the improved 128-bit DDR memory controller which was crucial to solving the bandwidth limitations that appeared on the GeForce 2 chips.

    It allowed gamers to play games on the Geforce 4 MX that was originally almost un-playable on a Geforce 2.

    It was also eventually released with PCI-E support and a wider memory bus, funny thing was how the card out-performed the Geforce FX5200 despite having an inferior feature set.
  • larson0699 - Friday, December 19, 2008 - link

    P.I. and way overborrowed from http://en.wikipedia.org/wiki/GeForce_4_Series">http://en.wikipedia.org/wiki/GeForce_4_Series

    In architecture, the 4MX was an original design, not a rebrand as in 8800/9800. For need of a reasonable depiction of its features, however, it *is* most closely related to the 2Ti and performs almost proportionally better (which would lead some to think of it as a derivative).

    The refresh in 2004 put this GPU on a PCI-E card with a bridge chip in between, which I wouldn't call "PCI-E support". It's a damn smart move economically when you're trying to sell the rest of your chips during a transition to a new platform, but I know not a single being who ever splurged on such a mediocre design (at least not the PCX series -- I've got a fried 6600 AGP in my hands from someone who didn't know the wiser).

    And I don't know where you're getting your data, but the FX5200 dusted the MX440 by about 2:1 in most of everything. I had a Radeon 8500 once and was playing on a LAN alongside a buddy with a Radeon 7500, and his card outperformed mine because it wasn't rendering everything mine was. But when I disabled the difference in effects, my card was well ahead. When I hear of an inferior new GPU, I think GMA 3100 < GMA 950, not this.

    Besides, GP's point wasn't about the GF4 but rather the twisted path of "upgrades" we're seeing more and more.

Log in

Don't have an account? Sign up now