A Quick Look Under The Hood

Our first concern, upon hearing about this hardware, was whether or not they could fit two of GTX 260 GPUs on a single card without melting PSUs. With only a 6 pin + 8 pin PCIe power configuration, this doesn't seem like quite enough to push the hardware. But then we learned something interesting: the GeForce GTX 295 is the first 55nm part from NVIDIA. Of course, the logical conclusion is that single GPU 55nm hardware might not be far behind, but that's not what we're here to talk about today.


Image courtesy NVIDIA

55nm is only a half node process, so we won't see huge changes in die-size (we don't have one yet, so we can't measure it), but the part should get a little smaller and cheaper to build. As well as a little easier to cool and lower power at the same performance levels (or NVIDIA could choose to push performance a little higher).


Image courtesy NVIDIA

As we briefly mentioned, the GPUs strapped on to this beast aren't your stock GTX 260 or GTX 280 parts. These chips are something like a GTX 280 with one memory channel disabled running at GTX 260 clock speeds. I suppose you could also look at them as GTX 260 ICs with all 10 TPCs enabled. Either way, you end up with something that has higher shader performance than a GTX 260 and lower memory bandwidth and fillrate (remember that ROPs are tied to memory channels, so this new part only has 28 rops instead of 32) than a GTX 280. This is a hybrid part.


Image courtesy NVIDIA

Our first thought was binning (or what AMD calls harvesting), but being that this is also a move to 55nm we have to rethink that. It isn't clear whether this chip will make it's way onto a single GPU board. But if it did, it would likely be capable of higher clock speeds due to the die shrink and would fall between the GTX 260 core 216 and GTX 280 in performance. Of course, this part may not end up on single GPU boards. We'll just have to wait and see.

What is clear, is that this is a solution gunning for the top. It is capable of quad SLI and sports not only two dual-link DVI outputs, but an HDMI out as well. It isn't clear whether all boards built will include the HDMI port the reference board includes, but more flexibility is always a good thing.


Image courtesy NVIDIA

Index Preliminary Thoughts
Comments Locked

69 Comments

View All Comments

  • chizow - Thursday, December 18, 2008 - link

    Well I do agree that the 295 preview is clearly a marketing ploy to sway holiday buyers to wait after New Year's. And I also agree that the GTX 295 isn't integral to NV's strategy, its just a band-aid with only 1 real purpose: to take back the single card crown. I don't think you can really fault Nvidia for that though, they're clearly very good at more than just designing GPUs.

    I also don't fault you for not condemning the 4870X2 soft launch as you were one of the few select sites to receive one. Obviously there's pressure in your business just as any other to not fall behind the competition. I just figured AT had changed their stance due to industry pressures, which is why I was surprised by some of the comments here.

  • SiliconDoc - Sunday, December 28, 2008 - link

    So for the endless thousands of people without 2 pci-e x16 slots on their motherboards, this was just a war for top crown with ATI ?
    I guess choice is a big fat zero in our new socialist economy.
    I suppose everyone here has 2 or 3 pci-e 16x slots on their motherboards.
    Yes, what a pig of a thing for NVidia to do, actually make a card that will offfer THE MAJORITY OF CONSUMERS WHO BOUGHT MOTHERBOARDS - the best possible framerates because they don't have 2 pci-e 16x slots.
    What terrible crown freaks they are.
  • jordanclock - Thursday, December 18, 2008 - link

    No, it is pathetic. Both companies have been doing just fine with hard launches in the past few years (I don't know if you recall the days of soft launches a month in advance of retail availability, or when an announced product never showed up at all) and for either company to take a step back with a paper launch is just stupid.

    If AMD did the same thing, I'm sure they would have been called stupid too. Rightfully so, I might add.

    My problem with this is that nVidia has yet to release a mid-range product in how long? The only way to get a mid-range card from nVidia is to buy the upper end of a previous generation. I would really like to see a release of sub-$200 MSRP 200-series card. That would be news worthy. Not another $500 card that even nVidia doesn't expect to sell all that well.
  • SiliconDoc - Friday, December 19, 2008 - link

    I'm just thrilled by the 3 week early release. I've been attacked relentlessly by red fanboys for merely telling the truth then providing the link.
    So the madder they are about a week early announcement, the happier I am. But that's not the only reason. Despite the redfans screaming unfair, and crying their base is going down unfairly by "marginalized" xmas purchases..LOL... I'd be one ticked off hombre' if NVidia kept their claptrap shut and on Jan. 8th announced their new 4870x2 killer - and of course if the whiners were honest about anything at all, they'ed say so as well. That ONE consideration outwieghs ANYTHING ELSE THEY WHINE ABOUT, PERIOD.
    So, I have to say there are so many I consider complete raving loons because they are so far off the mark they don't have even their own personal pocketbook in mind - which of course is akin to self immolation. Yes, they are fired up. Burning, burning down the house. They just forgot to step outside first, and remember, that my golly, that is their house they might be upgrading.
    So, I certainly HOPE that ATI releases a gigantic secret cheap upgrade card on Jan. 8th, after all the red fanboys splurged their xmas cookie monies on something that DIDN'T have a " early announced release that is "harmful" to the end user".
    YES, WHAT A CROCK.
  • Mr Perfect - Thursday, December 18, 2008 - link

    Yes, the lack of new cards below the 260 is disappointing. They have already renamed 8000 series parts to make the 9000 series parts, and there where reports of the 9000 series parts being renamed as GT 100 parts. Hopefully it was just a rumor.

    You know there is at least one poor sap out there who's going to replace his 8800GT with a 9800GT, and then upgrade again to a GTS 150. The last time you could make a whole chain of "upgrades" and get essentially the same thing was what? The GeForce 2MX/4MX/4000 string?
  • RagingDragon - Sunday, December 21, 2008 - link

    How about Geforce 7600 -> 8600 -> 9500?
  • mczak - Thursday, December 18, 2008 - link

    A GeForce 4 MX is in no way a renamed 2 MX. While true it doesn't have the feature set of "real" GeForce 4 it is indeed a very different chip than 2 MX (with faster performance).
  • StevoLincolnite - Thursday, December 18, 2008 - link

    The Geforce 4 MX was basically a Geforce 2 on Steroids, with it's enhanced memory controller and advanced (For back then) bandwidth saving technology's.

    Dual-monitor support, and a multi-sampling anti-aliasing unit from the Ti series; the improved 128-bit DDR memory controller which was crucial to solving the bandwidth limitations that appeared on the GeForce 2 chips.

    It allowed gamers to play games on the Geforce 4 MX that was originally almost un-playable on a Geforce 2.

    It was also eventually released with PCI-E support and a wider memory bus, funny thing was how the card out-performed the Geforce FX5200 despite having an inferior feature set.
  • larson0699 - Friday, December 19, 2008 - link

    P.I. and way overborrowed from http://en.wikipedia.org/wiki/GeForce_4_Series">http://en.wikipedia.org/wiki/GeForce_4_Series

    In architecture, the 4MX was an original design, not a rebrand as in 8800/9800. For need of a reasonable depiction of its features, however, it *is* most closely related to the 2Ti and performs almost proportionally better (which would lead some to think of it as a derivative).

    The refresh in 2004 put this GPU on a PCI-E card with a bridge chip in between, which I wouldn't call "PCI-E support". It's a damn smart move economically when you're trying to sell the rest of your chips during a transition to a new platform, but I know not a single being who ever splurged on such a mediocre design (at least not the PCX series -- I've got a fried 6600 AGP in my hands from someone who didn't know the wiser).

    And I don't know where you're getting your data, but the FX5200 dusted the MX440 by about 2:1 in most of everything. I had a Radeon 8500 once and was playing on a LAN alongside a buddy with a Radeon 7500, and his card outperformed mine because it wasn't rendering everything mine was. But when I disabled the difference in effects, my card was well ahead. When I hear of an inferior new GPU, I think GMA 3100 < GMA 950, not this.

    Besides, GP's point wasn't about the GF4 but rather the twisted path of "upgrades" we're seeing more and more.

Log in

Don't have an account? Sign up now