Odds & Ends: ECC & NVIDIA Surround Missing

One of the things we have been discussing with NVIDIA for this launch is ECC. As we just went over in our GF100 Recap, Fermi offers ECC support for its register file, L1 cache, L2 cache, and RAM. The latter is the most interesting, as under normal circumstances implementing ECC requires a wider bus and additional memory chips. The GTX 400 series will not be using ECC, but we went ahead and asked NVIDIA how ECC will work on Fermi products anyhow.

To put things in perspective, for PC DIMMs an ECC DIMM will be 9 chips per channel (9 bits per byte) hooked up to a 72bit bus instead of 8 chips on a 64bit bus. However NVIDIA doesn’t have the ability or the desire to add even more RAM channels to their products, not to mention 8 doesn’t divide cleanly in to 10/12 memory channels. So how do they implement ECC?

The short answer is that when NVIDIA wants to enable ECC they can just allocate RAM for the storage of ECC data. When ECC is enabled the available RAM will be reduced by 1/8th (to account for the 9th ECC bit) and then ECC data will be distributed among the RAM using that reserved space. This allows NVIDIA to implement ECC without the need for additional memory channels, at the cost of some RAM and some performance.

On the technical side, despite this difference in implementation NVIDIA tells us that they’re still using standard Single Error Correction / Double Error Detection (SECDED) algorithms, so data reliability is the same as in a traditional implementation. Furthermore NVIDIA tells us that the performance hit isn’t a straight-up 12.5% reduction in effective memory bandwidth, rather they have ways to minimize the performance hit. This is their “secret sauce” as they call it, and it’s something that they don’t intend to discuss at in detail at this time.

Shifting gears to the consumer side, back in January NVIDIA was showing off their Eyefinity-like solutions 3DVision Surround and NVIDIA Surround on the CES showfloor. At the time we were told that the feature would launch with what is now the GTX 400 series, but as with everything else related to Fermi, it’s late.

Neither 3DVision Surround nor NVIDIA surround are available in the drivers sampled to us for this review. NVIDIA tells us that these features will be available in their release 256 drivers due in April. There hasn’t been any guidance on when in April these drivers will be released, so at this point it’s anyone’s guess whether they’ll arrive in time for the GTX 400 series retail launch.

The GF100 Recap Tessellation & PhysX
POST A COMMENT

197 Comments

View All Comments

  • WiNandLeGeNd - Saturday, March 27, 2010 - link

    I think this was a great review, as mentioned previously, very objective. I think though that I may get a 480, because when I buy a card I keep it for 3 to 4 years before I get a new one, aka every other gen. And seeing that tessellation is really the gift horse of DX11 and how much more tessellation power is in the 480's, I think it could very much pay off in the future. If not then I spent an extra $85 for a tad extra performance as I just pre-ordered one for 485 and the 5870's are at $400 still.

    My only concern is heat and power, but most of the cards have a life time warranty. Hopefully my OCZ GamerXtreme 850W can handle it at max loads. The two 12v rails for the two 6 pin PCI-X connectors are 20 A each, I saw 479w max consumption, however that was furmark, at 12v that's 39.5 amps, so it would be extremely close if there is ever a game to utilize that much power. Although If I recall ATI specifically stated a while back to not use that as it pushes loads that are not possible to see in an actual game, I think they had an issue with the 4000 series burning out power regulators, correct me if I'm wrong.
    Reply
  • Alastayr - Saturday, March 27, 2010 - link

    I'm with sunburn on this one. Your reasoning doesn't make much sense. You must've not followed the GPU market for the last few years because

    first) "every other gen" would mean a 2 year cycle
    second) Nothing's really gonna pay off in the future, as the future will bring faster cards for a fraction of the price. You'd only enjoy those questionable benefits until Q4, when AMD releases Northern Islands and nVidia pops out GF100b or whatever they'll call it.
    third) Tessellation won't improve further that fast. If at all, developers will focus on the lowest common denominator, which would be Cypress. Fermi's extra horse power will most likely stay unused.
    fourth) Just look at your power bill. The 25W difference with a "typical" Idle scheme (8h/day; 350d/y) comes to 70kWh which where I live translates to around $20 per year. That's Idle *only*. You're spending way more than just $85 extra on that card.
    fifth) The noise will kill you. This isn't a card than just speeds up for no reason. You can't just magically turn down the fan from 60% to 25% and still enjoy Temps of <90°C like on some GTX 260 boards. Turn up your current fan to 100% for a single day. Try living through that. That's probably what you're buying.

    In the end everyone has to decide this for himself. But for someone to propose keeping a GTX 480 in his PC for a whopping 3-4 years... I don't know man. I'd rather lose a finger or two. ;)

    tl;dr I know, I know. But really people. Those cards aren't hugely competetive, priced too high and nV's drivers suck as much as ATi's (allegedly) do nowadays. Whis is to say neither do.

    I could honestly bite me right now. I had a great deal for a 5850 in Nov. and I waited for nV to make their move. Now the same card will cost me $50 more, and I've only wasted time by waiting for the competetive GTX 470 that never was. Argh.
    Reply
  • Sunburn74 - Saturday, March 27, 2010 - link

    Thats kind of bad logic imo. I'm not fanboy on either side, but it's clear to me that Nvidia targeted the performance of their cards to fit in exactly between the 5970, the 5870, and 5850. Its much harder to release a card not knowing what the other guy truly has as opposed to releasing a card knowing exactly what sort of performance levels you have to hit.

    Two, realistically, think of the noise. I mean ifyou've ever heard a gtx 260 at 100 percent fan speed, thats the sort of fan noises you're going to be experiencing on a regular basis. Its not a mild difference.

    And three, realistically for the premium you're paying for the extra performance (which is not useful right now as there are no games to take advantage of it) as well as for the noise, heat and power, you could simply buy the cheaper 5870, save that 85-150 dollars extra, and sell off the 5870 when the time is right.

    I just don't see why anyone would buy this card unless they were specifically taking advantage of some of the compute functions. As a consumer card it is a failure. Power and heat be damned, the noise the noise! Take your current card up to 100 percent fan speed, and listen to it for a few mins, and thats what you should about expect from these gpus.
    Reply
  • andyo - Saturday, March 27, 2010 - link

    I too am getting the warning message with Firefox 3.6.2. Posting this on IE. Here's the message:

    http://photos.smugmug.com/photos/820690277_fuLv6-O...">http://photos.smugmug.com/photos/820690277_fuLv6-O...
    Reply
  • JarredWalton - Saturday, March 27, 2010 - link

    We're working on it. Of course, the "Internet Police" have now flagged our site as malicious because of one bad ad that one of the advertisers put up, and it will probably take a week or more to get them to rescind the "Malware Site" status. Ugh.... Reply
  • jeffrey - Saturday, March 27, 2010 - link

    Give the advertiser that put up the bad ad hell! Reply
  • LedHed - Saturday, March 27, 2010 - link

    The people who are going to buy the GTX 480/470 are enthusiast who most likely bought the GTX 295 or had 200 Series SLI. So not including the 295 in every bench is kind of odd. We need to see how the top end of the last gen does against the new gen top end. Reply
  • Ryan Smith - Saturday, March 27, 2010 - link

    What chart is the 295 not in? It should be in every game test. Reply
  • kc77 - Saturday, March 27, 2010 - link

    Well the 295 beats the 470 in most benches so there's no need to really include it in all benches. Personally I think the 480 is the better deal. Although I am not buying those cards until a respin/refresh, those temps and power requirements are just ridiculous. Reply
  • bigboxes - Saturday, March 27, 2010 - link

    I know you "upgraded" your test PSU to the Antec 1200W PSU, but did you go back and try any of these tests/setups with your previous 850W PSU to see if could handle the power requirements. It seemed that only your 480 SLI setup drew 851W in total system in the Furmark load test. Other than that scenario it looks like your old PSU should handle the power requirements just fine. Any comments? Reply

Log in

Don't have an account? Sign up now