Odds & Ends: ECC & NVIDIA Surround Missing

One of the things we have been discussing with NVIDIA for this launch is ECC. As we just went over in our GF100 Recap, Fermi offers ECC support for its register file, L1 cache, L2 cache, and RAM. The latter is the most interesting, as under normal circumstances implementing ECC requires a wider bus and additional memory chips. The GTX 400 series will not be using ECC, but we went ahead and asked NVIDIA how ECC will work on Fermi products anyhow.

To put things in perspective, for PC DIMMs an ECC DIMM will be 9 chips per channel (9 bits per byte) hooked up to a 72bit bus instead of 8 chips on a 64bit bus. However NVIDIA doesn’t have the ability or the desire to add even more RAM channels to their products, not to mention 8 doesn’t divide cleanly in to 10/12 memory channels. So how do they implement ECC?

The short answer is that when NVIDIA wants to enable ECC they can just allocate RAM for the storage of ECC data. When ECC is enabled the available RAM will be reduced by 1/8th (to account for the 9th ECC bit) and then ECC data will be distributed among the RAM using that reserved space. This allows NVIDIA to implement ECC without the need for additional memory channels, at the cost of some RAM and some performance.

On the technical side, despite this difference in implementation NVIDIA tells us that they’re still using standard Single Error Correction / Double Error Detection (SECDED) algorithms, so data reliability is the same as in a traditional implementation. Furthermore NVIDIA tells us that the performance hit isn’t a straight-up 12.5% reduction in effective memory bandwidth, rather they have ways to minimize the performance hit. This is their “secret sauce” as they call it, and it’s something that they don’t intend to discuss at in detail at this time.

Shifting gears to the consumer side, back in January NVIDIA was showing off their Eyefinity-like solutions 3DVision Surround and NVIDIA Surround on the CES showfloor. At the time we were told that the feature would launch with what is now the GTX 400 series, but as with everything else related to Fermi, it’s late.

Neither 3DVision Surround nor NVIDIA surround are available in the drivers sampled to us for this review. NVIDIA tells us that these features will be available in their release 256 drivers due in April. There hasn’t been any guidance on when in April these drivers will be released, so at this point it’s anyone’s guess whether they’ll arrive in time for the GTX 400 series retail launch.

The GF100 Recap Tessellation & PhysX
Comments Locked

196 Comments

View All Comments

  • mcnabney - Friday, March 26, 2010 - link

    You make the most valid point.

    As long as the consoles are in the driver's seat (this isn't going to change) DX11 and the features it provides won't be widely found in games until the next generation of consoles - in 2-3 years.

    So really, without growth in the PC gaming market these is no need to upgrade from the last generation. Too bad really.
  • GourdFreeMan - Friday, March 26, 2010 - link

    Thank you for listening to our feedback on improving your test suite of games, Ryan. I think your current list much better represents our interests (fewer console ports, a selection of games that better represent the game engines being used in current and future titles, fewer titles with GPU vendor bias, inclusion of popular titles that have staying power like BF:BC2, etc.) than the one you used to review the 58xx's when they were released. The only title that I feel that is missing from our suggestions is Metro 2033. Kudos!
  • yacoub - Friday, March 26, 2010 - link

    Good review. The grammar errors are prolific, but I guess this was rushed to release or something.

    So it's a hot, power-hungry card with a high pricetag. Not too surprising.

    Would have liked to see a $150-range Fermi-based card sometime this year so I can ditch my 5770 and get back to NVidia, but the high temps and prices on these cards are not a good sign, especially comparing the performance against the 5800-series.
  • AznBoi36 - Saturday, March 27, 2010 - link

    Fanboy much?
  • yacoub - Saturday, March 27, 2010 - link

    Fanboy of what?
    The ATI card I have now that I can't wait to get rid of?
    The desire for NVidia to release something competitive so I can get back to a stabler driverset and remove all traces of ATI from this PC?
  • mcnabney - Saturday, March 27, 2010 - link

    Ah yes, get back to Nvidia whose last trick was releasing a driver that turned off GPU fans causing instant-card-death.

    With 480, turning off the fan might actually start a fire.
  • Headfoot - Monday, March 29, 2010 - link

    I bet you experienced that fan error IRL right?

    Just like how everyone who owned a Phemon got a TLB error 100% of the time right?
  • numberoneoppa - Friday, March 26, 2010 - link

    You know you have the best tech site around when a product review makes it seem like a ddos is in progress.

    As far as the review itself, it's very comprehensive, so thanks Ryan! The new NVIDIA cards seem to be just where most people thought they would be. It really makes me anticipate the next HD58xx card and the AMD price cuts on the current line up that will come with it.
  • Devo2007 - Friday, March 26, 2010 - link

    Great review, although you may want to edit this sentence:

    "NVIDIA meanwhile had to deal with the fact that they were trying to produce a very large chip on a low-yielding process, a combination for disaster given that size is the enemy of high yields."

    Shouldn't it be "large size is the enemy of low yields?" Either way, that end point seems a bit redundant.
  • SlyNine - Saturday, March 27, 2010 - link

    No, Large size would be a friend of low yeilds. low yeilds are our enemy.

Log in

Don't have an account? Sign up now