Odds & Ends: ECC & NVIDIA Surround Missing

One of the things we have been discussing with NVIDIA for this launch is ECC. As we just went over in our GF100 Recap, Fermi offers ECC support for its register file, L1 cache, L2 cache, and RAM. The latter is the most interesting, as under normal circumstances implementing ECC requires a wider bus and additional memory chips. The GTX 400 series will not be using ECC, but we went ahead and asked NVIDIA how ECC will work on Fermi products anyhow.

To put things in perspective, for PC DIMMs an ECC DIMM will be 9 chips per channel (9 bits per byte) hooked up to a 72bit bus instead of 8 chips on a 64bit bus. However NVIDIA doesn’t have the ability or the desire to add even more RAM channels to their products, not to mention 8 doesn’t divide cleanly in to 10/12 memory channels. So how do they implement ECC?

The short answer is that when NVIDIA wants to enable ECC they can just allocate RAM for the storage of ECC data. When ECC is enabled the available RAM will be reduced by 1/8th (to account for the 9th ECC bit) and then ECC data will be distributed among the RAM using that reserved space. This allows NVIDIA to implement ECC without the need for additional memory channels, at the cost of some RAM and some performance.

On the technical side, despite this difference in implementation NVIDIA tells us that they’re still using standard Single Error Correction / Double Error Detection (SECDED) algorithms, so data reliability is the same as in a traditional implementation. Furthermore NVIDIA tells us that the performance hit isn’t a straight-up 12.5% reduction in effective memory bandwidth, rather they have ways to minimize the performance hit. This is their “secret sauce” as they call it, and it’s something that they don’t intend to discuss at in detail at this time.

Shifting gears to the consumer side, back in January NVIDIA was showing off their Eyefinity-like solutions 3DVision Surround and NVIDIA Surround on the CES showfloor. At the time we were told that the feature would launch with what is now the GTX 400 series, but as with everything else related to Fermi, it’s late.

Neither 3DVision Surround nor NVIDIA surround are available in the drivers sampled to us for this review. NVIDIA tells us that these features will be available in their release 256 drivers due in April. There hasn’t been any guidance on when in April these drivers will be released, so at this point it’s anyone’s guess whether they’ll arrive in time for the GTX 400 series retail launch.

The GF100 Recap Tessellation & PhysX
POST A COMMENT

197 Comments

View All Comments

  • Headfoot - Monday, March 29, 2010 - link

    Great review, great depth but not too long. Concise but still enough information.

    THANK YOU SO MUCH FOR INCLUDING MINIMUM FRAME RATES!!! IMO they contribute the most to a game feeling "smooth"
    Reply
  • niceboy60 - Friday, August 20, 2010 - link

    This review is not accurate , Badaboom GTX 400 series cards , are not compatible with GTX 400 series yet .However they already post the test resaults
    I have a GTX 480 and does not work with badaboom , Badaboom official site confirms that
    Reply
  • slickr - Sunday, March 28, 2010 - link

    I thought that after the line-up of games thread, you would really start testing games from all genres, so we can actually see how each graphic cards performs in different scenarios.

    Now you have 80% first person shooters, 10% racing/Action-adventure and 10%RPG and RTS.
    Where are the RTS games, isometric RPG's, simulation games, etc?

    I would really like Battleforge thrown out and replaced by Starcraft 2, DOW 2: Chaos Rising, Napoleon Total War. All these RTS games play differently and will give different results, and thus better knowledge of how graphic cards perform.
    How about also testing The Sims 3, NFS:Shift, Dragon Age Origins.
    Reply
  • Ryan Smith - Monday, March 29, 2010 - link

    Actually DAO was in the original test suite I wanted to use. At the high end it's not GPU limited, not in the slightest. Just about everything was getting over 100fps, at which point it isn't telling us anything useful.

    The Sims 3 and Starcraft are much the same way.
    Reply
  • Hsuku - Sunday, March 28, 2010 - link

    On Page 9 of Crysis, your final sentence indicates that SLI scales better than CF at lower resolutions, which is incorrect from your own graphs. CF clearly scales better at lower resolutions when video RAM is not filled:

    @ 1680x1050
    480 SLI -- 60.2:40.7 --> 1.48
    5870 CF -- 53.8:30.5 --> 1.76 (higher is better)

    @ 1920x1200
    480 SLI -- 54.5:33.4 --> 1.63
    5870 CF -- 46.8:25.0 --> 1.87 (higher is better)

    This indicates the CF technology scales better than SLI, even if the brute performance of the nVidia solution comes out on top. This opposes diametrically your conclusion to page 9 ("Even at lower resolutions SLI seems to be scaling better than CF").

    (Scaling ability is a comparison of ratios, not a comparison of FPS)
    Reply
  • Ryan Smith - Monday, March 29, 2010 - link

    You're looking at the minimums, not the averages. Reply
  • Hsuku - Tuesday, March 30, 2010 - link

    My apologies, I was looking at the wrong graphs.

    However, even so, your assertion is still incorrect: at at the lowest listed resolution, CF and SLI scaling are tied.
    Reply
  • Ryan Smith - Wednesday, March 31, 2010 - link

    Correct. The only thing I really have to say about that is that while we include 1680 for reference's sake, for any review of a high-end video card I'm looking nearly exclusively at 1920 and 2560. Reply
  • Hrel - Thursday, September 02, 2010 - link

    I get that, to test the card. But if you don't have a monitor that goes that high, it really doesn't matter. I'd really like to see 1080p thrown in there. 1920x1080; as that's the only resolution that matters to me and most everyone else in the US. Reply
  • Vinas - Sunday, March 28, 2010 - link

    It's pretty obvious that anantech was spanked by nVIDIA the last time they did a review. No mention of 5970 being superior to the 480 is a little disturbing. I guess the days of "trusting anandtech" are over. Come on guys, not even a mention of how easily the 5870 overclocks? The choice is still clear, dual 5870's with full cover blocks FTW! Reply

Log in

Don't have an account? Sign up now