Odds & Ends: ECC & NVIDIA Surround Missing

One of the things we have been discussing with NVIDIA for this launch is ECC. As we just went over in our GF100 Recap, Fermi offers ECC support for its register file, L1 cache, L2 cache, and RAM. The latter is the most interesting, as under normal circumstances implementing ECC requires a wider bus and additional memory chips. The GTX 400 series will not be using ECC, but we went ahead and asked NVIDIA how ECC will work on Fermi products anyhow.

To put things in perspective, for PC DIMMs an ECC DIMM will be 9 chips per channel (9 bits per byte) hooked up to a 72bit bus instead of 8 chips on a 64bit bus. However NVIDIA doesn’t have the ability or the desire to add even more RAM channels to their products, not to mention 8 doesn’t divide cleanly in to 10/12 memory channels. So how do they implement ECC?

The short answer is that when NVIDIA wants to enable ECC they can just allocate RAM for the storage of ECC data. When ECC is enabled the available RAM will be reduced by 1/8th (to account for the 9th ECC bit) and then ECC data will be distributed among the RAM using that reserved space. This allows NVIDIA to implement ECC without the need for additional memory channels, at the cost of some RAM and some performance.

On the technical side, despite this difference in implementation NVIDIA tells us that they’re still using standard Single Error Correction / Double Error Detection (SECDED) algorithms, so data reliability is the same as in a traditional implementation. Furthermore NVIDIA tells us that the performance hit isn’t a straight-up 12.5% reduction in effective memory bandwidth, rather they have ways to minimize the performance hit. This is their “secret sauce” as they call it, and it’s something that they don’t intend to discuss at in detail at this time.

Shifting gears to the consumer side, back in January NVIDIA was showing off their Eyefinity-like solutions 3DVision Surround and NVIDIA Surround on the CES showfloor. At the time we were told that the feature would launch with what is now the GTX 400 series, but as with everything else related to Fermi, it’s late.

Neither 3DVision Surround nor NVIDIA surround are available in the drivers sampled to us for this review. NVIDIA tells us that these features will be available in their release 256 drivers due in April. There hasn’t been any guidance on when in April these drivers will be released, so at this point it’s anyone’s guess whether they’ll arrive in time for the GTX 400 series retail launch.

The GF100 Recap Tessellation & PhysX
Comments Locked

196 Comments

View All Comments

  • kc77 - Saturday, March 27, 2010 - link

    Yeah I mentioned it too. ATI got reamed for almost a whole entire page for something that didn't really happen. While this review mentions it in passing almost like it's a feature.
  • gigahertz20 - Friday, March 26, 2010 - link

    "The price gap between it and the Radeon 5870 is well above the current performance gap"

    Bingo, Nvidia may have the fastest single GPU out now, but not by much, and there are tons of trade offs for just a little bit more FPS over the Radeon 5870. High heat/noise/power for what? Over 90% of gamers play at 1920 X 1200 resolution or less, so even just a Radeon 5850 or Crossfired 5770's are the best bang for the buck.

    If all your going to play at is 1920 X 1200 or less, I see no reason why educated people would want to buy a GTX 470/480 after reading all the reviews for Fermi today. Way to expensive and way to hot for not much of a performance gain, maybe it's time to sell my Nvidia stock before it goes down any further over the next year or so.
  • ImSpartacus - Friday, March 26, 2010 - link

    "with a 5th one saying within the card"

    Page 2, Paragraph 2.

    Aside from minor typos, this is a great article.

  • cordis - Friday, March 26, 2010 - link

    Hey, thanks for the folding data, very much appreciated. Although, if there's any way you can translate it into something that folders are a little more used to, like ppd (points per day), that would be even better. I'm not sure what the benchmarking program you used is like, but if it folds things and produces log files, it should be possible to get ppd. From the ratios, it looks like above 30kppd, but it would be great to get hard numbers on it. Any chance of that getting added?
  • Ryan Smith - Friday, March 26, 2010 - link

    I can post the log files if you want, but there's no PPD data in them. It only tells me nodes.
  • cordis - Tuesday, March 30, 2010 - link

    Eh, that's ok, if you want to that's fine, but don't worry about it too much, it sounds like it was an artificial nvidia thing. We'll have to wait for people to really start folding on them to see how they work out.
  • ciparis - Friday, March 26, 2010 - link

    I had a weird malware warning pop up when I hit page 2:

    "The website at anandtech.com contains elements from the site googleanalyticz.com"

    I'm using Safari (I also saw someone with Chrome report it). I wonder what that was all about...
  • Despoiler - Friday, March 26, 2010 - link

    I'd like to see some overclocking benchmarks given the small die vs big die design decisions each company made.

    All in all ATI has this round in the business sense. The performance crown is not where the money is. ATI out executed Nvidia in a huge way. I cannot wait to see the financial results for each company.
  • LuxZg - Saturday, March 27, 2010 - link

    Agree.. No overclocking at all..feels like big part of review missing. With GTX480 having that high consumption/temperatures, I doubt it would go much further, at least on air. On the other hand, there are already many OCed HD58xx cards out there, and even those can easily be overclocked further. With as much watts of advantage, I think AMD could easily catch up with GTX480 and still be a bit cooler and less power hungry. And less noisy as a consequence as well of course.
  • randfee - Friday, March 26, 2010 - link

    very thorough test as expected from you guys, thanks... BUT:

    Why on earth do you keep using an arguably outdated core i7 920 for benchmarking the newest GPUs? Even at 3,33GHz its no match for an overclocked 860, a comman highend gaming-rig cpu these days. I got mine at 4,2GHz air cooled?!

    sorry... don't get it. On any GPU review I'd try to eliminate any possible bottleneck so the GPU gets limited more, why use an old cpu like this?!

    anyone?

Log in

Don't have an account? Sign up now