Odds & Ends: ECC & NVIDIA Surround Missing

One of the things we have been discussing with NVIDIA for this launch is ECC. As we just went over in our GF100 Recap, Fermi offers ECC support for its register file, L1 cache, L2 cache, and RAM. The latter is the most interesting, as under normal circumstances implementing ECC requires a wider bus and additional memory chips. The GTX 400 series will not be using ECC, but we went ahead and asked NVIDIA how ECC will work on Fermi products anyhow.

To put things in perspective, for PC DIMMs an ECC DIMM will be 9 chips per channel (9 bits per byte) hooked up to a 72bit bus instead of 8 chips on a 64bit bus. However NVIDIA doesn’t have the ability or the desire to add even more RAM channels to their products, not to mention 8 doesn’t divide cleanly in to 10/12 memory channels. So how do they implement ECC?

The short answer is that when NVIDIA wants to enable ECC they can just allocate RAM for the storage of ECC data. When ECC is enabled the available RAM will be reduced by 1/8th (to account for the 9th ECC bit) and then ECC data will be distributed among the RAM using that reserved space. This allows NVIDIA to implement ECC without the need for additional memory channels, at the cost of some RAM and some performance.

On the technical side, despite this difference in implementation NVIDIA tells us that they’re still using standard Single Error Correction / Double Error Detection (SECDED) algorithms, so data reliability is the same as in a traditional implementation. Furthermore NVIDIA tells us that the performance hit isn’t a straight-up 12.5% reduction in effective memory bandwidth, rather they have ways to minimize the performance hit. This is their “secret sauce” as they call it, and it’s something that they don’t intend to discuss at in detail at this time.

Shifting gears to the consumer side, back in January NVIDIA was showing off their Eyefinity-like solutions 3DVision Surround and NVIDIA Surround on the CES showfloor. At the time we were told that the feature would launch with what is now the GTX 400 series, but as with everything else related to Fermi, it’s late.

Neither 3DVision Surround nor NVIDIA surround are available in the drivers sampled to us for this review. NVIDIA tells us that these features will be available in their release 256 drivers due in April. There hasn’t been any guidance on when in April these drivers will be released, so at this point it’s anyone’s guess whether they’ll arrive in time for the GTX 400 series retail launch.

The GF100 Recap Tessellation & PhysX
Comments Locked

196 Comments

View All Comments

  • ReaM - Saturday, March 27, 2010 - link

    I don't agree with final words.

    480 is crap. Already being expensive it adds huge power consumption factor only to have a slightly better performance.

    However (!), I see a potential for future chips and I can't wait for a firmy Quadro to hit the market :)
  • Patrick Wolf - Saturday, March 27, 2010 - link

    6 months and we get a couple of harvested, power-sucking heaters? Performance king, barely, but for what cost. Cards not even available yet. This is a fail.

    This puts ATI in a very good place to release a refresh or revisions and snatch away the performance crown.
  • dingetje - Saturday, March 27, 2010 - link

    exactly my thoughts

    and imo the reviewers are going way to easy on nvidia over this fail product (except maybe hardocp)
  • cjb110 - Saturday, March 27, 2010 - link

    You mention that both of these are cut-down GF100's, but you've not done any extrapolation of what the performance of a full GF100 card would be?

    We do expect a full GF100 gaming orientated card, and probly before the end of the year, don't we?
    Is that going to be 1-9% quicker or 10%+?
  • Ryan Smith - Saturday, March 27, 2010 - link

    It's hard to say since we can't control every variable independent of each other. A full GF100 will have more shading, texturing, and geo power than the GTX 480, but it won't have any more ROP/L2/Memory.

    This is going to heavily depend on what the biggest bottleneck is, possibly on a per-game basis.
  • SlyNine - Saturday, March 27, 2010 - link

    Yea and I had to return 2 8800GT's from being burnt up. I will not buy a really hot running card again.
  • poohbear - Saturday, March 27, 2010 - link

    Oh how the mighty have fallen.:( i remember the days of the 8800gt when nvidia did a hard launch, released a cheap & excellent performing card for the masses. W/ the fermi release u would never know its the same company. Such a disappointment.
  • descendency - Saturday, March 27, 2010 - link

    I think the MSRP is lower than $300 for the 5850 (259) and lower than $400 for the 5870 (379). Just thought that was worth sharing.

    I have to believe that the demand will shift back evenly now and price drops for the AMD cards can ensue (if nothing else, the cards should go to the MSRP values because competition is finally out). I would imagine the price gap between the GTX480 and the AMD 5870 could be as much as $150 dollars when all is said and done. Maybe $200 dollars initially as this kind of release almost always is followed by a paper launch (major delays and problems before launch = supply issues).
  • AnnonymousCoward - Saturday, March 27, 2010 - link

    ...for two reasons: power and die size.

    So the 5870 and 470 appear to be priced similarly, while the 5870 beats it in virtually every game and uses 47W less at load! That is a TON of additional on-die power (like 30-40A?).

    We saw this coming last year when Fermi was announced. Now AMD is better positioned than ever.
  • IVIauricius - Saturday, March 27, 2010 - link

    I see why XFX started making ATI cards a few years ago with the 4000 series. Once again nVidia has made a giant chip that requires a high price tag to offset the price of manufacturing and material. The same thing happened a few years ago with the nVidia GTX200 cards and the ATI 4000 cards. XFX realized that they weren't making as much money as they'd like with GTX200 cards and started producing more profitable ATI 4000 cards.

    I bought a 5870 a couple months ago for $379 at newegg with a promotion code. I plan on selling it not to upgrade, but to downgrade. A $400 card doesn't appeal to me anymore when, like many posters have mentioned, most games don't take advantage of the amazing performance these cards offer us. I only play games like MW2, Borderlands, Dirt 2, and Bioshock 2 at 1920x1080 so a 4870 should suffice my needs for another year. Maybe then I'll buy a 5850 for ~$180.

    First post, hope I didn't sound too much like a newbie.

    -Mauro

Log in

Don't have an account? Sign up now