Odds & Ends: ECC & NVIDIA Surround Missing

One of the things we have been discussing with NVIDIA for this launch is ECC. As we just went over in our GF100 Recap, Fermi offers ECC support for its register file, L1 cache, L2 cache, and RAM. The latter is the most interesting, as under normal circumstances implementing ECC requires a wider bus and additional memory chips. The GTX 400 series will not be using ECC, but we went ahead and asked NVIDIA how ECC will work on Fermi products anyhow.

To put things in perspective, for PC DIMMs an ECC DIMM will be 9 chips per channel (9 bits per byte) hooked up to a 72bit bus instead of 8 chips on a 64bit bus. However NVIDIA doesn’t have the ability or the desire to add even more RAM channels to their products, not to mention 8 doesn’t divide cleanly in to 10/12 memory channels. So how do they implement ECC?

The short answer is that when NVIDIA wants to enable ECC they can just allocate RAM for the storage of ECC data. When ECC is enabled the available RAM will be reduced by 1/8th (to account for the 9th ECC bit) and then ECC data will be distributed among the RAM using that reserved space. This allows NVIDIA to implement ECC without the need for additional memory channels, at the cost of some RAM and some performance.

On the technical side, despite this difference in implementation NVIDIA tells us that they’re still using standard Single Error Correction / Double Error Detection (SECDED) algorithms, so data reliability is the same as in a traditional implementation. Furthermore NVIDIA tells us that the performance hit isn’t a straight-up 12.5% reduction in effective memory bandwidth, rather they have ways to minimize the performance hit. This is their “secret sauce” as they call it, and it’s something that they don’t intend to discuss at in detail at this time.

Shifting gears to the consumer side, back in January NVIDIA was showing off their Eyefinity-like solutions 3DVision Surround and NVIDIA Surround on the CES showfloor. At the time we were told that the feature would launch with what is now the GTX 400 series, but as with everything else related to Fermi, it’s late.

Neither 3DVision Surround nor NVIDIA surround are available in the drivers sampled to us for this review. NVIDIA tells us that these features will be available in their release 256 drivers due in April. There hasn’t been any guidance on when in April these drivers will be released, so at this point it’s anyone’s guess whether they’ll arrive in time for the GTX 400 series retail launch.

The GF100 Recap Tessellation & PhysX
Comments Locked

196 Comments

View All Comments

  • arjunp2085 - Friday, March 26, 2010 - link

    For dealing with suck fake geometry, Fermi has several new tricks.

    is that supposed to be such??

    850 Watts for SLI.. man Air Conditioning for my room does not consume that much electricity

    Might have to go for industrial connections to use such high Electricity consumptions lol

    Green Team NOT GREEN....
  • Leyawiin - Friday, March 26, 2010 - link

    Guess I'll keep my GTX 260 for a year or so more and hope for better days.
  • hangfirew8 - Friday, March 26, 2010 - link

    Launch FAIL.

    All this waiting and a paper launch. They couldn't even manage the 1/2 dozen cards per vendor at Newegg of some previous soft launches.

    All this waiting an a small incremental increase over existing card performance. High power draw and temps. High prices, at least they had the sense not to price it like the 8800Ultra-which was a game changer. It had a big leap in performance plus brought us a new DX level, DX10.

    I've been holding off buying until this launch, I really wanted nVidia to pull something off here. Oh, well.

  • softdrinkviking - Friday, March 26, 2010 - link

    so by the time a "full" gf100 is available, how close will we be the the next gen AMD card?
    and how low will be the prices on the 58XX series be?

    this article never made an explicit buying recommendation, but how many people out there are still waiting to buy a gf100?
    6 months is a long time.
    after xmas and the post holiday season, anybody on the fence about it (i.e. not loyal nvidia fans) probably just went for amd card.
    so the question (for a majority of potential buyers?) isn't "which card do i buy?", it's "do i need/want to upgrade from my 58xx amd card to a gf100?"


    also, i'm curious to find out if fermi can be scaled down into a low profile card and offer superior performance in a form factor that relies so heavily on low temps and low power consumption.
    the htpc market is a big money maker, and a bad showing for nvidia there could really hurt them.
    maybe they won't even try?

  • shin0bi272 - Friday, March 26, 2010 - link

    great review as usual here at Anandtech. I would have thought in your conclusions you would have mentioned that, in light of the rather lack luster 5% performance crown that they now hold, that it wasnt the best idea for them to disable 6% of their cores on the thing after all.

    Why make a 512 core gpu then disable 32 of them and end up with poorer performance when youre already 6 months behind the competition, sucking up more juice, have higher temps and fan noise, and a higher price tag? That's like making the bugatti veyron and then disabling 2 of its 16 cylinders!

    That will probably be what nvidia does when amd releases their super cypress to beat the 480. They'll release the 485 with all 512 cores and better i/o for the ram.
  • blyndy - Saturday, March 27, 2010 - link

    "Fermi is arranged as 16 clusters of 32 shaders, and given that it is turning off 64 shaders, it looks like the minimum granularity it can fuse off is a single cluster of 32. This means it is having problems getting less than two unrecoverable errors per die, not a good sign."

    from: http://www.semiaccurate.com/2009/12/21/nvidia-cast...">http://www.semiaccurate.com/2009/12/21/nvidia-cast...
  • shin0bi272 - Saturday, March 27, 2010 - link

    dont quote semi accurate to me. If you wanna call 1 in 100 claims being correct as Semi accurate then fine you can... me I call it a smear. Especially since the guy who wrote that article is a known liar and hack. If you google for gtx480 and click on the news results and click on semi accurate you will see its listed as satire.
  • Jamahl - Friday, March 26, 2010 - link

    the same Ryan Smith who panned the 5830 for being a "paper launch" even though it was available one day later?

    What's wrong this time Ryan? Maybe there are so many bad things to say about Fermi, being "paper launched" was well down the pecking order of complaints?
  • AnandThenMan - Friday, March 26, 2010 - link

    I was thinking the same thing. The 5830 got slammed for being a paper launch even though it wasn't, but Fermi gets a pass? Why? This isn't even a launch at all despite what Nvidia says. Actual cards will be available in what, 17 days? That's assuming the date doesn't change again.
  • jeffrey - Saturday, March 27, 2010 - link

    I'll third that notion.

    Even though Ryan Smith mentioned that Fermi was paper launched today, the tone and way that the article read was much harsher on AMD/ATI. That is ridiculous considering that Ryan had to eat his own words with an "Update" on the 5830's availability.

    To be tougher on AMD/ATI, when they did in fact launch the 5830 that day and have hard-launched, to the best of their ability, the entire 5XX0 stack gives an impression of bias.

    A paper launch with availability at least two and a half weeks out for a product six months late is absurd!

Log in

Don't have an account? Sign up now