Why NVIDIA Is Focused On Geometry

Up until now we haven’t talked a great deal about the performance of GF100, and to some extent we still can’t. We don’t know the final clock speeds of the shipping cards, so we don’t know exactly what the card will be like. But what we can talk about is why NVIDIA made the decisions they did: why they went for the parallel PolyMorph and Raster Engines.

The DX11 specification doesn’t leave NVIDIA with a ton of room to add new features. Without the capsbits, NVIDIA can’t put new features on their hardware and easily expose them, nor would they want to at risk of having those features (and hence die space) go unused. DX11 rigidly says what features a compliant video card should offer, and leaves you very little room to deviate.

So NVIDIA has taken a bit of a gamble. There’s no single wonder-feature in the hardware that immediately makes it stand out from AMD’s hardware – NVIDIA has post-rendering features such as 3D Vision or compute features such as PhysX, but when it comes to rendering they can only do what AMD does.

But the DX11 specification doesn’t say how quickly you have to do it.


Tessellation in action

To differentiate themselves from AMD, NVIDIA is taking the tessellator and driving it for all its worth. While AMD merely has a tessellator, NVIDIA is counting on the tessellator in their PolyMorph Engine to give them a noticeable graphical advantage over AMD.

To put things in perspective, between NV30 (GeForce FX 5800) and GT200 (GeForce GTX 280), the geometry performance of NVIDIA’s hardware only increases roughly 3x in performance. Meanwhile the shader performance of their cards increased by over 150x. Compared just to GT200, GF100 has 8x the geometry performance of GT200, and NVIDIA tells us this is something they have measured in their labs.

So why does NVIDIA want so much geometry performance? Because with tessellation, it allows them to take the same assets from the same games as AMD and generate something that will look better. With more geometry power, NVIDIA can use tessellation and displacement mapping to generate more complex characters, objects, and scenery than AMD can at the same level of performance. And this is why NVIDIA has 16 PolyMorph Engines and 4 Raster Engines, because they need a lot of hardware to generate and process that much geometry.

NVIDIA believes their strategy will work, and if geometry performance is as good as they say it is, then we can see why they feel this way. Game art is usually created at far higher levels of detail than what eventually ends up being shipped, and with tessellation there’s no reason why a tessellated and displacement mapped representation of that high quality art can’t come with the game. Developers can use tessellation to scale down to whatever the hardware can do, and in NVIDIA’s world they won’t have to scale it down very far to meet up with the GF100.

At this point tessellation is a message that’s much more for developers than it is consumers. As DX11 is required to take advantage of tessellation, only a few games exist while plenty more are on the way. NVIDIA needs to convince developers to ship their art with detailed enough displacement maps to match GF100’s capabilities, and while that isn’t too hard, it’s also not a walk in the park. To that extent they’re also extolling the other virtues of tessellation, such as the ability to do higher quality animations by only needing to animate the control points of a model, and letting tessellation take care of the rest. A lot of the success of the GF100 architecture is going to ride on how developers respond to this, so it’s going to be something worth keeping an eye on.


NVIDIA's water tessellation demo


NVIDIA's hair tessellation demo

GF100’s Gaming Architecture Better Image Quality: Jittered Sampling & Faster Anti-Aliasing
Comments Locked

115 Comments

View All Comments

  • DanNeely - Monday, January 18, 2010 - link

    For the benefit of myself and everyone else who doesn't follow gaming politics closely, what is "the infamous Batman: Arkham Asylum anti-aliasing situation"?
  • sc3252 - Monday, January 18, 2010 - link

    Nvidia helped get AA working in batman which also works on ATI cards. If the game detects anything besides a Nvidia card it disables AA. The reason some people are angry is when ATI helps out with games it doesn't limit who can use the feature, at least that's what they(AMD) claim.
  • san1s - Monday, January 18, 2010 - link

    the problem was that nvidia did not do qa testing on ati hardware
  • Meghan54 - Monday, January 18, 2010 - link

    And nvidia shouldn't have since nvidia didn't develop the game.

    On the other hand, you can be quite certain that the devs. did run the game on Ati hardware but only lock out the "preferred" AA design because of nvidia's money nvidia invested in the game.

    And that can be plainly seen by the fact that when the game is "hacked" to trick the game into seeing an nvidia card installed despite the fact an Ati card is being used and AA works flawlessly....and the ATi cards end up faster than current nvidia cards....the game is exposed for what it is. Purposely crippling a game to favor one brand of video card over another.

    But the nvididiots seem to not mind this at all. Yet, this is akin to Intel writing their complier to make AMD cpus run slower or worse on programs compiled with the Intel compiler.

    Read about that debacle Intel's now suffering from and that the outrage is fairly universal. Now, you'd think nvidia would suffer the same nearly universal outrage for intentionally crippling a game's function to favor one brand of card over another, yet nvidiots make apologies and say "Ati cards weren't tested." I'd like to see that as a fact instead of conjecture.

    So, one company cripples the function of another company's product and the world's up in arms, screaming "Monopolistic tactics!!!" and "Fine them to hell and back!"; another company does essentially the same thing and it gets a pass.

    Talk about bias.
  • Stas - Tuesday, January 19, 2010 - link

    If nV continues like this, it will turn around on them. It took MANY years for the market guards to finally say, "Intel, quit your sh*t!" and actually do something about it. Don't expect immediate retaliation in a multibillion dollar world-wide industry.
  • san1s - Monday, January 18, 2010 - link

    "yet nvidiots make apologies and say "Ati cards weren't tested." I'd like to see that as a fact instead of conjecture. "
    here you go
    http://www.legitreviews.com/news/6570/">http://www.legitreviews.com/news/6570/
    "On the other hand, you can be quite certain that the devs. did run the game on Ati hardware but only lock out the "preferred" AA design because of nvidia's money nvidia invested in the game. "
    proof? that looks like conjecture to me. Nvidia says otherwise.
    Amd doesn't deny it either.
    http://www.bit-tech.net/bits/interviews/2010/01/06...">http://www.bit-tech.net/bits/interviews...iew-amd-...
    they just don't like it
    And please refrain from calling people names such as "nvidiot," it doesn't help portray your image as unbiased.
  • MadMan007 - Monday, January 18, 2010 - link

    Oh for gosh sakes, this is the 'launch' and we can't even have a paper launch where at least reviewers get hardware? This is just more details for the same crap that was 'announced' when the 5800s came out. Poor show NV, poor show.
  • bigboxes - Monday, January 18, 2010 - link

    This is as close to a paper launch as I've seen in a while, except that there is not even an unattainable card. Gawd, they are gonna drag this out a lonnnnngg time. Better start saving up for that 1500W psu!
  • Adul - Monday, January 18, 2010 - link

    I suppose this is a vaporlaunch then.
  • Adul - Monday, January 18, 2010 - link

    I suppose this is a vaporlaunch then.

Log in

Don't have an account? Sign up now