No More Memory Bandwidth

Again, we have a 256 bit (4x 64 bit) memory interface to GDDR3 memory. The local graphics memory setup is not significantly different from the 6800 series of cards and only runs slightly faster at a 1.2 GHz effective data rate. This will work out in NVIDIA's favor as long as newer games continue to put a heavier burden on pixel shader processing. NVIDIA sees texture bandwidth as outweighing color and z bandwidth in the not too distant future. This doesn't mean the quest after ever increasing bandwidth will stop; it just means that the reasons we will need more bandwidth will change.

A good example of the changing needs of graphics cards is Half-Life 2. While the game runs very well even on older graphics cards like the 9800 Pro, the design is such that increased memory bandwidth is far less important than having more shader processing power. This is why we see the 6600GT cards significantly outperform the 9800 Pro. Even more interesting is that in our testing, we found that enabling 4xAA on a 9800 Pro didn't affect performance of HL2 much at all, while increasing the resolution from 1024x768 to 1280x1024 had a substantial impact on frame rates. If the HL2 model is a good example of the future of 3D engines, NVIDIA's decision to increase pixel processing power while leaving memory bandwidth for the future makes a lot of sense.

On an interesting side note, the performance tests in this article are mostly based around 1600x1200 and higher resolutions. Memory usage at 2048x1536 with 32bit color and z-buffer runs a solid 144MB for double buffered rendering with 4x AA. This makes a 256MB card a prerequisite for this setup, but depending on the textures, render targets and other local memory usage, 256MB may be a little short. PCI Express helps a little to alleviate any burden placed on system memory, but it is conceivable that some games could get choppier when swapping in and out large textures, normal maps, and the like.

We don't feel that ATI's 512MB X850 really brings anything necessary to the table, but with this generation we could start to see a real use for 512MB of local memory. MRTs, larger textures, normal maps, vertex textures, huge resolutions, and a lack of hardware compression for fp16 and fp32 textures all mean that we are on the verge of seeing games push memory usage way up. Processing these huge stores of data require GPUs powerful enough to utilize them efficiently. The G70 begins to offer that kind of power. For the majority of today's games, we are fine with 256MB of RAM, but moving into the future it's easy to see how more would help.

In addition to these issues, a 512MB card would be a wonderful fit for Dual-Link DVI. This would make the part a nice companion to Apple's largest Cinema Display (which is currently beyond the maximum resolution supported by the GeForce 7800 GTX). In case anyone is curious, a double buffered 4xAA 32bit color+z framebuffer at 2560x1600 is about 190MB.

In our briefings on G70, we were told that every part of the chip has been at least slightly updated from NV4x, but the general architecture and feature set is the same. There have been a couple of more significant updates as well, namely the increased performance capability of a single shader pipe and the addition of transparency antialiasing. Let's take a look at these factors right now.

The Pipeline Overview Inside The Pipes


View All Comments

  • vanish - Wednesday, June 22, 2005 - link

    From what i'm seeing the 6800U SLI beats the 7800GTX[SLI] in most normal resolutions. I don't know, but usually when a new generation comes out it should at least beat the previous generation. Sure, it works wonders on huge resolutions, but very few people actually have monitors that can display these types of resolutions. Most people don't have monitors above 1200x1000 resolution, much less 1600x1200. Reply
  • Live - Wednesday, June 22, 2005 - link

    What’s up with the BF2 graphs? The 6800u SLI scores more at 1600x1200 4xAA (76,3) then it does at the same resolution without AA (68,3). That doesn’t make sense does it?

    Sorry for the extremely poor spelling…
  • Dukemaster - Wednesday, June 22, 2005 - link

    Over 2,5 than a 6800 ultra in Battlefield 2 and Splinter Cell, how the hell is that possbile?? Reply
  • vortmax - Wednesday, June 22, 2005 - link

    Good job Nvidia with the launch. Now lets see if ATI can match the performance and availability. Reply
  • ryanv12 - Wednesday, June 22, 2005 - link

    From what I see, the 7800GTX is really of benefit to you if you have a monitor that is higher than a 1600x1200 resolution. Fairly impressive though, I must say. I also wasn't expecting double the performance of the 6800's since it only has 50% more pipes. I can't wait to see the 32 piped cards! Reply
  • Live - Wednesday, June 22, 2005 - link

    Looks good. to bad i ahve to wiat a few month until ATI releases the competition. Reply
  • bpt8056 - Wednesday, June 22, 2005 - link

    First post!! Congrats to nVidia for pulling off an official launch with retail availability. Reply

Log in

Don't have an account? Sign up now