No More Memory Bandwidth

Again, we have a 256 bit (4x 64 bit) memory interface to GDDR3 memory. The local graphics memory setup is not significantly different from the 6800 series of cards and only runs slightly faster at a 1.2 GHz effective data rate. This will work out in NVIDIA's favor as long as newer games continue to put a heavier burden on pixel shader processing. NVIDIA sees texture bandwidth as outweighing color and z bandwidth in the not too distant future. This doesn't mean the quest after ever increasing bandwidth will stop; it just means that the reasons we will need more bandwidth will change.

A good example of the changing needs of graphics cards is Half-Life 2. While the game runs very well even on older graphics cards like the 9800 Pro, the design is such that increased memory bandwidth is far less important than having more shader processing power. This is why we see the 6600GT cards significantly outperform the 9800 Pro. Even more interesting is that in our testing, we found that enabling 4xAA on a 9800 Pro didn't affect performance of HL2 much at all, while increasing the resolution from 1024x768 to 1280x1024 had a substantial impact on frame rates. If the HL2 model is a good example of the future of 3D engines, NVIDIA's decision to increase pixel processing power while leaving memory bandwidth for the future makes a lot of sense.

On an interesting side note, the performance tests in this article are mostly based around 1600x1200 and higher resolutions. Memory usage at 2048x1536 with 32bit color and z-buffer runs a solid 144MB for double buffered rendering with 4x AA. This makes a 256MB card a prerequisite for this setup, but depending on the textures, render targets and other local memory usage, 256MB may be a little short. PCI Express helps a little to alleviate any burden placed on system memory, but it is conceivable that some games could get choppier when swapping in and out large textures, normal maps, and the like.

We don't feel that ATI's 512MB X850 really brings anything necessary to the table, but with this generation we could start to see a real use for 512MB of local memory. MRTs, larger textures, normal maps, vertex textures, huge resolutions, and a lack of hardware compression for fp16 and fp32 textures all mean that we are on the verge of seeing games push memory usage way up. Processing these huge stores of data require GPUs powerful enough to utilize them efficiently. The G70 begins to offer that kind of power. For the majority of today's games, we are fine with 256MB of RAM, but moving into the future it's easy to see how more would help.

In addition to these issues, a 512MB card would be a wonderful fit for Dual-Link DVI. This would make the part a nice companion to Apple's largest Cinema Display (which is currently beyond the maximum resolution supported by the GeForce 7800 GTX). In case anyone is curious, a double buffered 4xAA 32bit color+z framebuffer at 2560x1600 is about 190MB.

In our briefings on G70, we were told that every part of the chip has been at least slightly updated from NV4x, but the general architecture and feature set is the same. There have been a couple of more significant updates as well, namely the increased performance capability of a single shader pipe and the addition of transparency antialiasing. Let's take a look at these factors right now.

The Pipeline Overview Inside The Pipes
Comments Locked

127 Comments

View All Comments

  • CrystalBay - Wednesday, June 22, 2005 - link

    Does this card play Riddick smoothly @ shader 2++ ?????
  • fishbits - Wednesday, June 22, 2005 - link

    "In aboot 5 years i figure we'll be paying 1000 bucks for a video card. These prices are getting out of control, every generation is more expensive then the last. Dont make me switch to consoles damnit."

    Funny, I can't afford the very best TVs the minute they come out. Same for stereo components. But I don't cry about it and threaten "Don't make me switch to learning the ukelele and putting on my own puppet shows to entertain myself!" Every time a better component comes out, it means I get a price reduction and feature upgrade on the items that are affordable/justifiable for my budget.

    Seriously, where does the sense of entitlement come from? Do these people think they should be able to download top-of-the-line graphics cards through BitTorrent? Do they walk around Best Buy cursing out staff, manufacturers and customers for being so cruel as to buy and sell big-ass plasma TVs?

    On second thought, get your console and give up PC gaming. That way you can stop being miserable, and we can stop being miserable hearing about your misery.
  • tazdevl - Wednesday, June 22, 2005 - link

    Funny how the single card deltas here are higher than at any other site.

    Unwhelmed for the amount of money and lack of performance increase.

    Have to commend nVIDIA for ensuring retail availability at launch.
  • archcommus - Wednesday, June 22, 2005 - link

    Impressive, but I'm still happy with my X800 XL purchase for only $179. For what it seems, with a 1280x1024 display, I won't need the kind of power this card delivers for a very long time. And less than $200 compared to $600, with still excellent peformance for now and the forseeable future? Hmm, I'll take the former.
  • Lonyo - Wednesday, June 22, 2005 - link

    I would have liked some 1280x1024 benchmarks with 8xAA from the nVidia cards and 6xAA from ATi to see if it's worth getting something like a 7800GTX with 17/19" LCD's to run som esuper high quality settings in terms of AA/AF.
  • segagenesis - Wednesday, June 22, 2005 - link

    I'm not disappointed. For one thing the price of current cards will likely drop now, and there will also be mid-range parts soon to choose from. I think the transparency AA is a good idea for say... World of Warcraft. The game is loaded with them and too often can you see the blockyness of trees/grass/whatever.

    #44 - Actually are you new to the market? :) I remember when early "accelerated" VGA cards were nearly $1000. Or more.

    Everybody lambasted NVIDIA last year for the lack of product (6800GT/Ultra) to the market, so them actually making a presence this year instead of a paper launch should also be commended. Of course, now what is ATI gonna pull out of its hat?
  • KeDaHa - Wednesday, June 22, 2005 - link

    The screenshot shows very clearly that SSAA provides quite a quality improvement over no AA

    The difference is bloody miniscule, perhaps if you used an image SLIGHTLY larger than 640x480 to highlight the difference?
  • L3p3rM355i4h - Wednesday, June 22, 2005 - link

    Wowzers. Time to get rid of teh 9800...
  • shabby - Wednesday, June 22, 2005 - link

    In aboot 5 years i figure we'll be paying 1000 bucks for a video card. These prices are getting out of control, every generation is more expensive then the last.
    Dont make me switch to consoles damnit.
  • Xenoterranos - Wednesday, June 22, 2005 - link

    Hell, for the same price as an SLI setup I can go out and get a 23 inch cinema display...And since these cards can't handle the 30" native resolution anyway, it's a win-win. And yeah, whats up with the quality control on these benchmarks! I mean really, I almost decided to wait for the ATI next-gen part when I saw this (GeForce man since the GeForce2 GTS!)

Log in

Don't have an account? Sign up now