No More Memory Bandwidth

Again, we have a 256 bit (4x 64 bit) memory interface to GDDR3 memory. The local graphics memory setup is not significantly different from the 6800 series of cards and only runs slightly faster at a 1.2 GHz effective data rate. This will work out in NVIDIA's favor as long as newer games continue to put a heavier burden on pixel shader processing. NVIDIA sees texture bandwidth as outweighing color and z bandwidth in the not too distant future. This doesn't mean the quest after ever increasing bandwidth will stop; it just means that the reasons we will need more bandwidth will change.

A good example of the changing needs of graphics cards is Half-Life 2. While the game runs very well even on older graphics cards like the 9800 Pro, the design is such that increased memory bandwidth is far less important than having more shader processing power. This is why we see the 6600GT cards significantly outperform the 9800 Pro. Even more interesting is that in our testing, we found that enabling 4xAA on a 9800 Pro didn't affect performance of HL2 much at all, while increasing the resolution from 1024x768 to 1280x1024 had a substantial impact on frame rates. If the HL2 model is a good example of the future of 3D engines, NVIDIA's decision to increase pixel processing power while leaving memory bandwidth for the future makes a lot of sense.

On an interesting side note, the performance tests in this article are mostly based around 1600x1200 and higher resolutions. Memory usage at 2048x1536 with 32bit color and z-buffer runs a solid 144MB for double buffered rendering with 4x AA. This makes a 256MB card a prerequisite for this setup, but depending on the textures, render targets and other local memory usage, 256MB may be a little short. PCI Express helps a little to alleviate any burden placed on system memory, but it is conceivable that some games could get choppier when swapping in and out large textures, normal maps, and the like.

We don't feel that ATI's 512MB X850 really brings anything necessary to the table, but with this generation we could start to see a real use for 512MB of local memory. MRTs, larger textures, normal maps, vertex textures, huge resolutions, and a lack of hardware compression for fp16 and fp32 textures all mean that we are on the verge of seeing games push memory usage way up. Processing these huge stores of data require GPUs powerful enough to utilize them efficiently. The G70 begins to offer that kind of power. For the majority of today's games, we are fine with 256MB of RAM, but moving into the future it's easy to see how more would help.

In addition to these issues, a 512MB card would be a wonderful fit for Dual-Link DVI. This would make the part a nice companion to Apple's largest Cinema Display (which is currently beyond the maximum resolution supported by the GeForce 7800 GTX). In case anyone is curious, a double buffered 4xAA 32bit color+z framebuffer at 2560x1600 is about 190MB.

In our briefings on G70, we were told that every part of the chip has been at least slightly updated from NV4x, but the general architecture and feature set is the same. There have been a couple of more significant updates as well, namely the increased performance capability of a single shader pipe and the addition of transparency antialiasing. Let's take a look at these factors right now.

The Pipeline Overview Inside The Pipes
POST A COMMENT

127 Comments

View All Comments

  • mrdeez - Thursday, June 23, 2005 - link

    also:maybe gaming in hi def......ona big screen Reply
  • mrdeez - Thursday, June 23, 2005 - link

    #114
    Dude just stfu......we are here to comment what we want and say it freely......minus threats and name calling....as i said before this card is not for gamers...maybe elite gamers that have a monitor that does these resolutions but most gamers i know have went to lcd and i have yet to see any lcd[im sure there are some]do these resolutions so this card really is a card for crt elite gamers......lol with those resolutions on a 21 inch monitor you would need binoculars as glasses to play the game....the tanks on bf2 would be ant like small....
    Reply
  • bob661 - Thursday, June 23, 2005 - link

    #114
    I am SO glad that Anand remains in business despite all the bitches that are in these comment sections.
    Reply
  • Locut0s - Thursday, June 23, 2005 - link

    Those who are complaining that they should have reviewed at lower resolutions should think for a minute. First of all you are talking about a 600 buck card, most people who have that kind of money to spend on a card also have a monitor that is capable of 1600x1200 or better. Also benchmarking at any lower resolution on a card like this in todays games is almost pointless as you are almost entirely CPU bound at those resolutions. Do you really want to see page after page of 1024x768 charts that differ by only 4-5 percent at the most?

    Also give the editors a break when it comes to writing these articles. As others have said this is not a subscription site and given the number of visitors and the quality of the articles I'm amazed, and gratified, that the people of Anandtech keep putting out article after long article despite all the winning that goes on over spelling mistakes and graph errors that more often than not are corrected within a few hours.
    Reply
  • SDA - Thursday, June 23, 2005 - link

    That's a really great comparison, #112, especially seeing as how we pay for AnandTech and any problems with it could leave us stranded in the middle of nowhere. And so witty, too!

    Jarred, ah, thanks.
    Reply
  • Questar - Thursday, June 23, 2005 - link

    "Our Web Editor is on vacation and we are all doing our own HTML and editing for the next 10 days. In our usual process, the article goes from an Editor to the Web Editor who codes the article, checks the grammar, and checks for obvious content errors. Those steps are not in the loop right now."


    " do know Derek as a very conscientious Editor and I would ask that you please give him, and the rest of us, a little slack this next week and a half"


    Dear Mr. Fink,
    I am sorry to hear about the problems you have had with your vehicle breakdowns. You see, our quality inspector was on vacation that week, so we just shipped our vehicles strait off the assembly line. Please cut us a little slack, as we usually build much better vehicles.

    Sincerly,
    Buncha Crap,
    CEO
    Crappy Moters Inc.

    Reply
  • frisbfreek - Thursday, June 23, 2005 - link

    my question is how did they do hi res 2048x1536 when the card is only single-link DVI? Shouldn't either an analog connection or dual-link be necessary? Reply
  • yacoub - Thursday, June 23, 2005 - link

    #108 - what do you want, a $2000 CPU to go with your $1200 in GPUs? =P Reply
  • CtK - Thursday, June 23, 2005 - link

    so dual display is still not available with dual 7800s?!?!?!?!? Reply
  • smn198 - Thursday, June 23, 2005 - link

    Come on Intel & AMD. Keep up! Reply

Log in

Don't have an account? Sign up now