No More Memory Bandwidth

Again, we have a 256 bit (4x 64 bit) memory interface to GDDR3 memory. The local graphics memory setup is not significantly different from the 6800 series of cards and only runs slightly faster at a 1.2 GHz effective data rate. This will work out in NVIDIA's favor as long as newer games continue to put a heavier burden on pixel shader processing. NVIDIA sees texture bandwidth as outweighing color and z bandwidth in the not too distant future. This doesn't mean the quest after ever increasing bandwidth will stop; it just means that the reasons we will need more bandwidth will change.

A good example of the changing needs of graphics cards is Half-Life 2. While the game runs very well even on older graphics cards like the 9800 Pro, the design is such that increased memory bandwidth is far less important than having more shader processing power. This is why we see the 6600GT cards significantly outperform the 9800 Pro. Even more interesting is that in our testing, we found that enabling 4xAA on a 9800 Pro didn't affect performance of HL2 much at all, while increasing the resolution from 1024x768 to 1280x1024 had a substantial impact on frame rates. If the HL2 model is a good example of the future of 3D engines, NVIDIA's decision to increase pixel processing power while leaving memory bandwidth for the future makes a lot of sense.

On an interesting side note, the performance tests in this article are mostly based around 1600x1200 and higher resolutions. Memory usage at 2048x1536 with 32bit color and z-buffer runs a solid 144MB for double buffered rendering with 4x AA. This makes a 256MB card a prerequisite for this setup, but depending on the textures, render targets and other local memory usage, 256MB may be a little short. PCI Express helps a little to alleviate any burden placed on system memory, but it is conceivable that some games could get choppier when swapping in and out large textures, normal maps, and the like.

We don't feel that ATI's 512MB X850 really brings anything necessary to the table, but with this generation we could start to see a real use for 512MB of local memory. MRTs, larger textures, normal maps, vertex textures, huge resolutions, and a lack of hardware compression for fp16 and fp32 textures all mean that we are on the verge of seeing games push memory usage way up. Processing these huge stores of data require GPUs powerful enough to utilize them efficiently. The G70 begins to offer that kind of power. For the majority of today's games, we are fine with 256MB of RAM, but moving into the future it's easy to see how more would help.

In addition to these issues, a 512MB card would be a wonderful fit for Dual-Link DVI. This would make the part a nice companion to Apple's largest Cinema Display (which is currently beyond the maximum resolution supported by the GeForce 7800 GTX). In case anyone is curious, a double buffered 4xAA 32bit color+z framebuffer at 2560x1600 is about 190MB.

In our briefings on G70, we were told that every part of the chip has been at least slightly updated from NV4x, but the general architecture and feature set is the same. There have been a couple of more significant updates as well, namely the increased performance capability of a single shader pipe and the addition of transparency antialiasing. Let's take a look at these factors right now.

The Pipeline Overview Inside The Pipes
Comments Locked

127 Comments

View All Comments

  • multiblitz - Sunday, June 26, 2005 - link

    It would be great of you could do a comparison between the 6800 and the 7800 in video /DVD-playback-quality similar to the comparison betwenn the X800 and the 6800 you did last year.
  • at80eighty - Saturday, June 25, 2005 - link

    OMG! I've never seen so many bitching whiners come outta the woodworks like this!!

    You A-holes oughta remember that this site has been kept free

    F
    R
    E
    E

    The editors owe YOU nothing. At all.

    AT team - accidents happen. Keep up the great work!

    /#121 : well said. Amazing how these turds dont realise that the knife cuts both ways...
  • mrdeez - Friday, June 24, 2005 - link

    #124
    You can stfu too...j/k..point taken .

    I guess the real issue for me is that this card is a beast but ill never have it in my sli rig......i want all settings maxed at playable resolutions thats just me.........and i will not go back to crt...lol crt thats was lame dude
  • Momental - Friday, June 24, 2005 - link

    #122 The problem with your solution regarding "all of us just getting two 6880U's" works perfectly for those with an SLI-capable board, yes? Some of us, like myself, anticpated the next generation of GPU's like the 7800 series and opted to simply upgrade to one of those when the dust settled and prices slid back a bit.

    Additionally, telling someone to "STFU" isn't necessary. We can't hold a conversation if we're all silent. Knowhuddamean, jellybean? Hand gestures don't work well over the internet, but here's one for you..........
  • SDA - Friday, June 24, 2005 - link

    LCD gamers shouldn't be bothering with new graphics cards, they should get new monitors.

    kidding, I have nothing against LCDs. The real advantage of showing the card run at 2048x1536 is that it lets you see how well the card scales to more stressful scenarios. A card that suddenly gets swamped at higher resolutions will probably not fare well in future games that need more memory bandwidth.

    On a side note, you can get a CRT that will run 2048x1536 @ a reasonable refresh for about $200 shipped (any Sony G520 variant, such as the Dell P1130). The only things that would actually be small in games are the 2D objects that have set pixel sizes, everything else looks beautiful.
  • mrdeez - Friday, June 24, 2005 - link

    #121
    lol ty for your insight....anyway like i said this card is not for lcd gamers as most have a 12x10 or 16x12.....so what purpose does this card have??answer me this batman and you have the group that should buy this card -otherwise, the rest of us should just get 2 6800u....this card is geared more for workstation graphics not gaming.....unless you game on a hi def crt and even then max res would be 1920 by 1080i..or something like that.....
  • SDA - Friday, June 24, 2005 - link

    #116, if people in the comments thread are allowed to give their opinions, why shouldn't #114 give his too? Surely even an illiterate like you should realize that arguing that everyone is entitled to his or her own argument means that the person you're arguing with is too.

    #119, some people have different requirements than others. Some just want no visible blur, others want the best contrast ratio and color reproduction they can get.
  • bob661 - Thursday, June 23, 2005 - link

    #188
    Oh yeah. The monitor goes up to 16x12.
  • bob661 - Thursday, June 23, 2005 - link

    #118
    I play BF2 on a Viewsonic VP201b (20.1") at work and it's very good. No streaking or ghosting. Video card is a 6800GT. I play at 1280x960.
  • Icehawk - Thursday, June 23, 2005 - link

    Well, I for one think 1280x1024 is pretty valid as that is what a 19" LCD can do. I'd just want to see a maxed out 12x10 chart to see how it does - I know a 6800 can't do it for every game with full AA and AF. Otherwise I agree - a 12x10 with no options isn't going to show much with current games.

    See, I'm considering swapping my two 21" CRTs for two 19" LCDs - and they won't do more than 12x10. I'd love to do two 20-21" LCDs but the cost is too high and fast panels aren't to be found. 19" is the sweet spot right now IMO - perhaps I'm wrong?

    Thanks AT for a nice article - accidents happen.

Log in

Don't have an account? Sign up now