256MB vs. 512MB - The Real World Performance Difference

More local GPU memory is never a bad thing, but it must be taken advantage of to be worth its high cost.  That means we need games with larger textures and higher detail levels to truly require 512MB cards, but given that the majority of gamers still have 64MB or less on their graphics cards - it's going to be a while before 512MB is necessary.  Game developers are notorious for developing "for the masses" and thus, will spend very little time on that which can only be taken advantage of by owners of $500+ graphics cards, today's 512MB card included. 

ATI's own marketing literature claims that the X800 XL 512MB offers up to a 40% performance increase over the 256MB X800 XL...at 1600 x 1200, with 6X anti-aliasing and 16X anisotropic filtering enabled.  The problem is that at such high resolutions with AA/AF cranked up, the X800 XL doesn't have the fill rate or the memory bandwidth to offer reasonable frame rates in most games, which is why we find the X800 XL 512MB to be more of a mismatch than anything else.  A faster GPU with more memory bandwidth would be able to offer more real world benefit when coupled with 512MB of memory than the X800 XL. 

That being said, let's look at the performance breakdown for the X800 XL 256MB vs. X800 XL 512MB at 1600 x 1200 with 4X AA and 8X AF enabled - pretty aggressive settings for the X800 XL to begin with.

As you can see, there is less than a 1% performance advantage to having 512MB with the X800 XL, even at these aggressive settings in four of the five benchmarks.  In Half Life 2, the 512MB card actually offers a fairly reasonable 11% increase in performance, but in the other games, the performance advantage is nothing.  The other thing to keep in mind is that 1600x1200 with 4X AA and 8X AF enabled is not the sweet spot for the X800 XL. In Chronicles of Riddick, for example, the performance offered at these settings just isn't smooth at all. 

The Half Life 2 performance boost is particularly interesting, but that was the only game we encountered where the performance boost was not only reasonable, but the game was also fairly smooth in actual game play.  However, at the price of the X800 XL 512MB, you are better off just purchasing an X850 XT and getting better performance across the board, including Half Life 2.

Although the single graph on this page pretty much tells the story of the X800 XL 512MB, we've included performance results from both X800 XL cards, the X850 XT as well as NVIDIA's GeForce 6800GT and 6800 Ultra on the coming pages, if you want to see things in perspective.  We included the X850 XT and 6800 Ultra in the comparisons because it is priced similarly to the X800 XL 512MB's suggested retail price.

The Test

AMD Athlon 64 Configuration

Athlon 64 4000+ Socket-939 CPU
2 x 512MB OCZ PC3200 EL Dual Channel DIMMs 2-2-2-10
ASUS nForce4 SLI Motherboard
ATI Catalyst 5.4 Drivers
NVIDIA 71.89 Drivers

Index Doom 3 Performance
Comments Locked

70 Comments

View All Comments

  • aliasfox - Wednesday, May 4, 2005 - link

    The x800xl is not the best card on the Mac side... the 6800 Ultra DDL and x800XT have been available for some time, both with DDL support for the 30" display.

    Of course, I'm just waiting for someone to hook a massive SLI system up to a 30" and actually *play* a modern game at 2560x1600... that would surely be a sight to see...
  • Cuser - Wednesday, May 4, 2005 - link

    GED: I gree with you on the OpenGL thing. I thought nvidia was in the lead with OpenGL support. My old ATI would not play in OpenGL at all!

    However, I disagree with your second post. If it were that simple, why do rendered games need so much RAM? I am pretty sure there are other things stored in video RAM when rendering the OS in 3D including textures. I think the days of the VRAM being used as JUST a frame buffer in the OS are numbered.
  • Ged - Wednesday, May 4, 2005 - link

    "2) On a MACINTOSH with Tiger, extra VRAM is a very good thing for the future, considering how the Quartz 2D Extreme will work (utilizing the GPU for OS rendering, caching it in VRAM)"

    You shouldn't need close to 512MB of VRAM for displaying OSX's 'Quartz 2D Extreme'.

    Consider that Apple's largest display is 2560x1600 pixels (4096000 pixels total) x 24 bpp (I'm assuming 24bpp) = 98,304,000 bits for a frame (12,288,000 bytes for a frame).

    Even if you had 20 full screen frames which you composited together for the full effect it would only take 245,760,000 bytes which is still within the 256MB on current generations of cards (using 32bpp it's still under 256MB at 19 complete frames).

    I seriously doubt that anything Quartz would do would need that much VRAM. If Quartz does need that much VRAM, I think something is really, really wrong.

    Someone please correct my math if I'm off, but something's wrong if you need 512MB to display 2D OS graphics.
  • Ged - Wednesday, May 4, 2005 - link

    "1) ATI is better at OpenGL than Nvidia"

    Everything I have read and all the benchmarks I have seen are oposite of this claim.

    What am I missing?
  • racolvin - Wednesday, May 4, 2005 - link

    Anand:

    Two things to remember about these cards:

    1) ATI is better at OpenGL than Nvidia
    2) On a MACINTOSH with Tiger, extra VRAM is a very good thing for the future, considering how the Quartz 2D Extreme will work (utilizing the GPU for OS rendering, caching it in VRAM)

    Everything on the Mac is OpenGL, so ATI testing the waters with this part is not surprising when I think of it in the Mac-future context. The X800XL would still be the best card available for the Mac folks should it make it to that side of the fence ;)

    R
  • StrangerGuy - Wednesday, May 4, 2005 - link

    Yet another pointless release from ATI, or Nvidia if it matter with their respective 512MB cards.

    BTW I saw an advertisement on a 512MB X300SE by ECS via a link from AT forums. Anyone has a 486 with 512MB yet?
  • Cuser - Wednesday, May 4, 2005 - link

    Bersl2: I sympathize with where you are coming from. However there are 2 things that you are not considering.

    First, Microsoft controls the operating system, so, it would stand to reason they control (if not have a large influence on) the ways and methods the developers, programmers, and hardware alike interact with that system(and vise versa).

    Secondly, DirectX api is developed in collaboration with the graphics industry (or at least so I am led to believe).

    I think the real reason to get upset by this is if microsoft abuses this power (do they?).

    Furthermore, I myself would like to see more titles using OpenGL, if not to promote diversity and not be dependant on only one API.

    I am not a developer, but it seems that more and more games are only using DirectX API because Microsoft pushes it (i.e...easier to get information, training, SDK's, and is advertised everywhere.)

    OpenGL is an open standard and does not seem to have that type of support. But, this is just my observation so far. Maybe someone in the industry could shed some light on this...
  • bersl2 - Wednesday, May 4, 2005 - link

    Cuser: Your assertion I agree with, if not your reasoning.

    Think about this: *Why* should Microsoft be the one controlling the API? The games they've "made" have (almost?) all been bought from somebody else. They don't make hardware either. Why is everybody being led around on a leash by the middleman?

    It's frustrating when there's very little reason (and if there is a halfway decent reason, I'd like to hear about it---and don't give me that "ease of use" crap; any competent programmer can use either API rather well) not to use OpenGL over Direct3D, other than the "Nobody ever got fired for choosing Microsoft" mentality.
  • aliasfox - Wednesday, May 4, 2005 - link

    I'm sure Aero Glass in all it's glory will want more than 64 MB VRAM. Given that a) it's Microsoft (not known for its sleek software) and b) that it's still a year away.

    My reasoning? Mac OSX (don't shoot me). OSX's Aqua interface is also 3D in the absolute barest sense of the word, and the rendering engine uses the GPU for textures and such (my understanding, at least). Features such as Exposé on Panther (which has been out for ~18 months) are much, much happier in 64 MB RAM than in 32.

    In addition, some of Apple's new features in Tiger need a GPU with DX9 features. I have a feeling Microsoft will do the same thing with Longhorn. But perhaps no more than 128 MB VRAM, I would hope.
  • fishbits - Wednesday, May 4, 2005 - link

    LOL, guess we'll have to wait and see what the real reasoning for designing the card was, and how good a buy it will be (and for whom and when). Anand nailed it when he said it was a raw deal for most of us as things stand now.

Log in

Don't have an account? Sign up now