Image Quality, Feature Tests, and Power

Something we'd like to look at a bit more in-depth for this review is image quality. It's no secret that due to ATI and NVIDIA's differences in rendering graphics, there is always going to be some variation in the look of the graphics from one brand to another. Most times this variation is too subtle to notice, but upon closer inspection, certain patterns tend to emerge.



Hold your mouse over the links below to see Image Quality (Right Click the links to download the full-resolution images):

ATI
NVIDIA


With Black and White 2, we can see how well the in-game maximum AA does at cleaning up the image. Note how there is a significant difference between the edges in the pictures without AA and with "high" AA enabled by the game. However, we don't see the same kind of difference between the image without AA enabled and the one with maximum quality enabled (in the graphics driver). This is a good example of in-game AA doing a much better job, quality and performance-wise, than the max quality settings in the control panel. We suspect that Black and White 2 has implimented a custom AA algorithm and has issues running stock MSAA algorithms. For this reason we recommend using the Black and White 2's in-game AA instead of the control panel's AA settings.

Both ATI and NVIDIA hardware look great and render similar images, and luckily for ATI there is an upcoming patch that should improve performance.



Hold your mouse over the links below to see Image Quality (Right Click the links to download the full-resolution images):

ATI
NVIDIA
In the Quake 4 images, pay special attention to the white binder on the console, and how the edges of the object change with the different settings. This gives us a very clear view of how AA gets rid of jaggies and cleans up the over all image. Again, we don't see a very big difference between the image without AA enabled and the one with the maximum settings enabled in the driver, the latter being much more subtle than the in-game AA.



Hold your mouse over the links below to see Image Quality (Right Click the links to download the full-resolution images):

ATI
NVIDIA


Battlefield 2 gives us a good view of how the maximum quality settings in the control panel (specifically transparency AA) fix certain graphical problems in games. Fences in particular have a tendency to render inaccurately, especially when looking through them at certain angles. While you can see that the in-game AA without adaptive or transparency AA cleans up a lot of jagged edges (the flag pole for instance), it still has trouble with parts of the fence.

As for power, we ran the multitexturing and pixel shader feature tests under 3dmark06 and measured the maximum powerload via our trusty Kill-A-Watt. This measures power at the wall before the PSU, so it doesn't focus only on the graphics cards.

We can see the CrossFire and SLI systems pull insane ammounts of power, but even as a single card the X1900 XTX is a very hungry part.

Load Power


Quake 4 Performance Final Words
Comments Locked

120 Comments

View All Comments

  • bob4432 - Thursday, January 26, 2006 - link

    Good for ATI, after some issues in the not so distant past it looks like the pendulum has swung back in their direction.

    i really like this, it should drop the 7800GT prices down maybe to the ~$200-$220(hoping, as nvidia want to keep the market hold...) which would actually give me a reason to switch to some flavor of pci-e based m/b, but being display limited @ 1280x1024 with a lcd, my x800xtpe is still chugging along nicely :)
  • Spoelie - Thursday, January 26, 2006 - link

    it won't, they're in a different pricerange alltogether, prices on those cards will not drop before ati brings out a capable competitor to it.
  • neweggster - Thursday, January 26, 2006 - link

    How hard would it be for this new series of cards by ATI to be optimized for all benchmarking softwares? Well ask yourself that, I just got done talking to a buddy of mine whos working out at MSI. I swear I freaked out when he said that ATI is using an advantage they found by optimizing the new R580's to work better with the newest benchmarking programs like 3Dmark 06 and such. I argued with him thats impossible, or is it? Please let me know, did ATI possibly use optimizations built into the new R580 cards to gain this advantage?
  • Spoelie - Thursday, January 26, 2006 - link

    how would validating die-space on a gpu for cheats make any sense? If there is any cheat it's in the drivers. And no, the only thing is that 3dmark06 needs 24bit DST's for its shadowing and that wasn't supported in the x1800xt (uses some hack instead) and it is supported now. Is that cheating? The x1600 and x1300 have support for this as well btw, and they came out at the same time as the x1800.

    Architecturally optimizing for one kind of rendering being called a cheat would make nvidia a really bad company for what they did with the 6x00/Doom3 engine. But noone is complaining about higher framerates in those situations now are they?
  • Regs - Thursday, January 26, 2006 - link

    ....Where in this article do you see a 3D Mark score?
  • mi1stormilst - Thursday, January 26, 2006 - link

    It is not impossible, but unless your friend works in some high level capacity I would say his comments at best are questionable. I don't think working in shipping will qualify him as an expert on the subject?
  • coldpower27 - Wednesday, January 25, 2006 - link

    http://www.anandtech.com/video/showdoc.aspx?i=2679...">http://www.anandtech.com/video/showdoc.aspx?i=2679...

    "Notoriously demanding on GPUs, F.E.A.R. has the ability to put a very high strain on graphics hardware, and is therefore another great benchmark for these ultra high-end cards. The graphical quality of this game is high, and it's highly enjoyable to watch these cards tackle the F.E.A.R demo."


    Wasn't use of this considered a bad idea as Nvidia cards have a huge performance penalty when used in this and the final buuld was supposed to be much better???
  • photoguy99 - Wednesday, January 25, 2006 - link

    I noticed 1900x1440 is commonly benchmarked -

    Wouldn't the majority of people with displays in this range have 1920x1200 since that's what all the new LCDs are using? And it's the HD standard.

    Aren't LCDs getting to be pretty capable game displays? My 24" Acer has a 6 ms (claimed) gray to gray response time, and can at least hold it's own.

    Resolution for this monitor and almost all others this large: 1920x1200 - not 1920x1440.
  • Per Hansson - Wednesday, January 25, 2006 - link

    Doing the math:

    Crossfire = 459w - 1900XTX = 341w = 118w, efficiency of PSU used@400w=78% so 118x0.78=92,04w
  • Per Hansson - Friday, January 27, 2006 - link

    No replies huh? Cause I've read on other sites that the card draws upto 175w... Seems like quite a stretch so that was why I did the math to start with...

Log in

Don't have an account? Sign up now