ATI's New Leader in Graphics Performance: The Radeon X1900 Series
by Derek Wilson & Josh Venning on January 24, 2006 12:00 PM EST- Posted in
- GPUs
Battlefield 2 Performance
Battlefield 2 has been a standard for performance benchmarks here in the past, and it's probably one of our most important tests. This game still stands out as one of those making best use of the next generation of graphics hardware available right now due to its impressive game engine.
One of the first things to note here is something that is a theme throughout all of our performance tests in this review. In all our tests we find that the X1900 XTX and X1900 XT perform very similar to each other, and in some places differ only by a couple of frames per second. This is significant considering that the X1900 XTX costs about $100 more than the X1900 XT.
Below we have two sets of graphs for three different settings: no AA, 4xAA/8xAF, and maximum quality (higher AA and AF settings in the driver). Note that our benchmark for BF2 had problems with NVIDIA's sli so we were forced to omit these numbers. We can see how with and without AA, both ATI and NVIDIA cards perform very similar to each other on each side. Generally though, since ATI tends to do a little better with AA than NVIDIA, they hold a slight edge here. With the Maximum quality settings, we see a great reduction in performance which is expected. Something to keep in mind is that in the driver options, NVIDIA can enable AA up to 8X, while ATI can only enable up to 6X, so these numbers aren't directly comparable.
Battlefield 2 has been a standard for performance benchmarks here in the past, and it's probably one of our most important tests. This game still stands out as one of those making best use of the next generation of graphics hardware available right now due to its impressive game engine.
One of the first things to note here is something that is a theme throughout all of our performance tests in this review. In all our tests we find that the X1900 XTX and X1900 XT perform very similar to each other, and in some places differ only by a couple of frames per second. This is significant considering that the X1900 XTX costs about $100 more than the X1900 XT.
Below we have two sets of graphs for three different settings: no AA, 4xAA/8xAF, and maximum quality (higher AA and AF settings in the driver). Note that our benchmark for BF2 had problems with NVIDIA's sli so we were forced to omit these numbers. We can see how with and without AA, both ATI and NVIDIA cards perform very similar to each other on each side. Generally though, since ATI tends to do a little better with AA than NVIDIA, they hold a slight edge here. With the Maximum quality settings, we see a great reduction in performance which is expected. Something to keep in mind is that in the driver options, NVIDIA can enable AA up to 8X, while ATI can only enable up to 6X, so these numbers aren't directly comparable.
120 Comments
View All Comments
DerekWilson - Tuesday, January 24, 2006 - link
this is where things get a little fuzzy ... when we used to refer to an architecture as being -- for instance -- 16x1 or 8x2, we refered to the pixel shaders ability to texture a pixel. Thus, when an application wanted to perform multitexturing, the hardware would perform about the same -- single pass graphics cut the performance of the 8x2 architecture in half because half the texturing poewr was ... this was much more important for early dx, fixed pipe, or opengl based games. DX9 through all that out the window, as it is now common to see many instructions and cycles spent on any given pixel.in a way, since there are only 16 texture units you might be able to say its something like 48x0.333 ... it really isn't possible to texture all 48 pixels every clock cycle ad infinitum. in an 8x2 architecture you really could texture each of 8 pixels with 2 textures every clock cycle forever.
to put it more plainly, we are now doing much more actual work with the textures we load, so the focus has shifted from "texturing" a pixel to "shading" a pixel ... or fragment ... or whatever you wanna call it.
it's entirely different then xenos as xenos uses a unified shader architecture.
interestingly though, R580 supports a render to vertex buffer feature that allows you to turn your pixel shaders into vertex processors and spit the output straight back into the incoming vertex data.
but i digress ....
aschwabe - Tuesday, January 24, 2006 - link
I'm wondering how a dual 7800GT/7800GTX stacked up against this card.i.e. Is the brand new system I bought literally 24 hours ago going to be able to compete?
Live - Tuesday, January 24, 2006 - link
SLI figures is all over the review. Go read and look at the graphs again.aschwabe - Tuesday, January 24, 2006 - link
Ah, my bad, thanks.DigitalFreak - Tuesday, January 24, 2006 - link
Go check out the review on hardocp.com. They have benchies for both the GTX 256 & GTX 512, SLI & non SLI.Live - Tuesday, January 24, 2006 - link
No my bad. I'm a bit slow. Only the GTX 512 SLI are in there. sorry!Viper4185 - Tuesday, January 24, 2006 - link
Just a few comments (some are being very picky I know)1) Why are you using the latest hardware with and old Seagate 7200.7 drive when the 7200.9 series is available? Also no FX-60?
2) Disappointing to see no power consumption/noise levels in your testing...
3) You are like the first site to show Crossfire XTX benchmarks? I am very confused... I thought there was only a XT Crossfire card so how do you get Crossfire XTX benchmarks?
Otherwise good job :)
DerekWilson - Tuesday, January 24, 2006 - link
crossfire xtx indicates that we ran a 1900 crossfire edition card in conjunction with a 1900 xtx .... this is as opposed to running the crossfire edition card in conjunction with a 1900 xt.crossfire does not synchronize GPU speed, so performance will be (slightly) better when pairing the faster card with the crossfire.
fx-60 is slower than fx-57 for single threaded apps
power consumption was supposed to be included, but we have had some power issues. We will be updating the article as soon as we can -- we didn't want to hold the entire piece in order to wait for power.
harddrive performance is not going to affect anything but load times in our benchmarks.
DigitalFreak - Tuesday, January 24, 2006 - link
See my comment above. They are probably running an XTX card with the Crossfire Edition master card.OrSin - Tuesday, January 24, 2006 - link
Are gamers going insane. $500+ for video card is not a good price. Maybe its jsut me but are bragging rights really worth thats kind of money. Even if you played a game thats needs it you should be pissed at the game company thats puts a blot mess thats needs a $500 card.