Pixel Shader Performance Tests

ShaderMark v2.0 is a program designed to stress test the shader performance of modern DX9 graphics hardware with Shader Model 2.0 programs written in HLSL running on a couple shapes in a scene.

We haven't used ShaderMark in the past because we don't advocate the idea of trying to predict the performance of real world game code using a synthetic set of tests designed to push the hardware. Honestly, as we've said before, the only way to determine performance of a certain program on specific hardware is to run that program on that hardware. As both software and hardware get more complex, results of any given test become less and less generalize able, and games, graphics hardware, and modern computer systems are some of the most complex entities on earth.

So why are we using ShaderMark you may ask. There are a couple reasons. First this is only a kind of ball park test. ATI and NVIDIA both have architectures that should be able to push a lot of shader operations through. It is a fact that NV3x had a bit of a handicap when it came to shader performance. A cursory glance at ShaderMark should tell us enough to know if that handicap carries over to the current generation of cards, and whether or not R420 and NV40 are on the same playing field. We don't want to make a direct comparison, we just want to get a feel for the situation. With that in mind, here are the benchmarks.

 

  Radeon X800 XT PE Radeon X800 Pro GeForce 6800 Ultra GeForce 6800 GT GeForce FX 5950 U
2
310
217
355
314
65
3
244
170
213
188
43
4
238
165
5
211
146
162
143
34
6
244
169
211
187
43
7
277
160
205
182
36
8
176
121
9
157
107
124
110
20
10
352
249
448
410
72
11
291
206
276
248
54
12
220
153
188
167
34
13
134
89
133
118
20
14
140
106
141
129
29
15
195
134
145
128
29
16
163
113
149
133
27
17
18
13
15
13
3
18
159
111
99
89
17
19
49
34
20
78
56
21
85
61
22
47
33
23
49
43
49
46

These benchmarks are run with fp32 on NVIDIA hardware and fp24 on ATI hardware. It isn't really an apples to apples comparison, but with some of the shaders used in shadermark, partial precision floating point causes error accumulation (since this is a benchmark designed to stress shader performance, this is not surprising).

ShaderMark v2.0 clearly shows huge increase in pixel shader performance from NV38 to either flavor of NV40. Even though the results can't really be compared apples to apples (because of the difference in precision), NVIDIA manages to keep up with the ATI hardware fairly well. In fact, under the diffuse lighting and environment mapping, shadowed bump mapping and water color shaders don't show ATI wiping the floor with NVIDIA.

In looking at data collected on the 60.72 version of the NVIDIA driver, no frame rates changed and a visual inspection of the images output by each driver yielded no red flags.

We would like to stress again that these numbers are not apples to apples numbers, but the relative performance of each GPU indicates that the ATI and NVIDIA architectures are very close to comparable from a pixel shader standpoint (with each architecture having different favored types of shader or operation).

In addition to getting a small idea of performance, we can also look deep into the hearts of NV40 and see what happens when we enable partial precision rendering mode in terms of performance gains. As we have stated before, there were a few image quality issues with the types of shaders ShaderMark runs, but this bit of analysis will stick only to how much work is getting done in the same amount of time without regard to the relative quality of the work.

  GeForce 6800 U PP GeForce 6800 GT PP GeForce 6800 U GeForce 6800 GT
2
413
369
355
314
3
320
283
213
188
5
250
221
162
143
6
300
268
211
187
7
285
255
205
182
9
159
142
124
110
10
432
389
448
410
11
288
259
276
248
12
258
225
188
167
13
175
150
133
118
14
167
150
141
129
15
195
173
145
128
16
180
161
149
133
17
21
19
15
13
18
155
139
99
89
23
49
46
49
46

The most obvious thing to notice is that, overall, partial precision mode rendering increases shader rendering speed. Shader 2 through 8 are lighting shaders (with 2 being a simple diffuse lighting shader). These lighting shaders (especially the point and spot light shaders) will make heavy use of vector normalization. As we are running in partial precision mode, this should translate to a partial precision normalize, which is a "free" operation on NV40. Almost any time a partial precision normalize is needed, NV40 will be able to schedule the instruction immediately. This is not the case when dealing with full precision normalization, so the many 50% performance gains coming out of those lighting shaders is probably due to the partial precision normalization hardware built into each shader unit in NV40. The smaller performance gains (which, interestingly, occur on the shaders that have image quality issues) are most likely the result of decreased bandwidth requirements, and decreased register pressure: a single internal fp32 register can handle two fp16 values making scheduling and managing resources much less of a task for the hardware.

As we work on our image quality analysis of NV40 and R420, we will be paying heavy attention to shader performance in both full and partial precision modes (as we want to look at what gamers will actually be seeing in the real world). We will likely bring shadermark back for these tests as well. This is a new benchmark for us, so please bear with us as we get used to its ins and outs.

NVIDIA's Last Minute Effort and The Test Aquamark 3 Performance
POST A COMMENT

95 Comments

View All Comments

  • raks1024 - Monday, January 24, 2005 - link

    free ati x800: http://www.pctech4free.com/default.aspx?ref=46670 Reply
  • Ritalinkid - Monday, June 28, 2004 - link

    After reading almost all of the video cards reviews posted on anandtech I start to get the feeling the anandtech has a grudge against nvidia. The reviews seem to put nvidia down no matter what area they excel in. With leading openGL support, ps3.0 support, and the 6850 shadowing the x800 in directX, its seems like nvidia should not be counted out as the "best card."
    I would love to see a review that tested all the features that both cards offered especially if showed the games that would benefit the most from each cards features (if they are available). Maybe then could I decide which is better, or which could benefit me more.
    Reply
  • BlackShrike - Saturday, May 8, 2004 - link

    Hey if anyone is gonna be buying one of these new cards, would anyone want to sell their 9700 pro or 9800 por/Xt for like 100-150 bucks? If you do contact me at POT989@hotmail.com. Thanks. Reply
  • DonB - Saturday, May 8, 2004 - link

    No TV tuner on this card either? Will there be an "All-In-Wonder" version soon that will include it? Reply
  • xin - Friday, May 7, 2004 - link

    (my bad, I didn't notice that I was on the first page of the posts, and replied to a message there heh)

    Well, since everyone else is throwing their preferences out there... I guess I will too. My last 3 cards have been ATI cards (9700Pro & 9500Pro, and an 8500 "Pro"), and I have not been let down. Right at this moment I lean towards the x800XT.

    However, I am not concerned about power since I am running a TruePower550, and I will be interested in seeing what happens with all of this between now and the next 4-6 weeks when these cards actually come to market... and I will make my decision then on which card to buy.
    Reply
  • xin - Friday, May 7, 2004 - link


    Besides that, even if it were true (which it isn't), there is a world of difference between have *some* level of support, and requiring it. (*some* meaning the intial application of PS3.0 technology to games, that will likely be as sloppy as your first time in the back of a car with your first girlfriend).

    Game makers will not require PS3.0 support for a long long long time... because it would alienate the vast majority of the people out there, or at least for the time being any person who doesn't have a NV40 card.

    Some games may implement it and look slightly better, or even still look the same only run faster while looking the same.... but I would put money down that by the time PS3.0 usage in games comes anywhere close to mainstream, both mfg's will have their new, latest and greatest cards out, probably a 2 generations or more past these cards.
    Reply
  • xin - Friday, May 7, 2004 - link


    first of all... "alot of the upcoming topgames will support PS3.0!" ??? They will? Which ones exactly?
    Reply
  • Z80 - Friday, May 7, 2004 - link

    Good review. Pretty much tells me that I can select either Nvidia or ATI with confidence that I'm getting alot of "bang for my buck". However, my buck bang for video cards rarely exceeds $150 so I'm waiting for the new low to mid range cards before making a purchase. Reply
  • xin - Friday, May 7, 2004 - link


    I love how a handful of stores out there feel the need to rip people off by charing $500+ for the x800PRO cards, since the XT isn't available yet.

    Anyway, something interesting I noticed today:

    http://www.compusa.com/products/product_info.asp?p...

    http://www.compusa.com/products/product_info.asp?p...

    Notice the "expected ship date"... at least they have their pricing right.
    Reply
  • a2y - Friday, May 7, 2004 - link

    Trog, I Also agree, the thing is.. its true i do not have complete knowledge of deep details of video cards.. u see my current video card is now 1 year old (Geforce4 mx440) which is terrible for gaming (50fps and less) and some games actually do not support it (like deusEX 2). I wanted a card that would be future proof, every consumer would go thinking this way, I do not spend everything i earned, but to me and some others $400-$500 is O.K. If it means its going to last a bit longer.
    I especially worry about the technology used more than the other specs of the cards, more technologies mean future games are going to support it. I DO NOT know what i'v just said actually means, but I fealt it during the past few years and have been affected by it right now (like the deus ex 2 problem!) it just doesn't support it, and my card performs TERRIBLY in all games

    now my system is relatively slow for hardcore gaming:
    P4 2.4GHz - 512MB RDRAM PC800 - 533MHz FSB - 512KB L2 Cache - 128MB Geforce4 mx440 card.

    I wanted a big jump in performance especially in gaming so thats why i wanted the best card currently available.
    Reply

Log in

Don't have an account? Sign up now