Q1 2004 Vendor Graphics Card Roundup

by Derek Wilson on 2/4/2004 7:30 AM EST
POST A COMMENT

44 Comments

Back to Article

  • vss1980 - Wednesday, March 10, 2004 - link

    I'm surprised there weren't more DirectX9 games tested.
    I must admit its annoying when review sites test with only 1 or 2 older games and focus on just 3Dmark and new game tests, but considering all those cards are DX9 cards the lack of DX9 testing isn't right.
    Reply
  • anthonyv - Monday, March 01, 2004 - link

    please recheck the charts on p.25... I think the charts, chart titles, and text are mixed up.
    Cheers,
    ..anthony
    Reply
  • Atlas5 - Sunday, February 29, 2004 - link

    I'm not sure that a comparison of such limited depth is all that useful to someone trying to figure out if upgrading to a new card is worthwhile. Try this review...

    http://www6.tomshardware.com/graphic/20031229/inde...

    ...if you're looking to compare more than what's hit the streets in the last 9 months.
    Reply
  • mkseidl - Thursday, February 19, 2004 - link

    You can d/l the coolbits reg edit which will add a tab to your display settings to overclock the video card with simple sliders. Just to compare with your FFXI my score was 3800 :)(I haven't run the benchmark with my card oc'd)

    I have

    Asus a7v8x
    xp1700 @ 2.25ghz
    evga geforceFX 5900se
    1gb pc2100
    sata raid 0 array 240gb
    dvd +/- burner
    lian li case


    I have a question, on hot hardware they got their evga up to 471/871mhz

    But I can only gt mine up to 429\820
    which is about a 50mhz diff? is that average for cards? I know not all cards/procs can oc the sae. but shouldn't it be closer?

    Martin
    mkseidl@hotmail.com
    Reply
  • mincheng - Wednesday, February 18, 2004 - link

    Can anyone tell me how to overclock the video cards??? I know that when you OC a cpu you gotta go to BIOS, but I just don't understand how Video Card overclocking works? I currently own a PNY FX5600 Ultra and a P4 3.06Ghz/533L2Cache - overclocked to 3.45Ghz and a 1Ghz of PC800 RDRAM. Scored 3437 on FFXI benchmark ver.2 . I really want to OC my video card so I can get better speed. And if you know how to OC then can you please e-mail me at hhsu@socal.rr.com? THANKS! Reply
  • mkseidl - Wednesday, February 18, 2004 - link

    Just out of curiosity, how did you guys overclock the video cards? I can't find anything in my drivers or bios to overclock my card. After reading your benchmarks, I purchased a evga 5900se. I upgraded from Gefoce2 Ultra

    3dMark01 - Geforce 2 was 5013 and my Fx was 9109
    Is that a big jump?
    And in the FFXI benchmark I went from 1900 to 3800

    But I want to overclock the crap out of it like my proc ;)

    Martin
    Reply
  • Pete - Wednesday, February 11, 2004 - link

    Last nitpicks: box shots and bundle listings would have been useful, and part numbers and stock speeds should have been mandatory on each card's page. Considering most vendors offer multiple versions of each product, not clearly marking which card was tested in this overclocking roundup seems like a rather glaring omission. NewEgg now seems to offer a High Tek (HIS) 9600XT, but I can't tell if that's the same one as in your review, or even if stock memory speed for HIS cards is 600 or 650MHz. Your only mentions of memory speeds are at the ends of pages 15 and 16, somewhat removed from each card's product page. I think it would have much clearer and more helpful if you had mentioned that both Sapphire's and HIS' 9600XT's ship with their memory at 650MHz either in their product pages, or directly under the memory overclocking graph. Reply
  • Pete - Sunday, February 08, 2004 - link

    Derek, Indig0's reply is slightly worying. I hope people aren't getting the wrong impression of anyone's performance based on a single sample's overclocked performance in only three benchmarks. If the samples were sent to you by each AIB, that's even more worrying, as who's to say they weren't cherry-picked?

    In fact, not indicating both the clock speeds and the fact that the cards were overclocked on each benchmark graph seems like a gross oversight. If the Flash format is preventing you from doing so, then changing it should be a priority. But if you can add two lines of description for each card (which it looks like you have plenty of room for), I'd amend the review.

    For future overclocking reviews, I think it'd be much more useful to at least show the percentage improvements in both clock speed and framerate in the game benchmark charts. Sure, you've covered this earlier in the article, but it's a long article, and there's no point in forcing people to continually flip back and forth when it's relatively simple to add this data to the graphs. Ideally, I'd have liked to see two graphs for each card, one for stock and one for OC'ed speed, with the clock speeds in the card title and the percent improvement over stock speed right after the OC'ed speed bar. I think you can cram a lot more useful data into those graphs. :)
    Reply
  • Stlr22 - Sunday, February 08, 2004 - link


    I would love to see benchmarks at 1600x1200. That's the sweetspot that I always play games at so I'd would like to see what kinda CPU/GPU combo it's gonna take to get smooth game play at that level.
    Reply
  • Nemesis77 - Friday, February 06, 2004 - link

    This review is getting alot of flack from the folks at Beyond3D: http://www.beyond3d.com/forum/viewtopic.php?t=1013... Reply
  • DerekWilson - Thursday, February 05, 2004 - link

    We have added an update to the first page (and other relevant pages) of the article explaining a couple things that were omitted from the original publication of this article. One of these points being the fact that NVIDIA cards are underclocked when running 2D; the other being the fab process of ATI's R300/R350/R360 GPUs.

    I hope these points being included in the article help to explain some of the results we see here. Thanks again, everyone, for your feedback and input.
    Reply
  • Indig0 - Thursday, February 05, 2004 - link

    Derek, this article is great, so many questions answered, hooray for nVidia!!! I was very dissapointed by nVidia's previous lackluster preformance, and I am pleased to see that nVidia figured out how to close the gap.

    I have some questions though, Valve said that in HL2, the nVidia FX cards ran better if seen as DirectX 8 cards instead of DirectX 9, does this judgement still stand with the new nVidia drivers? What about with the new 5700 and 5950 cards?
    Reply
  • DerekWilson - Thursday, February 05, 2004 - link

    We have an article dedicated specifically to image quality, and all vendor's cards performed on par with the reference cards they extend from in terms of IQ.

    http://anandtech.com/video/showdoc.html?i=1931
    Reply
  • impar - Thursday, February 05, 2004 - link

    I think Pumpkinierre hit the jackpot!

    It is nice to read such a thorough review of graphic cards speed, noise and temperature but in my opinion the power required by each card should also be taken into consideration!
    For example, in the mainstream we have 96XT, 57U and 59SE. Of those which one is the less/more power demanding?

    It would be nice to see that kind of info displayed...

    I imagine IQ wasnt a part of the review due to time constraints?
    Reply
  • DerekWilson - Thursday, February 05, 2004 - link

    Thanks for the linkage ...

    "As it stands, I'd say we've got ourselves a riddle wrapped in an enigma with hand-coded conundrum planes."

    intensely complex proprietary hardware running equally intricate closed source software having to do with very involved mathematics will often hand you a situation that can be described as such. :-) ;-)
    Reply
  • Pete - Thursday, February 05, 2004 - link

    Thanks for the reply, Derek. I look forward to future articles and your Halo investigation.

    FYI, I posted a thread about that at Beyond3D. If it turns out that the limiting factor in Halo is not pixel shaders, then the default benchmark would seem to become less than useful in indicating PS2.0 performance. As it stands, I'd say we've got ourselves a riddle wrapped in an enigma with hand-coded conundrum planes.

    http://www.beyond3d.com/forum/viewtopic.php?t=1014...
    Reply
  • DerekWilson - Thursday, February 05, 2004 - link

    qquizz: Thanks for the suggestion; we'll include that card next time.

    Pete: As always, your input and comments are appreciated.

    The way we do our graphics makes showing multiple series for large datasets extremely difficult. We could have done this for the single card type graphs though, and will in the future if that's something that would help.

    As for game choice. It would have been impossible to pick a PS2.0 game benchmark that was not part of TWIMTBP (Tomb Raider and Tron are both TWIMTBP games). The unreal engine is a widely used engine in the gaming industry, and we could not have left UT2K3 out. Your are, however, wrong about JK:JA. It is not an XBox port; the XBox version was released near the same time as the PC version from which it was ported. JK:JA is very much a sequel to the other titles in its series and is based on the OpenGL Q3A engine (Its called (Direct)XBox rather than GLBox for a reason).

    In the end, games were chosen for poularity of engine or title only.

    As for graphs, its a mixed bag. Some people prefer percentages while others perfer pure frame rates. Of course, this was really the first article of its kind in quite a long while, and we've got a few nice tricks up my sleeve for the next incarnation that will involve percentage graphs... The overall performance graphs were more of an afterthought as the goal of the review was to show the performance gains OEMs could make in their catagory.

    On HIS: we actually have an IceQ in the lab, and we were very excited about including it in the roundup. Unfortunately, we plugged it in and it didn't work. Nothing but the fan came up. On the whole, we were very impressed by HIS, and we look forward to including them in future reviews and roundups.

    Seeing that the NVIDIA cards reach higher high temps should tell more about the cooling than idle. But your point is well taken, and in future reviews we will include a page about the reference cards themselves. A page is necessary because idle clock speeds aren't the only caveat to which we must pay attention when looking at these cards.

    As for your Halo specific question, that's a matter we are still researching. :-)

    Thanks for your questions and thoughts, they are always appreciated.
    Reply
  • jbirney - Thursday, February 05, 2004 - link

    Bummer,

    1) no mention of lack of tri-linear on the FX cards.

    2) no mention that Halo allows for partial precision vrs full precion for the ATI cards


    "Many people crucified the NVIDIA cards with Tomb Raider, Aquamark3, ShaderMark and Tron benchmarks. The problem is that there are popular games out there that NVIDIA cards perform better in, and that info got lost in the shuffle."

    No its that in these titles, they utilize more of the PS2.0 engine. We all know the ATI cards have a better PS2.0 engine. And while driver tricks on NV part will help to close the gap, the fact remains. I also agree with Pete I find it very strange that you hand picked DX9 games that NV was good in. Why not include some of the other ones that NV did not do so well?
    Reply
  • Pete - Thursday, February 05, 2004 - link

    Hey Derek,

    Thanks for the article. An impressive amount of data, as usual, and some surprises WRT to OCs. The noise data are especially useful.

    A few criticisms and questions follow, if you're willing to suffer them. It's clear which is which, but if you feel I'm displaying an anti-nV bias (it's not intentional, and perhaps the pro-nV bias I perceive in this article is also unintentional on your part--or all in my head), at least consider responding to the questions.

    1. Too many graphs! Could you show plain and AA+AF scores in the same graph, or are you locked into a default format?

    2. I also found the choice of games somewhat curious, in that all three seem skewed toward nV. UT2K3 is a TWIMTBP title, and both Halo and JK:JA were developed for Xbox (exclusively among the consoles) in addition to the PC. It's not a big deal--I focused mainly on the heat, noise, and OC'ing results--but it doesn't seem entirely appropriate. The fact that you settled on a more "mainstream" CPU, included all the cards in one graph and listed framerates rather than percentages to me implies that you expect the benchmarks to be considered in and of themselves, rather than just as OC'ing yardsticks.

    2a. The Halo bench, is that the game's built-in one that measures the intro cut-scene? Is that a really good test of PS2.0 effects and speed? I ask because I noticed in the huge recent THG roundup that the 5800U scored very closely to the 5900U, yet the latter was (IIRC) supposed to offer much-improved shaders. I thought it strange that the NV35's extra FP units didn't translate into performance gains in a title that used FP shaders. Perhaps NV3x's PS performance in Halo is register-limited?

    3. Could you perhaps add comments on why the cards score as they do in the games (CPU-/bandwidth-/fillrate-/shader-limited?), or is that outside the scope of the article? If so, then why not just show OC'ing gains as percentages, to make it easier to compare with the gains over stock core and memory speeds?

    4. In light of the HIS 9600XT's exemplary OC'ing, I was curious why you didn't also test an HIS 9800 Pro with the well-reviewed IceQ HSF? Did HIS not have one to ship to you, or is HIS not selling the latter card in the U.S.? It's curious that I can't find a single HIS card on Pricewatch or PriceGrabber, because I'm almost positive I found them there before.

    5. I, too, forgot that nV cards are underclocked when idle. I think you'd be remiss not to remind your readers of that. If people are going to take the risk of OC'ing their cards, they'd be better off with all the info. At first, I thought the differences in idle temps indicated the FX cards have more effective HSF's and thus OC better, but that's not really something I can draw from the data. OTOH, the IceQ Cooler seems to allow for much greater OC's of 9800 Pro's precisely because it cools the GPU so much better than the stock HSF.

    Thanks again.


    BTW, I think 12x9/10 +/- AA&AF is a good compromise res to test cards with in the future (high-end 4xAA, mainstream 2xAA).
    Reply
  • Pumpkinierre - Thursday, February 05, 2004 - link

    The temperature measurements are a welcome change to a graphics card review. For too long reviews have gotten by with statements such as 'it runs warm'. When I built my system I probably wouldnt have bought the 9800pro had i known the levels of heat produced- 70C on the heatsink at load. I had to remodify my cooling to target the GPU and now keep it within 60C.
    Presently, with justifiable criticism levelled at Prescott P4 regarding excessive heat, it would be nice to know the max. heat dissipation, max. case temp., current draw and wattage of these GPUs.
    Reply
  • qquizz - Thursday, February 05, 2004 - link

    They should have included the 9700PRO in the benchmarks IMO Reply
  • dgrady76 - Thursday, February 05, 2004 - link

    Great article. Almost like old-school AT, except better in many ways. Mr. Wilson, your thoroughness is appreciated and made for a great read. Reply
  • TheSnowman - Thursday, February 05, 2004 - link

    if if you look back you will note that i did ask if you knew why the discrepancy existed. if i didn't know myself, i would have simply asked why and not directed my question at your knowledge of the situation. while i cannot argue against your comment that the information is available elsewhere, i doubt that you can argue that the omission of it in your review undoubtably serve to mislead the portion of the community who is not aware that you are comparing underclocked and undervolted cards to overclocked ones. as for linking to the original technology overviews or creating a page or two on the feature sets, that doesn't seem like a good idea when a single sentence explaining the situation would be much more effective in avoiding any possible confusion. Reply
  • DerekWilson - Wednesday, February 04, 2004 - link

    Why didn't you ask that question in the first place snowman ;-)

    The info was left out because we didn't feek it was relevant (the operation of reference nvidia and ati cards has been well documented on this site and on others).

    If you would prefer, in future roundups we can point to our articles introducing which ever GPUs we are including. If that's not enough, we can have a page or two dedicated to going over all the features the various reference cards have.
    Reply
  • TheSnowman - Wednesday, February 04, 2004 - link

    ya actually i knew that nvidia underclocks and undervolts their cards in 2d mode, i was just curious id Derek did. it struck me as rather odd that he would go though the trouble of presenting core temps while leaveing out such a vital factor in understanding the results. Reply
  • TrogdorJW - Wednesday, February 04, 2004 - link

    Snowman (and Derek), remember that the FX5900 cards (all of them?) have 2D and 3D speeds. This has been the case since the FX 5800. Basically, they usually run at 300 MHz core and memory (DDR = 600 MHz) and when a 3D application loads, they bump up to the higher speed. It was done initially so that they could run the "leaf blower" 5800 fan at lower RPMs when you didn't need the extra power.

    This is okay, I guess, but it might be an issue if you were to run games for 48 hours straight or something. Technically, I think the one speed option by ATI is better for peace of mind - if it can always run at that speed, you don't need to worry as much about instability after prolonged gaming sessions.

    From the benchmarks here, I have to agree that the FX5900 SE cards seem to be the best choice right now. Relatively close to the 5900 ultra and 5950 ultra, at almost half the cost! I think more of these cards should have been included instead of the 5700 Ultras. Who would spend $175+ on a 5700 Ultra when the 5900 SE/XT/regular costs about the same ($185+) and beats it pretty soundly in every benchmark? There are also some 5900 cards that list 850 MHz as the RAM speed (the eVGA lists 700) - they might cost $30 to $50 more, but maybe they use the 2.2ns or 2.0ns RAM? They could potentiall be as fast as the Ultra cards and cost $100 less!
    Reply
  • skiboysteve - Wednesday, February 04, 2004 - link

    also the vcore of the nvidia chips is at 1.2v in 2d mode and 1.6v in 3d mode. Reply
  • par - Wednesday, February 04, 2004 - link

    Noise and heat are a huge concern for me, so I'll probably put the 9600xt Ultimate in my sff, BUT when doom3 and some of the heavier games drop are the 4 pipes opposed to 8 gonna hurt me enough to buy a new card? Reply
  • AnonymouseUser - Wednesday, February 04, 2004 - link

    The EVGA 5900se is missing from the Temp graphs. Reply
  • DerekWilson - Wednesday, February 04, 2004 - link

    If you go back and look at the older scores for Unreal Tournament and Halo you'll see that (with the exception of 4xAA/8xAF in UT) ATI's high end cards haven't really done better in those two games:

    UT: http://anandtech.com/video/showdoc.html?i=1896&...
    Halo: http://anandtech.com/video/showdoc.html?i=1896&...

    If we had turned on AA/AF we would have seen ATI do better in UT.

    Many people crucified the NVIDIA cards with Tomb Raider, Aquamark3, ShaderMark and Tron benchmarks. The problem is that there are popular games out there that NVIDIA cards perform better in, and that info got lost in the shuffle.

    When it comes down to it, ATI's 9800XT is the best card out there if you aren't going to overclock. The 9600XT and the 5700U are just about the same in terms of goodness, and if you want a card in that price range, I'd recommend a cheap 5900 based card instead.

    TheSnowman:
    There are some issues with that. Since ATI and NVIDIA don't measure temp the same way, and we haven't been able to come up with a really solid way of standardizing temperature measurements between the two, we have to rely on what the driver says. We are looking for better methods of measuring the temperature of each card.

    That being said, I can venture a guess on why nvidia cards idle lower. Powerstrip offers me settings for 2D and 3D operation with NVIDIA cards, but I only get one setting to play with from ATI. The standard 2D clock speed for NVIDIA GPUs is anywhere from 1 to 2 hundred MHz lower than their 3D clock speed (generally 2D for the 5950 is 300MHz core). The memory clocks were locked between 2D and 3D operation and couldn't be set seperately.
    Reply
  • KristopherKubicki - Wednesday, February 04, 2004 - link

    Very well done. Reply
  • Icewind - Wednesday, February 04, 2004 - link

    I'd like to know how the freak the Nvidia cards outdid the ATI's in Halo and UT2k3, thats just beyond me. Reply
  • TheSnowman - Wednesday, February 04, 2004 - link

    hum Derek, i don't suppose you know why the nvidia based high end cards idle at so much lower temperature when compared to the ati based offerings? Reply
  • AnonymouseUser - Wednesday, February 04, 2004 - link

    Nice roundup. The 5900se (priced similar to the 5700 Ultra and 9600XT) is what I find most impressive. Reply
  • Abraxas - Wednesday, February 04, 2004 - link

    great review, this is the first of its type that i've seen and it really changed my mind on what card to buy. I would like to see 1280x1024 or even 1600x1200 in a future review, but even at 1024 it is nice.

    53.03 is really that much faster? that's just amazing.

    #7 ATI held a huge advantage on older drivers in HALO, just as much as in HL2. if the new drivers are that much faster... it appears that nvidia should never have been doubted :)
    Reply
  • DerekWilson - Wednesday, February 04, 2004 - link

    Sorry Icewind, we didn't include numbers for stock 9800pro, 5900(ultra/non-ultra/se/xt) or any stock card other than the newest releases. Most other reviews cover reference cards running stock numbers, but we just needed one reference point to show where these numbers fell and give people a basis to judge performance increase.

    Iger, there are a few reasons for what you are seeing. I would say that your questions were the correct ones to ask.

    We could only use a couple benchmarks, and the couple we chose are standardish (UT2K3), based on very common engines (JKJA), or one of the few available (Halo having PS2.0 support). These were not the games with huge performance gaps between them (like Tomb Raider or Tron). Also, since we were including 5700 and 9600 parts, we wanted to stick with the standard-but-lowish 1024x768 resolution rather than bump up a 1280 flavor.

    There is also one other thing that has been overlooked. Since the fall, there have been some driver changes. We've moved up to 53.03 for NVIDIA (which brough some noticeable performance increases) and the CATALYST 4.1 drivers which we have yet to give a good work out.

    In future reviews of this type, we plan on going with higher resolutions even if we include midrange cards. So the question we leave to the readers is this: how high do we go? 1152x864, 1280x960, 1280x1024 or 1600x1200 ...
    Reply
  • Lonyo - Wednesday, February 04, 2004 - link

    Most 9800's seem to be able to hit about 450MHz at the very max. Even the 9800 non-pro's (mine can get to 440MHz, but I run at 430MHz).
    Seems like a limit of the chip at about that sort of level.
    Reply
  • drpepper1280 - Wednesday, February 04, 2004 - link

    To answer a few questions, the passivly cooled 9600xt is on newegg, its the ultimate version. If you search by category it is at the bottom. Also the reason the nvidia cards do well against the ati cards is because they are overclocked in the bench marks (I'm pretty sure), also none of the bench marks are Half Life 2, lol. I had one question even before viewing the article, how does sapphires 9600xt 256mb stand up. Unfortunetly it was not reviewed, but I did read the the 9600xt could benifit from a memory increase. This makes me wonder if the 9600xt 256mb is actually a really good deal (it only cost 170 dollars), or if it is like many 256mb cards that actually decrease performance. Reply
  • Iger - Wednesday, February 04, 2004 - link

    It's strange it almost doesn't correspond with the fall test of FX5950 against 9800XT... There 9800XT looked much stronger... Now even the reference XT looks weaker, than FX. Maybe that's because the fall test was at higher resolutions? Or just not enough tests to see the big picture? Reply
  • tfranzese - Wednesday, February 04, 2004 - link

    Good article. Impressed with both camp's overclocking headroom. Reply
  • Icewind - Wednesday, February 04, 2004 - link

    Were are the comparison charts between the overclocked and stock speed 9800pro's? I must be blind, because I can't see them. Reply
  • par - Wednesday, February 04, 2004 - link

    Where can I find the passively cooled 9600XT by sapphire? Newegg shows sapphires 9600xt with a fan. Reply
  • DerekWilson - Wednesday, February 04, 2004 - link

    The Seagate HD: Barracuda 7200.7 PATA ... I'll add that to the table Reply
  • mostlyprudent - Wednesday, February 04, 2004 - link

    Nice article. A passively cooled 9600XT?!...I've found my next video card. There is one thing that I am unclear about - the Seagate hard drive used in the test setup - is it an SATA drive? Reply

Log in

Don't have an account? Sign up now