ATI's X1000 Series: Extended Performance Testing

by Derek Wilson on 10/7/2005 10:15 AM EST
POST A COMMENT

93 Comments

Back to Article

  • DonPMitchell - Tuesday, October 11, 2005 - link

    I wish they had benchmarked Half Life2, which is sort of a standard thing to look at. I don't disagree with their final conclusions, but I can't help but wonder if HL2 spanked nVidia, and that made them chose another game. It would be more unbiased for them to stick to a more or less standard set of tests, and it would let use compare to prior tests as well. Reply
  • Larso - Monday, October 10, 2005 - link

    First I would like to thank the AT crew for another excellent article! The graphs are very intuitive and easy to read.

    I think that one aspect is missing in about every of the X1000 articles I have found on the net: how does this new family of cards compare with the previous generation(s)? - For example, what middlerange card should I pickup as an upgrade for 9600XT, and how much improvement will I get??

    Anybody stumbled over a review that compares these new cards with the previous generations??

    -

    Another question, I selected the 9600XT back then because it can easily be passively cooled, now I wonder if any of the new cards can be silenced without turning the case into a bake owen?
    Reply
  • MrJim - Sunday, October 09, 2005 - link

    I dont understand why anandtech keeps on not testing games other than shooters with these high-end cards, like demanding flightsims(lock-on is very graphic intensive) and/or Pacific Fighters(not as GPU dependant as Lock-On). Also why no racing sims? We who play these do use filtering and FSAA alot or at least the 350 ones i know. Cheers! Reply
  • dimitrio - Saturday, October 08, 2005 - link

    Some one said they like the way the graphs were done. I must say that I found it a little difficult to do comparisions, since you have to look back and forth to see what card each symbol represents. After you give up figuring that out, you try to look at the data below, but again, since it's not ordered "better to worse", it takes some time to figure out that data. Things got even more complicated with the "Lower is Better" graph.

    I aknowledge that it makes everything much cleaner, and with the number of benchmarks on this article, you would end up with dozens of graphs on several pages, and you can clearly sence the writers desire to improve the presentation with this new format, but sometimes things are better kept simple, and I still would like to see the many bar graphs, as they are much more intuitive and informative, to me at least.
    Reply
  • photoguy99 - Saturday, October 08, 2005 - link

    This article is a prime reason - editors listen, participate, improve.

    Tom's doesn't even link their articles directly to discussions! Why? Can't handle the feedback?

    Glad to be here.
    Reply
  • Spoelie - Saturday, October 08, 2005 - link

    Isn't anyone else confuzzled about the x1600xt lackluster showing? I was really hoping to make it my next upgrade but it's current performance is only value-card worth. Just looking at the specs (12 SM3.0 pixel pipes @ 600mhz) would have it creaming the 6600GT (8 SM3.0 pixel pipes @ 500mhz), but it's barely even competing with it. This without considering other architectural advances and faster memory! My guess is that the ultimate fillrate is determined by the TMU's, and only having 4 makes this card a worthy 9600xt followup -which had 4 TMU's @ 500mhz- but nowhere near the mainstream card of this day and age. Extremely bad decision on ATI's part if this is it. I can't think of any other reason for this card to perform so pathetic. It would be nice to have it clarified if I'm completely missing the issue tho. Reply
  • taylore2003 - Saturday, October 08, 2005 - link

    what Anandtech really need to to is benchmark the x1300pro on a non fx-55 system, ppl who buy that gpu will not have a top of the line PC, do it on a 3200+ amd not a fx-55, i mean come on, then ppl like me (damn all 16 y/o's) can see what kinda framefates we would be getting, the x1300pro should go vs the 6600 non gt! b/c we can see te x1800 is a great top of the line but ati's mid to low range gpus are not so hot, so lets see, a x1300pro vs a 6600 non gt! w/ a reasonable test setup. Reply
  • coldpower27 - Saturday, October 08, 2005 - link

    From what I can see, it seems the 4 TMU's is a very crippling feature when compared to the 6600 GT, as that has 8 TMU's but 4 ROP's. It beats the 6600 GT yes, but not as much as we would be expecting.

    Compared to the 6600 GT

    Pure Pixel Fillrate: X1600 XT 7.08 GP vs 6600 GT 4.0GP 77% More (ATI)
    Output Pixel Fillrate: X1600 XT (4 TMU's)2.36 GP vs (4 ROP's) 2.0 GP 18% More (ATI)
    Vertex Shader Fillrate: X1600 XT 737.5 MT's vs 6600 GT 375 MT's 96.6% More (ATI)
    Memory Bandwidth: X1600 XT 1380MHZ 22.08 GB/s vs 6600 GT 1000MHZ 16 GB/s 38% More (ATI)

    Add to that the 256MB vs 128MB comparison, and "more efficient Shader Model 3.0 implementation".

    Battlefield with AA.
    X1600 XT is ~ 50-60% Faster

    Day of Defeat Source with AA
    X1600 XT is ~ 33% Faster (Starting 12x9)

    Doom 3 wtih AA(OpenGL)
    X1600 XT is ~ 6600 GT

    FarCry with AA
    X1600 XT is ~ 37%-38% Faster (10x7 to 12x9)

    Chronicles of Riddick No AA (OpenGL)
    X1600 XT is ~ 6600 GT

    Splinter Cell Chaos Theory with AA&AF
    X1600 XT is ~ 12-18% Faster.

    Everquest No AA
    X1600 XT is 12-18% Faster.

    In some cases it's just faster up to the difference in their Output Fillrates, Battlefield 2, FarCry & DOD:Source being it's biggest Wins, & the two OpenGL games as it's poorest showing.

    Not much of this is a surprise however.
    Reply
  • AtaStrumf - Saturday, October 08, 2005 - link

    I was wondering something along those lines too, especialy why is X1800 XT so much faster than X1800 XL, and why is X1300 Pro not that much slower than X1600 XT?

    I don't get this new ring bus memory controler. Maybe it has something to do with that as well as TMUs and co. In the past we had 256-bit on the high end, 128-bit in the middle and 64-bit on the low end, now it seems as though all have _the same_ memory controler, which seem a bit odd to me, and what is also peculiar is the fact that all but the highest end X1800 XT have 256 MB of memory, while X1800 XT has 512 MB. Does more memory now somehow equal more "bits" -- bandwith?
    Reply
  • haelduksf - Friday, October 07, 2005 - link

    I'm guessing you're actually having a "peek" at DOD:S performance ;) Reply
  • flexy - Saturday, October 08, 2005 - link

    there is an interesting article (in german, sorry) where they compare the old cards' (X850) performance with the new adaptive antialiasing turned on.

    You can see that some games do pretty well with minor performance loss - eg. but FarCry gets a HUGE hit by enabling adaptive antialiasing. I also did some tests on my own (X850XT) and the hit is as big as 50% in FarCry benchmark.

    My question would be how the new cards handle this and how big the performance hit would be eg. with a 1800XL/XT in certain engines.

    Also, i think the 6xAntiAliasing modi are a bit under-represented - i for my part am used to play HL2 1280x1024 with 6xAA and 16xAF....and i am not that interested in 4xAA 8xAF since i ASSUME that a high-end card like the 1800XT should be pre-destined to run the higher AA/AF modi PLUS adaptive antialiasing. Maybe also please note that a big number (?) of people might not even be able to run monster resolutions like 2048x but MIGHT certainly be interested in resolutions upto 1600x but with max AA/AF/adaptive modi on.

    Reply
  • flexy - Saturday, October 08, 2005 - link

    here the link, sorry forgot above:

    http://www.3dcenter.org/artikel/2005/10-04_b.php">http://www.3dcenter.org/artikel/2005/10-04_b.php


    Reply
  • cryptonomicon - Friday, October 07, 2005 - link

    I want to see ATI release a product that takes back the performance crown.. only then they can sit on the high price premiums for their cards again because they own the highest performance. Until then they can get busy slashing prices... Reply
  • ElFenix - Friday, October 07, 2005 - link

    you guys fail to realize that, at retail prices for nvidia cards, the ati cards slot quite nicely. best buy and compusa still sell 6600GTs for nearly $200, and 6800GTs for nearly $300. so, comparing those prices to the ATi prices reveals that ATi is quite price competitive. of course, no one who reads this site buys at retail (unless it's a hot deal), but there isn't any reason to think that ATi cards can't come down in price as quickly as the nvidia cards. Reply
  • bob661 - Saturday, October 08, 2005 - link

    Ummm yeah. We, the geeks, don't shop at CompUSA or Best Buy. Therefore, ATI's new hardware is NOT price competitive. Also, if Nvidia 6600GT's are $200 and 6800GT's are $300 at said stores, how would the ATI cards magically not get a price gouging too? Reply
  • ElFenix - Tuesday, October 11, 2005 - link

    did you even bother to read my post before shooting off your idiotic post? i said that no one here shops at best buy. and you don't know if ati's hardware is price competitive or not because, at the moment, you can't buy it. once it gets out into the channel maybe newegg and zzf and monarch will stock them at competitive prices as the nvidia parts. maybe not. but you don't know that yet, so making blanket statements like 'ati is not price competitive' is stupid!

    i'm not really sure what this 'price gouging' is you're referring to, but because you've already demonstrated your inability to comprehend the english language i'm going to assume its because you think best buy and compusa are selling for more than msrp. they're not. they are selling at msrp. and at best buy and compusa ati cards will sell at msrp. and ati cards at msrp are quite price competitive with nvidia cards at msrp.
    Reply
  • shabby - Friday, October 07, 2005 - link

    Lets see some hdr+aa benchmarks. Reply
  • DerekWilson - Saturday, October 08, 2005 - link

    There are no games where we can test this feature yet Reply
  • TinyTeeth - Friday, October 07, 2005 - link

    You make up for the flaws of the last review and show that you still are the best hardware site out there. Keep it up! Reply
  • jojo4u - Friday, October 07, 2005 - link

    The graphs give a nice overview, good work.

    Please consider to include the information what AF level was used into the graphs. This is something all recent reviews here have have been lacking.

    About the image quality: The shimmering was greatly reduced with the fixed driver (78.03). So it's down to NV40 level now. But 3dCenter.de[1] and Computerbase.de conclude that only enabling "high quality" in the Forceware brings comparable image quality to "A.I. low". Perhaps you find the time to explore this issue in the image quality tests.

    [1] http://www.3dcenter.de/artikel/g70_flimmern/index_...">http://www.3dcenter.de/artikel/g70_flimmern/index_...
    This article is about the unfixed quality. But to judge the G70 today, have a look at the 6800U videos.
    http://www.hexus.net/content/item.php?item=1549&am...">http://www.hexus.net/content/item.php?item=1549&am...
    This article shows the performance hit of enabling "high quality"
    Reply
  • waldo - Friday, October 07, 2005 - link

    I have been one that has been critical of the video card reviews, and am pleasantly suprised with this review! Thanks for the work Derek, and I am sure the overtime it took to punch this together...I can only imagine the hours you had to pull to put this together. That is why I love AnandTech! Great site, and responsive to the readers! Cheers! Reply
  • DerekWilson - Friday, October 07, 2005 - link

    Anything we can do to help :-)

    I am glad that this article was satisfactory, and I regret that we were unable to provide this ammount of coverage in our initial article.

    Keep letting us know what you want and we will keep doing our best to deliver.

    Thanks,
    Derek Wilson
    Reply
  • supafly - Friday, October 07, 2005 - link

    Maybe I missed it, but what system are these tests being done on?

    The tests from "ATI's Late Response to G70 - Radeon X1800, X1600 and X1300" were using:
    ATI Radeon Express 200 based system
    AMD Athlon 64 FX-55
    1GB DDR400 2:2:2:8
    120 GB Seagate 7200.7 HD
    600 W OCZ PowerStreams PSU

    Is this one the same? I would be interested to see the same tests run on a NF4 motherboard.
    Reply
  • supafly - Friday, October 07, 2005 - link

    Ahh, I skipped over that last part.. " The test system that we employed is the one used for our initial tests of the hardware."

    I would still like to see it on a NF4 mobo.
    Reply
  • photoguy99 - Friday, October 07, 2005 - link

    Vista will have DirectX 10, which adds geometry shaders and other bits.

    The ATI cards will run vista of course, but do everything DX10 hardware is capable of.
    Reply
  • photoguy99 - Saturday, October 08, 2005 - link

    Sorry, I meant the new ATI cards will *not* be DX10 compatible.

    The biggest difference is DX10 will introduce geometry shaders which is a whole new architectural concept.

    This is a big difference that will make the X1800XT seem out of date.

    The question is when will it seem out of date. Another year for Vista to be released with DX10, and then how long before a game not only has a DX10 rendering path, but has it do something interesting?

    Hard to say - it could be the games with a DX10 rendering path show little difference, it could be you see a lot more geometry detail in UT2007.

    Make your predications, spend your money, good luck.
    Reply
  • Chadder007 - Friday, October 07, 2005 - link

    Sooo...the new ATI's are pre-DX10 compliant? If so, what about the new Nvidia parts? Reply
  • DerekWilson - Friday, October 07, 2005 - link

    This is not true -- DX10 will specific functions will not be compatible with either new ATI or NVIDIA hardware.

    Games written for Vista will be required to support DX9 initially and DX10 will be the advanced featureset. This will be to support hardware from the Radeon 9700 and GeFroce FX series through the Radeon X1K and 7800 series.

    There is currently no hardware that is DX10 capable.
    Reply
  • Xenoterranos - Saturday, October 08, 2005 - link

    Im just hoping NVIDIA doesn't go braindead again ont he DX compliance. I'm still stuck with a non-fully compatible 5900 card. It runs HL2 very well even at high settings, but I know Im missing all the pretty DX9 stuff. I probably won't get another card untill DX10 hits, and then buy the first card that fully supports it. Reply
  • JarredWalton - Saturday, October 08, 2005 - link

    Well, part of that is marketing. DX9 graphics are better than DX8.1, but it's not a massive difference on many games. Far Cry is almost entirely DX8.1, and other than a slight change to the water, you're missing nothing but performance.

    It's like the SM2.0 vs. SM3.0 debate. SM3.0 does allow for more complex coding, but mostly it just makes it so that the developers don't have to unroll loops. HDR, instancing, displacement mapping, etc. can all be done with SM2.0; it's just more difficult to accomplish and may not perform quite as fast.

    Okay, I haven't ever coded SM2.0 or 3.0 (advanced graphics programming is beyond my skill level), but that's how I understand things to be. The SM3.0 brouhaha was courtesy of NVIDIA marketing, just like the full DX9 hubub was from ATI marketing. Anyway, MS and Intel have proven repeatedly that marketing is at least as important as technology.
    Reply
  • bob661 - Friday, October 07, 2005 - link

    Are they fully DX10 or partially? If partially, will that be enough to be Vista compliant? Reply
  • Clauzii - Friday, October 07, 2005 - link

    I´m pretty amased that ATI despite the higher clockrate can acomplish almost the same as a 7800GTX although with 2/3 the Pipelinecapacity.

    I´ll look even more forward to R580.
    Reply
  • MemberSince97 - Friday, October 07, 2005 - link

    Ahh Thank You Derek, this is much more AT style. Reply
  • Madellga - Friday, October 07, 2005 - link

    Derek, nice update. Thanks for including 1920x1200 in the benchmarks, it is a good move and I hope that other sites follow AT on that.

    It is interesting to see how the performance of some higher level cards fall after 1600x1200. Anyone buying WS monitors should pay attention to this.

    I was not conviced that the X1800XT was better performer than the 7800GTX, but looking at the WS high resolutions and AA+AF that pretty much settles the discussion.

    Don't let the critics bug you. Use it as feedback and source of ideas for future reviews.

    On the next article, please do not forget to check the famous "shimmering" effect.
    Does the R520 family handles this issue better than the G70?

    Take care
    Reply
  • JNo - Monday, October 10, 2005 - link

    Well put! This is extremely helpful for 1920x1200 LCD owners Reply
  • erinlegault - Friday, October 07, 2005 - link

    I think important point that is missing from all reviews is the importance of a Vista compatible graphics crad. The x1xxx's are the first graphics cards compatible with the new spec.

    So the price premium may be worth while if you are interested in upgrading the Vista, when ever it is finally released.
    Reply
  • bob661 - Friday, October 07, 2005 - link

    All you need is DX9 to be Vista compatible. Reply
  • bob661 - Friday, October 07, 2005 - link

    Oops, DX8. Reply
  • tfranzese - Friday, October 07, 2005 - link

    From the article:
    quote:

    But performance is absolutely critical on current and near term games.


    Yet you guys tested none. I think benchmarking available versions of FEAR, Call of Duty 2, Serious Sam 2, Black and White 2, etc would be much more enteresting than some of the choices made here. All the cards tested handle todays games well, but I would expect most who buy these cards are buying these for games that are soon-to-be released or coming in the next one or two quarters.
    Reply
  • karlreading - Friday, October 07, 2005 - link

    i must admit it seems to me everyones just giving anadtech a hard time. the review seemed prtty reasonable, they responded to the massive backlash they got from there first review, and i think thats where the deserve the credit. sheesh guys! givem a break!
    karlos
    Reply
  • tfranzese - Friday, October 07, 2005 - link

    These points were brought up about the first article too. It's a big improvement, I agree, but it's still not to the level that this site was founded on. Reply
  • DerekWilson - Friday, October 07, 2005 - link

    We already commented on the FEAR Demo -- we won't test a game that doesn't show shipping performance characteristics. We are working on getting our hands on a prerelease copy of the shipping game for testing.

    We have had black and white 2 in house since it became available at best buy (as today is the official US launch, we got it a couple days early). We just haven't had enough time to finalize tests for it.

    We will look into the recently released call of duty 2 demo among others. I agree that we could have done things better, and hopefully our coming follow up will hit all the points people want covered.

    If you have any other suggestions, please let us know -- we will try our best to include them.

    Reply
  • PrinceGaz - Friday, October 07, 2005 - link

    As for other suggestions, how about some games other than FPS or FPS-view (EQ2) style games? A driving game like NFS:U2 or colin McRae Rally 2005 would be an excellent addition. Then you should have some sort of flight/space-sim, like maybe X2. And a roleplaying game that isn't viewed from a first-person perspective. By including games that have a totally different style of graphics, you'll get a better idea of how well the card performs. X2 for instance would require totally different graphics performance than Half-Life 2.

    I know some of the games I've mentioned don't have a benchmarking mode, but use FRAPS to get the average framerate. And the minimum framerate. In fact the minimum framerate is more important than the average so you should include it as a matter of course in *all* tests, even to the point of dropping the average framerate if you don't have space. No one is too bothered if the average framerate while playing is 45 or 50fps while playing a game, but the difference between minimum framerates of 20 and 25fps would definitely be noticeable. I'm sure others will agree.

    This (and many other) site seems to think FPS games are all people play, but a lot of us play games from all genres, so including them would be useful.
    Reply
  • coldpower27 - Saturday, October 08, 2005 - link

    I support the use of X2: The Threat as a benchmark, I also support the use of shipping games to compare numbers, so Anandtech should benchmark Black & White 2 as it is now available, plus Call of Duty 2 & Fear when they become available. Reply
  • bob661 - Friday, October 07, 2005 - link

    Even though I found the original article very informative (I guess I can read well), this one was much better. The bar graphs don't show how the performance goes down as you raise the resolution and turn on the eye candy. Reply
  • zmanww - Friday, October 07, 2005 - link

    what no Overclocking?

    come on I want to see that this baby can really do.
    Reply
  • Peldor - Friday, October 07, 2005 - link

    I don't think the usual overclocking utilities are working for the X1x00 cards yet. Reply
  • DerekWilson - Friday, October 07, 2005 - link

    this is true -- we wanted to test slower versions and couldn't because of this.

    also reference board overclocking isn't always the best indication of retail board performance.
    Reply
  • Lonyo - Friday, October 07, 2005 - link

    Would be nice to see some analysis of ATi's SM3 implimentation, with SM2 vs SM3 benchmarks in the games which do support SM2 and SM3 paths. Reply
  • tfranzese - Friday, October 07, 2005 - link

    From the article:
    quote:

    But that's not all the coverage that we have planned for the new ATI parts. Stay tuned for some more in-depth Shader Model 3.0, image quality, and market analysis soon.


    I think the analysis and the conclusions up to this point have been far short sighted. Seems that the games that are using SM3.0 are taking considerable advantage of the new architecture. The Tech Report, Hexus and others were able to show that much.

    Reply
  • bob661 - Friday, October 07, 2005 - link

    If AT is going to cover that in a later article, then the observations ARE NOT short sighted. Since we're picking nits, if the other sites are giving you ALL the information you require, why do you insist on bagging on AT? Just go to get your info, make your decision, and STFU. Reply
  • tfranzese - Friday, October 07, 2005 - link

    Because, unlike you, I like to compare the results between all my favorite sites. Maybe you're not mature enough to understand the reasons for that, so I'll fill you in: Humans make mistakes, so trusting one persons judgement or methodology is not an intelligent decision in the real world.

    Also, it makes no mention in the article that near-term games are to be tested in a future update of this sort. Yes, I expect they'll be around once retail boards are reviewed, but if they plan on continuing tests on IQ, shader abilities, etc then what sense does it make to pass judgement until those tests are complete?
    Reply
  • bob661 - Friday, October 07, 2005 - link

    Did you not read tfranzese's post that you replied to? Scroll up about 1/2 inch. That quote is from the article. Your quote,
    quote:

    so trusting one persons judgement or methodology
    , says you like to sample different websites to get whole picture yet you bag on one site (AT) that doesn't give you enough info. If you go to different sites to get the whole picture, why bag on any of them? You're still getting all of the info you need.
    Reply
  • bob661 - Friday, October 07, 2005 - link

    LOL...oops..you made that quote yourself. I can't believe you quoted but yet you still imply that AT isn't going to do any furthur testing in the areas you would like to see tested. I quit..lol! Reply
  • tfranzese - Friday, October 07, 2005 - link

    The further testing I am referring to are with near-term titles, not SM3.0 analysis. I think you're misunderstanding what I'm saying or I'm not being clear enough.

    I also do believe it's short-sighted to judge an architecture before all the tests are complete. Right now I know IQ and SM3.0 examinations are coming up, but it looks like they're done with the game benchmarks until the suite is updated and retail boards are available.

    Anyway...

    Sure, I can get the majority of my data from a collection of sites, but if I voice my criticism I could hope that someday I may only have to visit three sites instead of ten to confirm and compare results and analysis. Not that I didn't enjoy reading all those articles during my downtime at work.
    Reply
  • JarredWalton - Friday, October 07, 2005 - link

    quote:

    I also do believe it's short-sighted to judge an architecture before all the tests are complete. Right now I know IQ and SM3.0 examinations are coming up, but it looks like they're done with the game benchmarks until the suite is updated and retail boards are available.


    Naw, we're not done. And while it's true that http://labs.anandtech.com/search.php?q=x1800&p...">you can purchase an X1800XL, we're still missing the X1800 XT. $440 or so for the XL http://labs.anandtech.com/search.php?q=7800%20gtx&...">isn't much cheaper than a 7800 GTX, and while the X1800 XT might be faster overall, the 7800 GTX beats the XL in nearly every test.

    Also, one big question mark that still remains is SLI vs. Crossfire performance. SLI is here now and working for the 7800 cards. X1800 XT is still a month out, and Crossfire X1800 XT... who knows? Three months, maybe more? After the delays of the X800 Crossfire parts, I'm not even ready to venture a guess on X1800 CF. :|
    Reply
  • DerekWilson - Friday, October 07, 2005 - link

    We will test the near term games along with SM3.0 as many people have asked us for this. Let us know if you need anything else. Reply
  • DigitalFreak - Friday, October 07, 2005 - link

    What about running EQ2 with AA turned on via the setting in EQ2.ini? I would assume that the results would be similar to the other tests though.

    Benching the Call of Duty 2 demo would be cool as well. A couple of sites have seen a performance increase when using a 512MB card vs a 256MB one. May actually be the first game where 512MB is worth having.
    Reply
  • tfranzese - Friday, October 07, 2005 - link

    Donka! Reply
  • phaxmohdem - Friday, October 07, 2005 - link

    I know all my "gaming" monitors at home run at 1280x1024 What gives with the benchmarks of this uncommon? resolution? Thats 81,920 extra pixels unaccounted for in the graphs for many of us running 17 and 19 inch LCD's. Reply
  • bob661 - Friday, October 07, 2005 - link

    1280x960 is actually in keeping with the 4:3 aspect ratio. 1280x1024 actually stretches the height of your display although it's a little hard to tell the difference. Reply
  • TheInvincibleMustard - Friday, October 07, 2005 - link

    The actual physical dimension of a 1280x1024 screen is larger than a 1280x960 if the pixel size is the same -- there's no "stretching" of anything, as 5:4 is just more square than 4:3 is but you've got more pixels to cover the "more squareness" of it.

    -TIM
    Reply
  • DerekWilson - Friday, October 07, 2005 - link

    It would be more of a squishing if you ran 1280x1024 on a monitor built for 4:3 with a game that didn't correctly manage the aspect ratio mapping.

    The performance of 1280x1024 and 1280x960 is very similar and it's not worth testing both.
    Reply
  • TheInvincibleMustard - Friday, October 07, 2005 - link

    True enough, but most 17" and 19" LCD monitors (the monitors in question in this line of posts) are native 1280x1024, and therefore no squishing is performed.

    I do agree with you that it is redundant to perform testing at both 1280x1024 and 1280x960, as those extra ~82,000 pixels don't mean a whole lot in the big picture.

    -TIM
    Reply
  • JarredWalton - Saturday, October 08, 2005 - link

    Interesting... I had always assumed that 17" and 19" LCDs were still 4:3 aspect ratio screens. I just measured a 17" display that I have, and it's 13.25" x 10.75" (give or take), nearly an exact 5:4 ratio. So 1280x1024 is good for 17" and 19" LCDs, but 1280x960 would be preferred on CRTs. Reply
  • TheInvincibleMustard - Saturday, October 08, 2005 - link

    By Jove, I think he's got it! :D

    -TIM
    Reply
  • bob661 - Friday, October 07, 2005 - link

    That might explain why I can't tell the difference. Thanks much for the info. Reply
  • intellon - Friday, October 07, 2005 - link

    bang on with the graphs in this article... top notch. I guess the difference in performance of these cards make it less congested.
    On another note, I was wondering would it be too much hassle to set up ONE more computer with a mass sold cpu (say like the 3200+) and a value ram and just run couple of the different game engines on it, and post how the new cards perform? You don't have to run this "old" setup with every card ... just the new launches. It would be much helpful to common people who won't buy the fx55.
    I for one, make estimates about how much slower the cards would run on my comp, but those estimates could be much better with a slower processor.
    I understand that the point of the review is to let the gpu free and keep the cpu from holding it back, but testing with a common setup is helpful for someone with limited imagination (about how the card will run on their system) or not so deep pockets.
    Of course you can just go right ahead and ignore this post and I won't complaint again, but if you do add such a system in the next review (it just has to be run with the new cards) I'll be the one who'll thank you deeply.
    Reply
  • Sunrise089 - Friday, October 07, 2005 - link

    2nd, even if only for a few tests Reply
  • LoneWolf15 - Friday, October 07, 2005 - link

    One other factor in making a choice is that there are no ATI X1000 series cards available at this point. Once again, every review site covered a paper-launch, despite railing on it in the past. No-one is willing to be the first to be scooped and say "We won't review a product that you can't buy".

    I have an ATI card myself (replaced a recent nVidia card a year ago, so I've had both), but right now I'm pretty sick of card announcements for cards that aren't available. This smacks of ATI trying to boost its earnings or its rating in the eye of its shareholders, and ignoring its customers in the process. It's going to be a long time before I buy a graphics card again, but if I had to choose a vendor based on the past two years, both companies' reputations fall far short of the customer service I'd hope for.
    Reply
  • TheInvincibleMustard - Friday, October 07, 2005 - link

    Hard|OCP didn't include the X1800XT in their review for precisely the reason you point out -- it's nowhere near available yet.

    -TIM
    Reply
  • Xenoterranos - Saturday, October 08, 2005 - link

    I think the prob was that they couldn't get one either. That's one of the advantages of being as big as Anandtech. Hard OCP is big mind you, just not ANANDABIG! Reply
  • DerekWilson - Friday, October 07, 2005 - link

    By that logic they shouldn't have reveiwed any of the cards -- we can't buy any yet.

    Honestly, we aren't an advertising agency, we aren't trying to sell anyone anything. We are a hardware review site. We review new hardware and technology as we are able to get our hands on it. That does include guiding our readers towards making good purchasing decisions at times, and the lack of availability is definitely a factor in that recommendation.

    But we can't be expected to ignore hardware we get just because it's not available. In fact, some of our best articles are on products that we got our hands on from 3rd party contacts months, weeks or days before the product is launched. For instanc, we got CrossFire parts a long time before ATI sent us product. The products wouldn't be available for months, but we wanted to review them anyway. The reason is because the hardware is interesting and people want to know the preformance characteristics and details of new products.

    We certainly won't recommend the X1800 XT to anyone until we see it ship with the 625/1.5 clock speeds we tested at a reasonable price.

    There are two sides to our philosophy on reviews, but our curiosity usually gets the better of us.

    Also note that we didn't include the X1800 XT in all of our benchmarks in the launch article for the purpose of deemphasising it. Quite a few people complained about it being left out.

    In any case, we want to bring readers the articles they want to read. And I'm not convinced that you guys really want us to leave out hardware. If we didn't test the hardware we had available, we would not only be avoiding telling the whole story, but we would be assuming our readers didn't have the sense to realize what is available for purchase and what is not.

    You guys are smarter than that :-)
    Reply
  • TheInvincibleMustard - Friday, October 07, 2005 - link

    Oh true enough, and I'm not calling into question any of AT's journalistic integrity or anything like that. I do readily admit that I get as drool-y over The Next Big Thing(tm) as the next hardcore techno-enthusiast, but I also find it very frustrating (along with many others) that products get a "review" slapped on them even if they're not shipping ... for that matter, I seem to recall a review this past winter about a product that never even shipped.

    I'm fairly certain that I'm not alone in saying that I like the fact that AT (and other sites, like Hard|OCP, TechReport, etc.) put out numbers on something at the very first available opportunity (ie, when the NDA lifts). On the other hand, I'm also fairly certain that I'm not alone in the fact that it's very frustrating to us, as consumers, to see products being reviewed that won't be out for weeks, if not months (if ever). I'm also fairly certain that I'm not alone in believing that hardware manufacturers will continue to perfom these "paper launches" so long as major hardware review sites (such as AT) treat a non-shipping product in the way that they currently are.

    In my personal opinion, I would absolutely love it if a major site, such as AT, would simply do away with reviews of products that consumers cannot buy. Purchase an X1800XL from NewEgg (or wherever) once it becomes available, and conduct a full review of it then. Leave the non-shipping versions as technology previews, or sneak peeks, or early looks, or whatever it is you'd like to call them. Video game review sites, such as Gamespot, make it a point to very much so distinguish between a "late-beta hands-on" or "release copy" and a "retail shipping product," and if they review a game and I like it, I can head on over to EBGames or wherever and purchase it that same day. That's the sort of approach that I feel would be most appropriate to curtail these paper launches ... so long as manufacturers think they can continue to get away with a full indepth comparison and review of a non-shipping product, they'll continue to do so. Personally, I think that ATi was not slammed hard enough for not having the parts available on store shelves as of earlier this week.

    -TIM
    Reply
  • bob661 - Saturday, October 08, 2005 - link

    Speak for yourself bro...lol....I like knowing about the product BEFORE it ships because people ask me about these products and I can inform them WELL before the instant gratification kicks in. Feed me Seymour!!!! Reply
  • DerekWilson - Saturday, October 08, 2005 - link

    How do we balance doing an architectural/technology article without performance tests of hardware?

    The problem is that a review looks like tech article or a preview on sites like ours ...

    It's a bit easier to make the distinction with a game. Much can change even after release. But with hardware, ATI is locked into the silicon and everything but the clock speeds and memory sizes are fixed. If you clock an R520 part at 625/1.5 you will get the performance we showed. The only thing we would really like to do (that we can't yet because ATI doesn't build it into their drivers) is underclock the cards and look at the cheaper skus or do a frequency scaling article. That way we would end up covering something the end users will eventually see.

    I agree that there is a problem in general with paper launches, but I think that the public has not been fooled this time. Especially after what happened last winter with the X800/X850 series and this past summer with CrossFire. On our end, its difficult to balance a solution.
    Reply
  • TheInvincibleMustard - Saturday, October 08, 2005 - link

    No, I do see your point of reference, and agree with your statement that it is "easier to make the distinction with a game." I guess that I'm just frustrated that what you folks are reviewing is not what I'll be buying (completely ignoring that whole "golden sample" mumbo-jumbo) right when new technology is launched. In fact, it was your review of the eVGA 7800GTX that helped me make my decision to purchase that card instead of a 6800GT that I was considering. It wasn't the review of the Reference Board that made me wish to purchase the 7800GTX, it was the review of an actual, shipping product, as that review drove home (to me, at least) much more so exactly what my money is going towards and what I'll be getting from spending that much on a piece of hardware.

    I've seen this extend beyond the graphics card segment, as well. Wesley's review of the Reference Board for the Crossfire ATi motherboards included high praise for what the chipset is capable of (which is well-deserved, aside from the USB performance, which I seem to recall ATi stating that they'd fix in this particular release) ... ATi actually went so far as to reference the article in their Q4 financial statement. While I realize that this is ATi's spin on things and something that is probably beyond the control of anyone at AT, it's frustrating to me to think about how people that aren't "in the know" see that quote coming from AT and think therefore that the Crossfire mobo is the Best Thing Since Sliced Bread(tm), when the actual review was of a Reference Board -- everyone has seen things in the past that are similar to this, such as SiS making very excellent chipsets that perform very well but end up getting delegated as the "value" segment for motherboards and therefore don't get the recognition that they deserve with retail shipping products. Is the Reference Board review incorrect? No, I don't think so, but I have a hard time acknowledging the fact that the Crossfire Reference Board is truly indicative of retail shipping products, which, on the surface at least (ie, without an indepth look into the review), it appears to be. Note that I am aware of the fact that DFI is, I believe, going to be taking this chipset and performing their voodoo magic on it, which will more than likely make it into a monster of a motherboard, but that's in the future and is not what Wesley's review contained.

    Now, I certainly am also well aware of the fact that motherboards are a very different beast when it comes to manufacturing and the capabilities and performance bulit into it at that stage (eg, DFI vs Chaintech with an nForce4 chipset), but in my opinion the principle is the same -- what you're reviewing is not truly indicative of what the consumer will be purchasing, and for that, I'm saddened.

    This, of course, in no way means that I'm going to be leaving AT to go read somewhere else. I've been a loyal reader of AT for probably five or six years now, and can remember reading about the 1.13GHz P3 fiasco, so from at least that timeframe. I certainly plan on continuing to read AT into the future, and have based many of my computer purchase decisions upon what I read in reviews at this site, and will probably continue to do so.

    I guess, basically what it boils down to, is that I'm frustrated that it seems somehow acceptable -- to a certain extent -- that these "paper launches" are tolerated. I hope that my point of view is a bit more clear to you, now, and while I realize that my voice does not have a huge weight attached to it in the large scheme of things, since a forum is provided for feedback, that's what I'm doing.

    On a much more positive note, I am extremely happy that AT is one of those places where there is a nice direct connection (these comments) between the site operators/writers and the site visitors. In my opinion, nothing harmful can come from improved communication, which is something that I truly enjoy about this site.

    Keep up the good work, everyone at AT.

    -TIM
    Reply
  • BikeDude - Saturday, October 08, 2005 - link

    I've been holding off upgrading my graphics system for several months now. This review helped me realise the next big thing is still a way off, unless nVidia rushes a 7800Ultra out in time for the holiday season. (I'm also going for that Apple monitor, hence I'd like a card that can deliver on resolutions higher than 2048x1536)

    Knowing what's down the road a month or two from now definitively helps making a buying decision today. Yes, the schedule might slip, but someone buying a 6800Ultra in April would probably have liked to know about the 7800GTX' pending launch.

    Having access to more information can never be considered bad. The problem here is that the manufacturers enjoy giving us good news ("Next Generation part performs well!"), but seldom tells us the bad ("Next generation part delayed due to unforseen problem"). But IMO I prefer that to having no news at all!
    Reply
  • JarredWalton - Saturday, October 08, 2005 - link

    So, correct me if I'm wrong, but your basic issue is with the reference product articles being called "reviews". If we change the title to "preview" then you're happy? At least from my perspective, it really doesn't change anything. I mean, a review of the technology and potential performance or a "preview" of the same thing... "A rose by any other name would smell as sweet," right?

    Anyway, I don't really have any issues with changing articles like this to have "preview" in the title, but that's not my call. Take it up with the big man (Anand) if you'd like. :) I just assume that everyone realizes that when we say the product is not yet available in stores, then we're not giving a recommendation to buy any specific hardware until it shows up.
    Reply
  • TheInvincibleMustard - Saturday, October 08, 2005 - link

    That is it, to a certain extent, but going further than that allows the change for whether this new part is going to be as-heavily-focused-on as a retail shipping product? Forgive me for beating a dead horse, but if we go back to the game (p)review concept -- nearly all big-budget games would get a hands-on late-beta or release-candidtate preview, as would be expected from the readership (they want to know what's going to be going on), and AT should be no different with hardware. However, the depth of coverage from the gaming site for the preview is nowhere near the amount that they put into the retail shipping product -- previews of big-budget games could be three or four paragraphs and a few more screenshots, maybe some beta gameply video, while the final review would be several pages, some form of ratings or scoring system for direct comparison with other products, usually an indepth video review featuring gameplay elements, and the like. In other words, there's a very clear distinction between what constitutes a preview and what constitutes a review, not only from the title of the article but to how much and what type of information is actually packed into the article.

    Does this mean that I think AT or other sites should not publish any information if there's no retail shipping product? No, of course not, information is helpful. However, when the line between a "technology preview" and a "retail product review" becomes blurred -- not only in terms of the title of the article but in the content of the article as well -- then that is when things seem to be not so "rosy" (pun intended).

    I hope that this post has made my point a little more clear, and if not, I'll try again. Just this thread, though, is exactly my point about what makes AT great -- the people that are doing the stuff on the site are willing to listen to me, one of the million lowly peons that click through every day. If everything else about the site changes, I most sincerely hope that at least this aspect of it would remain the same.

    Thanks again for your time.

    -TIM
    Reply
  • nserra - Friday, October 07, 2005 - link

    I agree.

    When doing some article the site must say if they are doing a preview, review or overview.
    Reply
  • Questar - Friday, October 07, 2005 - link

    "High quality anisotropic filtering is definitely something that we have begged of NVIDIA and ATI for a long time and we are glad to see it, but the benefits just aren't that visible in first-person shooters and the like."

    So you like all the texture shimmering on a 7800?!?
    Reply
  • DerekWilson - Friday, October 07, 2005 - link

    We will absolutely be looking further in depth on the shimmering issue.

    But texture shimmering and the impact of ATI's new High Quality AF option aren't the same problem. Certainly angle independant AF will help games where both ATI and NV have shimmering issues, but those instances occur less often and in things like space and flight games.

    I don't like shimmering, and I do like the option for High Quality AF. But I simply wanted to say that the option for High Quality AF is not worth the price difference.
    Reply
  • PrinceGaz - Friday, October 07, 2005 - link

    We're not talking about ATI's new angle-independent HQ AF option. It's nVidia's over-agressive trilinear-filtering optimisations that all 7800 series cards are doing, almost to the point of it being bilinear-filtering. They did that a couple of years ago and are doing it again now, but only on the 7800 series cards (6800 and under get normal filtering).

    If you want an example of this, just look at the transitions between mipmaps on the 7800 in the first review of the new ATI cards. I'm not talking about spikes on certain angles, but how the 7800 almost immediately jumps from one mipmap to the next, whereas ATI blends the transition far better. In fact, that is the main thing that struck me about those AF patterns in the review.

    Over-agressive trilinear-optimisation is a problem even on 6800 series cards after supposedly disabling it in the drivers (it reduces the impact of it). I just wish it could be turned off entirely as some games need full true trilinear filtering to avoid shimmering.
    Reply
  • DerekWilson - Saturday, October 08, 2005 - link

    I know what you are talking about.

    The issue is that *I* was talking about the new HQ AF option in ATI hardware in the sentence Questar quoted in the original post in this thread.

    He either thought I was talking about good AF in general or that the HQ AF has something to do with why ATI doesn't have a texture shimmering problem.

    I just wanted to clear that up.

    Also, the real problem with NVIDIA hardware is the combination of trilinear and anisotropic optmizations along side the "rose" style angle dependant AF. Their "brilinear" method of waiting until near the mipmap transition to blend textures is a perfectly fine solution if just using trilinear filtering (the only point of which is to blurr the transition lines between mipmaps anyway).
    Reply
  • TheInvincibleMustard - Friday, October 07, 2005 - link

    Hard|OCP did some image quality comparisons between the 7800GT and the X1800XL in their "X1000" launch article, and there was a noticable difference between ATi's HQAF and nVidia's AF, and in a FPS no less. Add in the fact that they pretty much said that you could enable HQAF for hardly any performance drop, and that's a pretty nice point in ATi's favor.

    I think that AnandTech should look at an IQ comparison again, if they're not seeing any difference.

    -TIM
    Reply
  • nserra - Friday, October 07, 2005 - link

    I agree. New image quality tests must be done.

    Or maybe nvidia cards with 2 x performance of Ati, but with xgi/sis image quality is OK.
    I don’t think so.

    S3 and XGI have been plagued by their texture quality (image quality). But no one cares if those problems come from an Nvidia card.

    X8xx was supposed to offer lower image quality than R3xx, but no one really has showed that.
    Reply
  • bob661 - Friday, October 07, 2005 - link

    I've never experienced image quality issues on NVidia or ATI cards. They both look the same to me. YMMV. Reply
  • ChrisSwede - Friday, October 07, 2005 - link

    I was wondering what card available now that compares to my 9800 PRO? i.e. which card should I look for in reviews and equate to mine?

    ?Maybe none? :)

    Thanks
    Reply
  • ChrisSwede - Friday, October 07, 2005 - link

    Thanks Reply
  • Spacecomber - Friday, October 07, 2005 - link

    I think that the 6600GT is a bit faster than the 9800 Pro, but essentially in the same league. HTH

    Space
    Reply
  • Peldor - Friday, October 07, 2005 - link

    The closest card to a 9800 Pro in these reviews is the 6600GT. Generally the 6600GT will be a bit faster than the 9800Pro, but not huge (except in OpenGL). Reply
  • Spacecomber - Friday, October 07, 2005 - link

    Oops, didn't mean to be redundant. I guess I took to long to post my comment. Reply

Log in

Don't have an account? Sign up now