Introduction

The introduction of the X700 line (codenamed RV410), ATI is breaking their tradition of using revamped and tweaked previous generation parts in their current generation midrange product. RV410 is officially an R423 derivative. We see this as a very big step for ATI, and we hope they maintain this direction with future generations. The immediate impact on the consumer space will be better performance in the midrange and compatibility on par with high end products.



This last point is important to realize. Last year, many games or features that ran fine on 9700 and 9800 cards would have strange problems or incompatibilities with 9600 cards. Also, previous generation RV series cards lacked ATI's F-Buffer which enables GPUs to run shader programs that exceed a certain length. These issues were usually cleared up in driver updates or game patches, but attention to the midrange tended to follow attention to the high end segment. Now that the high end ATI GPU is the same core design as the midrange, any performance improvements or fixes that apply to the X800 will also apply to the X700 line.

Unlike last year (and the year before), ATI's product launches have lagged NVIDIA's. Our 6600 numbers are exactly 2 weeks old today. While some may speculate that this gives ATI an advantage because they have seen the performance of the competition, ATI needs to carefully balance yield, performance, and price for itself before it can worry about the competition. Bringing a product to market second in such a competitive space would only give ATI an advantage if they were able to maintain profitable yields at higher performance than necessary (and so could lower clocks and increase yield while still leading performance). Of course, all this goes out the window when you have NVIDIA and ATI both throwing insanely low yield high performance limited availability parts at each other trying to claim the performance crown. Hopefully we can be confident that the 6600 GT and the X700 XT will end up being less vaporous than the 6800 Ultra Extreme and the X800 XT Platinum Edition.

But all speculation aside, this is when the battle really heats up. Both NVIDIA and ATI now have affordable midrange products in the market that perform very well with respect to previous generation parts. We've got all the details inside; read on to find out who comes out on top in the most important competition for this GPU generation.

Scaling Down the X800
Comments Locked

40 Comments

View All Comments

  • ThePlagiarmaster - Friday, September 24, 2004 - link

    #39 ianwhthse

    Very funny stuff. OK, I don't want a porsche now...LOL. I think I get it. If you own a porsche, you spend so much time working to pay for it you're too wore out to have sex? :)

    I get your point on the obvious. I guess I just like to show someone a printout to go along with the stuff I tell them. They seem to like it better when they can SEE it also. Years of dealing with mechanics and appliance repair people (and bad PC people I guess) have taught most people that step into a PC store that the guy selling them anything must be lying or stupid. Which is true in a lot of cases. Can't really blame them for being skeptics. Then again some people are just VISUAL and can't seem to get it when the same thing is SAID to them instead of shown to them.

    In any case, I hope they do another review soon with more benches.
  • ianwhthse - Thursday, September 23, 2004 - link

    #31, ThePlagiarmaster:

    Actually, check this out.

    http://www2.autospies.com/article/index.asp?articl...

    But maybe that's just because you have to be rich to afford one. Then you're old.

    #38, (Also ThePlagiarmaster) they were given limited time with the card. With a limited number of cards to pass around, and a deadline to meet, ATi didn't just give them the card and say, “send it back when you’re done.” In such a situation, Anandtech would need to try and do the most important tests. And which would we rather see? A comparison where every card has the exact same framerates +- .1 frames? Or bench in a fashion where we can actually TELL which card is better?

    I totally understand where you're coming from. I just moved from 10x7 up to 12x10 on my 19 incher only about 3 months ago, myself, but you need to face facts, too. You're a smart person, who obviously has a lot of experience with computers, and like you said, we influence the people we talk to. So tell them the obvious point you brought up. That a weaker cpu is gonna limit how fast the gpu will be (to an extent). We all know that, here. Most of us keep that actively in mind when we're reading the graphs. We can figure that out (though not to an exact amount) on our own. It’s more important (especially for a limited-time initial review) that we find out what the graphics card is capable of.

    I’m sure once retail boards start hitting shelves at a store near you, there will be plenty of “real-world” tests that will check things out at the level you’re talking about, but you can’t expect them to do what you’re asking for in an initial review of a reference card.
  • ThePlagiarmaster - Thursday, September 23, 2004 - link

    #37 blckgrffn

    ROFL. Depends on your video card AND your monitor. Mine has GREAT text at 120hz@1024x768. If you'd have read my post you'd already know I said it goes to 150 but looks like crap. Do you honestly think I wouldn't know to try other refresh rates? LOL. I'm an A+ certified PC Tech (among other certs, this is the relevant one here) and own a PC business. I've owed a computer of some sort since the apple // (back in the green/amber monitor days). I'd say you don't know that many people if everyone you know runs above 1024x768. I however, SELL these things every day.

    I didn't say anandtech shouldn't publish the the numbers they did. I merely suggested they add one resolution to the picture. If the world was using higher resolutions as a whole, the web would be optimized for it. But it's not IS IT? Apparently you don't play many games online. At 1600x1200 a LOT of games out there can't run without tanking. Even without being online a top end machine can't cut it at 1600x1200 (with all the candy turned on) in EVERY game out there as you suggest. Your experience would definitely not be described as BUTTER SMOOTH. I'd further think you don't play that many games, or your just full of crap. Unless everything in your machine is overclocked you don't have a prayer of doing what you say. Congrats, you own a machine that the rest of us haven't figured out how to create yet.

    If you don't like people talking back in a comments forum then leave. I merely responded to people that replied to my posts. Discussing the review is what this is all about, or did you not get that? What makes you think your favorite res is more important than mine (and the rest of the world)? Jeez, do you actually think the world revolves around you? Do you actually think most of the world owns the top cpu/gpu in their machines? Get real. I'd venture to guess that less than 5% of anandtech readers own both the top cpu (amd or intel) and the top gpu (Nvidia or Ati). Apparently you have no idea that the "middle class" even exists in our society. Sales of Intel/AMD's top cpu's say I'm right. They're so low they won't even tell us how many they sold in a quarterly report.
  • blckgrffn - Thursday, September 23, 2004 - link

    ThePlagiarmaster ENOUGH. We get your argument. I have a 21" and I NEVER play a game under 1600*1200. When the rig can't handle it anymore it is time to upgrade, end of story. This is why I have a pc and not a console. I think that everyone I know plays at higher than 1024*768, even my dad on his 17" plays @ 1152*864. Thank you Anandtech for publishing numbers that I can use. 1024*768 are useless for me, as equally useless the higher res ones are for ThePlagiarmaster. By the way, running ultra high refresh rates tends to make text more blurry than it would be at 75 or 80 hertz. Try it once and you will see what I mean. Try setting your desktop to 1152*864, you will probably like that too.
  • Staples - Thursday, September 23, 2004 - link

    I am kind of worried about Xbox 2. ATI is doing a horrible job of putting a good price/performance card out there.
  • ThePlagiarmaster - Thursday, September 23, 2004 - link

    #32 AtaStrumf

    Actually I have 20/15 vision with almost 20/10 in my left eye. Lasik surgery is great for $3000 I can almost see like an eagle. If you'd read my post you'd already know I don't enjoy headaches. As such I run at 120hz. My Viewsonic P225F will actually run at 150hz@1024x768 but my vidcard doesn't handle that too well. This is another excuse to run at a lower res (no, not 800x600 or 640x480 mind you), you get a really high refresh rate. Most 19's can't handle 100hz at 1600x1200. Like TrogdorJW said, there isn't much difference in the look above 1024x768. Can you really see the difference at 1280x1024? In most games I can't. I'd rather have a super high refresh rate, and never see that dip in fps that happens in some games when the action really heats up.


    Anandtech readers encompass a small number of people. However we advise many people (as is the case with my customers). If I sell someone a $500 videocard and it runs no faster than their neighbors $200 card (because the guy is CPU limited in his games) I look like a fool or worse, get bitched out. Sadly a lot of people do by 17's, and with a $200 vid card or more sometimes. I'd like to have a better idea of where the cpu/gpu line is drawn in the sand.

    I'm not saying throw away the high end benchmarks. Just saying I'd like to see the res a HUGE portion of the population runs in tested. Change your res to 1600x1200 and look at anandtech's website (or any other for that matter). See those huge bars of wasted space on the sides? Why does anandtech (and everyone else) optimize for 1024x768? Because 90% of the world runs in this res! On top of this, most don't like switching to a different res for each game depending on the fps they can get in each. It's just a pain in the A$$.

    PrinceGaz

    I agree CRT's are superior. But don't agree 1600x1200 is the best res on a 19 or a 21 (well 21 maybe, but only if you're using graphics apps, or cad type apps where higher res is VERY important and useful). You really like browsing the web while losing 1/3 of your screen? I take it you don't mind switching res all day (I highly doubt you browse that high). Most cards can't cut it at 1600x1200 without major frame hits (only the latest and greatest and even then you'll switch to lower res often). The TI4200 (I have a 4400 in one machine) certainly is a card where you must be switching all day long on a game by game basis. That gets old to me, and I wouldn't even want to go there with my PC ILLITERATE customers (as #32 called them - perhaps rightly so).

    Perhaps you're happy switching, and i'm not trying to take that away from you here. I'd just like to see a res that benefits recommendations to the average user (the hugest population of pc users that is). Is a hardware review site supposed to cater to the top 5% of the population, or the other 95% of the world? Don't get me wrong I love knowing what the advantage is at the high cpu/high gpu end is, but I don't get the opportunity to recommend that stuff very often. Do game designers make their games for the top 5% of pc users or aim them at the masses? Witness the popularity (still! ouch) of Geforce4MX cards and you'll see my point. I'm not asking them to throw out the highend stuff, nor to add 640x480 or 800x600 (ugh!). But 1024x768 is NORMAL. Of all the people you know, how many have 19's or bigger? If it's more than 50% you have a small circle of affluent people apparently. Again, not all that normal. While high res can be argued on large monitors, I'd argue right back that most monitors sold are 17in or smaller. The percentages just don't lie.
  • PrinceGaz - Thursday, September 23, 2004 - link

    #30- The Plagiarmaster

    Actually I *don't* have an LCD monitor myself, I was just saying that many people do. My main monitor is a 22" CRT and I would never consider exchanging it for even a top-of-the range 20" LCD (same viewable area) as I feel CRTs are superior.

    As #32 said, anyone who buys a 19" or worse still a 21" and only uses it at 1024x768 is nuts. 1600x1200 is usually the best resolution for 21/22" CRTs, and 1280x960 or 1280x1024 for 19" CRTs.

    I generally play recent games at 1280x960, or 1280x1024 if that is all that is offered, but do sometimes need to drop that to 1024x768, and even 800x600 for Doom 3 as that is all my Ti4200 can manage. No point my upgrading it as I'm about to build a PCI-e system. In older games I play at 1600x1200 if it is available and it looks great. If not available I play at the highest resolution offered and crank up the AA. There is no point playing at a lower resolution if your card and monitor are capable of working well at a higher resolution.

    #33- TrogdorJW

    I assume you use an ATI rather than an nVidia card then? If you do use an nVidia card then theres an option in the drivers (since 55.xx I believe) in nView Display Modes -> Device Settings button -> Device adjustments... -> Display Timing tab, where you can tick 'Enable doublescan for lower resolution modes'. For me that makes 800x600 scan just like 1600x1200, and 640x480 is like 1280x960. They look *far* better with doublescan enabled than without on my 22" CRT. It just extends what is done normally at 512x384 to higher resolutions. For me, 1024x768 is unaffected by it because I choose a high refresh-rate (well above what my monitor or card could do at 2048x1536).

    If ATI don't have that option available, then they should add it as it can't be very difficult to implement. Like I say the drivers do it anyway at up to 512x384 so its just a case of extending it.
  • TrogdorJW - Wednesday, September 22, 2004 - link

    32 - Hey, back off the 21" users! ;) I have a 21" monitor that I routinely use at 1024x768 in games. The difference between that and 1280x960/1024 is not that great, and 1600x1200 is really still too small for my liking. Performance is also an issue. If I can run 1280x1024 at good frame rates, I will, but I also have no problem running 1024x768 where required. 800x600 and lower, of course, are a different story. I start to see horizontal lines on my monitor at those resolutions.

    Anyway, the nice thing is that a $200 card is coming out that will have about the same performance as the 9800 Pro, and in some cases better performance. Hmmm... but my 9800 Pro cost $200 back in April. Heheh. The added features might be nice, but I'm not that concerned. If you want a 6600GT or X700XT in AGP flavor, the 9800 Pro is still a viable option if you can find it for $200 or less. JMHO.
  • AtaStrumf - Wednesday, September 22, 2004 - link

    ThePlagiarmaster, enough with the 1024x768 rant.

    You made your point, but you're forgeting that most of your computer illiterate customers are not reading this site.

    People who buy 21" monitors to run them at 1024x768, must have a few screws loose in their heads or are suffering from a serious vision impairment. I suppose you also run it at 50 Hz or sth like that.

    Anyho' I bet most AT readers run at least 1280x1024 on a 19" monitor and that includes their games.

    And anyway if a customer refuses to part with $50 in return for a much bettter monitor, what makes you think they will surrender $200 for a graphics card???

    They deserve Intel's extremely sh*ty integrated graphics engine and nothing else.
  • ThePlagiarmaster - Wednesday, September 22, 2004 - link

    #30 PrinceGaz

    So what you're saying is, everyone buys LCD's? NO? So everyone buys laptops to play games then? Neither of these is true. Most laptops are sold to business users. It's a rare person that specifically buys a laptop for games. LCD's are too expensive in large sizes (I know I don't sell many), and suck for games anyway. Only a FEW can run fast enough to play games that don't give you headaches (eye aches?..whatever). I hate BLUR. 1280x960 is not common. Unless you think a ton of people have widescreen lcd's at home (or widescreen laptops?) and bought them to play games?

    Apparently you missed most of the point of the post. Is it worth upgrading from older cards at a normal resolution (meaning, 1024x768, do a poll, you'll find most run here). Most people buy these things then wonder why they don't run much faster. With which gpu's are we cpu limited at 1024x768? By throwing the 9700pro (and lower) into these super high resolutions it might make someone (eh, a lot of people) think they're card is junk and an upgrade to one of these will solve all problems. NOT...If you tossed a few 1024x768 tests in, someone might find they're cpu limited with the current card already. Tossing on an even more powerful card is pointless to these people. Too bad they wouldn't figure that out in a review such as this.

    Why do you think people used to always run in 640x480 when testing cpus (which I hated, it isn't real-world)? Because some games are/were GPU limited above this. In order to eliminate the GPU they would test at 640x480. So yea, running in a lower resolution will sometimes let your cpu run wild (eh, produce high fps :) ). The point was, we're pretty much seeing the highest of both ends here. How does that help someone trying to figure out if a $200 card will help them get more fps? Look at #29's question.

    I have a 21in and a 19in, both are in 1024x768. My dad has a 21in he runs in the same res. Most web pages are designed for this res, a lot of games are too. So most people run in this res. A Porsche is designed to do about 150+mph but do you see anyone doing that on the highway? No, but that doesn't mean getting from 0-60 is any less fun now does it? Even though you don't run it at 150mph it still gets the women doesn't it? Not too many high performance cars advertise their top speed. Why? Because nobody uses it anyway.

    PC's weren't designed to play games. But some of them sure are fun today eh?

    #28, I know that, you know that, but most of the world still saves a buck or two on the monitor. As much as I push 19inchers, people are just CHEAP. I still sell a good number of 15's! Even when I tell them a decent 19 would only cost them $50 more and they have to live with it for the next 5yrs or so at 15in. Even on my 21 I don't see how 1024x768 is tunnel vision though. The web is great, pics are fine, I don't have to screw with font sizes all the time to get some things right, game interfaces are ALL designed to make 1024x768 LOOK perfect. They may design for others also, but they make sure this res is working as it's the most used.

Log in

Don't have an account? Sign up now