ATI's Radeon X850 XT Platinum Edition

Compared to the X800 XT:

Compared to ATI's previous flagship, the X850 XT PE offers a 0 - 10% increase in performance, with the biggest gains coming in Battlefield and Doom 3. The performance improvements aren't negligible, but definitely no reason to upgrade from a X800 XT. If you're stuck choosing between the two, what's another $50 when you're already spending $500 on a video card?

Compared to NVIDIA's GeForce 6800 Ultra:

Next up we have the X850 XT PE compared to NVIDIA's flagship, the GeForce 6800 Ultra, which is currently only available through OEMs in a PCI Express version.

ATI has always done better in Battlefield than NVIDIA has, so it's no surprise to see the X850 XT PE with a huge advantage there. The rest of the games are basically a wash with the exception of Doom 3 and Half Life 2. Under Doom 3, the X850 XT PE is about 15% slower than the GeForce 6800 Ultra, but the tables are turned as soon as you look at Half Life 2, where the X850 XT PE is almost 17% faster than the GeForce 6800 Ultra. So which card do you pick? Well, both happen to run every single game out on the market just fine at the highest resolutions/detail settings so you can't really go wrong either way. The issue here is predicting whether more developers will use Valve's Source engine or id's Doom 3 engine for future games, and at this point that's a tough prediction to make.

The Radeon X850 XT Platinum Edition basically offers smoother playability at 1600 x 1200 in all of today's games (including Half Life 2 and Doom 3) than either of the previous reigning champions, the X800 XT and the GeForce 6800 Ultra. Now let's have a look at the rest of the X850 line...

Lightning Fast, and CPU Bound ATI Radeon X850 XT
Comments Locked

69 Comments

View All Comments

  • kmmatney - Wednesday, December 1, 2004 - link

    Man, It's way too confusing buying a card these days. There were already way too many ATI model numbers out there, and now this! Why can't they have an entry, Low Mid-range, High Mid-Range and high-end card and leave it at that.
  • sophus - Wednesday, December 1, 2004 - link

    model numbers are out of control. just go look at pricewatch.com and get bombarded by all of the models...

    i've heard that the purpose of the all the numbers/acronyms is to confuse the consumer into buying a "newer" part, read: more profitable for them.

    the prices are getting too high. $500 for a card?! too much money for a (practical) gamer's most frequently upgraded part.

    also, the availability for these cards is way too low. How long after the release do we have to wait until we can actually see these in stores? is demand that high and supply that low? is there a leak in their bank accounts? are their manuf processes too high?

    a small tweak in their product and they demand top-dollar? or rather, "well this NEW product is just a little bit better then our last one. so instead of lowering the price on our OLD product, we'll just set the bar higher for our NEW product."
  • Mykal Starclem - Wednesday, December 1, 2004 - link

    Quote:
    I just hope that the large increase in the variety of cards means that a couple of them which actually be available to buy

    lol
  • Kasper4christ - Wednesday, December 1, 2004 - link

    *cough* Spell check :P
    Page 2
    "before it's clock soeeds are officially set in stone."
  • Steve Guilliot - Wednesday, December 1, 2004 - link

    #31
    Software HDTV decoders/encoder require working DxVA (i.e. ATI cards) for best performance, and sometimes to work at all. For people intersted in going the HTPC route, the video processor is very important, just not to you.
  • miketheidiot - Wednesday, December 1, 2004 - link

    extremely unimpressive. My 6800 is still fine.
  • Regs - Wednesday, December 1, 2004 - link

    This does not look like a refreash to me. Just seems more like overclocked parts trying to win the performance crown that "no 'one'" can afford. I expect Nvidia to do the same. ATI even has the nerve to charge 400 dollars for a 12 pipe design. At least have it include Dual Dvi. I know it may not need the extra 4 pipes enabled, but it just seems like their taking you for a ride for a few extra MHz.
  • Araemo - Wednesday, December 1, 2004 - link

    #24:
    I stand corrected, though iDCT is VERY old tech, that nVidia had around from at least the geforce 4 series(if not the gf2 series), and ATI has had since sometime in the RAGE series.

    And the "Motion compensation" sounds like what ATI has enabled via their drivers in a couple aps (Divx Player and Realplayer, if I'm not mistaken.)

    Motion estimation is what I was thinking the only new use was, my mistake. Though I DO hope that nvidia at least has iDCT working, if not the motion compensation as well. The main selling point I saw w/ regards to the video processor was the ENCODE.. since I've never had a cpu usage problem while decoding a video.. even on a pentium 2 running windows 2000.

    #26: I don't think 'decent' is the right word.. their drivers are decent, they just aren't fast.

    I draw the distinction because of the number of video cards I've had with UNSTABLE drivers. I am very happy with the stability of the Catalyst 4.x series drivers.


    That said.. nVidia has stable AND fast openGL drivers.. hello ATI?
  • ViRGE - Wednesday, December 1, 2004 - link

    #4, it's not worth getting worked up over anyone's video processor at this point. Nvidia's 68xx processor may be broken, but even if it worked, it doesn't make a difference. There are not any MPEG4 decoders on the market that can use either company's card, and WMV acceleration on my X800 Pro is having no impact: frame rates and CPU usage stay the same. And let's not even talk about hardware assisted encoding...

    The whole "video processor" idea has so far turned out to be a joke from both sides.
  • Zebo - Wednesday, December 1, 2004 - link

    #10 the USD has lost 33% of it's value since GF4 days so in reality Vcards are the same price just your dollar is'nt worth anything.

Log in

Don't have an account? Sign up now