Final Words

Without a doubt, AMD is back in the graphics game. When the Radeon HD 2900 XT launched, we couldn't be more surprised at how poorly the product did. The lack of competition allowed NVIDIA to sit back and relax as the orders for more 8800-based product kept on flowing in. While the Radeon HD 3870 isn't faster than the GeForce 8800 GT, if AMD can hit its price point, it is a viable alternative if you're looking to save money.

AMD is in a lot of trouble however if the 8800 GT pricing/availability problem does get worked out; the 8800 GT does offer better performance-per-watt and better performance in general, at the same price the decision is clear, but luckily for AMD the two don't appear to be selling at the same price.

The Radeon HD 3850 is a bit slower than its more expensive sibling and as such ends up being tremendous competition for current mid-range cards like the GeForce 8600 GTS or Radeon HD 2600 XT. We only compared it to the 8600 GTS in this review, but the 3850 similarly obsoletes the 2600 XT.

Both cards from AMD are quite competitive today, but the balance of competition could easily shift depending on pricing and availability of either these cards or their competition. If AMD can't deliver on the prices it is so adamant about meeting, it loses serious cool points. Similarly, if NVIDIA can get enough 8800 GTs in the market, or if the 256MB version actually hits at $179 - $199, AMD would be in a lot of trouble.

Today the Radeon 3870 seems like a nice, albeit slower, alternative to the 8800 GT. But it's difficult to make a thorough recommendation without knowing how the 256MB 8800 GT will stack up and where it'll be priced. Given how the 8800 GTs sold out, if you're truly interested in the 3870 pick one up now, but if you're like us and want to carefully weigh all options - wait a couple of weeks and see what happens with the 8800 GT 256MB.

There is one more point to discuss, and that is: what happens to the high end GPU market? AMD is talking about sticking two 3800 GPUs on a single card and NVIDIA has been very quiet about its next-generation high end GPU plans, but with games like Crysis and Gears of War out on the PC, it'd be nice to actually advantage peak performance as well as affordable performance. What we do like about these new affordable GPUs is that they finally leave us with a feeling that you're getting something for your money, whereas mid-range GPUs of recent history seemed to just give you mediocre performance while lightening your wallet a lot more than they should.

While this may seem like a blip in an otherwise very profit-centric product lineup, we'd love to see similar performance revolutions at other price points in the graphics market. Give us a $100 graphics card that's actually worth something, and maybe we'll end up seeing a resurgence in PC gaming after all.

Power Consumption


View All Comments

  • Agent11 - Sunday, November 18, 2007 - link

    I was very disappointed with the use of a p35 chipset to compare crossfire to SLI.

    You use a motherboard with 16x by 16x pcie lanes for SLI but use one with 16x by 4x for crossfire... And then make a point of crossfire not scaling as well!

    Ask any bencher, it does matter.
  • SmoulikNezbeda - Sunday, November 18, 2007 - link


    I would like to know what numbers in graphs really represents. Are those average FPS or something like (min + max + ave)/3 FPS?

  • Agent11 - Monday, November 19, 2007 - link

    If it isn't average then theres a problem. Reply
  • wecv - Monday, August 14, 2017 - link

    Hello, I am from the future.
    We now have 2GB GPUs with GDDR5 as entry level, 4GB-8GB GPUs for midrange with GDDR5 and 8GB GDDR5/GDDR5X/HBM2 or 11GB GDDR5X for High-end and enthusiast!

    You may go and live back in the past.
  • TheOtherRizzo - Saturday, November 17, 2007 - link

    What would you need a frame buffer of 512 MB for? That's enough room for about 80 1080p images. Sounds to me like someone at ATI is stuck in 1994 when framebuffers were the only memory on a graphics card... Reply
  • wecv - Monday, August 14, 2017 - link

    Hello, I am from the future.
    We now have 2GB GPUs with GDDR5 as entry level, 4GB-8GB GPUs for midrange with GDDR5 and 8GB GDDR5/GDDR5X/HBM2 or 11GB GDDR5X for High-end and enthusiast!

    You may go and live back in the past.
  • ZipFreed - Friday, April 13, 2018 - link

    Lol, this comment is awesome and cracked me up. I am reading these older GPU reviews researching something and have been thinking similar sentiments to myself as I go.

    Glad you necro'd this.
  • 0roo0roo - Saturday, November 17, 2007 - link

    the convoluted naming systems of gpus garrantees pretty much only geeks in the know will make good purchasing decisions. this matters to the health of the pc game industry, i'm sure many have been turned off by the experience of going to their local store and buying a card within their budget and little other useful information and getting a lousy experience. i'm sure retailers actually benifit from the confusion since they can charge more and just hope the customer just bases their decision on their price range. Reply
  • Shark Tek - Saturday, November 17, 2007 - link

    Finally GPU manufacturers are thinking right. Instead of making oven like heaters power hogs GPUs they're trying to make things right like Intel and AMD are doing with their CPU lines with less heat and power consumption.

    Lets see the upcoming generations how they will perform. ;)
  • araczynski - Friday, November 16, 2007 - link

    I'm assuming this is a mid line card with better stuff coming out?

    otherwise I don't see the point of getting anything other than an 8800gt, prices are too close to give up top of the line for merely 60 or so bucks, or better yet, waiting a few more months till the 8900's roll out.

Log in

Don't have an account? Sign up now