POST A COMMENT

114 Comments

Back to Article

  • bbqchickenrobot - Wednesday, May 07, 2008 - link


    But - now new Catalyst drivers have been released - so an updated benchmark needs to be completed as the drivers provide better support for the hardware and thus, better performance.

    Also, you used a non-AMD MoBo and Chipset... if you went with XFire + AMD 790 chipset + Phenom X3/X4 processor (Spider platform) you would have seen a better performance as well. There are other benchmarks that are/were done with these components (spider) and the results weren't nearly as mediocre. Just a little tip...
    Reply
  • Adamseye - Tuesday, February 12, 2008 - link

    I cant see how every review I have read differs from your charts, the 2900 xt can't be faster then the 3850.I mean I spent a month researching cards and the winner was the 3850 overclocking it to 3870 speeds. To think that AMD spent all that time to make a new 2900xt and name it the 3850-70, is just foolsih. from the benchmarks you provided only an idiot would buy the new gen cards for 60-100 buxks more when the 2900xt is on par. Could you please explain to me how this happened? I feel like ordering a 3850 was a waste of money because the old 2900 is better anyway. Reply
  • aznboi123 - Saturday, February 02, 2008 - link

    Welll dang that bothers me...666...>,< Reply
  • spaa33 - Monday, December 03, 2007 - link

    It looked to me that the biggest complaint on the HD Video Decode article was that the 2600/2900 options did not provide an off switch for the Noise Reduction. Did you notice if this option appeared to be present in the newer drivers of this card (3850)?

    Regards,
    Dan
    Reply
  • emilyek - Tuesday, November 27, 2007 - link

    So AMDTI is still getting stomped by year old hardware?

    That's what I read.
    Reply
  • jpierce55 - Saturday, November 24, 2007 - link

    This is really a good review, some others are very Nvidia biased. I would like to see you do an update with the new drivers in the near future if possible. Reply
  • gochichi - Friday, November 23, 2007 - link

    Anand,

    First Nvidia with its 8800GT... I clearly recall seing those at about $200, now they're $300 or more. At least these may come bundled with a game... they also "hold the crown".

    Now the HD 3870 has gone up to $269.99 (at newegg) and availability is every bit as bad as the 8800GT.

    This review assumes that AMD/ATI was going to deliver in volume, at a fixed price and they haven't delivered either. It would be really nice if you could slap their wrists... as individual consumers we are being tossed about and we don't have the "pull" to do anything other than "take it".

    Shouldn't AMD be accountable to deliver on their promises?
    Reply
  • SmoulikNezbeda - Thursday, November 22, 2007 - link

    Dear Anand,

    I would like to ask you what exactly results in individual games represents. Are those average FPS, or something like (min + max + ave)/3 FPS. On one czech website there were similar results to what was presented here, but they were showing (min + max + ave)/3 FPS, which is a complete nonsense as this would be advantageous for cards which have more volatile results. In case when they were comparing average fps the radeon had the same results as GT card. Also I would like to ask you whether you have used the same demo for both cards or you were playing a game and therefore testing a game in different situations?

    Thanks in advance

    Petr
    Reply
  • SmoulikNezbeda - Thursday, November 22, 2007 - link

    Dear Anand,

    I would like to ask you what exactly results in individual games represents. Are those average FPS, or something like (min + max + ave)/3 FPS. On one czech website there were similar results to what was presented here, but they were showing (min + max + ave)/3 FPS, which is a complete nonsense as this would be advantageous for cards which have more volatile results. In case when they were comparing average fps the radeon had the same results as GT card. Also I would like to ask you whether you have used the same demo for both cards or you were playing a game and therefore testing a game in different situations?

    Thanks in advance

    Petr
    Reply
  • Sectoid - Sunday, November 18, 2007 - link

    If I'm not mistaken the 8800GT is DX10 only right? Is DX10.1 so insignificant as to not count to the favor of the 3800's over the GT's? Don't get me wrong, I'm not trying to defend AMD; I just want to know if it's a good idea to sell my 8800GTS 320mb still at a good price now(I live in Brazil and they're still pricey here) and buy a 3870 or a 8800GT with 512mb. I recently bought a 22" monitor and the GTS is somewhat disappointing at 1600x1050. Nah, it's just that crappy game world in conflict. It runs similar to crysis demo at max! I have to play at medium and the textures are really crappy for a high-end pc 8-month old :(
    Who knows, maybe I'm already CPU or memory bound with a core 2 duo 6400@24xxMhz and dual ocz platinum 2 1gb 800mhz(2gb total)...
    Thanks in advance for any more input on the qualities of DX10.1 :)
    Reply
  • Agent11 - Sunday, November 18, 2007 - link

    I was very disappointed with the use of a p35 chipset to compare crossfire to SLI.

    You use a motherboard with 16x by 16x pcie lanes for SLI but use one with 16x by 4x for crossfire... And then make a point of crossfire not scaling as well!

    Ask any bencher, it does matter.
    Reply
  • SmoulikNezbeda - Sunday, November 18, 2007 - link

    Hi,

    I would like to know what numbers in graphs really represents. Are those average FPS or something like (min + max + ave)/3 FPS?

    Thanks
    Reply
  • Agent11 - Monday, November 19, 2007 - link

    If it isn't average then theres a problem. Reply
  • TheOtherRizzo - Saturday, November 17, 2007 - link

    What would you need a frame buffer of 512 MB for? That's enough room for about 80 1080p images. Sounds to me like someone at ATI is stuck in 1994 when framebuffers were the only memory on a graphics card... Reply
  • 0roo0roo - Saturday, November 17, 2007 - link

    the convoluted naming systems of gpus garrantees pretty much only geeks in the know will make good purchasing decisions. this matters to the health of the pc game industry, i'm sure many have been turned off by the experience of going to their local store and buying a card within their budget and little other useful information and getting a lousy experience. i'm sure retailers actually benifit from the confusion since they can charge more and just hope the customer just bases their decision on their price range. Reply
  • Shark Tek - Saturday, November 17, 2007 - link

    Finally GPU manufacturers are thinking right. Instead of making oven like heaters power hogs GPUs they're trying to make things right like Intel and AMD are doing with their CPU lines with less heat and power consumption.

    Lets see the upcoming generations how they will perform. ;)
    Reply
  • araczynski - Friday, November 16, 2007 - link

    I'm assuming this is a mid line card with better stuff coming out?

    otherwise I don't see the point of getting anything other than an 8800gt, prices are too close to give up top of the line for merely 60 or so bucks, or better yet, waiting a few more months till the 8900's roll out.
    Reply
  • Iger - Friday, November 16, 2007 - link

    Another interesting question is warranty. Main manufacturers of 8800GT (eVGA and XFX) give lifetime warranty on their products - that's much more impressive than saphire's 1 year... Reply
  • Odeen - Thursday, November 15, 2007 - link

    In one word, kinky. Reply
  • Xcom1Cheetah - Thursday, November 15, 2007 - link

    At techreport they show that the 3870 power usage under full load is full 39W less than the 8800GT... thats a huge difference..
    Any idea why there is such a large difference.

    http://www.techreport.com/articles.x/13603/9">http://www.techreport.com/articles.x/13603/9
    Reply
  • GlassHouse69 - Thursday, November 15, 2007 - link

    that makes immensely more sense than the results shown here. the 3850-70 isnt a massive leap and the ram requires alot less juice to run as well. (according to other articles of 3vs4 gddr) Reply
  • just4U - Thursday, November 15, 2007 - link

    First off excellent article.

    As to the sightings .. Ok here in Canada we tend to get things a little later then those in the states BUT the local shop I deal with already has 3850/70s in stock for
    179/239 respectively.

    For some reason even tho our dollar is finally on par (accually higher then the us greenback) we still seem to be paying higher prices. Perhaps price gouging from the retail stores.. (what ever) But anyhow it's in stock here in Calgary and there's not a 8800GT to be had... which sells at 329(ish) so yeah ... I think ill pick one one up based on this article.
    Reply
  • forPPP - Friday, November 16, 2007 - link

    quote:

    ... 3850/70s in stock for 179/239 respectively. For some reason even tho our dollar is finally on par (accually higher then the us greenback) we still seem to be paying higher prices.


    You are extremely lucky. In Poland 3870 is listed for $440 !!! OK with VAT (without $360), but it's the same price as for 8800GT. Well, who will buy it then ??? It's a joke - same price, much slower and more power hungry ! ATI what happened !?
    Reply
  • just4U - Thursday, November 15, 2007 - link

    mmm ok so I was wrong. Shops around here have them listed but don't yet have them in stock. They are expected over the next few days. Reply
  • falacy - Thursday, November 15, 2007 - link

    This is something we should all keep in mind, given that nothing has fundimentally changed in PC computing in the last 8 years. There is still a lot of fun to be had from the plethora of older PC games, which even the lowest end harware can play in full detail (with the exception of Unreal, which really taxes older hardware). And hey, if you're not going to complain about waiting 10 seocnds for it load, Open Office works great on lowend hardware too. Heck, even the lowest end Conroe CPUs trounce the 3.0GHz Pentium4 line in video tanscoding (and it would be interesting to see how the new Celeron 4xx series stacks up against a Pentium4 with 512K of chache, as they are both single core...).

    I just purchased a EVGA E-GEFORCE 8600GT Superclocked 567MHZ 256MB 1.5GHZ GDDR3 PCI-E Dual DVI-I HDTV Out Video Card for $95 CAD, which sure beats the $110 CAD that the 8500GT was priced at a couple of weeks ago. As far as usefullness goes in the $100 price segment, the 8600GT is a great buy, as it has playable graphics at 1280x1024 and 1024x768 in many games, where the 8500GT just does not.

    Hopefully now the passively cooled 8500GT models, which have smaller heatsinks and price tags than the passively cooled 8600GT, will be the standard for HD player PCs and we can all forget about the 8400 line of cards.

    It would have been nice to get one of the 3850s, but for the extra $80 it's not really worth the performance boost for people like me who are still using a 1024x768 CRT and Windows XP, playing older games and who perhaps have gotten too old to want to chase the latest gaming craze. I do have the hardware for Vista 64bit, but it's not worth hassle of the side-grade, when there isn't anything out there I feel compelled to play in DirectX 10. Maybe in a couple years there will be more DX10 titles that it will be worth upgrading the OS and monitor, rather than spending money on hardware.

    I'm running an ASUS P5K-VM, Pentium-Dual Core E2160, 1GB DDR2 667, which leaves my Pentium4 531 and 1GB DDR in the dust! Apart from only supporting PCI-E 1, this board will stand the test of time, so long as games/applications become more quad-core optimized, but for right now it's a super fast, super cheap computer compaired to what I paid for my Celeron 300a based uber-computer I had less than 10 years ago!
    Reply
  • poeticmoons - Thursday, November 15, 2007 - link

    It’s seems like you just ran over the fact that the 3870 is a dual slot card. Now I know that the 8800GTS and GTX were dual slot, but the 8800GT isn’t and I feel that is a very important factor. I don’t see how you would run 4 dual slot GPU’s in an ATX form factor case. Yes I know that the 3850 is a single slot card, but the high memory GT isn’t competing with that card it’s competing with the 3870. With a die shrink I would have just assumed that a dual slot card would be unnecessary. Reply
  • Spoelie - Thursday, November 15, 2007 - link

    Text hints at 3870 actually being quieter, while the slide mentions otherwise. Any data to back this up? Also, is the quieter part during idle or load, or both? Reply
  • nowayout99 - Friday, November 16, 2007 - link

    I don't see a noise slide...

    But actually, Anand, noise may be a deciding factor for me. I'd really like to know what the cards sound like vs. the GT8800, particularly the 3870, if you guys could come back to it.
    Reply
  • starjax - Thursday, November 15, 2007 - link

    What about testing with updated drivers? I understand that the HIS HD3870 cards are shipping with catalyst 8.43 drivers. Reply
  • Comdrpopnfresh - Thursday, November 15, 2007 - link

    How do you guys decide the intermediate slopes of the graphs between them? Some of them look like cubic regressions... Reply
  • Bram van der Heijden - Thursday, November 15, 2007 - link

    Just one thing i want to add.

    I think the last year AMD is really screwing up... dunno what there doing, but they aren't able to beat Intel, they aren't able to beat nVidia. Something went totally wrong over there. Marketing, Financial, Corporate launching strategies whatever... their screwing up.
    Reply
  • Leadthorns - Thursday, November 15, 2007 - link

    Anand,
    How about the image quality? Some reviews claim its marginally better on the ati card than the 8800gt. Whats your take?
    Reply
  • Bram van der Heijden - Thursday, November 15, 2007 - link

    Best Anand, and other readers.
    I find it startling to see you making an assumption that's already a fact: "The Radeon HD 3870 becomes even more attractive the more expensive the 8800 GT is and the opposite is true the cheaper it gets; if the 8800 GT 512MB was available at $219, then the 3870 doesn't stand a chance."

    I'm allready able to order Club3D 8800GT's 512MB for 208,- EURO's and even XFX's for about 212,45 EURO's... so thats even less in dollars. CLub3D is a company that builds good quality reference cards, so no suprises afterwards and XFX you all know.

    So... for such a good site as Anand's i find it a bit strange you are not aware of this, and living by the quote stated above... this allready blows away the 3870... though luck again AMD.

    Anyone interested in these cards haha, check out BEE-CT

    Regards,

    a Dutch bloke.
    Reply
  • MrKaz - Thursday, November 15, 2007 - link

    There are rumors that in my country the ATI 3850 256MB version will cost 140€.
    160€ for the 512MB and around 200€ for the 3870.
    So this is in line with what you say.
    (all values have VAT)

    About the good luck, I think even with the slightly slower card the DX10.1 capabilities will be a selling point.
    Just ask the guys that bought the faster X800 over the 6800 and now can’t play some SM3 games.
    Reply
  • jcromano - Thursday, November 15, 2007 - link

    One Euro is worth about 1.46 USD these days, no?

    So the 208 Euro card would cost about 304 USD, right?

    Jim

    Reply
  • Bram van der Heijden - Thursday, November 15, 2007 - link

    Hmmm. back to school...

    That would mean 208/1.46 = 142 something...
    Reply
  • Bram van der Heijden - Thursday, November 15, 2007 - link

    but it's not like that... damn.. ur right... that's pretty expensive...

    I was wrong... sry.
    Reply
  • Parhel - Thursday, November 15, 2007 - link

    Even though you're wrong, do you mind if I use your math on my upcoming trip to Europe? It would really help me out. :) Reply
  • MrKaz - Thursday, November 15, 2007 - link

    Since no one seems to ask but since when crossfire works on Nvidia 680i?

    Also you said this correctly over load power:
    "The difference is negligible, but when you include the fact that the 8800 GT is faster, the Radeon HD 3870 actually has worse performance-per-watt than the competition. "

    But you unfortunately failed to mention this over idle power:
    "The difference is huge, especially when comparing to the older ATI and NVIDIA offers, even when comparing to the new 8800GT it’s still a 40 Watts difference."
    Reply
  • Anand Lal Shimpi - Thursday, November 15, 2007 - link

    It doesn't, we used a P965 board for CrossFire, but you couldn't have known that - thus I've updated the test table :)

    And I've included commentary on the idle power of the 3800 series, my apologies for the oversight.

    Take care,
    Anand
    Reply
  • MrKaz - Thursday, November 15, 2007 - link

    Anand do you think Crossfire scaling would improve if you used some X38 or rd580?
    Or the 4x PCIe slot on 965 doesn’t affect it much?

    Does the 790 and Phenom get reviewed this month?

    Continue the good work!
    Reply
  • Anand Lal Shimpi - Thursday, November 15, 2007 - link

    Personally I don't think the chipset is at fault for poor scaling here, but you do make a good point - I'll see if we can run some numbers internally and figure that out.

    Indeed this isn't the only AMD product that gets reviewed this month...

    :)
    Reply
  • Anand Lal Shimpi - Thursday, November 15, 2007 - link

    Woops, my mistake, Derek ran the CF tests and they were on a P35 board and not a P965. I've updated the article accordingly.

    Take care,
    Anand
    Reply
  • MrKaz - Thursday, November 15, 2007 - link

    Thanks Anand! Reply
  • jcromano - Thursday, November 15, 2007 - link

    From page 5:
    quote:

    Looking at our own price search engine we see that only Amazon is listing a card available at $249, but it's not in stock, nor are any of the other more expensive 8800 GTs listed.


    I have been unable to use the RTPE for the past two weeks or so. What's the trick? Here is the error it gives me:
    quote:

    Warning: mysql_pconnect(): Lost connection to MySQL server during query in /var/www/vweb/rtpeserve/php/login.php on line 53


    Jim
    Reply
  • Crassus - Thursday, November 15, 2007 - link

    Yep. Same here error message here, both in Firefox and IE. I've been trying to make use of the RTPE for weeks now, without success. Or was it converted to Anandtech "staff only" use? ;c) Reply
  • jcromano - Thursday, November 15, 2007 - link

    Ok. Thanks for the quick response. I look forward to the return of the RTPE, but maybe your shopping page can substitute in the meantime.

    Cheers,
    Jim
    Reply
  • Anand Lal Shimpi - Thursday, November 15, 2007 - link

    I was talking about http://anandtech.shopping.com">http://anandtech.shopping.com, currently working on getting a solution to the RTPE issues :)

    Take care,
    Anand
    Reply
  • dm0r - Thursday, November 15, 2007 - link

    Loved the review and also the 3850....this is the real midrange card ill buy...excellent power consumption.
    Just only 1 thing missing is the temperature of the gpu's, but anyway excellent review
    Thanks
    Reply
  • yacoub - Thursday, November 15, 2007 - link

    Why are you recommending people wait for the 256MB of the GT? That model has no bearing on anything for people playing Crysis, CoD4, World in Conflict, etc. All the testing done on the 512MB GT shows that 512MB is really the new minimum for vRAM for gamers running a 19" or larger display and the 256MB model is well and truly irrelevant to their purchase options.
    Instead, the reason they should wait a couple weeks is just to see how the 512MB's availability and pricing changes.
    Reply
  • yacoub - Thursday, November 15, 2007 - link

    I really really like the new style to the charts and graphs. Everything is very easy to read and understand! Much improved over some older review designs! =)

    Also, lol @ how pathetic the 8600GT performs! :D
    Reply
  • Iger - Thursday, November 15, 2007 - link

    Actually, in terms of power consumption I would call this round a win for AMD. My home PC is on 24/7, but I really get to play on it for maybe a couple of hours a day at best (actually, probably, much less). AMD leads idle consumption by 40w, while losing the load power by 5. I think for pretty much every one 3870 will turn out cheaper than 8800GT. And I think it's important enough to be mentioned in article (no offence - just trying to be helpful).

    About prices - currently on overclocker.co.uk 8800GT 512 is preorderable for 350$, 8800GT 256 - for 290$, 3870 - for 320$ and 3850 - for 235$ (and AMD cards actually are listed in stock(!!) - impressive).
    With such disposition I would be close to buying a 3850 atm, btw... But, anyway, europe's prices are terrible :(

    Thanks very much for the article - it'll serve to satisfy at least some hunger before Phenom's ;)

    Ilya.
    Reply
  • Leadthorns - Thursday, November 15, 2007 - link

    Some review sights suggest that the IQ is marginally better on the 3870. Would be interested to know your take on this Reply
  • lux4424 - Thursday, November 15, 2007 - link

    In 2006 there were number of articles and presentations about benefits of new WDDM (Windows Vista Driver Model). These also mentioned WDDM 2.1, coming with DX10.1, and the benefits it should bring. Couple of examples:
    quote:

    WinHEC 2006, http://download.microsoft.com/download/5/b/9/5b970...">Future Directions In Graphics:
    *) Move to preemptive context switching and page-level memory management
    *) Video, Glitch-resilience: Preemptive context switching in WDDM 2.1 is key
    *) WDDM 2.1 – efficient GPU virtualization


    quote:

    WinHEC 2006, http://download.microsoft.com/download/5/b/9/5b970...">Desktop And Presentation Impact On Hardware Design:
    *) Advanced Scheduling with page level context switching
    *) Direct impact on desktop scenarios



    Since then it's absolute silence on the matter. It would be really great if Anandtech would cover the promises made WRT WDDM 2.1 (DX10.1) or even WDDM 2.0 (DX10) after SP1 for Vista is released.

    Regards
    Reply
  • GTMan - Thursday, November 15, 2007 - link

    Sentence with no ending...

    "Hopefully with DX11 Microsoft will be a little more used to the"

    Thanks for the article, interesting reading.
    Reply
  • Anand Lal Shimpi - Thursday, November 15, 2007 - link

    eep, thanks :)

    Take care,
    Anand
    Reply
  • NullSubroutine - Thursday, November 15, 2007 - link

    I am extremely disappointed in the review of the product.

    1) Only Vista was used, though XP has a lot larger user base.

    2) Limited variety of games.

    3) Limited variation of AF/AA

    4) No UVD tests.

    All could be forgiven if the title would have included First Look: DX10. I understand there is a limited time to do tests and it seems you had trouble getting your samples so this could lead to the problem. I usually look to anand for the most complete review of products (rather than having to look at many different incomplete ones sites use), but I believe this review to be incomplete and not what I expect from Anandtech.

    I await follow up reviews to reinstate my faith in this site. (and yes I am sure I will modded down as I will probably been seen as a 'hater' rather than trying to give constructive critism.

    Reply
  • Locut0s - Thursday, November 15, 2007 - link

    1) Only Vista was used, though XP has a lot larger user base.

    You answered your own question there. Remember this card is aimed at the midrange not the enthusiast and even more of these consumers are using XP.

    2) Limited variety of games.

    The games covered though are all the important big names that actually stress these cards and show what they are made of.

    3) Limited variation of AF/AA

    See Anand's reply above

    4) No UVD tests.

    You can see previous reviews to see UVD performance. I doubt this has changed at all since the hardware is identical.
    Reply
  • NullSubroutine - Thursday, November 15, 2007 - link

    I was saying XP should have been benchmarked because it is the largest userbase and most people especially at this price range will be using XP.

    When you limit the number of games benchmark you do not show an accurate performance of a video card, it has been shown that certain games play better on certain cards. Some sites only do reviews with games that are biased towards a certain brand or GPU; I expect that Anandtech is not one of those sites and expect a variety of games that show the true performance of the cards.
    Reply
  • Locut0s - Thursday, November 15, 2007 - link

    Sorry misread your question about XP/Vista. Yes they could test on XP. However it has been shown that the performance difference between the two is fairly small now and is in XPs favour meaning that games should run as good or better than what they show here. Reply
  • NullSubroutine - Thursday, November 15, 2007 - link

    I would have to disagree. There was at least a 20 to 25 percent difference in XP very high settings vs Visa very high settings in Crysis. If you look at any number of games, there is still a deficit for performance betwen XP and Vista, while the gap is shrinking, it is still very pronounced. Reply
  • Anand Lal Shimpi - Thursday, November 15, 2007 - link

    I think the real solution to the XP/Vista issue is to do a separate article looking at driver performance in XP vs. Vista. Derek was working on such a beast before the 8800 GT launched, and as far as I remember he found that with the latest driver releases that there's finally performance parity between the OSes (and between 32-bit/64-bit versions of Vista as well, interestingly enough).

    As far as more titles go, we tried to focus on the big game releases that people were more likely upgrading their hardware for. Time is always limited with these things, but do you have any specific requests for games you'd like to see included? As long as they aren't overly CPU limited we can always look at including them.

    I'll have to confirm with Derek, but I believe UVD performance hasn't changed since our last look at UVD with these GPUs: http://www.anandtech.com/showdoc.aspx?i=3047">http://www.anandtech.com/showdoc.aspx?i=3047.

    Thanks for the suggestions, I aim to please so keep it coming :)

    Take care,
    Anand
    Reply
  • NullSubroutine - Friday, November 16, 2007 - link

    The biggest discreptency (spelling) I have seen between all reviews have been the drivers used (especially if you take in consideration the difference from say XP to Vista 64 bit new to old drivers).

    Many review sites are using drivers that came with the disks, 8.43 or 8.44 which are supposed to be out Nov 15th for download (I couldnt find them on AMD's site earlier today). There seems to be these new drivers (must be beta drivers) give a huge boost in performance (it seems) for the 3800 series.

    What I cannot figure out why they test the 3800 series with the 8.43/8.44 but the 2900s with 7.10. So its hard to tell if the newer drivers are good for the HD series in general or more specific to the 3800s.

    Has Anand tested the different drivers?
    Reply
  • Lonyo - Thursday, November 15, 2007 - link

    They can't really test in XP that easily.
    Either they test in Vista, or they test in Vista AND XP (to be able to run DX10 benchmarks).
    I expect it's just easier to do all the tests in one OS, rather than having half run in Windows XP, except for DX10 which they run in Vista.
    Reply
  • MGSsancho - Thursday, November 15, 2007 - link

    I agree with you on that. I think there will be another UVD article later. like nothing but what video cards can offload parts of the video decode, what minimial cpu is needed for like a HTPC to run HD movies.

    Xp would be cool.

    but Anand, could you do a 32b v 64b? i know you mentioned it in the article, but can you do 1gb, 2gb, 4gb, and 8gb configs? maybe current games with single core (AMD 57FX the old king), with a dual core then a quad core? i bring up 8gb for a reason. now aday we can get 2gb dimms. And some of us us our comps for other task like running a few virtual mahchines minimized. we minimize our work, game for a 30 min break then go back to work. or maybe were running apache for a home website. or many other task that simply eat up ram (leaving FF open for weeks}.
    Im not asking for a dual socket god machine. but with current mobos, its possible to do 8gb of ram. thanks for reading this and take care.
    Reply
  • Locut0s - Thursday, November 15, 2007 - link

    With all the buzz in the CPU world nowadays being about more cores and not more MHZ it's interesting to see that the latest graphics cards have been all about more MHZ and more features. It seems to me that it's in the graphics card world that more cores would make the most sense given the almost infinite scalability of rendering. Instead of making the next generation of GPUs more and more complex than the previous generation why not work instead on making these GPUs work together better. Then your next generation card could just be 4 or 5 of the current generation GPUs on the same die or card. Think of it, if they can get the scaling and drivers down pat then you could churn out blazingly fast cards just by adding more cores to the card. And as long as you are manufacturing the same generation chip and doing so at HUGE volumes the cost per chip should go down too.

    Think this is something we will start to see soon?
    Reply
  • Gholam - Thursday, November 15, 2007 - link

    In case you haven't noticed, graphics cards have been packing cores by the dozens from the beginning - and lately, by the hundreds. Reply
  • Locut0s - Thursday, November 15, 2007 - link

    Well yes I know but the "cores" that they are using are extremely simplified, more so than I was thinking of. Instead I was thinking of each "core" as being able to perform most if not all of the steps in the rendering pipeline. Reply
  • Guuts - Thursday, November 15, 2007 - link

    I think the simple answer is that in the CPU world, they hit a clockspeed wall due to thermal issues and had to change their design strategy to offer greater performance, which was to go to multiple cores.

    The GPU makers haven't reached this same wall yet, and it must be cheaper and/or easier to make one high-performing chip than redesigning for multi-GPU boards... though there are some boards that have 2 GPUs on it that act like SLI/Crossfire, but in a single board package.

    I'm sure when the GPUs start suffering the same issues, we'll start seeing multi-core graphic cards, and I would assume that nvidia and AMD are already researching and planning for that.
    Reply
  • dustinfrazier - Thursday, November 15, 2007 - link

    Going on a year for Nvidia dominance and boy does it feel good. I bought my 8800gtx pair the first day they were available last year and never expected them to dominate this long. God I can't wait to see what comes out next for the enthusiasts. It get the feeling it is gonna rock! I really wanna see what both companies have up their sleeves as I am ready to retire my 8800s.

    I understand that these latest cards are great for the finances and good energy savers, but what does it matter if they already have a hard time keeping up with current next gen games at reasonable frame rates, 1920x1200 and above? What good does saving money do if all the games you purchase in 08 end up as nothing but a slide show? I guess I just want AMD to release a card that doesn't act like playing Crysis is equivalent to solving the meaning of life. Get on with it. The enthusiasts are ready to buy!
    Reply
  • Gholam - Thursday, November 15, 2007 - link

    For the reference, over here in Israel, 8800GT is promised to arrive next week - for approximately $380 + VAT (11.5%). For comparison, 8800GTS 640MB costs a bit over $400+VAT; 8800GTS 320MB used to cost in the low to mid 300s, but they're no longer available. I wonder when will 38xx get here, and at what price... Reply
  • abhaxus - Thursday, November 15, 2007 - link

    let me just say that i love my 8800 gts. however, as a person stuck with a 939 athlon x2 @ 2.5 ghz, and wanting to upgrade to a quad core setup, I've been freaking out lately about what motherboard to buy, and the lack of new video cards has made that very difficult. If the 320mb gts dropped in price in relation to the new GT, I'd buy a 650i/680i board in a heartbeat and just SLI it up. But the fact that no innovation is going on has kept prices too high for too long. I've had this card since march and prices are actually higher now than when I bought it originally.

    At least intel isn't resting on their laurels the way nvidia has been. I want new cards... so the old ones get cheaper!

    also if anyone wants to go really OT with a reply and tell me whether an Asus P5N32 SLI Plus would be a good choice to O/C a Q6600 to about 3.2 ghz and run 2 8800 GTS 320mb cards in SLI... let me know :)
    Reply
  • wolfman3k5 - Thursday, November 15, 2007 - link

    No, the P5N32SLI wouldn't be a good choice to overclock a Quad. Neither would be the Striker. The fact of the matter is that both this ASUS boards have a hard time putting out high FSB clock and sustain them with Quad Cores. You either go EVGA 680i (LT) if you want to retain the SLI capability, or I would suggest a P35 or X38 based motherboard.
    Just my 0.02C.
    Reply
  • abhaxus - Thursday, November 15, 2007 - link

    I've read that... but then I've also read on AT and that with current bios releases the asus boards are fine to around 360-400 FSB. I haven't O/C'ed an intel chip since the Celeron 300A so I am pulling my hair out trying to decide if it's worth it to plan for going SLI or just get a P35 board and stay with a single card. Reply
  • Anand Lal Shimpi - Thursday, November 15, 2007 - link

    <font color=black> Reply
  • abhaxus - Thursday, November 15, 2007 - link

    I apologize for breaking the comments... silly me for mentioning another site :) Reply
  • bupkus - Thursday, November 15, 2007 - link

    Just highlight the blank areas with your mouse.
    Click and drag.
    Reply
  • ViRGE - Thursday, November 15, 2007 - link

    Testing
    Reply
  • dustinfrazier - Thursday, November 15, 2007 - link

    Going on a year for Nvidia dominance and boy does it feel good. I bought my 8800gtx pair the first day they were available last year and never expected them to dominate this long. God I can't wait to see what comes out next for the enthusiasts. It get the feeling it is gonna rock! I really wanna see what both companies have up their sleeves as I am ready to retire my 8800s.

    I understand that these latest cards are great for the finances and good energy savers, but what does it matter if they already have a hard time keeping up with current next gen games at reasonable frame rates, 1920x1200 and above? What good does saving money do if all the games you purchase in 08 end up as nothing but a slide show? I guess I just want AMD to release a card that doesn't act like playing Crysis is equivalent to solving the meaning of life. Get on with it. The enthusiasts are ready to buy!
    Reply
  • abhaxus - Thursday, November 15, 2007 - link

    err, tried to do a hardocp logo and it hid everything in the previous post.
    text is:

    I've read that... but then I've also read on AT and that with current bios releases the asus boards are fine to around 360-400 FSB. I haven't O/C'ed an intel chip since the Celeron 300A so I am pulling my hair out trying to decide if it's worth it to plan for going SLI or just get a P35 board and stay with a single card.

    sorry to go so OT. the article was very good in typical anand style.
    Reply
  • JonathanYoung - Thursday, November 15, 2007 - link

    Just browsing through the article and this graphic caught my eye:

    Monitors command buffer to *ASSES* level of GPU utilization

    Not sure if this is an AMD or AT graphic, but you guys might want to correct it!
    Reply
  • Anand Lal Shimpi - Thursday, November 15, 2007 - link

    That'd be an AMD graphic, if I had an editable source I'd correct it, but all I've got is the PDF :)

    Take care,
    Anand
    Reply
  • imaheadcase - Thursday, November 15, 2007 - link

    I have heard quite a few people dislike those line graphs you use, the eyes just don't register lines well vs bars (one reason why long lines are not put in center of roads and they use bars to make the line) Why not stick to bar graphs like you do in the power consumption page?

    The eyes like things to conform to a shape, or should I say the brain. :) Quick glance at a bar graph is easier for the brain to compute than following lines.
    Reply
  • strikeback03 - Thursday, November 15, 2007 - link

    Regarding lines on road, this is somewhere on US83 between La Pryor and Leakey in southern Texas.

    http://img.photobucket.com/albums/v315/strikeback0...">http://img.photobucket.com/albums/v315/strikeback0...

    I have no idea what that road marking means.

    As for the line charts, I like them better than a multiple bar chart would to display all the same info on a single chart.
    Reply
  • Anand Lal Shimpi - Thursday, November 15, 2007 - link

    It's always tough finding a good balance, since I can cram so much more information into a line graph than a bar graph. I've just been toying with these things for the 8800 GT and this review, I'll see if I can come up with something better for the next round :)

    Take care,
    Anand
    Reply
  • feraltoad - Thursday, November 15, 2007 - link

    Can't please everyone I guess. I really like the line graphs. I think it is much easier to compare cards scaling across resolutions and gives a better overview of performance in relation to one another.

    You could use hand puppets and then everyone would be happy. I know I would :)
    Reply
  • JNo - Thursday, November 15, 2007 - link

    I second that - lines ftw Reply
  • ChronoReverse - Friday, November 16, 2007 - link

    I'll third that. The lines are great. Let's me know the resolution scaling quickly too. Reply
  • peldor - Thursday, November 15, 2007 - link

    The line graphs are an improvement over the bar graphs. Good use of colors on these charts too. Reply
  • Anand Lal Shimpi - Thursday, November 15, 2007 - link

    I think this is honestly one of the best ideas I've ever heard. If I were talented enough to make a good looking hand puppet...

    Take care,
    Anand
    Reply
  • xsilver - Thursday, November 15, 2007 - link

    pfft. hand puppets,
    you need 3d virtual godzilla representing nvidia and 3d virtual king kong representing amd. BTW godzilla would win because it can shoot flames out of its mouth. :P


    about the article - is it not feasible that when the price of the 8800gt drops to $220 or lower the 3870 just needs to drop to 85% of that mark? With the 3870 being on the smaller die process, they could afford it or at least try to?
    Reply
  • Anand Lal Shimpi - Thursday, November 15, 2007 - link

    ohhhhhk, I just said that I'm not talented enough to make a couple of hand puppets, you expect me to be able to create 3D models of reptiles and animate them? I picked text as a medium for my artistic expression for a reason - I'm not exactly artistic otherwise ;)

    The thought of a $220 8800 GT and a $187 3870 (and thus an even cheaper 3850) is just too much for my mind to handle at this point. I think it'll eventually happen, but not in the near term, these things are too new and both companies like making money.

    Take care,
    Anand
    Reply
  • Anonymous Freak - Thursday, November 15, 2007 - link

    Will you do a comparison on CrossFire performance, say, comparing two 3870s to two 8800GTs on X48 vs. 680i? Or even two 3870s on X48 vs. even ONE 8800 Ultra? If 3870 can really sell for $219, or 3850 for less than $200, two of them might well blow away an 8800 Ultra in $/fps terms, even worse than 8800 GT SLI does. Reply
  • Anand Lal Shimpi - Thursday, November 15, 2007 - link

    Ask and you shall receive, I just added two new pages to the article - the new Page 10 tackles the CrossFire question.

    Take care,
    Anand
    Reply
  • wpapolis - Friday, November 16, 2007 - link

    First off, great article! I still surf other web sites but when I read articles produced by Anandtech, they are usually more informative and better presented. Anand, you have a great sense of the relevant information to include in your articles and I do appreciate your effort.

    Just one small thing regarding CrossFire performance. I was a little disappointed when I read your comment ...

    "Scaling looks pretty good from the Radeon HD 3850, however it's still not as good as what NVIDIA is able to achieve with the 8800 GT. NVIDIA consistently achieves about 11% better scaling from one to two GPUs than AMD."

    You mentioned that scaling of the 3850's isn't as good as a 8800 GT. In this case, did you compared ...
    2 x 3850's with 256MB per card
    vs. 2 x 8800 GT's with 512MB per card?

    If so, I would be interested in how well ...
    2 x 3870's with 512MB per card compares.

    I suspect that running a Crossfire configuration sucks up more RAM, so using a 256MB cards doesn't scale as well as using 512MB+ cards.

    I know, I know, you didn't get 2 x 3870's, but maybe you can get one more now? Maybe one more page to this article?

    Thanks again for your tremendous effort!

    Bill
    Reply
  • chrispyski - Thursday, November 15, 2007 - link

    Nice crossfire chart. I know many people will be thinking about CF'ing the 3850's over a single high-end card (although I totally agree, it does not really work out well in the end) Reply
  • chucky2 - Thursday, November 15, 2007 - link

    Those that could have waited for the 3850 and instead bought a 8600 or 2400/2600 are probably kicking themselves right now...

    Chuck
    Reply
  • peldor - Thursday, November 15, 2007 - link

    Only if it's a HT+Gaming PC. If it's just a HTPC, a 8600 or 2400 is still lower power and lower noise (with fanless options). Reply
  • semo - Thursday, November 15, 2007 - link

    i'm still kicking myself for buying an ati 7500. Reply
  • bryanW1995 - Thursday, November 15, 2007 - link

    I must be psychic. I called that about 30 minutes b4 article was posted. Anand must be reading my mind...:) Reply
  • Anand Lal Shimpi - Thursday, November 15, 2007 - link

    Wanna go double or nothing? How do you think Phenom is gonna turn out? ;)

    Take care,
    Anand
    Reply
  • chucky2 - Thursday, November 15, 2007 - link

    10% improvement over WHAT Anand? Come on, tell us... :)

    Chuck
    Reply
  • Anand Lal Shimpi - Thursday, November 15, 2007 - link

    C'mon Chuck, one AMD launch at a time :)
    Reply
  • GlassHouse69 - Thursday, November 15, 2007 - link

    Nice article :)

    3870 can run games decently on 1920x1200 resolutions. Being that i dont care about Crysis (oh no! taboo comment!) or xbox360 games on the pc (gears o war), it seems like it could be the card to get..... If the retailers do not price gouge. Waiting for newegg to inflate this one.

    It seems that the 3850 is the same card as the 3870 in many ways. Any attempt at oc'ing will be really fascinating. I wonder if 1 Gb of gddr4 will make this card more competitive. even 768 megs would be nice/adequate
    Reply
  • Kougar - Thursday, November 15, 2007 - link

    Newegg has stock on three HD 3870 cards, all three are priced at $220 right now. Reply
  • DrMrLordX - Thursday, November 15, 2007 - link

    I have to ask, was there any antialiasing in these benchmarks? I suspect not but I'd like to hear an answer anyway.

    The 3850 looks like a good card for overclockers, since it's just a downclocked 3870. At least it's nice to see that the 2900XT and 2900Pro have mostly been rendered obsolete by a cooler, quieter product that can be brought up to snuff with some overclocking.
    Reply
  • Anand Lal Shimpi - Thursday, November 15, 2007 - link

    We included AA numbers with Oblivion (look for Oblivion AA in the graphs). The problem with AA these days is that most newer games don't really run well enough to have AA enabled and quality settings turned up (read: Crysis). While it's not a problem when testing pairs of 8800 GTXes, we felt it wasn't top priority for the more affordable and less powerful cards.

    That being said, I'll talk it over with Derek and see what we can do for some of our future articles.

    Take care,
    Anand
    Reply
  • Roy2001 - Thursday, November 15, 2007 - link

    Well, once I played games with AA enabled, I would never turn it off. I would rather lower the resolution. Reply
  • falacy - Thursday, November 15, 2007 - link

    That's a giant "ME TOO!" for me.

    my old ATi 9800XT would run 4x AA at 1024x768 in most games and I found that more enjoyable than running 1280x1024 without AA. The 60Hz fliker of the monitor at 1280x1024 played a role in that I am sure, but mostly the trouble with gaming without AA is that objects in the distance tend to shimmer in an unnatural way that seems to pull me out of the moment. So, indeed lower resolution + 4x AA = a better experience than higher resolution that has distracting artifacts.
    Reply
  • DrMrLordX - Thursday, November 15, 2007 - link

    Alright, thanks. I actually overlooked the AA tests on Oblivion. Silly me.

    Mostly I was interested in knowing if the 3870 had better results running with 4x AA than the 2900XT. Interestingly enough, the 3870 doesn't seem to lose a lot with 4x AA, especially at high resolutions. The 8800GT is another story.
    Reply
  • munky - Thursday, November 15, 2007 - link

    But... I'd like to see more games benchmarked, and with AA preferably. Reply
  • StormRider - Friday, November 16, 2007 - link

    Is anyone else bothered by the transistor count of 666 million? Couldn't they have done something so that it was 665 million or 667 million instead? Reply
  • aeternitas - Tuesday, December 11, 2007 - link

    lol How stupid. As you go out using this card to obviously kill some sort of opponent, you're bothered by this? Reply
  • Kaleid - Friday, November 23, 2007 - link

    Just a number, nothing more to it. Reply

Log in

Don't have an account? Sign up now