The Radeon HD 4870 1GB: The Card to Get

by Derek Wilson on 9/25/2008 12:00 AM EST


Back to Article

  • happytony - Thursday, October 02, 2008 - link

    just ask. thanks

    compare the price to nvforce
    it's worth?
  • LazyGarfield - Tuesday, September 30, 2008 - link

    I wonder how many gamers got a 30" TFT? Like 0,3 % of the community?
    So how interesting do you think would be the 2560 x 1600 resolution to your readers?

    How about featuring some useful resolutions like 1650 x 1080 or 1920 x 1200? (Not that it ever showed much difference in such a comparison).

    Then again, when you look at the figures just one game is showing some real gains anyways because it´s not really an improvement to be playing Crysis with 33 instead of 30 fps.... even more since this is most likely an average figure. Not to mention that the improvement is most likely coming from a higher gpu/memory clock rather than from the quantity of the memory).

    The min-fps would be very interesting as well because the biggest improvement with more memory on GFX-cards is mostly showing there.
  • Gannon - Tuesday, September 30, 2008 - link

    I wish people would stop mouthing off these truisms as if they were the only way to look at the situation.

    Consider that the real world is a physical place, competition does not always bring benefits. Take a look at game console development -three radically different development platforms, which makes it a royal PITA to cross platform a game that doesn't have sufficient hardware power, and what about all the reduplication with joysticks and mice? A lot of waste goes on.

    The idea that competition "always benefit the consumer" is nonsense, I can name hundreds of negative externalities because of competition and ridiculous amounts of redundant items and work that merely clogs up the market.

    Often times we make too much stuff at a too poor quality and design things to go obsolete without thinking about the future. This at some point is unsustainable environmentally. So lets just remember that things are made from nature, not anywhere else, that's where all true value comes from. All we do is merely reshape what was already there, all that potential value already existed in potential form, you can't get a glass bottle, computer, microchip, or a factory, if you dont have pre-existing value by which to make it out of.
  • helldrell666 - Monday, September 29, 2008 - link

    The 4870 1Gb version does beat the 280gtx in 3 out of 6 games at 2560 1920 res. it beats the 280gtx in 4 out of 6 and matches it in the witcher.

    Nive job ATI...AMD
  • tiggers - Monday, September 29, 2008 - link

    Like some of the other posts above, I wouldn't mind having an older tech, mid-level card (1 to 3 generations back) thrown in as a reference point.

    It would be nice to have a bench mark to help people judge cost/performance to see if it's worth while to upgrade or not.
  • aldy402 - Saturday, September 27, 2008 - link

    derek, looking at the 512mb and 1gb pcb pic comparisons, they look the same besides the double density ram chips. I just wanted to confirm that all the full cover WC blocks for the the 512 model should fit the 1gb model as well, correct? Reply
  • unclebud - Saturday, September 27, 2008 - link

    i bought one from best buy wth the 10% off befre the 4850/4870 launch, making it $117 before tax, and didn't want to bother with buying the 4850 (this week) for $135, especially since i play on a 17" CRT. Reply
  • jedz - Friday, September 26, 2008 - link

    the hd4870 does it well again this time by adding another 512mb of video memory and making it a 1gig card. AMD is really on the right track this time, the higher the memory the more fps can be yielded in high resolution gaming(which is still a bottleneck for nvidia). way to go AMD....

    I'm looking forward for the release of hd4850x2....
  • yacoub - Friday, September 26, 2008 - link

    How noisy is it? Why did you not include that in the Power page? Usually you cover Power and Noise. Reply
  • robotslave - Thursday, September 25, 2008 - link

    This is all well and good for those who have a $300 graphics card budget, but those of us who shop in the sub-$200 range are left scratching our heads a bit.

    The 1GB 4850 variant has been in stores for weeks now, but none of the major hardware sites have reviewed one. Newegg has five models in stock; have none of the manufacturers sent review samples to you guys?

    What's the holdup? How long are we going to have to wait to see whether or not the 1GB 4850 is going to make a difference for the 22"-24" 1900x1200 panels currently in the display sweet spot?
  • SiliconDoc - Friday, October 03, 2008 - link

    No offense robotslave, none to anandtech either, but they've reached the point that the corpo-politician model of perma e-boner has been erected, and there's no going back.
    They need no gamer left behind legislation. lol
    It must be blocked from the floor overclocking vote. tee hee
    ( don't you care about team cohesion and enthusiasm levels ?) -rofl
    I mean do you expect these silicon valley monster review boys to settle for 1024x768x32 on a $150.00 vidcard lineup ?
    That sounds like a hostile work environment.
    The "secondary" market (or is that 3rd world now ?) requires too much viagra, the team has to think about morale and hype, man.
    I think we'd better move to the "less interesting" review pages where gamerz aren't "seen or heard" even if they "are there".
    ( I'll see ya there, or rather, I won't ... wink)

  • Gaz - Thursday, September 25, 2008 - link

    which of theses two senario's would be the better one to implement 4 x 4870 1gb or 2 x 4870x2
    I have a MSI K9A2 platinum that can handle 4 cards in crossfireX the only problem is that the power conections to the cards and the size of the power supply needed to run all 4 4870 1gb cards compaired to the 2 x 4870x2 which would only use 2 six pin and 2 eight pin conectors
    the 4 x 4870 1gb i think would use 8 six pin power conecters to run the vidio cards but what size psu would you realy need a 1500psu or would a 1350 or 1000 psu do
    I havent bought any cards yet
  • Comdrpopnfresh - Thursday, September 25, 2008 - link

    I think the changes in the 4870 show that AMD was smart in designing their current generation of graphics products.

    They were able to provide an appreciable performance increase by adding more memory. In doing this, they saved money by not have to tape out new die alterations, and don't drastically alter power consumption.

    Nvidia, on the other hand, didn't seem to be able to provide as much of a performance improvement, while increasing power consumption and probably having to spend on the new core.
  • Paladin1211 - Thursday, September 25, 2008 - link

    Unlike the GTX 260 Core 216, this card isn't an epic fail at its price point... Reply
  • AlB80 - Thursday, September 25, 2008 - link

    "It still draws a significant amount more power at idle and load than the GTX 260."

    They still using catalyst 8.7.
  • dennilfloss - Thursday, September 25, 2008 - link

    I am confused when you say it does not tax memory as much because with QTP3, I sometimes reach around 950MB of VRAM use, so I'd think the extra 512MB would make quite a bit of difference in not having to use the hard disk instead of some VRAM. I think most of the Oblivion players interested in the 1GB version of the 4870 will have extra texture mods installed. Do you use one of the more detailed texture packs in Oblivion or is your Oblivion just vanilla?

  • grmnasasin0227 - Thursday, September 25, 2008 - link

    So now that the 1GB is out, when will we be able to see the 4870X2 compared to a 4870 1GB in CF? I'd like to know which is the better buy. Reply
  • Kulamata - Thursday, September 25, 2008 - link

    I would have liked to have seen the 4850 XFire included; I think it (they?) can run with this pack. I'm keeping my eye open for the promised 4850 X2; we'll see if they're a real product. If so, I'm quite interested for 1920 X 1200. Reply
  • emilyek - Thursday, September 25, 2008 - link

    Can you guys show frames for something other than 2560 x 1600 and maxed AA/AF? Not everyone has a 30" Apple Cinema Display or better.

    Cards that are neck and neck at that resolution very well might have anywhere from a 10-20 fps difference at, say, 1680 x 1050, where most people game.
  • AnnonymousCoward - Thursday, September 25, 2008 - link

    Look again: the line graphs have your resolution. Reply
  • devonz - Thursday, September 25, 2008 - link

    Perhaps I have misunderstood something, but particularly for 32 bit Vista and Win XP, I THOUGHT that video memory essentially "used" part of the 4 gigabyte limit for RAM usable by the OS. So, would that mean that if you had two of these in CF you would limit your OS RAM to 2 gigabytes? That seems like a considerable drawback to me. Yes, people could go with 64 bit Vista, but my impression (perhaps not correct) is that this STILL isn't totally as functional (drivers, application/game compatibility, etc) as the 32 bit version. Reply
  • yyrkoon - Thursday, September 25, 2008 - link

    The only time you *may* have to worry about dual card encroaching on 2GB of RAM is *if* you use two 1GB cards as singles for a 4 head display. I am fairly certain(but have no hands on) that SLI, and Crossfire are both viewed by Windows as a single piece of hardware, at least as far as the memory is concerned. Again, also fairly certain, but not positive: If you had 2GB of RAM, and somehow your card(s) had 2GB of RAM that could be seen by the OS, that you would not have to worry about things until you went over 2GB of system RAM. Then, the possibility of your system having less than 2GB available for processes, and applications could be a problem.

    This is a reason why you will see system vendors selling systems with 3GB of RAM that have Video cards that have 256MB of RAM or better. Depending on the BIOS, and the motherboards ability to 'hoist memory', with 4GB of physical system RAM installed, you typically 'lose' at least 500MB to system resources(which include stuffs like I/O ports,graphics cards 'aperture', etc). Just as an example, I have two motherboards from the same company here, one is an nVidia AM2 board, and the other is an Intel P35/ICH9R board. The AM2 system board can report 3.5GB available to the user in Windows, while the Intel chipset board can only give the user 3.25GB. Both used the same exact additional hardware(NV 7600GT 256MB, and an Intel GbE Pro PCIe card).

    All the above said, I tend to view 'upgrading' a system to 3GB of system RAM as retrograding, with no thought given to future upgrade-ability. My belief here is that you can *someday* want to upgrade to 8GB of RAM, and while it *is* possible to have existing 1x1GB sticks, and buying single 2GB sticks; It is better to buy matched pairs for a single channel for use with dual channel memory. Even if you do not plan on getting into 64bit operating systems soon. Also, that extra 256-512MB of RAM can make a noticeable difference at times.

    Another thing to think about is the 2GB vs 4GB 'issue'. A lot of people seem to be thinking that if a game can only address up to 2GB of memory(and most games are in this category), that it makes little to no sense to upgrade to more memory. This is false. The reason is very similar to why dual core CPU at the same clock rate as a single core CPU will preform better: Availability of system resources, and competition of said resources between the OS, and other applications. I.E. if you have an additional 1.5GB of RAM, you game can use up to 2GB, while the OS can do whatever it wants with this additional 1.5GB before going to the swap disk. Windows hitting the swap disk can be very bad for performance, especially if running applications, and the swap disk are on the same hard drive. Granted, this performance gain from increasing system memory *is* noticeable, but is not as good as 'doubling' you GPU power. So, it really depends on what you plan on doing with said system, and other constraints to know which makes more sense for you personally.

    As another example: Photoshop can and will use up to 2GB of system memory without using the /3GB boot switch, but also can use more memory above 2GB as swap(scratch disk) on up to another 2GB I beleive. This is before it starts hitting the hard disk(s). For the casual Photoshop users, this may not be a huge deal, but for professional image retouchers who work in 16bit color depth(or greater) at very high resolutions, this can make a very big difference. Especially when using math intensive filters.

    Anyways, if you're interested in learning more about the limitations of 32bit Windows, you can always visit Microsofts website, and start by searching for '/PAE'. You can also dig though their pages(although some of it will take some time to find), and find every single last detail on the matter.
  • Ezareth - Thursday, September 25, 2008 - link

    My gaming Rig is running good ole 32Bit XP with 2 Gigabytes of system memory. I Have NEVER come close to using all 2 Gigabytes in any gaming, or while using photoshop, surfing the web etc.

    My graphics card is a 4870X2 which while it has 2 GB of memory only 1 GB is addressable by the system. If I'm not mistaken these dual GPU cards are nice because the system only addresses a single cards memory while the graphics card itself uses the full amount. When I installed my 4870X2 I still show the full 2GB of memory on my system so I know this is the case as my sound card and other internals take up around half a gig combined.

    I think if you crossfire on the Mobo the system has to address both cards memory, but XP is much better as using memory than Vista. I still consider Vista a giant bloatware operating system, DirectX 10 is just not enough reason for me to switch. I'm waiting on Windows 7.
  • yyrkoon - Thursday, September 25, 2008 - link

    In XP x32, a single application can only use up to 2GB of memory without using the /3GB boot switch. The /3GB boot switch *can* cause system instabilities, AND the application has to be written to take advantage of this additional memory to begin with.

    Now onto your situation: As far as I know, you could have up to 3GB of memory installed on your system and not be affected by your Video cards memory wise. Right now, the biggest card I know of (consumer grade) only has 1GB of memory, so the 4GB x32 limitation is satisfied. If however there were a consumer grade card that had 2GB addressable . . . I am not sure 100% either way how this would affect things, but I am pretty sure, you would lose some of, or all of that extra 1GB of system RAM.

    My knowledge of how SLI work(low level) is very limited, but for all intents and purposes, the reason why 2x 1GB cards would never affect system memory on up to 3GB, is that the OS sees these card as a single piece of hardware(as far as I know).
  • Ezareth - Thursday, September 25, 2008 - link

    Ohh I also have to note that I use a Samsung 305T(30 inch) monitor and can play crysis on High settings at playable framerates using DirectX9 and XP with this card and I'm using a severely outdated Athlon 64 3700+ processor. I just don't understand how you guys are getting such low framerates at 1920X1600 unless it is Vista and DirectX 10. Reply
  • The0ne - Thursday, September 25, 2008 - link

    2Gig is ok for normal usage but if you switch to being a medium user to a power user than 2gigs is not enough. Your photoshop use is probably dealing with small and simple files for example. I don't play enough games to comment on Crysis but from what I've read it's a hogger. Reply
  • piroroadkill - Thursday, September 25, 2008 - link

    2GB is fine for almost all usage, and I have 4 monitors on my main rig - I do see the need for more RAM though, once you start taking the piss, by, for example running two copies of WoW @ 1680x1050 at max settings on the same rig, as well as irc, dc++, music player, web browser, etc etc, it's swapping time Reply
  • devonz - Thursday, September 25, 2008 - link

    Oops, I didn't read the last comment regarding memory usage. Still, seems like I'm correct and the answer to my question is yes. That's a BIG drawback for doing CF with these cards then. Reply
  • abzillah - Thursday, September 25, 2008 - link

    Well, your perception is wrong. I haven't had any issues with 64bit vista. Maybe you should try 64bit vista and refresh your perception of it. Reply
  • smilingcrow - Thursday, September 25, 2008 - link

    I have Vista x64 installed and it supports 99% of the features that are important to me in my everyday XP SP3 install. The problem is that the 1% of features that are missing are more important to me than the improvements that Vista brings so I’m sticking with XP for now.

    Missing features – ATI Remote Wonder (only works with Media Centre), DVBViewer (doesn’t remember playback position for videos), iTunes etc (don’t support the Multimedia keyboard keys if the application doesn’t have focus).
  • DerekWilson - Thursday, September 25, 2008 - link

    there is no longer any reason to use 32bit vista over 64bit vista ... the initial problems have been resolved, and the apps that have 64bit specific issues are few and far between. if you use one of those apps, sure that's an issue, but in general 64 bit vista is the way to go.

    in fact, we'd recommend 64bit over 32bit every time for the gamer. we do all of our game testing with 64bit as well, and we haven't had any issues for a very long time that were related to the 64bit operating system.
  • yyrkoon - Thursday, September 25, 2008 - link

    "While the 4870 512MB part can be had for $20 or $30 cheaper than the 4870 1GB (if you shop around), many of the 512MB variants are still priced in the same range as the 1GB cards. There is no reason to buy the 512MB part if prices are equal, so we hope to see a downward shift in price for the 512MB version."

    I can think of at least one reason why not to buy the 1GB version given the prices are the same.

    A person using a 32bit version of Windows, with 4GB of memory installed. More video ram used = less available system memory available for use in windows. Granted for some this may not be a HUGE reason, but for people like me who try to build the best all around system for gaming, image editing etc, this can make a difference(especially if you're not cranking the resolution up). Eventually, I will be going to a 64bit OS, so buying 2x2GB makes more sense than wasting money on a 1x1GB stick, and loosing that little extra performance from a dual channel setup.

    Now here is to Windows 7 being released tomorrow ; )
  • DerekWilson - Thursday, September 25, 2008 - link

    you are right about the effect of that case, but i think, at least for gaming, because graphics is the largest bottleneck, you would want your graphics card to have the best advantage possible.

    in the general case there is no longer any reason to use 32bit vista. we only use 64bit for all our graphics testing.

    there may be a few isolated apps here and there that have issues, but the problem is no where near as wide spread as it was the first year or so. for anyone who wants vista, we recommend 64 bit.

    i do get the continued attraction to windows XP though ... i'm not a huge fan of vista myself, and xp 64bit still isn't a great option.

    but if we're talking about 32bit XP then some of our benchmarks would not make any sense (DX10) and that might change things as well ... so there really its both more an less of an issue at the same time :-)
  • yyrkoon - Thursday, September 25, 2008 - link

    Well, in my case, I did not adopt early into Vista, and yes if I were to buy Vista right now, it would probably be Retail Ultimate. So, I would have a 32bit version, and a 64bit version. Anyways, it will probably be around the time SP2 comes out for Vista before I will consider buying it, and until then I will stick with what I have (which is XP Pro x32).

    About my Windows 7 comment, I was sort of serious in that I may even wait until it comes out before making a move, but I know it is not due out right now. The thing for me concerning Direct3D 10 is that most of the games right now are not using it properly, or to its full potential. That, and it really does not(right now) make games look all that much better, OR perform better than Direct3D 9(which would be the main reason I would want 'dx10' to begin with).

    Also just a FYI, if you purchase a 32bit copy of Vista, Microsoft will ship you a 64bit copy for the cost of shipping. Some versions of Vista may not retail as 64bit copies, but all version except for Home personal (I beleive) have their 64bit counterparts. At the very least, I am sure business on up have 64bit versions.

    XP Pro x64, and Windows 2003 share a lot of architecture as you may already know, so maybe this is why XP x64 does not work well for everyone. However, I have a customer that runs XP x64 on an Asus SLI board, with SLI 7800GTX's, and aside from the occasional quirk, it seems to run very well. I actually had to reinstall the OS for him a few months back because he had somehow hosed his install, and all drivers(except one) installed very easily. I do not recall which driver gave me hassles, but it was just PiTA, that lasted at most 30 minutes until I sorted it out. Most games as I understand it will work with XP x64 just fine, with a few that require work-arounds, if they do not already outright work.
  • BikeDude - Thursday, September 25, 2008 - link

    [quote]i'm not a huge fan of vista myself[/quote]

    You're a technical competent person. If you disable superfetch, the glass look, UAE, and a handful of other features, you will have a very XP-like experience, except with proper DX10 support and better 64-bit drivers.


  • JarredWalton - Thursday, September 25, 2008 - link

    Actually, SuperFetch is one of the features I like (for the most part - the increased hard drive noise at idle can be irritating). What irks me is stuff like the modified dialogs. It now takes a couple more clicks and opened windows to get to screen savers, resolution, network settings, etc. I can live with all that, of course, so I'm using Vista on virtually all of my PCs. (I've still got an XP laptop I use for a few things, though.) Reply
  • Spoelie - Thursday, September 25, 2008 - link

    The difference between 3,5GB and 3,0GB will hardly be noticeable. If it is, then installing the 32bit version for a new build is not the smartest thing to do. Reply
  • ilkhan - Thursday, September 25, 2008 - link

    So you are saying that doubling the ram (without any speed improvements) on the 4870 gives a bigger increase than increasing the raw power om the GTX260?

    I like the accessability of your benchmarks, but sometimes they are just really hard to trust.
  • JarredWalton - Thursday, September 25, 2008 - link

    Yes, that's what we're saying. The Core 216 has 12.5% more computation power, but it has the same amount of memory and the same memory bandwidth. It appears we are now at the point where at 1920x1200 and especially 2560x1600, 1GB GPUs see a decent performance boost over 512MB cards. It's not guaranteed, but personally I think we'll see more titles using larger textures in the future - or at least giving users the option for higher res textures.

    But of course we still have the platform question to a certain degree. If you want SLI or CrossFire down the road, you need to consider the choice between NVIDIA or Intel/ATI chipsets (until X58 at least). I give Intel the advantage on chipset, but personally I think SLI is better overall than CrossFire.

    Power is something else to consider; there's really no reason the 4870 can't idle at 4850 levels; that a single 4870 at idle uses more power than 4850 CF is pretty shocking - to me at least. Load power is almost tied, but a 45W idle power increase is really lousy. Sure, that only works out to $30 to $40 per year running 24/7, but that's also heat, which generally means more noise and wear and tear.

    My personal opinion differs somewhat from Derek here. If you want maximum performance for the dollar, the HD 4870 1GB appears the best choice. If you're looking at the entire package, I'm more inclined to go with the $260 GTX 260 for the above reasons (power being a major one). If you're then looking at SLI or CrossFire, while I would still take the X38/X48 over the 780i/790i, and the NVIDIA chipsets are notoriously power hungry, it's close but I'd still go with GTX 260. Since I'm already quite happy with my X38 board, however, I don't think I'll be upgrading my GPU to the NVIDIA camp for a while yet.
  • Finally - Thursday, September 25, 2008 - link

    Yes, please! Check on that. A german site has dug into the underflashing story and found out that a properly undervolted HD4870 uses less power in idle than a HD4850!

    If you are ready to flash, you have the lower power card right there.">
  • JarredWalton - Thursday, September 25, 2008 - link

    I would hope that someday soon AMD will address this with drivers or something... but seriously they dropped the ball here. I mean, 4850 and 4870 are the same GPU, so the only difference is clock speed and voltages. You can't expect me to believe that in this day and age they can't get clock and voltage adjustments to work on-the-fly. A BIOS flash can work it seems, but that just begs the question: why wasn't the BIOS programmed "properly" in the first place? (Possibly they discovered in testing that there were problems with the different voltages?) Users should *NOT* have to flash a GPU BIOS for stuff like proper power saving. Reply
  • Finally - Thursday, September 25, 2008 - link

    Hmm. You are the test-guy. You should know (and tell us, please! :p)

    @HD4870vs.HD4850: You forgot 1 thing: the HD4870 has GDDR5, but the HD4850 has GDDR3. As it has been proven, this makes a big difference. So you can't say that they are the same. GDDR5 seems to be much more undervolting and power-saving-friendly.
  • Spoelie - Thursday, September 25, 2008 - link

    Could you check with ATi if powerplay (down throttling clockspeed *and* voltage) is in the pipeline for a future driver release?

    I've been hearing forum voices saying "it's in the next release" for quite some time now.

  • Jedi2155 - Thursday, September 25, 2008 - link

    In higher resolutions, I think this is a reality.

    I think the article is very truthful and quite a few other sites and come to back this up.

    There is a problem with the frame buffer at higher resolutions and settings, especially if you understand how anti-aliasing among other things work.

    Use Rivatuner to check the memory usage on the frame buffer yourself at those resolutions....
  • NullSubroutine - Thursday, September 25, 2008 - link

    I don't have a problem with them using 8.7 for the 4870, far as I have last heard its a great driver for that card. But that is a horrible driver to use for the 4870 X2. While it wasn't the card being looked at, it can skew the results if you are trying to decide which card to get. Reply
  • Tiamat - Thursday, September 25, 2008 - link

    Page 2:

    512MB -> 1024MB is a 100% improvement (i.e. double the ram) not 50% improvement. 50% improvement would have been to 768MB ram.
  • DerekWilson - Thursday, September 25, 2008 - link

    heh ... you are quite right. sorry about that. i'll fix this. Reply
  • Spoelie - Thursday, September 25, 2008 - link

    Power consumption: "Significantly" more in *both* idle and load?

    idle yes, load no
  • DerekWilson - Thursday, September 25, 2008 - link

    by significant i mean the differences is not negligible Reply
  • Diosjenin - Thursday, September 25, 2008 - link

    I was rather under the impression that the 1GB per/2GB total RAM on the 4870 X2 was generally the reason it could be found in many cases to scale better than two 4870s, since the latter option included only 512MB per/1GB total RAM.

    Now that we have 4870s with 1GB RAM, can you stick two of them together and do a 2x 4870 1GB vs 4870 X2 comparison to see how that can affect the scaling disparities we've seen there before?
  • carmaster22 - Thursday, September 25, 2008 - link

    How come you don't include the NVIDIA 9800 series cards anymore?

    They were proven to perform just as well and better than the GTX series and there's many people that have them.
  • SiliconDoc - Friday, October 03, 2008 - link

    It's to make it as confusing as possible to we the consumer. No matter WHAT review site I go to - they are absolutely CERTAIN to leave out a couple of KEY cards in the reviews - so that it makes it absolutely near IMPOSSIBLE to make a reasonable decision without endless HOURS of finding, comparing, checking the system stats, of various reviews....
    It's like corporate code - but what really happens is the goobers are thinking, thinking, thinking - and they think so much and so !bleepin! hard, that they come up with some cool points, and interesting facts - but alas - you still don't know what you want to know.
    Whatever, it's so frustrating - I'm sending $100 paypal to the first author that actually satisfies a good lineup in review.
  • Goty - Thursday, September 25, 2008 - link

    The only 9800 series card that could outperform the 4870 was the 9800GX2, and even that fell behind the 4870 and the GTX200s when you started cranking up the resolution and IQ. Reply
  • daniyarm - Thursday, September 25, 2008 - link

    8800gt SLI beat 4750 in several tests even at hi-res. People that own single or sli 8800 or 9800 want to know how the cards compare. What's the point of a review that compares only new gen cards and give absolutely no information for people that want to know if they need to upgrade or not. Reply
  • daniyarm - Thursday, September 25, 2008 - link

    I meant 4870. Reply
  • Patrick Wolf - Thursday, September 25, 2008 - link

    Where do you get your info?

    The 9800 GX2 can be had for under $300 and is also very comparable to the 8800gt SLI. If you have an SLI mainboard and an 8800gt, a very cheap and viable upgrade would be to throw in another 8800gt. If you lack the SLI mainboard, bite on the GX2 and sell your current card.

    Same goes for the 9800 GTX and GTX+ if you're going SLI with them.

    The following graphs speak for themselves. All the above solutions are still a great contender, worthy of inclusion.

  • Patrick Wolf - Thursday, September 25, 2008 - link

  • JarredWalton - Thursday, September 25, 2008 - link

    Considering you just linked a recent review that has all of the pertinent information, then throw in the cluttered nature of those charts, and I'm of the opinion that dropping most of the cards and just keeping the more recent stuff makes a lot of sense.

    FYI, outside of a few games (The Witcher 2xAA, ETQW 4xAA, COD4 4xAA) the 9800 GTX+ is very close to the performance of the GTX 260. That's understandable, since they have similar architectures. Here's the theoretical performance overview:

    GTX 260:
    192 SPs at 1242MHz = 715.4 GFLOPS
    Core clock of 576 MHz = 36.9 GT/s texture fillrate
    28 ROPs at 576 MHz = 16.1 GP/s pixel fillrate
    448-bit RAM at 999MHz = 111.9 GB/s

    9800 GTX+:
    128 SPs at 1836MHz = 940 GFLOPS
    Core clock of 738 MHz = 47.2 GT/s texture fillrate
    16 ROPs at 738 MHz = 11.8 GP/s pixel fillrate
    256-bit RAM at 1100MHz = 70.4 GB/s

    So the GTX 260 has substantially more bandwidth (59%) and pixel fillrate (36%), while the 9800 GTX+ has more theoretical GFLOPS (31%) and texture processing power (28%). The GTX 260 ends up faster overall - I'm not sure it ever trails - but there are many games where the difference between the two is only about 10%. The 8800 GT, for the record, is usually 65 to 75% of the performance of GTX 260.
  • Spoelie - Thursday, September 25, 2008 - link

    The 9800GT is a rebadged 8800GT, which sits below any ATi 4 series .
    The 9800GTX+ is competitive with the 4850, but that's a lower price/market segment (target resolution 1600x1200/1680x1050 ?).

    This is an article about cards that run newer games on 1920x1200 and higher comfortably, and well, the 9 series just don't make that cut anymore. Anandtech included all possible contenders except maybe for some SLI configurations.
  • Jovec - Thursday, September 25, 2008 - link

    AT is wrong in what many, if not most, of us want in these reviews. Simply, we have our specific system, and want to know if a single upgraded part or parts (when it comes to new platforms) justifies the purchase price. Obvioulsy impractical, but there is no reason AT can't build and keep 1-2 systems per year, for a 2-3 year moving window, of the most common mid-range builds and include those benchmarks as a reference in all reviews.

    The best example of this is the 8800GTS which I assume many of us still own. We don't care how the 4870 runs on the Intel Core 2 Extreme QX9770 @ 3.20GHz used in the review, we care if the card will boost our FPS enough to warrant purchasing for our computer. It's a different type of comparison than the GPU-limited and CPU-limited tests they currently run, but very useful to the majority of us.

    We can say "My system is very close to the Fall '07 system, so if I buy this new card I'll get a similar performance increase."

  • poohbear - Sunday, September 28, 2008 - link

    "The best example of this is the 8800GTS which I assume many of us still own. We don't care how the 4870 runs on the Intel Core 2 Extreme QX9770 @ 3.20GHz used in the review, we care if the card will boost our FPS enough to warrant purchasing for our computer. It's a different type of comparison than the GPU-limited and CPU-limited tests they currently run, but very useful to the majority of us."


    please anandtech show us graphs that tell us if its worth it to upgrade from an 8800gt or similar hardware, that's what the majority of us own and are interested in knowing. This 1gb 4870 is interesting and all, but why would you do an article like this before addressing the aforementioned question that is pertinent to 90% of the rest of us?
  • PrinceGaz - Friday, September 26, 2008 - link

    I must concur with this guy.

    Whilst I am more than capable of internally extrapolating other test results to these latest ones, I have an 8800GTS (640MB) so with all reviews on AT have to go through a double compare of the closest in the review to what I have.

    Please Anand & co, include at least high-end cards from nVidia and AMD from two generations ago in all graphics-card reviews, and mid-range cards from ther previous-generation where relevant. We don't all buy a new card every few months, most people will buy a new graphics card every two years or less often, even quite a few of us who visit this site every day.

    When I only see comparisons to other current cards as in this review, I invariably abort reading it and find a review on another site which includes a wider range of cards, which is a shame as I enjoy the technical info here- but if the only comparison provided is between current generation competitors, neither of which I have, then all the graphs are pointless as I (and most other people) have previous generation cards.
  • SiliconDoc - Friday, October 03, 2008 - link

    That would be imformative and then we'd be informed. In tihs case, you must be a $$$$$$$$$$$$$$$$$$$4 loaded supergame monster - and so it doesn't matter.
    Just crank up another 4 or 5 hundred on the collegiate mastercard if you blow the first couple of know that can be the geekfest lan machine or whatever.
  • saiku - Wednesday, October 01, 2008 - link

    I must also agree to the two posters above me. I too have a 8800 GTS and would like to know if this card makes sense for an upgrade. Can AT please keep 1-2 year old tech around when they talk about new parts? Reply
  • formulav8 - Saturday, September 27, 2008 - link

    Anandtech finally game AMD the props they deserved along time ago instead of kissing up to nvidia who had no care in the world about charging almost a literal arm and leg. Not that AMD/ATI cares all that much but at least they didn't milk us for all we're worth like nVidia did the past couple years. Just my opinion of course :)

    Decent review by the way. I do agree though that reviews should be more real world in the sense that they be reviewed with more reasonable, mainstream components like a Phenom, Higher End X2, and Low-Mid end Core2, memory configs like 2GB at 667/800 mhz, ect... Since that is where most gamers will be at. Again, just my opinion :)

  • SiliconDoc - Saturday, December 27, 2008 - link

    How about mainstream monitor resolutions ?
    Oh that's right , the highend cards blow away mainstream monitor resolutions now.
    Well, I guess they had better start including the 2 grand required for proper monitor upgrades to run the resolutions they post all their reviews in now.
    Somehow that giant 30 inch Dell DWP (or two or three) 2560x1600 sitting in the test lab isn't magically appearing attached to their readers gaming rigs.
  • Jorgisven - Monday, September 29, 2008 - link

    The trouble with using mid-range components is that they can unneccessarily bottleneck the performance in ways that can affect different GPU's differently, in different situations. If you scale back your components and then put a $500 GPU in, the results would be skewed, especially at higher details. Since this is a higher end GPU, one would expect that you would have components relative in price as well. Reply
  • SiliconDoc - Friday, October 03, 2008 - link

    That's not the worst of it Jordan ...
    How often do you run a 2550x 1900 or whatever supermegga massive giganto resolution ?
    I guess I'll have to take a rtip to Taiwan for my 37" super rezz megoo mighty mouse monitor... that's only $2,567.98 plus tax and overseas shipping to even run the ding dang rezzzzzzzzzzzzzes they test in nowadyas.
    Yeah, gee, we've all got these monitors that run this extraordinary resolution that you CANNOT EVEN FIND in the retail store market - and barely online when you try.
    ( A friend of mine can't stop cranking up his rezzz - he just got 2 more monitors last night - so he cranks the thing up till the monitors are buzzing and hissing LOL - then I send him a pic or a screenshot or something - and say blah blah - and he can't read the thing - the lettering is the size of the ding dang microdot.
    .000000035 um lettering. lol
    He doesn't READ much. rofl.
  • Patrick Wolf - Thursday, September 25, 2008 - link

    While you can go back to previous articles which have basically the same Test Setup with the same games, it would be very convenient if they would include cards from the 9 series in their newer articles. Reply
  • Hxx - Thursday, September 25, 2008 - link

    Why should they be compared? 9800s are old tech, not in high demand anymore.Thats why

    About the article, the writer did a very good job. I'm surprised to see the 4870 coming up in front of the gtx280 in some games. The 4870 1gb is the best card from a price/performance perspective.
    Looks like Nvidia is due for another price drop, lol. Good job ATI.
  • ZoZo - Friday, September 26, 2008 - link

    Comments like yours is exactly why NVIDIA decided to rename the GeForce 9 line to GT 100. Reply
  • Griswold - Friday, September 26, 2008 - link

    So, you claim that the rebadged GF8 aka GF9 is the same as the two GTX models they have at the top now? Go take a nap... Reply
  • homerdog - Monday, September 29, 2008 - link

    No, NVIDIA is supposedly releasing some G92b based cards under the moniker of "GT 100-series".">
  • SiliconDoc - Friday, October 03, 2008 - link

    Oh forgot the stupid google link">
  • SiliconDoc - Friday, October 03, 2008 - link

    Gosh I only need $1,549.99 to get me my 2650x1600 monitor - yeah wouldn't I spend and extra 20 bucks on that 1 meg videocard...
    Yah buddy, makes sense me - I'm a hick from hickville - gonna gets me that corporate CAD monitor real soon now.
    (good gawd)
    No worries I love anandtech - it's just we don't get all the superfreebies - so ya know the 2 grand monitor is kinda wife-repelling.
  • fcx56 - Tuesday, December 16, 2008 - link

    Ha, consider yourself lucky! I paid $3K for mine back in 2004, WITH the student discount Reply
  • SiliconDoc - Saturday, December 27, 2008 - link

    Yes, you and almost noone else is the point. BTW - keep that student grant money spending thing on the low low. :-)
    Oh, yes of course Chancellor, my education woul;d have absolutely suffered immeasurable harm without my 2560x1600 gaming... err..uhh... ahh.. unmmm. I meant cad workshop / artistic design monitor. I thank you Sir, and the taxpayers, for your concern.
  • harbin - Thursday, September 25, 2008 - link

    but i am no gamer, i really don't know. Reply

Log in

Don't have an account? Sign up now