Final Words

So here's the deal. We can find the GTX 280 for about $340 if we aren't looking very hard (it can actually be had right now before mail in rebate for $325 at newegg but we'll give the 285 the benefit of the doubt). Compared to the $380 we can grab the new GeForce GTX 285 for, that's over 11% more money for only about 10% performance improvement. Of course there are more aggressively overclocked parts out there but they tend to cost a bit more as well. We do often see decreasing value with increasing performance, but it's not something we like. And if you don't mind mail in rebates the GTX 280 can be had for $300.

It looks like the benefit to the consumer here is going to be the unloading of GTX 280 hardware at prices that put it in better competition to the Radeon 4870 1GB. Of course the 4870 1GB is still a lot cheaper, but the GTX 280 starts to get a little more attractive at only 20% more expensive than the 4870 1GB as much of the time the performance advantage is larger than that. There are exceptions, of course.

It is a little more difficult to compare the GeForce GTX 285 to AMD hardware because of the price point. AMD doesn't have a card that hits the $400 mark (without mail in rebates that is: the 4870 X2 can hit $400 after mail ins). At about $50 more expensive, as we've noted, the 4870 X2 is just over 13% higher in price. Typically the 4870 X2, even in games that don't favor AMD architecture, leads the GeForce GTX 285 by more than that, often at performance about 18% higher at 2560x1600. This indicates that even at the higher price, value (price/performance) is higher with the 4870 X2.

In spite of the potential advantages offered by the Radeon 4870 X2, we have qualms about recommending it based on our experiences since October with the introduction of Core i7 and X58 and the multitude of software titles that were released. Driver support just isn't what it needs to be to really get behind an AMD single card dual-GPU solution right now. The issue is less about what's out now and more about support for titles as they come out and fast responses to issues (which AMD can't provide). The 8.12 hotfix (that is listed as only necessary with 4850 CrossFire) actually has improved stability and performance on all the single and dual setups we've tested on Core i7. We haven't finished putting it through its paces, but so far this one is a real step in the right direction. Unfortunately it will be months before we see this hotfix rolled into a WHQL driver. We definitely recommend this hotfix at least to anyone using AMD hardware on Vista x64 with a Core i7 platform.

In summary, despite its typical 10% performance advantage, the GeForce GTX 285 offers less price/performance than the GTX 280. The closest price competitor to the GTX 285, the Radeon HD 4870 X2, also offers better value, but at a higher price. At the same time, we have reservations about putting our weight behind the 4870 X2 with the driver issues we've experienced lately.

Smaller Die + More Performance = More Power
Comments Locked

76 Comments

View All Comments

  • nyran125 - Friday, January 16, 2009 - link

    even the 8800GTS 512mb run all these games still with a decent frame rate of 30 - 80 fps, no point in upgrading at the moment till the net big graphical enhancement or graphical revolution to games , like the next big Crysis or Oblivian type enhancement. Till then , its a complete waste of money buying a newer card if you already have an 8800. Because you know that when the next big graphics game comes out like a new Elder Scrolls(Oblivian) or something the newest card out at that time wont even be enough to run it till 6 months to a year after the game is out. Sometimes it takes even longer for the graphics cards to catch up to the games on Maximum settings. So stop wasting your money lol.
  • Captain828 - Friday, January 16, 2009 - link

    so you're telling me you're getting >30FPS when playing Crysis @ Very High, 1680x1050 + 2xAA + 4xAF ??!
  • ryedizzel - Thursday, January 15, 2009 - link

    Not sure if anyone has said this yet but why the heck are you guys running all your benchmarks at 2560x1600? I mean seriously, how many people are really using monitors that big? AT THE LEAST please show some benchmarks for 1680x1050!
  • Iketh - Friday, January 16, 2009 - link

    lol they are there... are you unable to read a line chart?
  • JarredWalton - Friday, January 16, 2009 - link

    You don't buy parts like the GTX 285/295 or 4870X2 to play at 1680x1050 for the most part. In fact, 2560x1600 and 30" LCDs is the primary reason that I bought a 4870X2 around the time Fallout 3 came out. You can see that even at higher resolutions, there are several titles that are somewhat system limited (i.e. the GPU is so powerful that the benchmark isn't fully stressing the GPU subsystem).
  • MadMan007 - Friday, January 16, 2009 - link

    That's certainly true and I think we understand why the charts are for the highest resolution and it's nice to provide data for lower resolutions. Aside from making a graph for each resolution, perhaps it would be possible to make them interactive somehow...say I click on 1920x1200 below the graph, then that data is charted. What would be really top notch is if I could choose which cards and which resolutions to compare.
  • GhandiInstinct - Friday, January 16, 2009 - link

    MadMan,

    I only wish they did that. Then their reviews would be my #1 source.
  • Stonedofmoo - Friday, January 16, 2009 - link

    But that's just the point though. Most people are still running 22" monitors at 1680x1050 res. We don't NEED top end powerful cards that Nvidia and ATI seem only interested in building.

    What we're looking for are upper midrange parts like a hypothetical GTX 240/220 if they were to exist to replace the aging and now redundant Geforce 9 series.

    Seriously:-
    ATI have more midrange parts than nvidia but really need to work on their rubbish power consumption, especially at idle.
    Nvidia need to actually have some midrange parts but have the power consumption sorted.

    Both need to refocus. I've never seen Nvidia go for so long without releasing a completely new series of cards from top to bottom end.
  • SiliconDoc - Monday, January 19, 2009 - link

    Well more cards are always better and keep us entertained and interested, but this continuous call for "midrange" from NVidia is somewhat puxzzling to me.
    Since the 9800xxxx takes on the 4850, then there's the 9600 on down and the 88gtxx -- I mean it's all covered...
    ATI jusr relased the 4830 on their new highest core crippled to take on the 9800(GT) so they claim...
    I guess if I were NVidia I wouldn't waste my company time pr money on people wanting to read "new technology reviews" based upon cards that REFILL an already filled space that the competition just a bit ago, after 2 years of near NADA, finally brought to market some competition.
    Since ATI was dang out of it for so long - why should NVidia retool the GT200 core to STOMP all their 9800 8800 9600 and the like pieces?
    You want them to destroy themselves and their own line so you can say " Hey, new tech in the midrange - now I can decide if I want a 4830 or 4670 or 4850 or one of these crippled GT200's " - then moments later you'll say to yorself " Wait a minute, why should I get rid of my 9800gtx ?!"...
    I mean REALLY ...
    Would someone please expalin to me what I'm missing ?
    It is all "I want a crippled cheap GT200 core ", isn't it ?
    Maybe part of it is why are we still here when the 8800 was released in Nov 2006 ?
    Maybe the question should be why is ATI still taking on 2 year old NVidia tech.
    AMD just took another huge charge loss from it's ati division, and I'm not certain NV is doing any much better ( though gpu-z shows 65% of the market is NV's ) - so why would NV do an expensive die/core rollout that crushes their already standing cores that compete with ati midrange just fine ?
    It just does not make any sense.
  • hk6900 - Saturday, February 21, 2009 - link


    Die painfully okay? Prefearbly by getting crushed to death in a
    garbage compactor, by getting your face cut to ribbons with a
    pocketknife, your head cracked open with a baseball bat, your stomach
    sliced open and your entrails spilled out, and your eyeballs ripped
    out of their sockets. Fucking bitch

    I really hope that you get curb-stomped. It'd be hilarious to see you
    begging for help, and then someone stomps on the back of your head,
    leaving you to die in horrible, agonizing pain. *beep*

    Shut the *beep* up f aggot, before you get your face bashed in and cut
    to ribbons, and your throat slit.

Log in

Don't have an account? Sign up now