Final Words

So here's the deal. We can find the GTX 280 for about $340 if we aren't looking very hard (it can actually be had right now before mail in rebate for $325 at newegg but we'll give the 285 the benefit of the doubt). Compared to the $380 we can grab the new GeForce GTX 285 for, that's over 11% more money for only about 10% performance improvement. Of course there are more aggressively overclocked parts out there but they tend to cost a bit more as well. We do often see decreasing value with increasing performance, but it's not something we like. And if you don't mind mail in rebates the GTX 280 can be had for $300.

It looks like the benefit to the consumer here is going to be the unloading of GTX 280 hardware at prices that put it in better competition to the Radeon 4870 1GB. Of course the 4870 1GB is still a lot cheaper, but the GTX 280 starts to get a little more attractive at only 20% more expensive than the 4870 1GB as much of the time the performance advantage is larger than that. There are exceptions, of course.

It is a little more difficult to compare the GeForce GTX 285 to AMD hardware because of the price point. AMD doesn't have a card that hits the $400 mark (without mail in rebates that is: the 4870 X2 can hit $400 after mail ins). At about $50 more expensive, as we've noted, the 4870 X2 is just over 13% higher in price. Typically the 4870 X2, even in games that don't favor AMD architecture, leads the GeForce GTX 285 by more than that, often at performance about 18% higher at 2560x1600. This indicates that even at the higher price, value (price/performance) is higher with the 4870 X2.

In spite of the potential advantages offered by the Radeon 4870 X2, we have qualms about recommending it based on our experiences since October with the introduction of Core i7 and X58 and the multitude of software titles that were released. Driver support just isn't what it needs to be to really get behind an AMD single card dual-GPU solution right now. The issue is less about what's out now and more about support for titles as they come out and fast responses to issues (which AMD can't provide). The 8.12 hotfix (that is listed as only necessary with 4850 CrossFire) actually has improved stability and performance on all the single and dual setups we've tested on Core i7. We haven't finished putting it through its paces, but so far this one is a real step in the right direction. Unfortunately it will be months before we see this hotfix rolled into a WHQL driver. We definitely recommend this hotfix at least to anyone using AMD hardware on Vista x64 with a Core i7 platform.

In summary, despite its typical 10% performance advantage, the GeForce GTX 285 offers less price/performance than the GTX 280. The closest price competitor to the GTX 285, the Radeon HD 4870 X2, also offers better value, but at a higher price. At the same time, we have reservations about putting our weight behind the 4870 X2 with the driver issues we've experienced lately.

Smaller Die + More Performance = More Power
Comments Locked

76 Comments

View All Comments

  • Gorghor - Tuesday, January 27, 2009 - link

    Actually more of a retorical question than anything else. Sales haven't been good and no support for dual DVI when using the Hybrid Power mode are the reasons I've heard about. I still don't understand why they don't push these Hybrid technologies.

    I mean, in a day and age where everybody's talking about saving our planet, it just seems idiotic to push ever more power hungry graphic cards eating up as much as a 600 liter marine aquarium. What a damned waste, not to mention the fact that electricity is far from cheap (any of you tried living in Germany?). The worse part is that the technology exists and works (both from ATI and Nvidia) in laptops, so it can't be all that complicated to make a decent version for the desktop. It's just that no one seems to care...

    Well I for one can't stand the idea of wasting power on an idle graphics card that could just as well be disabled when I'm not gaming (read: 99% of the time). And I wish more people would think the same.

    Sorry 'bout the rant, just makes me angry!
  • veryhumid - Monday, January 26, 2009 - link

    One thing I'd like to see is some older cards in the mix... just a couple. Maybe a GX2, 8800GT... recently popular cards. I'm curious to see how big and how fast the performance gap gets over a fairly short period of time.
  • MadBoris - Monday, January 19, 2009 - link

    A couple things...

    Here we have a die shrink and you are apparently more interested if you can save a few watts rather than how high the OC can go?

    I think most when looking at this card would be more interested if it can do 10% overclock rather than 10% power savings.

    How high does it OC, did I miss it somewhere?

    People aren't buying brute force top-end video cards for their power consumption just like a Ferrari isn't purchased for it's gas mileage. I'm not saying power consumption doesn't have a place but it's a distant second to performance that a die shrink offers in stable OC potential.

    I also have to echo the statements about 2560x1600, being the standard chart, may need some rethinking. I get it that at those resolutions that is where these cards shine and start leaving behind weaker cards. BUT it's a very small percentage of readership that has 30" monitors. I'm at 24" with 1920 and that is probably not common. It would seem to make the best sense to target primarily the most common resolutions which people may be tempted to purchase for. Probably 1680 or 1920 most likely. Cheaper video cards do much better in comparison at smaller resolutions, which are the actual resolutions most are using. I get it that the chart below shows the different resolutions but that is where 2560 should be found, it shouldn't be the defacto standard. Reminds me of using 3dmark and how the numbers don't reflect reality, ofcourse these cards look good at 2560 but that isn't what we have.
    ~My 2 cents.

  • SiliconDoc - Monday, January 19, 2009 - link

    Yes, the power savings freakism to save the whales, and the polar bears, is out of control. After all that, the dips scream get a 750 watt or 1000 watt, or you're in trouble, park that 400-500 you've been using - or the 300 in many cases no doubt. Talk about blubbering hogwash...and ATI loses in the power consumption war, too, and even that is covered up - gosh it couldn't be with that "new tech smaller core" .....
    Now of course when NVidia has a top end card for high rezolution, it's all CRAP - but we NEVER heard that whine with the 4870 - even though it only excelled at HIGHER and HIGHEST resolution and aa af - so that's how they got in the RUT of 2560x - it showed off the 4870 in all it's flavors for 6 months - and they can't help themselves doing it, otherwise the 4870 SUCKS, and SUCKS WIND BADLY... and get spanked about quite often.
    The high rez shows and reviews are for the red team bias win - and now suddenly when it six months of endless red raving for 2650x - all the eggheads realize they don't have that resolution sitting in front of them - because guess who - BLABBERED like mad about it.
    :-) YEP
    The only one to point it out on the 4870x2 and the like - and boy was I crushed... gosh what a fanboy...
    But now it's all the rave- so long as it's directed at NVidia.
    AND - the lower benchies make the ati top cards look CRAPPY in comparison.
    Oh well, I'm sure the reds know it - maybe they love the possibility of higher fps if they ever pop for a 30".
    _______________________________________________________

    All I can say is thank you NVidia for finally straightening out the DERANGED MINDS of the 4870 loverboys.

    Thank you, Nvidia - you may have cured thousands this time.
  • hk6900 - Saturday, February 21, 2009 - link


    Die painfully okay? Prefearbly by getting crushed to death in a
    garbage compactor, by getting your face cut to ribbons with a
    pocketknife, your head cracked open with a baseball bat, your stomach
    sliced open and your entrails spilled out, and your eyeballs ripped
    out of their sockets. Fucking bitch

    I really hope that you get curb-stomped. It'd be hilarious to see you
    begging for help, and then someone stomps on the back of your head,
    leaving you to die in horrible, agonizing pain. *beep*

    Shut the *beep* up f aggot, before you get your face bashed in and cut
    to ribbons, and your throat slit.

    You're dead if I ever meet you in real life, f ucker. I'll f ucking
    kill you.

    I would love to f ucking send your f ucking useless ass to the
    hospital in intensive care, fighting for your worthless life.

    http://www.youtube.com/watch?v=Po0j4ONZRGY">http://www.youtube.com/watch?v=Po0j4ONZRGY

    I wish you a truly painful, bloody, gory, and agonizing death, *beep*
  • MadBoris - Monday, January 19, 2009 - link

    All the green and red aside...I didn't mean to bash AT or Derek.
    Just wondering what happened to the reviews of years back when we had more than just one chart to gawk at and make determinations for a product. With more charts we could analyse easier, or maybe their was a summary. The paper launch review, was just that, and maybe this limited review is kind of a message to Nvidia to not paper launch but...we lose.

    Even though this is just a refresh with a die shrink, I still think it's worth revisiting and restating some of what may be obvious to some who breathe GPU's, but which I don't remember from 6 months ago.

    Like...the whole landscape for todays purchasing decisions.
    Effects of CPU scaling, AA and AF penalties for a card, which resolutions make sense for a high end card with a GB memory.
    You don't have to test it and graph it but give us a reminder like 4x AA is free with this card and not this one, or CPU scaling doesn't mean as much as it once did, or this card doesn't make sense below 1680x1050, or this amount of memory isn't quite needed these days unless you are at 2560, etc. A reminder is good. I was surprised not to see a mention of highest OC and what % perf that gets you in the same article talking die shrink, so things have changed. OC isn't the black art it once was, it's often supported right in the mfr drivers so it shouldn't be passed up. I always run my HW at a comfortable OC (several notches back from highest stable), why wouldn't I if it's rock solid.

    I like 1 GB memory but there's only a few games that can break my 640MB GTS barrier (oblivion w/ tweaks, Crysis w/ tweaks, maybe there are others I don't have). I'd like 1GB more if Win 7 wasn't still supporting 32 bit and then we can see devs start using bigger footprints. But due to multiplatforming and the one size fits all lowest common denominator console hardware being the cookie cutter, well 1GB memory means less than it probably ever did in the last couple years since games won't be utilizing it.
  • SiliconDoc - Tuesday, January 20, 2009 - link

    I like all your suggestions there. As for the reviewer(Derek), I can't imagine the various pressures and mess to put together even one review, so I don't expect much all the time, and I don't expect a reviewer to not have a favorite- and even push for it in their pieces - consciously or otherwise. The reader needs to allow for and understand that.
    I think what we do get should be honestly interpreted and talked about (isn't that really the whole point besides the interest and entertaining valuie), so I'm more about straightening out longstanding hogwash the overclocking masses repeat.
    You made some fine points I hope Derek sees it - I just imagine they are crushed for time of late, perhaps the economic cutbacks have occured at AnAnd as well - but a bit of what you pointed out should be easy enough. Maybe they will roll it in.
  • hk6900 - Saturday, February 21, 2009 - link

    Get murdered, cunt
  • MadBoris - Monday, January 19, 2009 - link

    In all fairness, Derek does summarize some of what one can draw from the numbers, atleast mainly 2560 graphs and how it compares to the 280 price/perf. But like my subject states you can't please everyone, I guess I would like to see more, maybe that is just me.
  • Muzzy - Saturday, January 17, 2009 - link

    The box said "HDMI", but I don't see any HDMI port on the card. What am I missing?

Log in

Don't have an account? Sign up now