POST A COMMENT

76 Comments

Back to Article

  • Gorghor - Tuesday, January 27, 2009 - link

    Actually more of a retorical question than anything else. Sales haven't been good and no support for dual DVI when using the Hybrid Power mode are the reasons I've heard about. I still don't understand why they don't push these Hybrid technologies.

    I mean, in a day and age where everybody's talking about saving our planet, it just seems idiotic to push ever more power hungry graphic cards eating up as much as a 600 liter marine aquarium. What a damned waste, not to mention the fact that electricity is far from cheap (any of you tried living in Germany?). The worse part is that the technology exists and works (both from ATI and Nvidia) in laptops, so it can't be all that complicated to make a decent version for the desktop. It's just that no one seems to care...

    Well I for one can't stand the idea of wasting power on an idle graphics card that could just as well be disabled when I'm not gaming (read: 99% of the time). And I wish more people would think the same.

    Sorry 'bout the rant, just makes me angry!
    Reply
  • veryhumid - Monday, January 26, 2009 - link

    One thing I'd like to see is some older cards in the mix... just a couple. Maybe a GX2, 8800GT... recently popular cards. I'm curious to see how big and how fast the performance gap gets over a fairly short period of time. Reply
  • MadBoris - Monday, January 19, 2009 - link

    A couple things...

    Here we have a die shrink and you are apparently more interested if you can save a few watts rather than how high the OC can go?

    I think most when looking at this card would be more interested if it can do 10% overclock rather than 10% power savings.

    How high does it OC, did I miss it somewhere?

    People aren't buying brute force top-end video cards for their power consumption just like a Ferrari isn't purchased for it's gas mileage. I'm not saying power consumption doesn't have a place but it's a distant second to performance that a die shrink offers in stable OC potential.

    I also have to echo the statements about 2560x1600, being the standard chart, may need some rethinking. I get it that at those resolutions that is where these cards shine and start leaving behind weaker cards. BUT it's a very small percentage of readership that has 30" monitors. I'm at 24" with 1920 and that is probably not common. It would seem to make the best sense to target primarily the most common resolutions which people may be tempted to purchase for. Probably 1680 or 1920 most likely. Cheaper video cards do much better in comparison at smaller resolutions, which are the actual resolutions most are using. I get it that the chart below shows the different resolutions but that is where 2560 should be found, it shouldn't be the defacto standard. Reminds me of using 3dmark and how the numbers don't reflect reality, ofcourse these cards look good at 2560 but that isn't what we have.
    ~My 2 cents.

    Reply
  • SiliconDoc - Monday, January 19, 2009 - link

    Yes, the power savings freakism to save the whales, and the polar bears, is out of control. After all that, the dips scream get a 750 watt or 1000 watt, or you're in trouble, park that 400-500 you've been using - or the 300 in many cases no doubt. Talk about blubbering hogwash...and ATI loses in the power consumption war, too, and even that is covered up - gosh it couldn't be with that "new tech smaller core" .....
    Now of course when NVidia has a top end card for high rezolution, it's all CRAP - but we NEVER heard that whine with the 4870 - even though it only excelled at HIGHER and HIGHEST resolution and aa af - so that's how they got in the RUT of 2560x - it showed off the 4870 in all it's flavors for 6 months - and they can't help themselves doing it, otherwise the 4870 SUCKS, and SUCKS WIND BADLY... and get spanked about quite often.
    The high rez shows and reviews are for the red team bias win - and now suddenly when it six months of endless red raving for 2650x - all the eggheads realize they don't have that resolution sitting in front of them - because guess who - BLABBERED like mad about it.
    :-) YEP
    The only one to point it out on the 4870x2 and the like - and boy was I crushed... gosh what a fanboy...
    But now it's all the rave- so long as it's directed at NVidia.
    AND - the lower benchies make the ati top cards look CRAPPY in comparison.
    Oh well, I'm sure the reds know it - maybe they love the possibility of higher fps if they ever pop for a 30".
    _______________________________________________________

    All I can say is thank you NVidia for finally straightening out the DERANGED MINDS of the 4870 loverboys.

    Thank you, Nvidia - you may have cured thousands this time.
    Reply
  • hk6900 - Saturday, February 21, 2009 - link


    Die painfully okay? Prefearbly by getting crushed to death in a
    garbage compactor, by getting your face cut to ribbons with a
    pocketknife, your head cracked open with a baseball bat, your stomach
    sliced open and your entrails spilled out, and your eyeballs ripped
    out of their sockets. Fucking bitch

    I really hope that you get curb-stomped. It'd be hilarious to see you
    begging for help, and then someone stomps on the back of your head,
    leaving you to die in horrible, agonizing pain. *beep*

    Shut the *beep* up f aggot, before you get your face bashed in and cut
    to ribbons, and your throat slit.

    You're dead if I ever meet you in real life, f ucker. I'll f ucking
    kill you.

    I would love to f ucking send your f ucking useless ass to the
    hospital in intensive care, fighting for your worthless life.

    http://www.youtube.com/watch?v=Po0j4ONZRGY">http://www.youtube.com/watch?v=Po0j4ONZRGY

    I wish you a truly painful, bloody, gory, and agonizing death, *beep*
    Reply
  • MadBoris - Monday, January 19, 2009 - link

    All the green and red aside...I didn't mean to bash AT or Derek.
    Just wondering what happened to the reviews of years back when we had more than just one chart to gawk at and make determinations for a product. With more charts we could analyse easier, or maybe their was a summary. The paper launch review, was just that, and maybe this limited review is kind of a message to Nvidia to not paper launch but...we lose.

    Even though this is just a refresh with a die shrink, I still think it's worth revisiting and restating some of what may be obvious to some who breathe GPU's, but which I don't remember from 6 months ago.

    Like...the whole landscape for todays purchasing decisions.
    Effects of CPU scaling, AA and AF penalties for a card, which resolutions make sense for a high end card with a GB memory.
    You don't have to test it and graph it but give us a reminder like 4x AA is free with this card and not this one, or CPU scaling doesn't mean as much as it once did, or this card doesn't make sense below 1680x1050, or this amount of memory isn't quite needed these days unless you are at 2560, etc. A reminder is good. I was surprised not to see a mention of highest OC and what % perf that gets you in the same article talking die shrink, so things have changed. OC isn't the black art it once was, it's often supported right in the mfr drivers so it shouldn't be passed up. I always run my HW at a comfortable OC (several notches back from highest stable), why wouldn't I if it's rock solid.

    I like 1 GB memory but there's only a few games that can break my 640MB GTS barrier (oblivion w/ tweaks, Crysis w/ tweaks, maybe there are others I don't have). I'd like 1GB more if Win 7 wasn't still supporting 32 bit and then we can see devs start using bigger footprints. But due to multiplatforming and the one size fits all lowest common denominator console hardware being the cookie cutter, well 1GB memory means less than it probably ever did in the last couple years since games won't be utilizing it.
    Reply
  • SiliconDoc - Tuesday, January 20, 2009 - link

    I like all your suggestions there. As for the reviewer(Derek), I can't imagine the various pressures and mess to put together even one review, so I don't expect much all the time, and I don't expect a reviewer to not have a favorite- and even push for it in their pieces - consciously or otherwise. The reader needs to allow for and understand that.
    I think what we do get should be honestly interpreted and talked about (isn't that really the whole point besides the interest and entertaining valuie), so I'm more about straightening out longstanding hogwash the overclocking masses repeat.
    You made some fine points I hope Derek sees it - I just imagine they are crushed for time of late, perhaps the economic cutbacks have occured at AnAnd as well - but a bit of what you pointed out should be easy enough. Maybe they will roll it in.
    Reply
  • hk6900 - Saturday, February 21, 2009 - link

    Get murdered, cunt Reply
  • MadBoris - Monday, January 19, 2009 - link

    In all fairness, Derek does summarize some of what one can draw from the numbers, atleast mainly 2560 graphs and how it compares to the 280 price/perf. But like my subject states you can't please everyone, I guess I would like to see more, maybe that is just me. Reply
  • Muzzy - Saturday, January 17, 2009 - link

    The box said "HDMI", but I don't see any HDMI port on the card. What am I missing? Reply
  • DerekWilson - Saturday, January 17, 2009 - link

    It comes with a DVI to HTMI converter. It also carries sound. Reply
  • Daeros - Sunday, January 18, 2009 - link

    With Nvidia you only get HDMI sound if you mobo or sound card has the 2-pin SPDIF connector. With ATI, the card actually has a controller built-in. Reply
  • jay401 - Friday, January 16, 2009 - link

    Call me when it's <$300. Reply
  • Average Joe - Friday, January 16, 2009 - link


    As the owner of a 22" LCD. I think 22" is the perfect size. 24" didn't seem to be worth the extra 200 dollars they cost at the time not to mention that my poor 8800gt 512 would be stuck on medium for everything..
    I play games at 1680 x 1050.

    I don't like to run SLI rigs because I'm not willing to deal with the noise, power, heat the other 85% of the time I'm not playing Crysis.

    I buy single cards and usually midrange motherboards like the P45. The 4870X2 seems like a waste to me personally. I'm as likely to be playing Civ, Guild Wars, or Total War as I am Fallout or Crysis. So I only going to be using that 2nd chip some of the time. I think I'm still more likely to buy one of those X2 boards someday than I am of buying second graphics card thats the exact make and model as the one I have. They cards change to fast.

    Fortunately, having the 22" LCD means I can get by pretty easily with just a single card. I keep reading in forums that ATI driver support ain't where it should be for Vista 64. I don't know if that is deserved or not but I'm avoiding ATI for now. I'm probably going to buy a GX280 or GX285 simply because it CAN play Crysis at 1680X1050 with a single card at max settings and a GX260 can't or barely can. 38.5 FPS is playable with some chopiness, 30 FPS is "its time to think about medium quality" when things get busy. I'm not sure how much less power a 280 uses vs 2 gx260's but I bet it makes less noise. I don't want to sit in front of a leaf blower when I running Turbo Tax.


    I saw a 285 on new egg for $370 today thats less than some gx280's. I might wait for the pricing war carnage to bring the 280's down and get one of those.
    Reply
  • hk6900 - Saturday, February 21, 2009 - link


    Die painfully okay? Prefearbly by getting crushed to death in a
    garbage compactor, by getting your face cut to ribbons with a
    pocketknife, your head cracked open with a baseball bat, your stomach
    sliced open and your entrails spilled out, and your eyeballs ripped
    out of their sockets. Fucking bitch

    I really hope that you get curb-stomped. It'd be hilarious to see you
    begging for help, and then someone stomps on the back of your head,
    leaving you to die in horrible, agonizing pain. *beep*

    Shut the *beep* up f aggot, before you get your face bashed in and cut
    to ribbons, and your throat slit.

    You're dead if I ever meet you in real life, f ucker. I'll f ucking
    kill you.

    I would love to f ucking send your f ucking useless ass to the
    hospital in intensive care, fighting for your worthless life.

    http://www.youtube.com/watch?v=Po0j4ONZRGY">http://www.youtube.com/watch?v=Po0j4ONZRGY

    I wish you a truly painful, bloody, gory, and agonizing death, *beep*
    Reply
  • FAHgamer - Saturday, January 17, 2009 - link

    You have heard that ATI's Vista 64 drivers are sub-par?

    However, I've heard that nVidia's drivers are far behind ATI's as regards (any flavor of) Vista.

    While it might be true that Vista 64 Catalyst have issues (I don't know, I am using Vista 32 Ultimate and drivers are fine), I believe that you would be far worse off with an nVidia card.
    Reply
  • Stonedofmoo - Saturday, January 17, 2009 - link

    I don't know where you heard Nvidia's x64 drivers are behind Ati's as thats nonsense. If you look you will see all review sites use Vista x64 SP1 to do their reviews. They wouldn't do that if they were getting held back with poor drivers.

    In my experience the x64 drivers are every bit as good as x86 drivers, and Nvidia's are better than ATI's despite their more frequent driver releases. That's half ATI's trouble, by sticking to their monthly schedule quality control suffers.
    Reply
  • hk6900 - Saturday, February 21, 2009 - link


    Die painfully okay? Prefearbly by getting crushed to death in a
    garbage compactor, by getting your face cut to ribbons with a
    pocketknife, your head cracked open with a baseball bat, your stomach
    sliced open and your entrails spilled out, and your eyeballs ripped
    out of their sockets. Fucking bitch

    I really hope that you get curb-stomped. It'd be hilarious to see you
    begging for help, and then someone stomps on the back of your head,
    leaving you to die in horrible, agonizing pain. *beep*

    Shut the *beep* up f aggot, before you get your face bashed in and cut
    to ribbons, and your throat slit.

    You're dead if I ever meet you in real life, f ucker. I'll f ucking
    kill you.

    I would love to f ucking send your f ucking useless ass to the
    hospital in intensive care, fighting for your worthless life.

    http://www.youtube.com/watch?v=Po0j4ONZRGY">http://www.youtube.com/watch?v=Po0j4ONZRGY

    I wish you a truly painful, bloody, gory, and agonizing death, *beep*
    Reply
  • FAHgamer - Sunday, January 25, 2009 - link

    Huh?

    Who are you replying to? You got me a little confused here...
    Reply
  • Average Joe - Friday, January 16, 2009 - link


    As the owner of a 22" LCD. I play games at 1680 x 1050. I run a mild overclock because the core 2's run so cool. I don't like to run SLI rigs because I'm not willing to deal with the noise, power and heat the other 85% of the time when I'm not playing Crysis. I tend to buy single cards. The 4870 X2 seems like a waste to me personally because I'm as likely to be playing Civ or Guild Wars as I am Fallout or Crysis.

    Having the 22" LCD means I can get by pretty easily with just a single card. The word on the street is ATI driver support ain't where it should be. I don't know if that is deserved or not. I'm probably going to buy a GX280 or GX285 simply because I can play at 1680X1050 with a single card at max settings with one and a GX260 can't or barely can. 38.5 FPS FPS is quite a bit better than 30 I'm not sure how much less power a 280 uses vs 2 gx260's but I bet it makes less noise.

    I saw a 285 on new egg for $370 today thats less than some gx280's. I might wait for the pricing war carnage to bring the 280's down and get one of those.
    Reply
  • nyran125 - Friday, January 16, 2009 - link

    even the 8800GTS 512mb run all these games still with a decent frame rate of 30 - 80 fps, no point in upgrading at the moment till the net big graphical enhancement or graphical revolution to games , like the next big Crysis or Oblivian type enhancement. Till then , its a complete waste of money buying a newer card if you already have an 8800. Because you know that when the next big graphics game comes out like a new Elder Scrolls(Oblivian) or something the newest card out at that time wont even be enough to run it till 6 months to a year after the game is out. Sometimes it takes even longer for the graphics cards to catch up to the games on Maximum settings. So stop wasting your money lol. Reply
  • Captain828 - Friday, January 16, 2009 - link

    so you're telling me you're getting >30FPS when playing Crysis @ Very High, 1680x1050 + 2xAA + 4xAF ??! Reply
  • ryedizzel - Thursday, January 15, 2009 - link

    Not sure if anyone has said this yet but why the heck are you guys running all your benchmarks at 2560x1600? I mean seriously, how many people are really using monitors that big? AT THE LEAST please show some benchmarks for 1680x1050! Reply
  • Iketh - Friday, January 16, 2009 - link

    lol they are there... are you unable to read a line chart? Reply
  • JarredWalton - Friday, January 16, 2009 - link

    You don't buy parts like the GTX 285/295 or 4870X2 to play at 1680x1050 for the most part. In fact, 2560x1600 and 30" LCDs is the primary reason that I bought a 4870X2 around the time Fallout 3 came out. You can see that even at higher resolutions, there are several titles that are somewhat system limited (i.e. the GPU is so powerful that the benchmark isn't fully stressing the GPU subsystem). Reply
  • MadMan007 - Friday, January 16, 2009 - link

    That's certainly true and I think we understand why the charts are for the highest resolution and it's nice to provide data for lower resolutions. Aside from making a graph for each resolution, perhaps it would be possible to make them interactive somehow...say I click on 1920x1200 below the graph, then that data is charted. What would be really top notch is if I could choose which cards and which resolutions to compare. Reply
  • GhandiInstinct - Friday, January 16, 2009 - link

    MadMan,

    I only wish they did that. Then their reviews would be my #1 source.
    Reply
  • Stonedofmoo - Friday, January 16, 2009 - link

    But that's just the point though. Most people are still running 22" monitors at 1680x1050 res. We don't NEED top end powerful cards that Nvidia and ATI seem only interested in building.

    What we're looking for are upper midrange parts like a hypothetical GTX 240/220 if they were to exist to replace the aging and now redundant Geforce 9 series.

    Seriously:-
    ATI have more midrange parts than nvidia but really need to work on their rubbish power consumption, especially at idle.
    Nvidia need to actually have some midrange parts but have the power consumption sorted.

    Both need to refocus. I've never seen Nvidia go for so long without releasing a completely new series of cards from top to bottom end.
    Reply
  • SiliconDoc - Monday, January 19, 2009 - link

    Well more cards are always better and keep us entertained and interested, but this continuous call for "midrange" from NVidia is somewhat puxzzling to me.
    Since the 9800xxxx takes on the 4850, then there's the 9600 on down and the 88gtxx -- I mean it's all covered...
    ATI jusr relased the 4830 on their new highest core crippled to take on the 9800(GT) so they claim...
    I guess if I were NVidia I wouldn't waste my company time pr money on people wanting to read "new technology reviews" based upon cards that REFILL an already filled space that the competition just a bit ago, after 2 years of near NADA, finally brought to market some competition.
    Since ATI was dang out of it for so long - why should NVidia retool the GT200 core to STOMP all their 9800 8800 9600 and the like pieces?
    You want them to destroy themselves and their own line so you can say " Hey, new tech in the midrange - now I can decide if I want a 4830 or 4670 or 4850 or one of these crippled GT200's " - then moments later you'll say to yorself " Wait a minute, why should I get rid of my 9800gtx ?!"...
    I mean REALLY ...
    Would someone please expalin to me what I'm missing ?
    It is all "I want a crippled cheap GT200 core ", isn't it ?
    Maybe part of it is why are we still here when the 8800 was released in Nov 2006 ?
    Maybe the question should be why is ATI still taking on 2 year old NVidia tech.
    AMD just took another huge charge loss from it's ati division, and I'm not certain NV is doing any much better ( though gpu-z shows 65% of the market is NV's ) - so why would NV do an expensive die/core rollout that crushes their already standing cores that compete with ati midrange just fine ?
    It just does not make any sense.
    Reply
  • hk6900 - Saturday, February 21, 2009 - link


    Die painfully okay? Prefearbly by getting crushed to death in a
    garbage compactor, by getting your face cut to ribbons with a
    pocketknife, your head cracked open with a baseball bat, your stomach
    sliced open and your entrails spilled out, and your eyeballs ripped
    out of their sockets. Fucking bitch

    I really hope that you get curb-stomped. It'd be hilarious to see you
    begging for help, and then someone stomps on the back of your head,
    leaving you to die in horrible, agonizing pain. *beep*

    Shut the *beep* up f aggot, before you get your face bashed in and cut
    to ribbons, and your throat slit.
    Reply
  • MadMan007 - Thursday, January 15, 2009 - link

    The benchmark numbers are there below the graphs but I agree that charting 2560x1600 isn't very realistic. Maybe the benchmarkers are getting a little out of touch with what real people have for monitors. Reply
  • Beno - Thursday, January 15, 2009 - link

    ffs its been 2 years and we still cant get pass 100 fps burrier in crysis at 1650x !!

    every new cards ati and nv makes, only gives around extra 10 fps on that game :(
    Reply
  • MadMan007 - Thursday, January 15, 2009 - link

    One detail that's not clear, and this is partly because of NVs confusingly named releases, is which GTX 260 is included in the charts. We know it's not the 55nm, but is it 192 or 216 shader? Lots of websites forget to put this detail in their testing, just writing GTX26-192 or -216 would make it clear. Thanks. Reply
  • jabber - Thursday, January 15, 2009 - link

    ....those bizarre S-Video outputs?

    Why not something more useful? Or just drop them completely.
    Reply
  • Odeen - Thursday, January 15, 2009 - link

    The S-Video outputs are industry standard, and are used to connect to SD TV sets.. I don't see what's so bizzare or useless about them. Reply
  • jabber - Friday, January 16, 2009 - link

    But who uses them?

    I've never seen anyone use them and I havent read about anyone trying for years. When they did all those years ago the VIVO thing was a mess or a pain to get working.

    Just seems pointless now especially for SDTV.
    Reply
  • MadMan007 - Friday, January 16, 2009 - link

    While it's an s-video looking output it's not just for s-video, they are used for component output as well I believe. Reply
  • SpacemanSpiff46 - Thursday, January 15, 2009 - link

    Any reason the 4850 X2 is being neglected so much? I have not seen any reviews with this card. Also, it would be nice to see how the 9800GX2 is stacking up with these cards. Reply
  • bob4432 - Thursday, January 15, 2009 - link

    wonder the same thing myself - the 4850 is a good card alone and the price is very nice. add to that that many people are running a 4850, this could be a very attractive upgrade - lets see some 4850 cf setup #s/comparisons too Reply
  • Sunagwa - Friday, January 16, 2009 - link

    I have to agree. I always go for the most value when I purchase my parts.

    Granted "value" can easily be taken out of context considering obviously wide ranging income.

    For me however the 4850 (this time around, I am a PC gamer at heart) was the absolute choice when I purchased it.

    Getting back on topic, I would love to see the CF setup as well as the dual GPU setup included in your review. If only to be able to compare the performance and possible upgrade potential of my current computer to your test bed.

    Just a side note for those who care but my C2DUO-Wolfdale OC'D to 4Ghz that I payed 160$US for has me very happy and I could care less about Corei7...wait...no I could not. 8)

    Regards,
    Sunagwa
    Reply
  • GhandiInstinct - Thursday, January 15, 2009 - link

    Derek,

    These articles need to start concluding with: "So if you have to buy a video card(cpu...etc) today, buy the ____________.

    Thank you.
    Reply
  • Beoir - Thursday, January 15, 2009 - link

    For a similar price you could get a 4870 X2.
    Do they think the customer is stupid? NVIDIA was not thinking about pricing by making it only $100 less than a dual GPU. But, to be the "glass is half full" wait a few months and I'm sure the price will drop significantly.
    Reply
  • SiliconDoc - Monday, January 19, 2009 - link

    It's stupid to get a dual gpu solution when you can have one GPU be smoking up the fps in ALL GAMES.
    The problem of course is, ATI doesn't have a single gpu that is even close. ATI is sucking wind.
    That's a problem, so somehow, some line of hogwash must be blabbered about to fix that problem.
    If you like screwing with games endlessly, trying to get CF working, then you go to another title and it takes an fps HIT, and comes out lower and nothing can fix it - go for your "pick".
    There's just no way a sane person would take the 4870x2 over the GTX285, unless they loved trouble, no driver profiles, no Cuda, and no PhysX - and a huge power suction to go with all that crap - toasting any "savings" on the electric bill.
    Try the Warmonger demo, it even runs at 17fps on a 3650 at 1280x1024. Check out how cool PhysX is - you've never seen any game like it - try it... then you'll see.
    http://www.nvidia.com/content/graphicsplus/us/down...">http://www.nvidia.com/content/graphicsplus/us/down...
    top one - download that puppy and have some real fun...
    HUGE POTENTIAL, and it's already something to behold.
    Reply
  • hk6900 - Saturday, February 21, 2009 - link


    I really hope that you get curb-stomped. It'd be hilarious to see you
    begging for help, and then someone stomps on the back of your head,
    leaving you to die in horrible, agonizing pain. *beep*

    Shut the *beep* up f aggot, before you get your face bashed in and cut
    to ribbons, and your throat slit.
    Reply
  • JarredWalton - Thursday, January 15, 2009 - link

    Or for $50 more you can get a GTX 295 instead of the 4870X2. I'd be more inclined to go that route, personally. I don't think the importance of drivers and multi-GPU driver profiles can be overstated on the single card dual-GPU solutions. Reply
  • Goty - Thursday, January 15, 2009 - link

    The GTX295 would make sense if it weren't NVIDIA's practice to shove out a halo product with issues and then stop supporting it entirely farther down the road a la the 9800GX2. Reply
  • Thar - Thursday, January 15, 2009 - link

    These reviews left me a little confused. Where you comparing a single card to configurations with two cards in crossfire and or two cards in SLI? If I see SLI or X2 at the end of a card name am I to assume the test bed was running two cards and if I don't see it assume a single card was used?

    In your conclusion you say the 295 has captured the Halo yet not one bench mark showed it at the top. The only thing I could figure is you were benching a single 295 against 2 card SLI and Crossfire set ups.
    Reply
  • Toolius - Thursday, January 15, 2009 - link

    Would it be possible to update the results with 4870x2's in Crossfire ? I mean a GTX295 SLI setup is over a 1000 $ and a GTX 285 SLI setup is close to 800 $ . Considering that 4870x2's in Crossfire also cost just about the same , Can we have some figures for 4870x2's in crossfire as well PLEASE ?? Pretty Please ? With a Cherry on top :) Reply
  • Thar - Friday, January 16, 2009 - link

    Head over to Tom's Hardware for 4870's in Crossfire. A good review that makes it clear how many cards are running in each test.

    Best I could tell this test on Anandtech did not have the GTX295 in a dual card SLI setup, and while they completly failed to mention it I do believe the GTX 285 and 260 were both in a two or maybe 3 card SLI setup.
    Reply
  • IKeelU - Thursday, January 15, 2009 - link

    Third sentence should probably read: "As we weren't able to get power tests done on time..." instead of: "As we weren't able to get power tests done time..." Reply
  • Kroneborge - Thursday, January 15, 2009 - link

    I'd like to second the request for info on sound levels. I do music production, AND play games on my computer. So it's important to be able to find a nice balance. I know some people don't care if their cards are loud, but there are many others that do.

    Thanks,
    Reply
  • mczak - Thursday, January 15, 2009 - link

    I'm not sure those power consumption figures are relevant. You're measuring a overclocked card. Even though you've downclocked it to standard speeds, it could well be setup to run at slightly higher voltages to guarantee stable operation at the overclocked frequencies without the manufacturer having to do much further qualification. Reply
  • gungan3 - Thursday, January 15, 2009 - link

    Oh and the GTX 285 has only 2 X 6 pin PCIE connectors while the GTX 280 had one 6 pin and one 8 pin connector Reply
  • gungan3 - Thursday, January 15, 2009 - link

    Yes i would have really liked to see some tests with the 4850 X2 as well. At a $299 pricepoint for the 2X1 GB version it should offer higher performance than a GTX 280/285. You could throw in 9800 GTX+ SLI as well there, it should probably smoke its own brother as well.

    Also why oh why are there no tests on fan noise and GPU temperatures? Those would be very useful to consumers. Another test could be case Temperature which would be a big help to buyers of the GTX 295 which dumps hot air inside the case itself. How about overclocking tests? No time for those as well?

    And some more insight into the actual changes in hardware would also be appreciated. Pictures of the fronf of the PCB and the cooling system would be helpful. To quote from your review "The hardware looks the same as the current GeForce GTX 280. There really isn't anything aside from the GPU that appears different (except the sticker on the card that is)"

    Might i point out that as is the case with the 55 nm GTX 260(as well as the GTX 295), that all the memory chips are now on the front of the card as opposed to the original PCB's which had memory on both sides thus requiring more layers in the PCB( afaik 16 layers as opposed to 12 layers). Possibly some changes in power/memory voltage circuitry as well. Was that too hard to notice?
    Reply
  • Daeros - Thursday, January 15, 2009 - link

    I was just noticing something about several gfx card reviews I have seen here lately, the lack of CF results to compare with SLI. Of course the top of the chart is full of Nvidia cards when you don't test any multi-card solutions from ATI. I know the new test platform supports this, so I really don't understand the reasoning.

    Also, there is an excellent competitor for the GTX285, the 4850x2. It comes with 2x1GB GDDR3 so it will be slightly stronger than two standard 4850's in CF, and Newegg has them for $299 w/ free shipping.
    Reply
  • Goty - Thursday, January 15, 2009 - link

    Why include Crossfire results when you have the 4870X2 in the mix? It's nearly identical to two 4870s in a Crossfire configuration, so there's no need to run another set of benchmarks if you're going to get the same numbers. Reply
  • Daeros - Thursday, January 15, 2009 - link

    My point is that the 4870x2 is designed to compete with the GTX280/285 cards from Nvidia. All I was saying was it would be nice to have multi-card comparisons for both brands at similar price points (ie GTX285SLI=$760, GTX280SLI=$650 4870x2CF=$860, 4870x2+4870(1GB)=$670), 4850x2CF=$600). So why not test a couple more cards in similar brackets and give more-useful, fully-fleshed reviews. Reply
  • elerick - Thursday, January 15, 2009 - link

    How I believe the Price wars between AMD and Nvidia are going to be good for consumers. I can't wait to see the new pricing for GTX 280 with these rolling out. Glad to see performance increases this early on in the year. Reply
  • Stonedofmoo - Thursday, January 15, 2009 - link

    Really I'm bored of reading about top end parts eating hundreds of watts of power.

    I'd really like to see the GT200 technology migrated too midrange parts. In the UK we have a situation where Nvidia does not have one single competative part for sale between £150-200. The GTX 260's are all above £200 and the Geforce 9 series parts are not worth considering when you see how much faster the ATI 48xx cards are in that price range.

    Nvidia really needs to forget the race for top performance cards that eat power for breakfast, and start taking note that not everyone wants the most powerful card, some of us are looking for the new 8800GT of this generation...
    Reply
  • Goty - Thursday, January 15, 2009 - link

    I think you hit the nail on the head when you said the only benefit to the end user from the GTX285 is that it will drop the price on the GTX280, Derek. You get more performance, but it still slots into exactly the same spot performance-wise: faster than the GTX260/HD4870, slower than the 4870X2. Add in the fact that there are no power savings and you've got a pointless product aside from the fact that it saves NVIDIA a little money.

    As for the review itself, why only results using 4xAA? I'd like to see how performance falls off with 8xAA vs the HD4870 and see if the marginally increased clockspeeds help at all in that department.
    Reply
  • SiliconDoc - Monday, January 19, 2009 - link

    Here, take a look at the power useage:
    .
    http://www.neoseeker.com/Articles/Hardware/Reviews...">http://www.neoseeker.com/Articles/Hardware/Reviews...
    .
    There you see the SICKNESS in all it's silicon cooking glory... the 4870x2, the 4850x2 AND the 3870x2 ALL COMSUME MORE POWER THAN THE GTX295(x2) .

    Just look at that CRAP.....
    .
    Now it's time to say WHAT HASN'T BEEN SAID:
    .
    Since the 4870/4870x2/512/1024/2048 are all on a SMALLER silicon die - and all have a HIGHER core clockspeed, and ALL have mroe electricity surging through them, all causing MORE HEAT ... with those smaller cores FILLED UP WITH DATA and electricity more often at a DENSER level - guess what's coming ? !? !
    It's only been 5 months- and SOON - the 4870 monsters - due to ELECTROMIGRATION - are going to start BURNING OUT...
    .
    YES BURN OUT TIME FOR THE SMALLER ATI CORES IS COMING SOON TO A RIG NEAR YOU !
    .
    lol - Another gigantic POINT - the raging reds have for 5 months never considered.... never brought up - never talked about...
    .
    I can HARDLY WAIT till the failures start hitting hard.. in bigger and bigger numbers - it's inevitable.
    Reply
  • hk6900 - Saturday, February 21, 2009 - link


    I really hope that you get curb-stomped. It'd be hilarious to see you
    begging for help, and then someone stomps on the back of your head,
    leaving you to die in horrible, agonizing pain. *beep*

    Shut the *beep* up f aggot, before you get your face bashed in and cut
    to ribbons, and your throat slit.
    Reply
  • SiliconDoc - Monday, January 19, 2009 - link

    The 4870 uses more power in idle - look at the power charts.
    " No power savings" = 4870
    Gosh I am really sick of the lies.
    How did you manage to get yourself to spew that out ?
    The 285 is lower in idle than a 260 which also beats the 4870.
    "no power savings" = deranged redfan.
    Reply
  • sam187 - Thursday, January 15, 2009 - link

    I still would like to know if the GTX285/295 support Hybrid SLI -> Hybrid Power... Reply
  • Daeros - Thursday, January 15, 2009 - link

    AFIK, the entire GTX2XX line dropped hybrid power. The only cards that support it are the high-end 9XXX cards. Reply
  • sam187 - Thursday, January 15, 2009 - link

    The GTX260/280 also support it:
    http://www.nvidia.com/object/hybrid_sli_desktop.ht...">http://www.nvidia.com/object/hybrid_sli_desktop.ht...
    Reply
  • Daeros - Thursday, January 15, 2009 - link

    wow, I totally missed that. sorry Reply
  • Aberforth - Thursday, January 15, 2009 - link

    man, some of these reviews are getting very generic and boring day by day, you take pictures of a product at different angles and do the number game, maybe it's because the technologies and innovations these days are becoming so forced and profit oriented. Reply
  • GaryJohnson - Thursday, January 15, 2009 - link

    As opposed to the before time or the long long ago when technology was all about mystery and wonder? Reply
  • Aberforth - Thursday, January 15, 2009 - link

    Yes Reply
  • crimson117 - Thursday, January 15, 2009 - link

    If we tell him, will he help us stop Fatass? Reply
  • Stonedofmoo - Thursday, January 15, 2009 - link

    ...Could Anantech post a review of the 55nm GTX 260's? In other reviews I have seen it appears the 65nm & 55nm GTX 260's use the same amount of idle power which is odd and at odds with the pattern we have seen with the 285&295 cards at 55nm. Reply
  • Stonedofmoo - Thursday, January 15, 2009 - link

    Thank you for posting a more comprehensive review this time compared to the GTX 295 review posted earlier this week.

    The power and heat statistics are very useful and helps provide the information we need to make a decision.

    Keep up the good work!
    Reply
  • SiliconDoc - Monday, January 19, 2009 - link

    I note Derek doesn't mention if PhysX is enabled or disabled in the NV CP, nor in Crysis for instance.
    I do wonder - since it could take a few frames from the NV cards. I wonder why that whole deal is SILENCE.
    I noted one comment "still shoving PhysX down our throats".
    I know I used to see HERE, "PhysX is disabled" before they got to the fps testing.
    At other sites, they claim "enthusiast settings" in Crysis for instance, even on a few charts here - meaning as the charts here say and as it is misspelled here "Physics" is ON.
    So, I wonder how that whole deal is in this bench set.
    I suppose it's left on for NV, then when the red card turn comes, a simple click and no problem - no PhysX - and no couple or a dozen framerate hit either.
    Sorry, I certainly can't say good job.
    Reply
  • hk6900 - Saturday, February 21, 2009 - link

    Remove yourself from the gene pool, retard Reply
  • TheDoc9 - Thursday, January 15, 2009 - link

    This review was much better than the original gtx 295 review. Every question I had about these cards was answered and I was able to decide which will be my next purchase. Reply

Log in

Don't have an account? Sign up now