NVIDIA GeForce GTX 295: Leading the Pack

by Derek Wilson on 1/12/2009 5:15 PM EST
POST A COMMENT

100 Comments

Back to Article

  • LinkedKube - Tuesday, January 20, 2009 - link

    I am not a fan boy of any kind. I recently considered buying a 4870x2, until I heard about the 295, which I will be purchasing soon.

    The article was a bore, as I currently own a g80 ultra, and still regret paying the 750usd for it a year and a half ago.

    I'm currently thanking the situation of competing card markers because I dont have to go through the high end cards looking at a 750~ price range of cards. It was the best card at the time, and that's what I bought, if ati would have had a better card I'd own an ati xxxx w/e it may have been.

    If you're willing to spend 400usd+ on a card, power consumption shouldnt be an issue to argue about.

    This is two 280's slapped together, we all know that, yes its the best card currently. Is it worth the money? Probably not if you have bills to pay and have fears of feeding yourself afterwards.

    To each his own though. It looks like this silicon guy has about 7 months worth of ati hate in his blood. I'm wondering if he has a black and green case to go along with his nvidia "aura."
    Reply
  • rocky1234 - Monday, January 19, 2009 - link

    First off this was a pretty good article it shows that Nvidia finally has a card that can topple the mighty ATI card in some of the tests but not all. Keep up the good work.

    I also own a ATI 4870x2 card it works great have not had any problems with it as of yet except for minor driver issues with crossfire in 2 games but those were worked out. I used to own only nvidia but got tired of their lame assed renaming of old product to a newer higher number then try to fool people into buying these new cards. I still own a 8800/9800GT card & it will be the last nvidia card I buy for awhile until they decide to stop trying to screw people over by rehashing the old cards as new cards. This geforce 295 does look like a good card but for the small amount it gives over a 4870x2 card I think I will wait for the next gen of cards to come out before I spend money again on a new card.

    oh yeah to SiliconDoc no need to point out any spelling mistakes or missed periods here & there as I don't really care what you got to say you proved over & over again you are a nvidia fanboy & really have nothing usefull to say that we all already didn't know. its nothing personal but until you take your head out of your butt & wake up & smell the daisy's we just don't care. Also no need to reply to this as I won't care to reply to what you got to say SiliconDoc. I only singled you out because you attacked so many people in this thread only because they had personal views to express.

    enough said
    rocky1234
    Reply
  • chuouwee - Thursday, January 15, 2009 - link

    Imo... the 295 seems like it only avgs around 5-6 fps than the 4870 x2 overall the games reviewed in this article... I don't think it justifies the extra 50-100 dollars we would be spending on the card... Reply
  • SiliconDoc - Saturday, January 17, 2009 - link

    I think the combination of that, along with CUDA, the PsysX factor, the ability for instant use in the supplied driver on the CD that comes with the card GAMING Profiles, AND to use your 8 series and above card as a PhysX processor in another pci-e slot, EASILY makes up for price differences. Not to mention the 60 watts in idle and 45 watts in 3d power $avings the GTX295 offers over the 4870x2...

    When you add it all up - IT'S VERY CLEAR WHY THERE ARE PRICE DIFFERENCES AND WHY THE ATI CARD IS DROPPING LIKE A ROCK.
    Reply
  • Hrel - Tuesday, January 13, 2009 - link

    Can you PLEASE! start including 3DMark numbers in your GPU reviews? PLEASE! Reply
  • san1s - Tuesday, January 13, 2009 - link

    can you please tell me how to "play" 3dmark?
    It must be so fun watching the same scene being rendered on your screen over and over again
    Reply
  • kzVegas - Tuesday, January 13, 2009 - link

    It seems that nearly every author at Anandtech has been told to view nVidia product's with the same enthusiasm as looking for dimes in a well used cat box. The author of this article brackets his test of the GTX295 with two tests of games that are not very popular just to show that an AMD/ATI card can best the nVidia offering. Then goes on to argue in his closing comments that AMD needs to do better with its drivers. The points out this card is $50 more than a cheap 4870x2 and thats %10 more expensive. WHO CARES? When someone is going to spend $450 to $500 for a card the difference in price isn't all that important. I'd like to see more articles that don't appear to be so biased against one company or the other. Reply
  • strikeback03 - Wednesday, January 14, 2009 - link

    I always love the bias comments after every graphics card review, because no matter what hardware is being reviewed or what they thought of it somebody will claim they are biased. Go check the comments after pretty much any AMD GPU review - there will be at least one person (and usually several) complaining because they spent too many words in the article talking about the nVidia competition and which is better at what price, with no "This is the best card EVAR!1!" endorsement.

    Anandtech crew, thanks for all your work!
    Reply
  • SiliconDoc - Saturday, January 17, 2009 - link

    Now show me in the 4870x2 review where a 30" monitor is described as the very neccessity of purchase...
    I checked there is one line buried between two others that mentions resolution needs...
    I could care less what IDIOTS state in comments as "your proof" that people complain about bias from "both sides".
    Unless you address the SPECIFIC POINTS of any one complaint be REFUTING THEIR CONTENT WITH THE ARTICLES OWN WORDS ...
    you have failed.
    It's not my fault you're lazy in that regard, and would rather just whine there is no bias (because of opposinbg complaints) and then spew your anand praise points as your infantile final analysis.
    I ADRESSED a half dozen and more bias issues in my posts here, and you completely ignored every single one of them - in favor of " there are complaints from either quarter".
    Address this then, mister even steven - in these posts we have a good number of cussing raging redfans spewing death toward my and a few others posts - who pointed out redcard issues. We don't see that going the other way, here, AT ALL.
    Now, who then is raging, cussing, spewing monkey set of fan bias ?
    Take a look and read the postings.
    One commenter even spammed his response to me, like half a dozen times.
    No matter HOW MUCH you claim it to be true, we do NOT see that type of beghavior going the other way - green fans wishing death upon the red suporters. I have never seen it in fact, at dozens of review comment sites.
    Have you ? Would you like to look ?
    Once again, the easiest way out is whining there are both types, and therefore neither is correct.
    The problem is, I have not been constantly lying and exagerrating, and twisting the truth, I have been the one correcting the lies and shenanigans. There is a difference, and you'd know it if you cared to notice.
    Now refute any of my arguments in this entire thread from any of my posts, I'm WAITING MR FAIR !
    If you refite most of them, why then you've proven your point, huh.
    Otherwise, you're a bag of lying wind.
    Reply
  • SiliconDoc - Thursday, January 15, 2009 - link

    I read the entire 4870x2 review before saying what I did. Of course you didn't do the same. Reply
  • strikeback03 - Thursday, January 15, 2009 - link

    Just check the first page of comments.

    From CyberHawk:
    Been waiting for this one...
    ... but I find a response a bit cold.

    It's the fastest card for God sake!


    From formulav8:
    Yeps, this is one of the worst reviews Anand himself has ever done. He continues to praise nVideo who just a month or 2 ago was charging $600 for their cards.

    Give credit where credit is do. He even harps on a sideport feature that doesn't mean much now and AMD says it didn't provide no real benefit even when it was enabled.

    I've been a member of this site since 2000 and am dissappointed how bad the reviews here are getting especially when they have a biased tone to them.


    Those are just two examples from the first page of comments on an article you yourself pointed out. Just for kicks, here is one from another article (4830 launch):

    From Butterbean:
    I jumped to "Final Words" and boom - no info on 4830 but right into steering people to an Nvidia card. That's so Anandtech.

    I still state that no matter whose hardware they review or what they say, someone will accuse them of bias.
    Reply
  • SiliconDoc - Tuesday, January 13, 2009 - link

    AMEN. Good post. Reply
  • jmg873 - Tuesday, January 13, 2009 - link

    you stated in the article that the gtx260 sli beating out the 4870 x2 showed that sli was superior to crossfire. i don't have an opinion at this point on which is better but saying that sli beats crossfire based on that isn't accurate. the 4870 x2 isn't 2 4870's in crossfire, it's 2 of them basically "welded" together on 1 card. if you have two 4870's in crossfire that will probably yield a different result, maybe worse, maybe better. Reply
  • Jovec - Tuesday, January 13, 2009 - link

    My understanding is it is still Crossfire/SLI for dual gpu, single slot cards like the 295 and 4870x2. The advantage of such cards is that you don't need an CF/SLI mobo while also being a bit cheaper (than purchasing 2 of said cards). You could also go quad with these cards on a CF/SLI mobo. Reply
  • JarredWalton - Tuesday, January 13, 2009 - link

    Just because ATI disables the "CrossFire" tab with the 4870X2 doesn't mean it isn't CrossFire. Trust me: I bought one and I'm disappointed with the lack of performance in new games on a regular basis. I'm thinking I need to start playing games around four months after they launch, just so I can be relatively sure of getting working CF support - and that's only when a game is an A-list title. There are still plenty of games where CrossFire can negatively impact performance or cause other quirkiness (Penny Arcade Adventures comes to mind immediately). Reply
  • af530 - Wednesday, January 14, 2009 - link

    Die of aids you shithead Reply
  • thevisitor - Tuesday, January 13, 2009 - link

    PLEASE LEARN !

    The first graph in this review should be dollar per frame
    but you anand cannot show it, right!
    because everyone can see then how nvidia sells total crap, AGAIN.

    the price is the 99% buy condition, so you must consider it when writing something again !
    Reply
  • Kroneborge - Tuesday, January 13, 2009 - link

    Different people care about different things when choosing a product. For some dollar per frame might be the most important thing, for others (especially at the high end) all they care about is having the most powerful product, and so they gladly pay a high premium for that last little extra bit.

    Neither is wrong, what's right for one person, can be wrong for another.

    A reviewers job it to tell you about the performance, it's up to you to decide if you think that's worth the money. They can't make up your mind for you.
    Reply
  • elerick - Tuesday, January 13, 2009 - link

    Graphics are in a transition phase right now. With the economy ditching high end, they are forced to compete for midrange. That is why the competition is much more severe in the 4850/GT260 camp.

    It's sad that anand's readers have to blog and flame everything that is written all the time. If you leave power consumption out of the *inital* launch review it is for good reason, perhaps they are forced to get the review out or it will become yesterdays news. The most important thing here is benchmarks. Power consumption and SLI will soon be following once they can get their hands on more video cards.

    I'm tired of reading comments where everyone just bitches about everything, grow up.

    I look forward to reading your next review, Cheers!
    Reply
  • araczynski - Tuesday, January 13, 2009 - link

    nvidia still sucking in price as usual. Reply
  • sam187 - Tuesday, January 13, 2009 - link

    Does anyone know if the GTX295/285 support hybrid power? It doesn't seem so (nvidia homepage, product dispriction from shops, ...).

    For me it looks like nvidia is dropping the technology (remember Geforce 9300/9400 chipset?) :-(

    Reply
  • Stonedofmoo - Tuesday, January 13, 2009 - link

    Hey what's happened with this review?

    Anandtech is my review site of choice because your reviews are usually in depth and informative. There are more decisions to be made with a graphics card now than frame rates.

    I for one would very much like to have seen some information on the power consumption of this card.

    Let's hope that your GTX 285 review is better and has more information about the 55nm transition and how it affects power consumption, because for some of us this is becoming a big issue.
    I just sent back a GTX 280 because the power consumption was rediculous and at idle it uses 35-40w more power with 2 monitors connected than one monitor.

    I do agree though, it's the midrange parts I'd far rather see using the new G200 process. The 9xxx series are old hat at should be replaced.
    Reply
  • danchen - Tuesday, January 13, 2009 - link

    How about multiple monitor setups ?
    If I'm planning to power up 3 x 24" LCD monitors(1920x1200),which card should be best ? (only expecting to use "high"/"very high" settings)?
    This card only has 2 DVI outputs right ?
    Should I skip this and just get 2 x GTX285 SLI (for the ports) ?
    Which brand performs the best for multiscreen setup ?
    Reply
  • nubie - Tuesday, January 13, 2009 - link

    To game on? As far as I know nVidia is not opening up more than one "screen" (can span multiple monitors on a single video card though) for SLI.

    I don't think they can support 3 active DirectX monitors on the GTX295 (or can they?) If they do then it would be the one to get.

    I crammed 3 PCI-e cards into the extra pci-e x1 slots on my old motherboard a couple years back, and was a bit disappointed by the state of multi-monitor gaming. http://picasaweb.google.com/nubie07/PCIeX102#51748...">http://picasaweb.google.com/nubie07/PCIeX102#51748...

    Best to buy a triple-head-to-go (or a dual-head to go) http://www.matrox.com/graphics/en/products/gxm/th2...">http://www.matrox.com/graphics/en/products/gxm/th2... , or use SoftTHTG http://www.kegetys.net/SoftTH/">http://www.kegetys.net/SoftTH/


    Sadly multi-displays (and VR/3D setups that depend on multi-displays/outputs) are a feature sorely lacking from DirectX and most game engines (the only one that comes to mind is MS flight simulator)


    Reply
  • yacoub - Monday, January 12, 2009 - link

    prices seem to be creeping back up again and that's not good (for consumers, especially in this economy). If the GTX 285 can't MSRP around $349 for 1GB models, we're in trouble. And it needs to see a sub-$300 price point before most gamers will give a crap, even though it appears to be the single-GPU card to get. Reply
  • nubie - Tuesday, January 13, 2009 - link

    It is on a 55nm process, it can get price cut. They just need to move out the rest of the 65nm first.

    I wish they had some decent mid-range with new tech (like you-know-who), instead of peddling "GTX-100" series, AKA G92 as the mid-range.

    Not that it isn't great tech, but at least the other team is actually trying.
    Reply
  • SiliconDoc - Thursday, January 15, 2009 - link

    What you really should have said is : It's too bad ATI, with it's flagship core, can only match the years old tech of the 9800GT, or the 9800gtx or plus, unless it uses DDDR5 memory, as it does on the 4870.

    So the REAL TRUTH IS - all this "new tech" from ATI - the "other one that is trying" according to you amounts to (to be overly fair to YOU ) DDR5 memory...

    If we just go with the gpucore tech - like I said ATI latest flagship core - in the 4850 and 4870 - in the former- get knocked around by VERY MUCH OLDER NVIDIA "tech".
    Like 2 or 3 years older ....

    I guess the whole ding dang "new tech" whine is another twisted, repeatable, babbling foools errand promoted by the red amrket managers, and spewed about by the non thinking sheeple with red wool issues.
    If NOT - please, pray tell... do correct me...

    Unfortunately, that correction won't be forthcoming...

    I saw the 4850 just the other day compared in benchmark to a 9600GSO and it was "very close"...

    The question is how good is that core really ? How much "new tech" is there - it certainly appears it's all core clockspeed and DDR5... if not - why does the 4850 fall below (or barely above)the 98xxGTXx series all the time ?

    Is "new technology" really what you want in the range you claimed you wanted it ? LOL
    If NVidia puts out new technology in that region, do you expect several years old ATI cores to match or beat it ? I bet you don't.
    Reply
  • Mithan - Monday, January 12, 2009 - link

    Don't worry about it, those prices will come down or be heald in check. Reply
  • Hxx - Monday, January 12, 2009 - link

    I wish you guys would have discussed about power consumtion, heat, and noise. Other than that, i enjoyed reading it. As for the card, its slightly faster than a 4870 x2 in the majority of the games, but that's about it. Nothing innovative, just another sandwich card designed by Nvidia with a poorly designed cooler letting hot air inside the case... how dissapointing. Reply
  • SiliconDoc - Monday, January 12, 2009 - link

    The GTX295 is LOWER in power consumption AND in noise.
    That's why it wasn't included - you know who really likes a certain team...
    Reply
  • cactusdog - Tuesday, January 13, 2009 - link

    SiliconDoc, you sound like a tool. Reply
  • TheDoc9 - Tuesday, January 13, 2009 - link

    Actually while I don't agree with everything silcondoc says. I also read some very lite well hidden bias in this article. I simply called it indifference in one of my posts. Reply
  • zebrax2 - Tuesday, January 13, 2009 - link

    you know who really likes a certain team...
    plain bullshit. if they really wanted to favor the red team they would have rigged the test. you have no basis on accusing them and I'm sure that they have some reason for not adding it.

    if you don't like what they write just go to another site
    Reply
  • SiliconDoc - Tuesday, January 13, 2009 - link

    If you don't like what I wrote, follow your own advice and leave.
    How about it there fella - if that's your standard, take off.
    If not, make excuses.
    I think it's Derek, to be honest, and specific.
    That's fine, I think not noting it - and not being able to adjust for it, IS a problem.
    You cannot really expect someone that is into hardware and goes that deep into testing to wind up in the middle.
    So use your head.
    I used mine and pointed out what was disturbing, and if that helps a few think clearly on their purchases, no matter their decision, that's good by me.
    However, your comment was not helpful.
    Reply
  • Goty - Monday, January 12, 2009 - link

    Uses less power than... what?

    By bit-tech's numbers, the GTX295's power consumption is about 70W more than a single GTX280 and only about 40W less than a 4870X2 at full load.
    Reply
  • SiliconDoc - Saturday, January 17, 2009 - link

    Oh, let's not have any whining about where's the link
    http://vr-zone.com/articles/nvidia-geforce-gtx-295...
    Not that you ever thought about doing that at all - although I'm sure others did.
    Reply
  • SiliconDoc - Saturday, January 17, 2009 - link

    lol- wrong comparison - again....

    " By bit-tech's numbers, the GTX295's power consumption is about 70W more than a single GTX280 and only about 40W less than a 4870X2 at full load. "

    Let's correct that.

    " By vr-zone's numbers, the GTX295's power consumption is about 60W LESS than a single 4870x2 at idle, and about 45W LESS than a 4870X2 at full load. THE GTX295 stomps the 4870x2 into the GROUND when it comes to power savings, and BETTER PERFORMANCE. "

    There we go , now the red ragers can cuss and call names again - because the truth hurts them, so, so bad.

    Reply
  • AdamK47 - Monday, January 12, 2009 - link

    It's interesting you got the game working with these drivers. The game crashes for most with these. Reply
  • Goty - Monday, January 12, 2009 - link

    Doesn't the documentation state that the 8.12 hotfix is only needed for 4850 crossfire systems? Reply
  • Marlin1975 - Monday, January 12, 2009 - link

    A $500 video card that used 289watts of power. Just what every one wants.


    I was happy with AMDs new video cards. Not because they were at the top of all the charts. But that they offered a lot of power for a fair price and did not use to much power. Maybe some geek that wants to "brag" about spending $500 on a video card will get this. But for the other 99.9% of users this brings nothing to usefull table.
    Reply
  • MadMan007 - Monday, January 12, 2009 - link

    The HD4000s were and are certainly great bang for the buck but not using too much power...not so much. The idle power draw was behind NV and the load power draw is generally in proportion to performance. Reply
  • SiliconDoc - Monday, January 12, 2009 - link

    The GTX260 BEATS the 4870 in power consumption- as in IT'S LOWER for the GTX260.
    Just like the GTX295 BEATS the 4870x2 in power useage.
    In the 260/4870 case, full 3d was within 1-3 watts, and in 2d/idle the 260 was 30 watts lower - taking the CROWN.
    Similar in this case- although NVidia declares the 298 watts max, which red fans love to site - the actual draw is less- as sites that have tested all show.
    Oh well. More FUD from the reds will be all over the place.
    Tonight I learned that this 2 gpu thing on one card with framerates like the raved and glroified 4870x2 are just one big waste without 2560x rez - but before the 4870x2 was praised beyond measure for actually winning very often JUST THAT SINGLE HIGHEST REZZ ni the benchmarks.. lol
    It's like politics, total spin and twist, and forget anything else.
    Reply
  • SlyNine - Tuesday, January 13, 2009 - link

    OK you're an Nvidia fanboi, We get it.

    The 4870 was the better deal. get over it.
    Reply
  • SiliconDoc - Tuesday, January 13, 2009 - link

    Merely a fan of the truth. Is the truth that hard for you tools to face ?
    Apparently a lot of freaks have decided to face the truth now = as in get a 30" or forget this level of card, which DOES INCLUDE the 4870x2 - only even more so - as you see disabling x2 with that card is a no go...
    So it took NVidia releasing the slightly better card to wake up the ATI fanboys. 6 months of lies and fanning their red flamnes all over the place, and now Nvidia has brought most of them to their senses.
    " Noone needs a card this good"- is the half hearted response from the red tools now. Or better yet, since the 4870x2 just dropped from 500 bucks to lower pricepoint because it got slayed, now they want the no profiles, driver issue red piece of crap, anyway.
    Whatever...
    Now, WOULD YOU LIKE TO REFUTE ANYTHING ? Including the power consumption I brought up ? Please, you're on the internet, go look, before you make a fool of yourself calling me a fanboy.
    GO LOOK.
    Reply
  • kk650 - Wednesday, January 14, 2009 - link

    too bad a fatarse burger eating yank like yourself didn't die on 9/11 along with several thousand of your fucking countrymen, you cunt. Now go die of a heart attack, retard Reply
  • DJMiggy - Wednesday, January 14, 2009 - link

    That comment was very much uncalled for. You are a very crass individual. I hope you never have to lose a friend or loved one to something that could have been prevented and wish you and your family well. Reply
  • kk650 - Tuesday, January 13, 2009 - link

    He's a fucking moron Reply
  • mhouck - Monday, January 12, 2009 - link

    I have to agree with the others that this article was disappointing. I would think that the nvidia dual GPU would be compared to their last dual GPU the GX2. I was really expecting to see a comparison between the last generation solution and this generation. Maybe a look at driver support for the GX2 and how its doing as compared to ATI's driver support for their X2's. From all the articles I've read, we are constantly asking whether the driver support will be there to make dual GPU's in SLI and Crossfire worthwhile. WELL, TAKE A LOOK AND REPORT BACK! These solutions have been out for a year now!! Maybe I expected too much. :-( Reply
  • MadMan007 - Monday, January 12, 2009 - link

    Which GTX 260 is included in the charts, GTX 260-192 or GTX 260-216? Reply
  • BSMonitor - Monday, January 12, 2009 - link

    Don't you guys usually show us the system power consumption charts for these hefty GPU's? Curious where is stands on that front against 260, 280 SLI and 4870x2. Reply
  • SiliconDoc - Monday, January 12, 2009 - link

    Pssst ! The GTX295 wins hands down in both those departments...that's why it's strangely "left out of the equation".
    (most all the other sites already reported on that - heck it was in NVidia's literature - and no they didn't lie - oh well - better luck next time).
    Reply
  • Amiga500 - Wednesday, January 14, 2009 - link

    Well... to be honest...


    If leaving out power consumption pisses people like you off - good one anandtech!


    (I guess your nice and content your nvidia e-penis can now roam unopposed?)
    Reply
  • SiliconDoc - Thursday, January 15, 2009 - link

    First of all, it's "you're" when you are referring to my E-PENIS. (Yes, please also capitalize ALL the letters to be properly indicative of size.)
    Second, what were you whining about ?
    Third, if you'd like to refute my points, please make an attempt, instead of calling names.
    Fourth, now you've fallen to the lowest common denominator, pretending to hate a fellow internet poster, and supporting shoddy, slacker work, with your own false fantasy about my temperament concerning power and the article.
    What you failed to realize is me pointing out the NVidia advantage in that area, has actually pissed you off, because the fanboy issue is at your end, since you can't stand the simple truth.
    That makes your epeenie very tiny.
    Reply
  • kk650 - Tuesday, January 13, 2009 - link

    Remove yourself from the gene pool, fuckwit americunt Reply
  • darckhart - Monday, January 12, 2009 - link

    from all the games where the gtx295 beats the 4870x2, it's only a 3-5 fps win. i don't see how that "gets the nod even considering price." at best, that's $10 per frame. i think we need to see thermals and power draw (i don't recall if you talked about these in the earlier article) to better justify that extra $50. Reply
  • JarredWalton - Tuesday, January 13, 2009 - link

    I bought a 4870X2 a couple months back... if I had had the option of the GTX 295, it would have been my pick for sure. I wanted single-slot, dual-GPU (because I've got a decent platform but am tired to dual video cards). I cannot overstate the importance of drivers, and frankly ATI's drivers still disappoint on a regular basis. Until and unless ATI can get driver profiles into their drivers for CrossFire, I don't think I can ever feel happy with the solution. Also, 4870X2 drivers need to bring back the "CrossFire disable" option in the drivers; we all know there are two GPUs, and there are still occasional games where CrossFire degrades performance over a single GPU. Reply
  • TheDoc9 - Monday, January 12, 2009 - link

    Definitely a half-ass review, something I don't expect from Anandtech. Something more to come later?

    Many questions on these cards can still be asked;
    -Testing at other resolutions, not just a recommendation to stay away unless playing at 2500 res. and a 30" monitor.
    -Testing on other rigs, such as a mid range quad core and dual core to give us an idea of how it might perform on our rig (who don't own a mega clocked i7)

    I don't like to sound negative, but honestly there was no enthusiasm written in this preview/review/snapshot/whatever it's supposed to be. Kind of disappointing how every other major site has had complete reviews since launch day. Was this written on the plane trip home?
    Reply
  • bigonexeon - Friday, January 16, 2009 - link

    i dont see the point of using an intel i7 as intel released a article that the i7 cache 3 has a memory leak thats only attempt at fixing it is a software patch which there not going too fix until the second generation of the i7s also why compare standard cards too a newer card why not put the older cards newer designs against the newer cards etc the superclocks,xxx which is the latest edition of the old model used Reply
  • theprodigalrebel - Monday, January 12, 2009 - link

    Might want to take a second look at the line graph and the table below it. Reply
  • Iketh - Monday, January 12, 2009 - link

    This article was just right. I had no enthusiasm to read about this card because there isnt anything to get excited about. Apparently Derek didn't either. Im sure there will that enthusiasm again when a next gen card appears and there is something new to talk about.

    It's also having to follow the Phenom II article.
    Reply
  • Gasaraki88 - Monday, January 12, 2009 - link

    It's stupid to get this card if you don't have a 30" monitor and a high end cpu. They are testing the video card here not CPUs. Testing on slower CPU will just show every card pegged at the same frame rate.

    This review was fine, thanks. =)
    Reply
  • SiliconDoc - Thursday, January 15, 2009 - link

    Gee, suddenly the endlessly bragged about "folding" means absolutely zero (ATI cards suck at it BTW) ... and you've discounted CUDA, and forgotten about PhysX ,,, and spit in the face of hundreds of thousands of single PCI-e 16x motherboard owners.
    Now go to the 4870x2 reviews and type that same crap you typed above - because I KNOW none of you were saying it THEN on the 4870x2 reviews...
    In fact, I WAS THE ONLY ONE who said it at those reviews... the main reason BEING THE 4870X2 WAS RECIEVING ENDLESS PRAISE FOR WINNING IN THAT ONE 2560X resolution....
    Yes, they couldn't stop lauding it up over how it excelled at 2560x - oh the endless praise and the fanboys drooling and claiming top prize...
    Now all of sudden, when the red card get SPANKED hard....
    ____________________________________________________________

    Yes, when I posted it at the red reviews - I didn't have folding or PhysX to fall back on... to NEGATE that... not to mention EXCELLENT dual gpu useage and gaming profiles OUT OF THE BOX.

    The facts are, the 4870x2 had better be at least 100 bucks cheaper, or more - who wants all the hassles ?
    Reply
  • TheDoc9 - Tuesday, January 13, 2009 - link

    It wasn't fine for me, and I don't believe that this card should only be purchased by those with a 30" monitor and bleeding edge cpu. That someone with a fast core proc might be able to find some use with this product vs. the next slowest alternative. Prove to me I'm wrong. Reply
  • Nfarce - Tuesday, January 13, 2009 - link

    Ok. I can look at many data results on this website with the GTX280 paired with an i7 and a stock clocked E8500 and do some interpolation of said data into the results here.

    See Exhibit A from the Jan. 8 article on AMD's Phenom II X4 940 & 920 using a single GTX280:

    Crysis Warhead @ 1680x1050 (mainstream quality, enthusiast on) results:

    Stock E7200 @ 2.53 GHz-> 66.2 fps
    Stock E8600 @ 3.30 GHz-> 84.0 fps
    Stock i965 @ 3.20 GHz-> 86.8 fps

    Now back to this report with the same game resolution (but using gamer quality with enthusiast on) with a single GTX280:

    Stock E7200 @ 2.53 GHz -> ???
    Stock E8600 @ 3.30 GHz -> ???
    Stock i965 @ 3.20 GHz -> 36.6 fps

    Now using the GTX295:

    Stock E7200 @ 2.53 GHz -> ???
    Stock E8600 @ 3.30 GHz -> ???
    Stock i965 @ 3.20 GHz -> 53.1fps

    With the above data, it shouldn't take an M.I.T. PhD to reasonably get a figure of potentials with slower CPUs and lower resolutions.





    Reply
  • TheDoc9 - Wednesday, January 14, 2009 - link

    That actually is informative. Reply
  • A5 - Monday, January 12, 2009 - link

    If you're not playing at 25x16, this card isn't going to make anything playable that isn't already on an existing, cheaper solution.

    In that same vein, the people who will drop $500 on a video card will most likely have a high-end CPU - there isn't a reason to test it on systems that aren't the top of the heap. They're testing the card, not the whole system - the conclusion to be made is that on any given set of hardware, Card X will outperform Card Y.
    Reply
  • SiliconDoc - Monday, January 12, 2009 - link

    Just like the bloated pig 4870x2 has been for so many months - a costly, outlandish, not neccessary, wild good for nothing at a bad pricepoint, unless you're playing at 2560x - and let's add, at that level it's sometimes not even playable framerates anyway.
    Glad to see people have finally come to realize what a piece of crap the 4870x2 solution is - thanks NVidia for finally straightening so many out.
    This is great.
    Reply
  • Hxx - Tuesday, January 13, 2009 - link

    you obviously have no clue about video cards or you cannot afford one, which explains your attitude. First off the 4870 x2 is an awesome card much faster than any other card available except the 295. Second, it is reasonably priced at 400 after mir which is not bad for a high end product. This card can run every game out there just as good as the gtx 295. There is no difference between the two because once the game runs at 35 fps or above you will not notice a difference. In other words, the 16 fps difference in cod5 between the cards has no value because the game plays fine at 40fps. The gtx 295 priced at $500 is a waste of money. And btw I am not an ATI fanboy , I own a gtx 280. Reply
  • strikeback03 - Wednesday, January 14, 2009 - link

    The 4870x2 launched at $550, so unless you need a new card RIGHT NOW you can wait until the initial rush on the GTX295 is over and the price settles down some. Reply
  • SiliconDoc - Tuesday, January 13, 2009 - link

    So tell me about the forced dual mode in ATI ? Oh that's right mr know it all, they don't have that.
    SLI does.
    Yes, I obviously know notihng, but you made a fool of yourself.
    BTW - please give me the link for the $400.00 4870x2 - because I will go buy one - then post all the driver and crash issues.

    Waiting....
    Reply
  • Hxx - Friday, January 16, 2009 - link

    Siliconduc,

    I said that a 4870x2 can be had for 400 after mir learn to read next time. Both companies have driver and crash issues with their cards, Nvidia is not an exception. That is especially if you run vista 64 bit so its unfair to judge ATI based on that. Second, Nvidia is known for overpricing their cards and the gtx295 is no exception. Third, the gtx 295 DOES NOT wipe the floor with 4870 x2, its slightly faster, like 5-10% in the majority of the games, not all of them. There isn't any game out there that will be playable on a gtx 295 but not on a 48700 x2, NONE, regardless of your display resolution. So why pay 100 extra? Especially since the life span of these videocards its so short that you would need a better card every year if you wanna keep maxing out ur games.
    Reply
  • SiliconDoc - Saturday, January 17, 2009 - link

    Oh, on the 7th SPAM reposting, you deleted some of your idiocy, the last few lines of overtly excessive bs, as compared to the former bs plain lines you decided to keep.
    So, psycho $3v3n spamfanboy, you feel corrected now ? rofl
    I asked about your feelings because that's what you post about. lol

    Reply
  • Hxx - Friday, January 16, 2009 - link

    Siliconduc,

    I said that a 4870x2 can be had for 400 after mir learn to read next time. Both companies have driver and crash issues with their cards, Nvidia is not an exception. That is especially if you run vista 64 bit so its unfair to judge ATI based on that. Second, Nvidia is known for overpricing their cards and the gtx295 is no exception. Third, the gtx 295 DOES NOT wipe the floor with 4870 x2, its slightly faster, like 5-10% in the majority of the games, not all of them. There isn't any game out there that will be playable on a gtx 295 but not on a 48700 x2, NONE, regardless of your display resolution. So why pay 100 extra? Especially since the life span of these videocards its so short that you would need a better card every year if you wanna keep maxing out ur games. So that's why i consider this card pointless. Because your paying extra for physics ( which is used by a handful of games) and is not noticeably faster than a 4870 x2. As for the power consumption, if you have the money to throw on either one, than you don't care about the difference in wattage.
    Reply
  • Hxx - Friday, January 16, 2009 - link

    Siliconduc,

    I said that a 4870x2 can be had for 400 after mir learn to read next time. Both companies have driver and crash issues with their cards, Nvidia is not an exception. That is especially if you run vista 64 bit so its unfair to judge ATI based on that. Second, Nvidia is known for overpricing their cards and the gtx295 is no exception. Third, the gtx 295 DOES NOT wipe the floor with 4870 x2, its slightly faster, like 5-10% in the majority of the games, not all of them. There isn't any game out there that will be playable on a gtx 295 but not on a 48700 x2, NONE, regardless of your display resolution. So why pay 100 extra? Especially since the life span of these videocards its so short that you would need a better card every year if you wanna keep maxing out ur games. So that's why i consider this card pointless. Because your paying extra for physics ( which is used by a handful of games) and is not noticeably faster than a 4870 x2. As for the power consumption, if you have the money to throw on either one, than you don't care about the difference in wattage.
    Reply
  • Hxx - Friday, January 16, 2009 - link

    Siliconduc,

    I said that a 4870x2 can be had for 400 after mir learn to read next time. Both companies have driver and crash issues with their cards, Nvidia is not an exception. That is especially if you run vista 64 bit so its unfair to judge ATI based on that. Second, Nvidia is known for overpricing their cards and the gtx295 is no exception. Third, the gtx 295 DOES NOT wipe the floor with 4870 x2, its slightly faster, like 5-10% in the majority of the games, not all of them. There isn't any game out there that will be playable on a gtx 295 but not on a 48700 x2, NONE, regardless of your display resolution. So why pay 100 extra? Especially since the life span of these videocards its so short that you would need a better card every year if you wanna keep maxing out ur games. So that's why i consider this card pointless. Because your paying extra for physics ( which is used by a handful of games) and is not noticeably faster than a 4870 x2. As for the power consumption, if you have the money to throw on either one, than you don't care about the difference in wattage.
    Reply
  • Hxx - Friday, January 16, 2009 - link

    Siliconduc,

    I said that a 4870x2 can be had for 400 after mir learn to read next time. Both companies have driver and crash issues with their cards, Nvidia is not an exception. That is especially if you run vista 64 bit so its unfair to judge ATI based on that. Second, Nvidia is known for overpricing their cards and the gtx295 is no exception. Third, the gtx 295 DOES NOT wipe the floor with 4870 x2, its slightly faster, like 5-10% in the majority of the games, not all of them. There isn't any game out there that will be playable on a gtx 295 but not on a 48700 x2, NONE, regardless of your display resolution. So why pay 100 extra? Especially since the life span of these videocards its so short that you would need a better card every year if you wanna keep maxing out ur games. So that's why i consider this card pointless. Because your paying extra for physics ( which is used by a handful of games) and is not noticeably faster than a 4870 x2. As for the power consumption, if you have the money to throw on either one, than you don't care about the difference in wattage.
    Reply
  • Hxx - Friday, January 16, 2009 - link

    Siliconduc,

    I said that a 4870x2 can be had for 400 after mir learn to read next time. Both companies have driver and crash issues with their cards, Nvidia is not an exception. That is especially if you run vista 64 bit so its unfair to judge ATI based on that. Second, Nvidia is known for overpricing their cards and the gtx295 is no exception. Third, the gtx 295 DOES NOT wipe the floor with 4870 x2, its slightly faster, like 5-10% in the majority of the games, not all of them. There isn't any game out there that will be playable on a gtx 295 but not on a 48700 x2, NONE, regardless of your display resolution. So why pay 100 extra? Especially since the life span of these videocards its so short that you would need a better card every year if you wanna keep maxing out ur games. So that's why i consider this card pointless. Because your paying extra for physics ( which is used by a handful of games) and is not noticeably faster than a 4870 x2. As for the power consumption, if you have the money to throw on either one, than you don't care about the difference in wattage.
    Reply
  • SiliconDoc - Saturday, January 17, 2009 - link

    After you claim one card is cheaper than another, then you claim if you have money to throw on either one you don't care about the power savings from NVidia.
    Clearly you are deranged.
    Reply
  • Hxx - Saturday, January 17, 2009 - link

    You will understand in time that not everything that is been released its actually worth the money , especially with computer hardware with a high depreciation factor. Reply
  • SiliconDoc - Saturday, January 17, 2009 - link

    Now you're off on another argument, since you made a fool of yourself on the former one.
    Tell us how Corvette's are always a waste of money, too. I'm sure we're all waiting for your condsiderate opinion on the matter. Do tell us as well how adults in this forum, even I, do not understand such a concept, I'm sure some other idiot will believe you.
    You done stuffing your own shoe in your mouth ?
    I certainly don't believe you are as ignorant as the last statement you typed, nor that anyone here is as ignorant as you claim possible.
    Adults in a tech not understanding that some modern items purchased may be overpriced, even beyond their percieved consumer value ?
    Surely you jest, bs artist.
    Reply
  • Hxx - Saturday, January 17, 2009 - link

    Do you actually read what you post or what you reply to? You make no sense sir, I never said Corvettes are not worth the money lol, or that they are a waste of money. Honestly, I don't wanna argue with you, but you surely don't act like an adult, as you previously posted. Bottom line is that this particular card is not worth $500,that has been my whole point right from the beginning. It doesn't bring a noticeable performance lead over the older video cards available, such as the 4870 x2.Now you can act like an Nvidia fanboy and say that i'm full of *** or you can take it as a given. Reply
  • SiliconDoc - Sunday, January 18, 2009 - link

    If you didn't post your BS, you wouldn't be making a fool of yourself in argument. Clearly you blabbered out total crap your first post, and it has been utterly refuted, and not just by me but by other posters in this thread.
    Heck you even refuted yourself, as I pointed out.
    What card you do or don't have makes absolutely ZERO difference, since you slapped yourself down and still pretend you can't comprehend that. In fact you can't even comprehend a metaphor - Corvette flunky.
    Now, after you yapped 500 is too expensive, you yakkered 400 is not, then you blabbered electric costs don't matter to anyone buying a high end card, which would - wether you admit it or not, include the 4870x2, which you, once again blabbered is a good affordable price, before and after you blabbered these high end cards are out of reasonable cost range, and quacked I couldn't understand such a thing, but I do indeed understand YOUR OPINION, and HOW FALSE IT IS, proven so by your OWN WORDS.
    Now, you want to claim all you ever said was "blah bloah blah bla h4870x2"...
    Well, you said a lot more, and yeah, you should take nearly all of it back, since the majority was BS - full of contradiction.
    Stating your peace without blabbering lies that contradict your former spewing is apparently nowhere in your repetiore of posting skills.
    Others disagree with your stupidity as well, and have posted as much, no need for them to have directed it at you.
    Let me remind you, YOU blabbered this at ME, you initiated your little SPEW here, and the very FIRST thing you did was throw an INSULT:
    " you obviously have no clue about video cards or you cannot afford one, which explains your attitude. First off the 4870 x2 is an awesome card much faster than any other card available except the 295. Second, it is reasonably priced at 400 after mir which is not bad for a high end product. This card can run every game out there just as good as the gtx 295. There is no difference between the two because once the game runs at 35 fps or above you will not notice a difference. In other words, the 16 fps difference in cod5 between the cards has no value because the game plays fine at 40fps. The gtx 295 priced at $500 is a waste of money....................."

    No go BLOW your stupid lies at someone else, someone DUMB enough to buy your total BS.
    Reply
  • totenkopf - Sunday, January 18, 2009 - link

    All of you guys are ridiculous. It's good to see that brand loyalty is all it takes to reduce you all to a rabble of immature E-men. The only incorrect thing here is this unyielding and blind loyalty. Utility dictates that none of you are really wrong, just misled :)

    The 295 is an awesome card. Period. Even though it only gets marginally better frames than it's rival in many cases, it's still the undisputed champion of the graphics realm.

    CUDA and Physx (or whatever) have a severely limited value to many folks. Certainly, one buying this card will want all of those bells to goes with his whistles, but they are not necessary (nor are they necessarily welcome for the debatably small premium you pay). However, there is a small chance they will become widely supported in the lifespan of this card, and therefore useful.

    Frankly, I believe power consumption is really a silly thing to argue about. It's not like the difference in idle power is going to pay for the price difference between cards. While I like saving polar bears, they aren't going to influence this one luxury item I might afford myself.

    The biggest thing is the price difference. Most of the 4870x2s look like regular old reference boards so if I were to get one I would surely be getting a $400 one with MIR (at that price it makes the gtx285 a terrible value). I think the cheapest 295 I saw was $479. When every dollar counts it's a tough decision of which to get. If you're buying a new system that's a cooler and optical drive, or maybe an upgrade to a raptor. If you are just upgrading that's still a new game or mouse. Maybe I'm just cheap :) The best bet for 1900x1200 may now be the gtx280 @ $300 after MIR before the gtx285 makes it disappear. Or perhaps a $220 4870 1Gb for ~$220 with MIR (Value/performance and possible upgrade for super cheap later if you have the right PSU).

    I know one thing for sure: Without the hd4000 series and the 4870x2, the gtx280 would still cost $500+ and the gtx295 would cost $750-$850 if it were produced at all. Whatever team you like, be glad they are both here competing! What I want is for nvidia to actually compete at the mainstream level with cards that aren't going on two years old. I like AMD's scalable approach to GPU's. Their 55nm part is able to perform at every price point. Sure they used ddr5 to help them compete... they were instrumental in it's development, they kinda planned it that way (don't worry, nvidia will take advantage of it soon). I'm also a little let down that it took nvidia almost ~5 months(?) to compete for the top spot AND have it be just a wee bit better.

    Both cards are amazing for different reasons... get over it!
    Reply
  • SiliconDoc - Sunday, January 18, 2009 - link

    Another jerkoff that insults first thing, then red fans boys out, then tells everyone to get over it - after spewing the same tired talking points all through this place, and ignoring the facts and sloughing them off.
    It's CLEAR you never played a PhysX game, and don't even know what it is. I leave it at that - go get WARMONGER from the nvidia graphics plus 2 site
    http://www.anandtech.com/video/showdoc.aspx?i=3498...">http://www.anandtech.com/video/showdoc.aspx?i=3498...

    It's the one on top - NOW GO TRY IT - IT RUNS ON AN ATI 3650 at about 17 frames - my friend just tried it for the first time and loved it. IT IS SOMETHING ENTIRELY NEW AND BETTER - no go try it DUMMMY.
    As to the rest of your spew, it is clear you're a red fanboy, and can't help yourself - or you're so dumb you just repeat ad infinitum the red talking points of stupidity - like wanting NV to make new chips at the mid and lower end when their TWO YEAR OLD CHIPS SLAP ATI'S TOP CORE AND ALL IT'S LOWER RANGE DOWN.
    ________________________________________________

    How about this raging red freak that wants everyone else to "get over it" - how about we take a TWO YEAR OLD ATI CORE - and see what she does ? You up for it ?
    NO OF COURSE YOU AREN'T.
    Ati is still WAY BEHIND - they need a totally new core to go anywhere, and without DDR5, they'ed have nearly NOTHING - since the 8800 and 9800's SMACK DOWN THE 4850 - ati's gimped top core.
    GET A CLUE.
    Reply
  • SiliconDoc - Sunday, January 18, 2009 - link

    corrected warmonger link http://www.nvidia.com/content/graphicsplus/us/down...">http://www.nvidia.com/content/graphicsplus/us/down... Reply
  • Hxx - Monday, January 19, 2009 - link

    LOL, warmonger is one of those nvidia crap demnos,just like cellfactor, complete BS. Its not a complete game, just a demo to show how "good" and "powerful" - hence the name power packs, Nvidia cards are. They bought agaiea, who failed, and now theyre trying to market this physics crap that won't take off unless game developers are going to adopt it. Since Mirror's edge is garbage, Nvidia failed with their physics, for now at least.

    IF it wasn't for ATI, you would post in here like a gullable kid saying that gtx 295 is so worth for $850 ( 8800 ultra anyone?). The gtx 295 is the first NVIDIA high end card priced at only $500 - hmm i wonder why. I bet you think that games with Nvidia logo play only on Nvidia hardware. Your friend mustve switched from playing WOW if he/she thinks that game is the best thing ever. Stop drooling your BS that nobody cares.
    Reply
  • overcast - Thursday, January 15, 2009 - link

    http://www.newegg.com/Product/Product.aspx?Item=N8...">http://www.newegg.com/Product/Product.aspx?Item=N8...
    http://www.newegg.com/Product/Product.aspx?Item=N8...">http://www.newegg.com/Product/Product.aspx?Item=N8...
    http://www.newegg.com/Product/Product.aspx?Item=N8...">http://www.newegg.com/Product/Product.aspx?Item=N8...

    I'm running 2 x 4870x2 on an I7 machine with no crash issues.

    Now shut up.
    Reply
  • SiliconDoc - Thursday, January 15, 2009 - link

    Go tell this guy to shut up now, bloviator.
    " I cannot overstate the importance of drivers, and frankly ATI's drivers still disappoint on a regular basis. Until and unless ATI can get driver profiles into their drivers for CrossFire, I don't think I can ever feel happy with the solution. Also, 4870X2 drivers need to bring back the "CrossFire disable" option in the drivers; we all know there are two GPUs, and there are still occasional games where CrossFire degrades performance over a single GPU. "

    I didn't see any cards at $400 at your links, either, but one thing is clear - the shabby 4870x2 with driver issues out the wazoo for EVERYONE is getting quickly very, very cheaper for a reason. That reason is IT GOT STOMPED by the GTX295, and the GTX295 runs better in a far wider variety of games ON LAUNCH than the 4870x2 EVER will.

    Now it's time for you to shut up.
    Reply
  • overcast - Thursday, January 15, 2009 - link

    What part of, ($399.99 after $40.00 Mail-In Rebate) , don't you understand?

    Congratulations to the Nvidia team for developing a new card that beats something released last August, typically by less than 10fps. Amazing.
    Reply
  • SiliconDoc - Saturday, January 17, 2009 - link

    Here's what you should understnad, we'll take the very best prices we know area vailable, not usual that average users buy from which can be much higher - best savings usually - newegg.

    4870x2 = $449, $550, $505, $475, $505, $495, $430, $480

    Now those are the BEST prices.

    I think your lies need some work.
    Reply
  • SiliconDoc - Saturday, January 17, 2009 - link

    Congratulations to ati for releasing the 4580 in August that gets beat by 2 year old standard and hopped up 8800 nvidia cores. I just love new technology, don't you ? Reply
  • SiliconDoc - Thursday, January 15, 2009 - link

    Of course you are, you all are, none of you have any driver issues on your red cards, you all say so.
    So no, I won't shutup - I will isten to your lies, and respond to them.
    It isn't reasonable at ALL to claim no driver issues on a 4870x2 - it';s a flat out lie from the word go.
    Reply
  • Hxx - Friday, January 16, 2009 - link

    Siliconduc,

    I said that a 4870x2 can be had for 400 after mir learn to read next time. Both companies have driver and crash issues with their cards, Nvidia is not an exception. That is especially if you run vista 64 bit so its unfair to judge ATI based on that. Second, Nvidia is known for overpricing their cards and the gtx295 is no exception. Third, the gtx 295 DOES NOT wipe the floor with 4870 x2, its slightly faster, like 5-10% in the majority of the games, not all of them. There isn't any game out there that will be playable on a gtx 295 but not on a 48700 x2, NONE, regardless of your display resolution. So why pay 100 extra? Especially since the life span of these videocards its so short that you would need a better card every year if you wanna keep maxing out ur games. So that's why i consider this card pointless. Because your paying extra for physics ( which is used by a handful of games) and is not noticeably faster than a 4870 x2. As for the power consumption, if you have the money to throw on either one, than you don't care about the difference in wattage.
    Reply
  • kumquatsrus - Monday, January 12, 2009 - link

    great, all i have to do now is find me a 30 inch monitor to go along with this card, lol. Reply
  • SiliconDoc - Saturday, January 17, 2009 - link

    How about overclocking did you say - strangely missing in the review for some unknown "reason". lol AMAZING.

    Let's use vr-zone, at least they included overclocking since the GTX295 DOES SO WELL, so much better than the already frying an egg or your fingers 4870x2 45 more watts used in 3d and 65 more watts used in idle powerhogggg - you know that card you couldn't wait to get for 6 months - before NVidia trounced it.

    " The GTX 295 lends itself fairly well to overclocking.
    Reference Speed
    (Mhz)
    Max. Overclocked
    Speed (Mhz)
    Percentage
    Increase

    Core Clock
    576
    670
    16.32%

    Shader Clock
    1242
    1444
    16.26%

    Memory Clock
    999
    1220
    22.12%

    Crysis Warhead fps: 30.91 standard 35.74 OVERCLOCKED

    A GOOD SOLID 15% GAIN , great overclock scaling !

    I guess they "forgot" here. Maybe I should read it all again, about expensive monitors, since I didn't see the usual overclocking drop down page tab... huh... it's strangely MISSING ENTIRELY.

    http://vr-zone.com/articles/nvidia-geforce-gtx-295...
    Reply
  • SiliconDoc - Monday, January 12, 2009 - link

    That was never an argument for the 4870x2.
    I knew though the red blooded geekicans would make sure they pointed it out over and over in their latest GTX295 reviews , top front, and center.
    It's AMAZING.
    Reply

Log in

Don't have an account? Sign up now