POST A COMMENT

109 Comments

Back to Article

  • Blackchild1101 - Saturday, April 28, 2012 - link

    Looks badass...

    Hopefully I'll be able to purchase one by 2014...
    Reply
  • michaelheath - Sunday, April 29, 2012 - link

    Indeed. The amount of money asked for this card is more than what I was planning on spending overall for a system upgrade.

    Dear Nvidia: Hurry up with that GTX 660Ti. $500-1k graphics cards are useful and within financial reach of only a handful of gamers. The rest of us have other financial priorities.
    Reply
  • MonkeyPaw - Sunday, April 29, 2012 - link

    Well at least the didn't call it the "NVIDIA GeForce 680 4GB Superclocked 3072 Core Edition." :p Reply
  • CeriseCogburn - Sunday, April 29, 2012 - link

    Or the "Double D" knockers, and then the many amd fanboys claimed it was the sexiest card they had ever seen. ROFL
    (this card actually looks better, period)
    +
    Or the gigahertz ! edition... GHz ! (I guess amd was proud they got their second class up past ... no never mind it was all a marketing scheme dishonest as could be from the word GHz appearing).
    -
    I say thank God Nvidia is coming in at $999 or we'd have AMD's $1,200 dual to stare our wallets down.
    Reply
  • michaelheath - Sunday, April 29, 2012 - link

    Perhaps I should phrase my sentiments another way: Why the eff is Nvidia and AMD bothering at all with putting out dual GPU cards that all of five people can afford *and* actually see any benefit from?

    I'm just as enthusiastic about computer hardware as the next person who responds to my pithy statements, but still, all things considered, another dual GPU, 300+ watt, $1k graphics card is hardly exciting in a day and age when $250-350 will do it for 99.9% of gamers.
    Reply
  • daniel142005 - Sunday, April 29, 2012 - link

    But that 0.01% is still profitable to them. You'd be surprised how many people will actually buy this card. Likely even two. GPUs aren't just used for games anymore either... and most of those $250-$350 cards won't do it for 3D games. At least not for max resolution and at least high settings. Reply
  • Yuriman - Monday, April 30, 2012 - link

    GK104 has been castrated for compute, though. Even smaller target audience, but I bet they'll still sell every one they build. Reply
  • CeriseCogburn - Monday, April 30, 2012 - link

    I somewhat agree with you michaelheath, but you have to remember inflation.
    Heck a decade + ago people were paying a solid 3 grand for a lousy 386 system, so probably anyone not a teenager or barely above understands fairly well - these are high end devices - not much worse than SSD's of any decent size.
    Heck the Dell 30"ers were over 2 grand forever and can you imagine 3 of those ?
    So it's not a low price but it's not outrageous, it's just it would be nice if it was $200 instead of $1,000.
    It's a persons hobby and many hobbies cost thousands of dollars very easily.
    The other upside is it drives up tech and pushes it forward so that's good as well.
    Reply
  • medi01 - Monday, April 30, 2012 - link

    "Something overpiced cost 2 times more, hence the price isn't outrageous", seriously? Very silly argument. Reply
  • CeriseCogburn - Monday, April 30, 2012 - link

    That's your silliness, not mine. BM&C about price over and over and over and over, yet it's cheaper than the amd release, shoved prices down for those 79xx's, and yet, silly you, moaning and complaining forever.
    3 grand systems 15 years ago yet, you didn't register a word... those were one half one megabyte and even 256k (one quarter of a megabyte ram), one eight thousandth the amount of ram in these...
    ROFL - ONE EIGHT THOUSANDTH and you can do is moan, and call me names.
    Reply
  • MrBrownSound - Monday, April 30, 2012 - link

    Now those were the days... so I hear, lol.
    I agree with you that considering all things and the price these cards are moving, 500 is very competitive pricing. I'm planning on buying one soon as they let me. Was it not for AMD, nvidea would gladly charge $680 for the 680 or much more.
    Reply
  • leignheart - Monday, April 30, 2012 - link

    well dude, would you rather be apart of the 1% of America that is mega rich, or would you rather be the 99% who are poor and at best average? and also what do you mean actually see any benefit from? play witcher 2 on full tilt and see if your cracker jack box video card could actually play it. cause it cant. im happy nvidia and amd produce 1000 dollar cards, just as im happy they produce 50 dollar cards, so that there is a card for everyone. this card clearly isnt aimed at you so you have no business commenting on it. just as i have no business commenting on low end or mid range cards because they are not aimed at me. and i thank the lord almighty every everyone doesnt think like you because we never have advancement. Reply
  • taken0prisoners - Monday, April 30, 2012 - link

    These cards are for multi-monitor. Most people do not buy a card like this for a single monitor. Reply
  • euler007 - Sunday, April 29, 2012 - link

    The GTX 680 has been sold out for a month on newegg, better for them to focus on producing that en masse at the moment. Reply
  • leignheart - Monday, April 30, 2012 - link

    no its better for them to produce what they think is best at the moment. man im amazed at how many people think they know how to run nvidia better than they do. Reply
  • leignheart - Monday, April 30, 2012 - link

    dude its pretty rude of you to say people who buy this card have no other financial priorities. like your better or work harder or are somewhat more intelligent in your pc buys or some garbage. get a better job or manage your finances better and then you can afford a card like this, but dont disparrage anyone else. you sound like a jerk. Reply
  • smilingcrow - Sunday, April 29, 2012 - link

    Do what I will do; get a second paper round and you should be able to afford one in 2013. Reply
  • CeriseCogburn - Sunday, April 29, 2012 - link

    The 7970 and 7950 were near paper as well for 2.5 months until 1 day before the GTX680 launch - then suddenly they had stock at the egg.
    As soon as 2.5 months from Nvidia GTX680 launch hits, we can take a comparison, and see who really supplied closer to launch date.
    Reply
  • piroroadkill - Monday, April 30, 2012 - link

    This just isn't true. A friend of mine bought a 7970 barely after launch day. Reply
  • CeriseCogburn - Tuesday, May 01, 2012 - link

    Three friends of mine bought a 680 barely after launch day, AND I watched stock on the 7970 for the 2.5 months interim.
    I am correct, and you are wrong, period.
    Reply
  • Parhel - Tuesday, May 01, 2012 - link

    Since maybe a week after launch, the 7970 was in stock all day, every day, at Newegg. Several models in fact. Reply
  • medi01 - Monday, April 30, 2012 - link

    What do you do with your PC, may I ask? Reply
  • ct760ster - Monday, April 30, 2012 - link

    Somp pody pliz colt an amber lamp... ccolt an amber lamp, my pocket is bleeding ;) Reply
  • jav6454 - Sunday, April 29, 2012 - link

    I was looking forward to this card until I saw the price trend... not a chance. It seems logical for extreme users, but if they really want to push AMD, they should lower that cost down to $800s. I can dream can I?

    However, price aside, this will be a worthwhile card to have. Let's see if it plays out as expected.
    Reply
  • cmdrdredd - Sunday, April 29, 2012 - link

    Yep...might as well SLI or X-Fire instead and save some cash. Reply
  • Death666Angel - Sunday, April 29, 2012 - link

    This card is for people who either want 4 GPUs or don't have easy CF/SLI capabilites (PSU not providing enough power / connectors, mainboard not sufficient, not enough space). I doubt many people cross shop between high end single GPUs in CF/SLI and high end dual-GPU cards. Reply
  • ImSpartacus - Sunday, April 29, 2012 - link

    Why would Nvidia want to push AMD? There's no <$1000 7990 on the market. Reply
  • CeriseCogburn - Sunday, April 29, 2012 - link

    On the other hand without Nvidia's superior GTX680 the 7970x2 price would be minimum $579.99 times two, or $ 1,159.98, nearly $161 dollars more than the purported 690's coming price, and $260 less than 2 amd flagships - $130 bucks per.
    +
    So Nvidia saved everyone money, while amd scalped early adopters like me out of 3 free games and over $130 for a single 7970.
    Boo!
    Why didn't amd just price their cards reasonably to begin with ?
    Obviously they feel the 7970 is worth much less, 3 free games (two of which and sometimes all 3 of which are missing nearly all the time) and $50-$250 less than the GTX680 now that the benches, drivers, and added features (of Nvidia cards) are widely known.
    Nvidia always launches at $499, by the way, something else all the genius reviewers always fail to mention, even as they raked the coals on pricing, they couldn't put the absolutely standard Nvidia flagship launch price together in their tiny conspiracy minds - which helped out amd of course !
    LOL
    Thanks so much ! I got burned !
    Reply
  • cknobman - Monday, April 30, 2012 - link

    AMD's card were priced high from the get go because:

    A. Nvidia did not have anything to compete with.

    B. There were idiots like you willing to pay the high price!!!!!!!!!!!!
    Reply
  • CeriseCogburn - Tuesday, May 01, 2012 - link

    No they were priced high because amd is a greedy no good scalping and dishonest company, and all the online review sites are idiots and supporters in that.
    This site praised amd shorting it's 6000 series in the channels so it "wouldn't be competing with itself".
    In other words, a lot of dishonest and uncaring people around, despite their protesting lips.
    Reply
  • Kenenniah - Tuesday, May 01, 2012 - link

    Last I checked, a high end GPU isn't something most people NEED. It's a luxury item and you chose to buy it at that price. The only person that "scalped" you is yourself. How was it dishonest, the price was listed and you paid it.

    A corporation's job is to make money for it's shareholders, not to feed you roses and sunshine. A smart business prices its products at what it feels the market can bear. Since people like you were willing to buy at that price....... basic supply and demand economics.
    Reply
  • CeriseCogburn - Thursday, May 10, 2012 - link

    Come on, you should be smarter than that. most of what we buy in the west is a luxury item. No gaming card isn't a luxury item, and the problem which apparently flew over the top of your not so shiny scalp is the immediate price drop by the pathetic amd within 2.5 months - and no matyter your protestations or millionth time pretend corporate board internet idiot spew, the bottom line is the jerks lost money for themselves, because people like myself have a lot more than 2 watts of hate filled fanboyism.
    AMD cannot be trusted at all now - any time they release before nVida they are held in gigantic pricing suspicion.
    They blew away years of perception in one stupid move.
    Reply
  • B3an - Sunday, April 29, 2012 - link

    Looking good for a dual GPU card.

    But i remember reading about a higher end and larger GPU based on kepler, called GK112 or something like that? Any news on this or is the 690 the same thing? This could be old incorrect news but i've not been keeping up...
    Reply
  • Dracusis - Sunday, April 29, 2012 - link

    Rumours of GK112 aka GTX 685 are still around, but I've no idea how accurate they are. Reply
  • Sabresiberian - Sunday, April 29, 2012 - link

    Last rumor I heard about "Big Kepler" was September. It is supposed to be a larger die version of Kepler (more transistors). I'm not sure how likely it is, really.

    (The GTX 690 is two regular GTX 680 GPUs on one board, so not "Big Kepler".)
    Reply
  • Kevin G - Sunday, April 29, 2012 - link

    It is very likely as the larger chip will also be aimed at the GPGPU crowd. The current GTX680 was surprisingly a step backward with the previous generation GTX 580 able to out run it in GPGPU focused tests. The real question is if nVidia is going to launch consumer based video cards based upon the larger chip or keep it isolated to the Quadro and Tesla lines. Reply
  • tipoo - Sunday, April 29, 2012 - link

    Yeah that's what I was thinking, the 680s compute performance is cut down drastically and the GK110 seemed very compute oriented from the rumours, so I think that might just be a Quadro/Tesla. Reply
  • Dracusis - Sunday, April 29, 2012 - link

    That makes more sense seeing the price point the GTX 680 was positioned at. Reply
  • CeriseCogburn - Sunday, April 29, 2012 - link

    It makes no sense since it was already announced to be a gamers card as were all former CUDA slam cards.
    So it makes no sense, they already announced it would be a gamers card.
    More FUD for amd, again !
    I love it !
    Reply
  • CeriseCogburn - Sunday, April 29, 2012 - link

    So you think it might just be a quaddro because the Fermi's were just quaddro's right ? LOL
    Good Lord help me.
    HELP !
    Reply
  • plopke - Sunday, April 29, 2012 - link

    My gaming pc's is most of the time build with a budget around 1000 dollar :). But I guess some people out there feeling a bit nutty. Now I only wished the nextgen console would come out :\. These kind of cards for me personally are getting each new iteration more silly because of very very aging console hardware that come with ports that utilies only part of what a modern gaming pc can do. Plus the best/funniest games for me personally these days for pc thanks to steam are small indie games.

    Reply
  • StevoLincolnite - Sunday, April 29, 2012 - link

    Well. Their is a market for these kinds of cards, those with only 1 PCI-E slot being one of them, and small form factor cases.

    Also... Getting 2 of these cards and chucking them in SLI would probably do some wonders. (And use allot less space and power than 4x 680's.)

    However, these days the high-end GPU's can slaughter any game at 1080P, so I would hazard a guess this is more targeted towards the 2560x1440, 2560x1600 and Surround Vision/Eyefinity users.
    Which lets face it... If you can afford that kind of Display set-ups you can afford a high-end GPU and vice versa.

    Also can't forget the GPU compute market either, packing 2 GPU's in a single card is a good way to drive up performance per card.
    Reply
  • Granseth - Sunday, April 29, 2012 - link

    I partly agree with you, but this time around you don't get a compute platform, if you want to GPU compute you have to go back to 580/590 or AMD
    And curious why they skimped on the memory and didn't go for 4GB for each GPU since that might be an upcoming problem and you are paying a premium for these cards.

    But I see the point with 690, if you are going SLI, or have a small form factor you might as well get this card.
    Reply
  • Sabresiberian - Sunday, April 29, 2012 - link

    I imagine that they didn't want to increase the noise level of the GTX 690, and more memory would have meant more power, so more heat, and a noisier card.

    FXAA is much more memory efficient, the GTX 680 with 2GB standing up to the Radeon 7970 with 3GB is a strong indication there. Still, the last video card I bought was a bit of a mistake because it doesn't have quite enough memory, and I'm a bit gun-shy about getting a video card with "only" an effective 2GB capacity.

    ;)
    Reply
  • Nfarce - Sunday, April 29, 2012 - link

    "However, these days the high-end GPU's can slaughter any game at 1080P"

    I just got my hands on an EVGA 680 Superclocked Signature card (with backplate) after replacing a 570SC (camping on NewEgg for half a day hitting F5 paid off). For now I'm running a 25.5" 1920x1200 monitor, and whereas I was running high-30s in frames on Crysis 1 & Warhead at 4xx, I'm now right at about 50FPS with the EVGA overclocked card. That's hardly slaughtering the frames. I still have some fine tuning to do and haven't even fired up BF3 yet. Moving up to three 24" screens at 5760x1080 or one 30" 2560x1600 (which I'm still trying to decide on for the same price) will require another card in SLI to keep the frames up.
    Reply
  • Hrel - Sunday, April 29, 2012 - link

    Utilize* "These kinds* of cards, for me personally, are getting more silly with each new iteration. Because of a very very aging console hardware that comes with ports that utilize* only part of what a modern gaming PC can do".

    Don't even say I'm a grammar Nazi, I left mistakes in there. Just trying to help.
    Reply
  • Sabresiberian - Sunday, April 29, 2012 - link

    Clearly you aren't bothering to try to understand why, so I'm not sure what your point is unless you think it's proper to brag about spending less money on gaming than someone else.

    It's not; it just shows your ignorance.
    Reply
  • Latzara - Sunday, April 29, 2012 - link

    @Sabresiberian ... I don't think he's bragging -- personaly i don't think that being able to perceive value opposed to price is anything to brag about quite the opposite we should all be more atuned to this than just checking price tags ...

    This card, to me ofc, just isn't worth it's price and i will never pay for a top of the line card cause they are always, without fail, ovepriced - an upper mid range card is a better value for money and has never failed me in terms of performance needed and i don't think it would fail the vast majority of users based on the performance/cost/availbale titles scale -- you can always pump up the resolution to something that, again, 95%(or more) of users don't use to somewhat explain where the money is going but i still think it's pure waste ...

    There is always a market for them to be sure, who wants to pay more than it's worth purely for bragging rights is welcome to them
    Reply
  • Blindsay04 - Sunday, April 29, 2012 - link

    At that price id rather just get 2x 680s Reply
  • B3an - Sunday, April 29, 2012 - link

    Well if this turns out to be quieter than two 680's as is indicated, then i'd go for this instead. It seems it will have almost identical performance as 2x 680's (memory clocks are the same, and GPU boost clock is very nearly as high). And most people, if not everyone, will noticed a quieter card over something like a 1 - 2% frame rate increase, which is basically nothing.

    And with two of these GPU's it's going to run literally every game at 60+ FPS even at 2560x1600, plus being as most monitors dont have a higher refresh rate than 60hz at native res then it would be impossible to notice any FPS increase from 2x 680's anyway.
    Reply
  • Iketh - Sunday, April 29, 2012 - link

    you'll use 50% more power going that route too... Reply
  • Iketh - Sunday, April 29, 2012 - link

    err 33%.... Reply
  • CeriseCogburn - Sunday, April 29, 2012 - link

    GTX680 is very, very power efficient, so it will beat amd's monstrous compute power hogs anyway, in either case, one dual 690 or two 680's, it's still less power than the power hungry housefire flaming 7970 with excessive overvoltage and a crankable power tune if you want to even get close to matching the 680.
    Save power ? Buy the Nvidia GTX680s.
    Reply
  • RaistlinZ - Sunday, April 29, 2012 - link

    Yeah, $1,000 is pretty steep. But until the 7990 comes out they can keep the price high and I'm sure it'll still sell. Besides, Nvidia seems to have shortage of chips even with the 680 at $500. Pricing the 690 below $1,000 will just makes their supply issues even worse. Reply
  • garadante - Sunday, April 29, 2012 - link

    Yeah but even though many NVIDIA trolling haters will ignore it, and forget about the fact as soon as possible, in my opinion NVIDIA really surprised us all with their $500 680 pricing. Even if supply appears to be utter crap in certain areas (like the states) , how many of us, had we been told a month before 680 launch, that it would sell for $500 and at least in my area, not be horribly overpriced by retailers, would've believed it? Most of us would've probably just laughed, but they went ahead and did it, and even if it hasn't seemed to be a huge topic, I'm sure it's left an impression within us. If they keep up the surprises with pricing, it can only help their image, and as these cards won't sell a hundred thousand units, perhaps that PR would be better in the long run than the margins they'd lose selling it for lower.

    But it won't surprise me if it does sell for $1000, and it'll just be a wake-up call, that the computer hardware industry doesn't really want progress; that they are trying not to shake the game up and increase the pace of development. I see so many tech companies flub something that could've been huge, but they don't seem to care. I see AMD as having flubbed by releasing a very minor improvement for their flagship next-gen card, NVIDIA doing no better and rebranding their mid-range card as their new high-end card because they can, and are just capitalizing off the margins while they can. Even with the hardware industry stagnating (come on, this incremental performance increase generation over generation is just TOO consistent... there are technologies that could give us ten fold the performance, nobody wants to risk it however) the software industry is doing even worse, with gaming feeling like a dead end for me. Ah, the days when I eagerly anticipated AAA title launches, and now I couldn't care less!

    However, I did remember something during that last paragraph: the 680 WAS intended to be a mid-range chip. It was likely going to be sold for $300-400 originally, and NVIDIA was fully aware of that fact and the margins it would've brought, so going off that mid-range mentality, perhaps, again, sub $900 or even $800 isn't all too far-fetched. I just hope they're currently producing GK110s or GK112s and stockpiling them, binning the good chips, and will hit a home-run when they decide to refresh this series hopefully like they did when they launched to 500 series.
    Reply
  • garadante - Sunday, April 29, 2012 - link

    Oh jesus christ. I didn't see the update to the page, as I read it write after the keynote ended. Apparently it is $1000 official pricing... Way to flub a winning strategy NVIDIA. Oh well, no surprise, that's what every tech giant seems to be doing. Doesn't really matter I suppose, I've never been a fan of dual GPU cards since they always seem to cause more problems than they're worth, and present the issue of microstuttering, which behemoth single chip cards don't face. So now I await the GK110/112 behemoth and the mid-range 600 parts, or, if those are lacking, the 700 series refresh. Hopefully the mid-range cards support 4 monitors just like the 680 does! :3 Reply
  • CeriseCogburn - Tuesday, May 01, 2012 - link

    No, not at all. Charlie D of Semi-Acurrate said $299, P E R O I D. Not 300 hundred or four hundred, and just as the 7970 was hitting the 590 and 6990 were $700 and $729+ respectively.
    So what you brain washed fools believe, is a card near equal to a $700 dual flagship in performance was coming in at the liars Charlie D's reported $299 rumor.
    Now of course, all you just as well soured industry insiders, all ten thousand of you on the drooler gamer blogs like this one, can make up your court fool speculation and never once pay attention to the exact prices the releases themselves came in against, even thought there are only two companies and barely a half dozen cards with at least a year and half across the board pricing stable as a rock immediately prior.
    It's like instant amnesia and moaning anti- bean counting tightwad syndrome is a virus that no amount of facts can exterminate.
    The Nvidia flagship was just rumored to be a coming 7 billion t. , the largest chip ever to be made in all of human history, but you, the oh so wise know it all claims 10X the performance is sitting before the scientists and engineers if only someone had the guts "to risk it".
    That sort of spew is equivalent to Charlie D at semi-accurate, go tell him, I'm certain he publish for you, the secret insider know it all, so we can listen to a million moaning crybabies for the next ten years.
    I doubt a third of you could successfully use a soldering iron let alone route 7 billion on off switches no matter the funding or training, but you sure know tin foil and all whom is holding the entire world back, as you've got lots of plans for east Asian high tech production, and certainly would have Global Foundries whipped into 10X the speed output if only given the chance...with your basement nanobot tech...
    Reply
  • jwcalla - Sunday, April 29, 2012 - link

    All that hype for something that no more than 500 people are going to buy.

    PC games aren't even pushing it these days, unless one has some kind of ridiculous non-existent display resolution.

    Better to buy a 1080p projector with that money and game on a 100" screen. JMO of course.
    Reply
  • garadante - Sunday, April 29, 2012 - link

    Well, from what little I've seen, it looks like there's going to be a -relative- mad rush by the industry to 2160p resolution in the near future. 1080p has been around quite awhile, and has been mainstream for a handful of years. the 4k panels I've heard are launching mainstream in 2013 likely will be priced quite high and won't fully supersede 1080p for awhile, just as 1080p took a significant amount of time to supersede SDTV, and still isn't the absolute standard, but I won't be surprised if even initial prices are relatively affordable, in the few grand range, as it seems the days of $8000+ TVs are gone for good?

    But even if the information I've seen is horribly wrong and everything I said turns out to be completely false, -hopefully- the launch of mainstream 4k panels, even if they're priced exorbitantly, drives the price of 1080p panels even lower. If quality 24" 1080p monitors ever push $100 I'd really have no reason not to buy 3 or 4 and go for multi-monitor setups. It seems extravagant, but from a purely productivity viewpoint, once you go multi-monitor, it's hard to go back, and having even more screens just allows for more possibilities of tangibly productive multitasking. And 5760x1080 gaming would definitely benefit from these higher end graphics cards, though i still haven't been able to try it or 3D gaming myself so I don't know whether or not that's a viable selling point for me, but it would be fun!
    Reply
  • garadante - Sunday, April 29, 2012 - link

    Well, this pricing is purely speculation right now. What was the 590 at launch? $650-$700? Granted this is anticipated to have closer to 2x 680 SLI performance, but it wouldn't surprise me if this came in close to $800 or less. Unless they pull an AMD move and price in sky high while the market has no competition, but then they did pleasantly surprise us with 680 pricing, and if you'll remember, the speculation for the 680 was $600+, and NVIDIA turned around with a $500 price point -at- launch, which surprised me. Newegg in the states had cards up for very close to $500 on launch day, it was just stock that has been and currently is the Achilles heel of the 680, at least from what I've seen in the states. Reply
  • garadante - Sunday, April 29, 2012 - link

    And I was wrong. Horribly. Anandtech refreshed this page with official pricing after I posted my comment, and it is $1000 so, small news, huge flop, moving on. Dual GPU cards don't bring enough tangible benefits for me to really see double-the-pricing as viable, but that's just me. I can't wait to see the fabled GK110/112. Now, I'd say it might be a relative game-changer, but I'd in all probability be wrong. Horribly, horribly wrong. But I still prefer behemoth dies since they won't suffer the same issues SLI'd cards can. Reply
  • CeriseCogburn - Sunday, April 29, 2012 - link

    Do anyo foyu realize Nvidia released the last 4 flagships at $499, and no doubt anyone with two watts of brainpower, especially those reviewing cards for a living knew $499 was the coming price for a long, long time, for the GTX680.
    It's really amazing how much ignorance flies about, even at the highest "reviewer" levels.
    I quite understand the reason is " We don't believe you ! " or
    "What rip off scheme are you up to this time ?!"
    Unfortunately we can now add amd into that mental brainfart mix for the future, which leaves us gaming end purchasers further into idiotsville, thanks to our clueless "reporters" who can't put two and two together 4 times in a row.
    Months later the "public" here is still clueless.
    Nvidia flagships, 4 times in a row, $499 launch price... DUH it's so easy, the too smart for their own good missed it, and they still miss it.
    +
    Wonderful. I love being surrounded by incompetence.
    Reply
  • sausagefingers - Monday, April 30, 2012 - link

    What a tiresome load of vainglorious crap. I would append this too all your recent posts but... the effort. Reply
  • CeriseCogburn - Tuesday, May 01, 2012 - link

    Except I'm 100% correct, and I'll add besides being surrounded by complete idiots whom can never learn, they are attacking potty mouthers, too. Reply
  • _vor_ - Tuesday, May 01, 2012 - link

    Do us all a favor and break your keyboard; over your face. Reply
  • RaistlinZ - Sunday, April 29, 2012 - link

    Of course they're going to price it high while there's no competition. If they price it at $800 they lose money and run into supply issues. If I was running the company I'd do the same thing to maximize revenue and profits. Reply
  • SlyNine - Sunday, April 29, 2012 - link

    Its why my GTX 680 is on its way. Still not impressed with pricing this generation. Doesn't touch what my 9700pro, 1900XT, 8800GT, or 5870 did for their time (and lets be honest, the 5870 is still a fast card 2 1/2 years later)

    I wanted something 2x as fast and in at least some situations like min frames and really stressful games the 680 is 2x as fast.

    If the 7970 came out at about 450$, I might have bought one. Not that im jumping AMDs ship, All things being equal I'd probably go with AMD.
    Reply
  • SoneeOO7 - Sunday, April 29, 2012 - link

    I would rather wait for the 680s to come down in price later on down the road. I have one now and dont plan on getting another one unless its atleast ~150 bucks less than it is now. Plus It's easier to resell 2 GTX 680s than one 690. Reply
  • leomg - Sunday, April 29, 2012 - link


    But, crossfire of 7970 is much faster !!
    Reply
  • ol1bit - Sunday, April 29, 2012 - link

    Wow, so I can get 3 360's or maybe 4 for one video card.

    Too costly....I guess they better make their money before the next gen consoles come out!
    Reply
  • tipoo - Sunday, April 29, 2012 - link

    You know a 150-200 dollar card in a modern computer you might have anyway will run every game coming out at better than console settings, right? Not everyone needs a 1000 dollar card just to play games, this is just for the uber high end niche market. Reply
  • kallogan - Sunday, April 29, 2012 - link

    about high end stuff. Release low and mid-range gpus Nvidia and not only to OEMs with crappy rebadged names. Thank you in advance. Reply
  • Hrel - Sunday, April 29, 2012 - link

    "We won’t be publishing our review until Thursday, but in the meantime let’s take a look about what we know so far about the GTX 690."

    "let's take a look at* what we know so far..."
    Reply
  • Breach1337 - Sunday, April 29, 2012 - link

    The only benefit of getting this over 2x680 seems to be lowers watts and quad-sli headroom, no? Reply
  • Kevin G - Sunday, April 29, 2012 - link

    I was hoping to see 4 GB per CPU on these cards. It is clearly aimed at ultra resolution gaming and the extra memory memory will help in this area, especially with MSAA, Speaking of ultra resolution gaming, it'd have been nice to incorporate the card with six mini-DP connectors, foregoing DVI entirely. 6000 x 1920 resolution (five 1920 x 1200 displays in portrait) gaming anyone? Reply
  • wagoo - Sunday, April 29, 2012 - link

    That's the res I'm running at already. Agree about the VRAM being disappointing.. Reply
  • Wreckage - Sunday, April 29, 2012 - link

    So this card should be twice as fast as AMD's fastest card? Reply
  • tipoo - Sunday, April 29, 2012 - link

    In games that scale well on SLI, close to that. Reply
  • Leyawiin - Sunday, April 29, 2012 - link

    Another card that won't be in stock anywhere? Nothing but a glorified paper launch. Reply
  • shatteredstone - Sunday, April 29, 2012 - link

    NVidia keeps doing this with their high-end cards -- you have a mandatory second slot occupied just for the third DVI connector. It's comparatively easy to put on a decent watercooler to make everything else fit in a single slot (with ample cooling), but that lone DVI connector spoils the whole thing.

    At AMD, even the highest-end cards can be converted to single slot (even the 7970 and 6990 -- and probably the 7990 then too).

    The only way to fit 7 gtx690 (or 680, for that matter) into one computer would be to desolder the double-dvi bracket and replace it with a single one (foregoing the additional connector). This is not an easy job by any stretch of the imagination (look at the soldering points on your card to see what I mean), far beyond what is required to replace the cooling system.

    It may be just as well since compute performance in the current Geforce cards is somewhat lacking -- and for some algorithms I'm interested in, AMD blows NV out of the water due to different design choices regarding the ALUs -- plus it is a lot easier to stuff 14 AMD GPUs into one board (although the Windows drivers crash using this setup on 6990s, the Linux drivers work somewhat fine, I'm told).

    There have been some gtx680 cards with a single-slot design long after introduction, though. Maybe this will happen with the 690, too.
    Reply
  • CeriseCogburn - Sunday, April 29, 2012 - link

    The double dvi bracket is not soldered on. You take off the screws and the video cable nuts, and it comes right off.
    Obviously you don't know what you're talking about and have never even tried on even a single card, but that just makes you part of the usual crowd here.
    Congratulations for your completely incorrect attack on Nvidia, you're among great soul brothers and sisters.
    Reply
  • Stahn Aileron - Sunday, April 29, 2012 - link

    He's not talking about the bracket. He's talking about the DVI connector itself in the second slot. Those are soldered to the PCB. He's choice of wording I'll admit is kinda poor right there. "Double connector" would've been more accurate rather than "double bracket". Still, you should've been able to infer that from his statement if you thought about it for a moment instead of instantly attacking him for a perceived error. Reply
  • CeriseCogburn - Tuesday, May 01, 2012 - link

    Well, thank you pointing that out.
    I'll add I'm not certain what 7 pci-e slot motherboard the fantasy moaner is talking about, and let's not forget current cpu's fail at only 3 card scaling, they are out of steam by then.
    Essentially, it's another complaint trivia for no good reason at all, barring obscure trade show wall of 24 monitors which were already accomplished with 4 eyefinity 6 radeons, I guess mr moaner wants 7x4=28 for the 680, and fails to whine about no 7x6=42 with 7 eyefinity dp, right ?
    Reply
  • ven - Sunday, April 29, 2012 - link

    so, soon we can expect Asus mars-iii, may be for $1900. seriously did Mars-ii sold at least one unit? Reply
  • CeriseCogburn - Tuesday, May 01, 2012 - link

    No, mars 2 sold zero units so they crushed all the cards at the automobile junkyard, then held a free beer and brats party for people like you. Reply
  • Nfarce - Sunday, April 29, 2012 - link

    I dedicated an entire day to NewEgg hitting the F5 button and finally scored an EVGA Superclocked Signature card w/back plate (just to give you an example of how fast they are going, I had three occasions where a 680 was captured in the cart but by the time I got to checkout after just a few seconds, the card was already gone.

    Good luck to anyone who wants one of these 690s. Nvidia has a serious 28nm production problem on their hands with TSMC, and they need to find at least one other wafer manufacturer. This is the worst capacity/supply issue I ever recall of a new GPU release. We're going on two months now since release and the card STILL cannot be kept in stock anywhere, and the ones that do stock them like Tiger Direct or independent vendors on Amazon, they are charging a $50-$100 markup.

    God only knows what's going to happen with the 670. Good luck to all. Now I've got to get in line and fight for another 680.
    Reply
  • BrunoLogan - Sunday, April 29, 2012 - link

    Nice technical achievement but the price isn't reachable for most of us.
    I'd like to see the 660Ti. Will we have to wait for it much longer?
    Reply
  • toutbeau - Sunday, April 29, 2012 - link

    Cars have a model which is head and shoulders above the remainder eg. BMW M3. This majority of the sales are the bog standard 320i or 320d. But the the top model gives an aura to the range.

    I gotta admit having this super duper model which they will sell 17 units of will set a vibe for sales staff, the fanbois who buy the 660ti or what have you.

    I personally have ordered an overclocked gigabyte 680. Dont know when it will arrive though but I think thats preferable to a dual GPU.
    Reply
  • Saffleur - Sunday, April 29, 2012 - link

    This is one gorgeous looking card. It is also badass lol

    I like the minimalism of it. It doesn't scream "OMFG look at me." It stands out on it's own. Don't have the $500 to drop on it but it is still nice.
    Reply
  • Saffleur - Sunday, April 29, 2012 - link

    Make that $1000 not sure why I pegged it at $500 Reply
  • MUYA - Monday, April 30, 2012 - link

    Just imagine come May 3rd...they announce MSRP at less than $999 ...hmm *wishful thinking* $799 and lo and behold, GTX 680 also get a price cut too!

    *slap*
    ..ya a GPU pipelinedream
    Reply
  • wwwcd - Monday, April 30, 2012 - link

    $1k for one piece of hot(very hot) sh*t. Sorry Nvidia. Useless! Reply
  • Golgatha - Monday, April 30, 2012 - link

    I think nVidia can afford to throw on a high flow bracket on their flagship card. With the stacked DVI ports, it actually does make a substantial difference this generation. Reply
  • Golgatha - Monday, April 30, 2012 - link

    Just noticed one other thing. WTF is with 3 DVI ports and no HDMI?! Reply
  • EJ257 - Monday, April 30, 2012 - link

    Seriously. I wish they dropped the 2nd and 3rd DVI and gave us another 2x DP and HDMI.

    Also the first chart has a typo? How is the 580 and 680 both $499?
    Reply
  • prophet001 - Monday, April 30, 2012 - link

    First, I would definitely get 2 680s and SLI them. 10% is a significant amount of performance. Especially when it comes to extending the life of your system.

    Second, do dual GPU cards suffer from the same micro-stutter problems as SLI setups do?

    If they do, why would they sell cards like that? If they don't, why don't they take whatever they learned and apply it to SLI setups?
    Reply
  • CeriseCogburn - Tuesday, May 01, 2012 - link

    They did. Read some reviews, SLI is smooth as silk with frame rate target.
    CF 7970 is not as good a gaming experience, and the drivers suck.
    Even when 7970CF has a frame rate win, the gaming sucks in comparison.
    Glad to help.
    Reply
  • Farkus - Monday, April 30, 2012 - link

    I'll have to say I was disappointed that my 5970s; even with the voltage tweaker, I could barely get them OC'd at all. The price of them almost equalled two 5870s which would easily out-perform a single 5970. They had a cold bug which required them to warm up first and you had to jack up the idle clock if you didn't want 2-D artifacting or lock-ups going into games off the desktop. Bottom line here: The only reason to buy one of these is if you plan to SLI them and deal with any idiosyncrasy, if you're up for it. Otherwise get two singles and SLI them. As much as I am a fan of technology, I think it's asking a lot for a dual card (not to mention two of them) to perform like singles. Besides, it's easier to cool two singles. When I did have the 5970's warmed up and running in SLI it was quite the deal. I've since replaced them with two 5870's @ 900 Mhz and never had a peep out of them and they do fly. My two PNY liquid cooled 580's are absolute screamers at 915 Mhz. I'm not sure if quad-SLI is that much of an advantage over two single cards maxxed out. I wouldn't do it again just to avoid any headaches. Now go ahead and rip on me. Reply
  • CeriseCogburn - Tuesday, May 01, 2012 - link

    amd drivers suck, nvidia drivers do not suck. You experienced the amd suck of their drivers.
    The reviewers already reported how amd CF drivers suck with the 79xx, many problems, and some games just won't run at all, at any resolution.
    Now recently the 12.4 drivers have been breaking CF installs.
    Get with it bro, buy Nvidia, and don't blame the massive suck of amd on the winning green team, who really cares for their end users and proves it.
    BTW - now if you had a clue instead of a fanboy brainwash for the last 5 years, Nvidia just gave you another gigantic present, FXAA and Adaptive V-sync and frame rate target (with card partner software) in the latest 301.24 Nvidia driver all the way back to the 8 series meaning it inclused ALL the nvidia rebrands the entire amd fanboy anger and rage crew spewed about for years on end.
    ROFL - HA HA HA guess who has the only laugh, let alone the last ?
    Reply
  • von Krupp - Monday, April 30, 2012 - link

    ...yup.

    Now we must wait to see if the second half of what we thought might happen will indeed happen. That is, Nvidia is going to hold GK110 and refine it until AMD launches its next generation response. I would be surprised if they bothered to release a $1500+ GK110-based consumer solution to trump the dual-GPU 7990 when it comes out. As awesome as GK110 sounds, I remain skeptical that a single GK110 could trump two HD 7970s or two GTX 680s all by itself. So yes, I believe they will wait for the HD 8970 before firing back with another architecture.

    That all said, that is one very pretty card. The recent price shifts make me feel silly for spending $1100 on my cards, but c'est la vie. This is the tech world, get what you need when you need it (within a three month margin) or else you'll always be waiting.
    Reply
  • Zingam - Monday, April 30, 2012 - link

    I want one of those in my laptop for $50. Until I get that performance for $50 in 20W TDP notebook... I can wait. Reply
  • MrBrownSound - Monday, April 30, 2012 - link

    It's amazing the 680's are flying off the shelf faster than they can be stocked. I hope they get it together and pick up the pace with distribution in north America. I still can't buy one from my favorite retailer. I'd rather stay away from amazon and ebay, in case it is damaged. Some people are taking advantage of the situation and are charging way more than the 680 are worth on those sites too. Reply
  • just4U - Monday, April 30, 2012 - link

    "Availability will also be a significant issue. As it stands NVIDIA cannot keep the GTX 680 in stock in North America.."
    ----

    I think we have seen maybe half a dozen or so in Calgary.. so that's an understatement. It's not that their flying off the shelves it's there is a very limited supply.
    Reply
  • CeriseCogburn - Tuesday, May 01, 2012 - link

    The egg has HUNDREDS of confirmed purchaser reviews on the 680, they trickle in at one or two a day, meaning of course, if you have two watts and two eyes, that's how stock is flowing as well. Reply
  • Daggarhawk - Tuesday, May 01, 2012 - link

    Listing the 680 as $499 is a LAFF RIOT. you can't get one for less than $650 right now. if you can find it.

    in most markets MSRP is meant for mark downs. for bleeding edge graphics it's just a :P
    Reply
  • papapapapapapapababy - Tuesday, May 01, 2012 - link

    nice paperweight. in a short few years i will have this much power integrated in the graphics solution of my 45$ cpu. Reply
  • xgmdx - Wednesday, May 02, 2012 - link

    Oh sweet only one display port and no hdmi..... Reply
  • Will Robinson - Wednesday, May 02, 2012 - link

    Ladies and gentlemen I present...."Silicon Doc"
    New name....same old ugly hate talk.
    Reply
  • jacobdrj - Thursday, May 03, 2012 - link

    The price for these cards is simply not high enough. That is why they can't keep them stocked. If they raised the price, there would be more stock, and at the same time more profit for NVIDIA. People who truly want this card would have more availability because fewer people would value this card at the higher price or have the means to purchase the card at the higher price.

    Supply and Demand 101...

    Prices for this card need to go WAY up until they can ramp up production...
    Reply

Log in

Don't have an account? Sign up now