While we've been holding our breath for a die-shrunk GT200, in the first announcement of something really new since the introduction of GT200, NVIDIA is promising a single card multi-GPU solution based on a GPU that falls between a GTX 260 core 216 and a GTX 280 in terms of performance. The fact that the first thing we hear after GT200 is another ultra high end launch is interesting. If the end result is pushing the GeForce GTX 260 under $200, and the GTX 280 under $300, then we can definitely get behind that: it would be sort of a midrange re-introduction by pushing current GT200 parts down in price. While we'd love to see parts from NVIDIA designed for budget minded consumers based on their new technology, the current direction does appear to be a viable alternative.


Image courtesy NVIDIA

To be fair, we don't know yet what is going to happen to GTX 260 and GTX 280 pricing. It is possible today, through combinations of instant and mail-in rebates, to find the GTX 260 for $200 and the GTX 280 for $300, but these are the exception rather than the rule. If pre-rebate pricing could fall to these levels and below, much of NVIDIA's lack in providing affordable pricing for their excellent technology will be fixed. Of course, this seems like a tough pill to swallow for NVIDIA, as the GT200 die is huge. Pricing these parts so low has to be really eating into their margins.


Image courtesy NVIDIA

And yes, this is a complete divergence from a hard launch. This announcement is antecedent to retail availability by exactly 3 weeks. Hardware will not be available until January 8th. While we are happy to talk about product whenever we are allowed, it is still our opinion that hard launches are better for everyone. Talking about something before it launches can (and has in the past with both ATI and NVIDIA) lead to changes before launch that reduce performance or completely eliminate products. Especially around holidays. This is the most tempting and worst time to announce a product without availability.

But be that as it may, we have the information and there's no reason to deny it to our avid readers just because we wish NVIDIA were acting more responsibly.

A Quick Look Under The Hood
Comments Locked

69 Comments

View All Comments

  • SiliconDoc - Sunday, December 21, 2008 - link

    Jiminy crickets, can I buy you a TWIMTBP dartboard so you can get out some anger?
    I think you missed the boat entirely. I think it's great the two companies are competing, and they have a war going - I bet it's motivational for both developer teams.
    I'd also bet the "sandwhich" method NVidia used will be good learning for cards down the line, because it doesn't look like shrinkage is going to beat transistor count raising.
    Since the 260 / 192 and 216 only slightly beat the 4870 / 512 and 1024 - I disagree NVidia "tried it with them" and found it not good enough - they well knew to open it to 240 shaders for more of an advatage in the X2 boost. That's just simple plain common sense - that your analysis in that matter didn't have. A bitter putdown it was.
    Now, the point you missed that is valid, is what BOTH card companies do, and so do the cpu makers- and they cover it up by claiming problems with "yield". And the enthusiast public sucks it down like mental giant candy - oh so smart is the geek end user he knows why these things happen - and he or she blabbers it to all his or her peers... oh so smart the blabbering corporate protecter geek end user is...
    The bottom line is they CRIPPLE cards, gpu's, cpu's, bios, whatever it is they need to do - cut pipelines, chop shaders, whack cache...
    And somehow the supposedly so bright geek enthusiast squads babble the corporate line "Well, no we uhh.. we have a bad chip come out, and uhh... well we have to disable part of it - err, uhh, we just like disable it and mark it lower..."
    (add the speed binning - which occurs - to validate all the other chanting BS - and it works - for the corporate line)
    When the real truth is THEY PLAN THE MARKET SEGMENT - THEY CHOP AND DISABLE AND SLICE AND CUT AND REMOVE AND ADD BACK IN - TO GET WHAT THEY WANT.
    Yes, there are some considerations - # of ram chips to be on vidcard for example for bit width - but this they decide...
    They don't just shoot it straight out the way they HAVE TO because of "low yields".
    The only part that is valid is SPEED BINNING.
    Somehow though, the excuses for 'em are everywhere, as if wonder the cpu or GPU die " has the exactly correct portion magically come out retarded " - the portion they want disabled, or changed, for their TIER insertion of exactly proper product class.
    I can't stand it when BS becomes widespread belief.
  • Razorbladehaze - Sunday, December 21, 2008 - link

    What does this even mean?

    "Since the 260 / 192 and 216 only slightly beat the 4870 / 512 and 1024 - I disagree NVidia "tried it with them" and found it not good enough - they well knew to open it to 240 shaders for more of an advantage in the X2 boost. "

    First of all this is a total contradiction to logic if I'm indeed reading this correctly. I'll even let you keep the idea that 260 performs better (which it never has, although the 216 does border with the 4870 much better).

    But why then does nvidia release another version (216) with increased shaders? It's not because they suddenly have better yields, which could be the case, but once again it was reactionary. From the overall landscape of nvidia's moves they would have been happy to go on with their original design.

    Further more the logic once again is not there that nvidia would increase shaders again, when releasing their product and waiting for the decrease in process size (probably the larger of the two factors) before releasing the GT295. Unless of course they feel that HAD too in order to ensure they would beat out the 4870X2. If what is posited as true in the quote by silicon doc, logically nvidia would have wasted no time and likely released a GT295 much closer to the release date of the 4870X2.

    Finnaly what the hell are you talking about "for more of an advantage in the X2 boost."

    Does this mean the 4870X2 had a boost? Or are you trying to state that there is some special feature on 4870X2 that provides a boost (which there is not unless this is a reference to OC'ing through overdrive). Perhaps though and what I interpret this as a mistake or typo that you really said X2 when referencing the GT295.

    And with that interpretation the whole comment, " they well knew to open it to 240 shaders for more of an advantage in the X2 boost. " only goes to reinforce the logic that nvidia felt they had to make another REACTIONARY change to their own designs to gain the edge.

    Finally I'm done posting in here, so doc you're more than welcome to flame all you want. I've already spend enough of my precious time replying to the senseless dribble.

  • SiliconDoc - Monday, December 22, 2008 - link

    You're a red freak dude. "Reactionary design". LOL
    roflmao
    You need some help.
    Quoting Aristotle to Nvidia - and you want to send it to their people.- Like I said you've got a huge chip on your shoulder, so you yap out big fat lies.
    Can you stop yourself sometime soon ? I hope so.
  • SiliconDoc - Sunday, December 21, 2008 - link

    You're the one who said they tried it(dual) with the 260 and found it failed, not to mention you claiming (pulling the rabbit out of your spat) they tried it with 280 and found heat or watts untenable.... another magic trick with your mind.
    Obviously the reason they opened up to 240 shaders is because they could. There's ZERO room left in 4780- none fella. Zip.Nada.
    I guess you know what Nvidia engineers do because you figured it out all on your own, and they are so stupid, they tried your idiot idea, and found it didn't work.
    (NO, YOU SAID IT BECAUSE YOU HAVE A HUGE CHIP ON YUOR SHOULDER - AND YOU'RE OBVIOUSLY WRONG BECAUSE IT'S DUMB, DUMB, DUMB.)
    That's the very type of stuff that we've had to put up with for so long now, - and the next round is already here.
    Endless lying pukehead statements made by the little war beasts.
    I'm sick of it - I don't care if you're not lying, don't care if you're just making up stupid dissing stuff, but to lie up a big can of crud then pretend it's true - and expect others to accept it - NO.
  • SiliconDoc - Sunday, December 21, 2008 - link

    Perhaps you forgot the 4870 had a 512 version released, then a 1024.
    Maybe that would explain it to you, but I doubt it.
  • Razorbladehaze - Sunday, December 21, 2008 - link

    This is funny, "I can't stand it when BS becomes widespread belief. "

    I've noticed that this poster is quite guilty of spinning BS for others to buy into.

  • rocky1234 - Friday, December 19, 2008 - link

    Problems in farcry2 with crossfire? I have a 4870x2 & the game has worked fine from the day I got & I got it about three days after it was released...I play on maxed out settings & 1080P without issue's & the game always ran smooth The only reason I stopped palying it is because I got bored of the game.

    This release of the 295 is just a PR move for Novidia nothing more they beleive they have to be Number 1 or nothing because if they are number 1 they can charge what ever they want for their video cards.

    I am just waiting for all those websites to start doing the reviews & when they Novidai card wins a few they trash the AMD card because it lost by a few frames even though the card will let you play pretty much any game at high res & maxed settings. Oh yeah I did find a use for my old 8800GT/9800GT I now use it as a slave card for PhysX the only problem is where are they games so far the ones that use it are low in numbers & the game play does not seem to get better by having it except now when the Novida card kicks in & does the CPU's work my quad core can take a small nap...lol

  • SiliconDoc - Tuesday, December 30, 2008 - link

    " This release of the 295 is just a PR move for Novidia nothing more they beleive they have to be Number 1 or nothing because if they are number 1 they can charge what ever they want for their video cards. "

    Oh, so I guess if I ( like tens of millions of others) only have a single 16x pci-e like on the awesome Gigabyte P45 > GA-EP45-UD3R , or GA-EP45-UD3L , or UD3LR, or ASUS P5Q SE, P5Q, P5Q SE+, or Biostar T-Force, or endless number of P35 motherboards, or 965, G31, G33, etc... I should just buy the ATI product, and screw me if I can't stand CCC, or any number of other things, I'll just have to live with the red card for top frames... YOU DON'T WANT ME OR TEN MILLIONS OTHERE ALLOWED TO BUY A TOP DUAL NVIDIA ? !?!?!

    " I am just waiting for all those websites to start doing the reviews & when they Novidai card wins a few they trash the AMD card because it lost by a few frames even though the card will let you play pretty much any game at high res & maxed settings."

    Well I am waiting for it too, and I can hardly wait, since those same websites compared 2 gpu cores to one core, and declared by total BS, 2 cores the winner. If you start screaming it's best single slot solution like dozens have for 6 months, then tell me and mine it's a waste for NVidia to do the same thing, don't mind me when I laugh in your face and consider you a liar, ok ?

    " Oh yeah I did find a use for my old 8800GT/9800GT I now use it as a slave card for PhysX"

    Ahh, good. You won't be doing that with a red card anytime soon, and if you do, we see by the news, IT WILL BE NVIDIA with a rogue single developer supporting it ! (not ati - supporting their own cards for it - so much for customer care and cost consciousness)
    _________________________________________________________________

    So, you think the tens of millions of single slot 16x, so NO 16x 16x, or just 16x 4x PCI-E users deserve a chance at a top end card other than ATI ? OH, that's right you already basically said " NO !" - and squawked it's a pr crown stunt...

    How about SLI board owners who don't want to go ATI ? You think they might be interested in DUAL in one slot or QUAD ?
    No, you didn't give it a single brain cell of attentiveness or thought...
    Duhh... drool out a ripping cutdown instead....

    I'm sure I'll hear, I'm the one with the "problem", by another spewing propaganda.
  • initialised - Thursday, December 18, 2008 - link

    [quote=Anand]Making such a move is definitely sensible, but it is at the highest end (2560x1600 with tons of blur (I mean AA, sorry))[/quote]

    No, according to the Geordise translation (that we coined on the way home from CyberPower in a traffic jam) AF is Deblurrin' while AA is Dejaggin'), please ammend the article.
  • JarredWalton - Thursday, December 18, 2008 - link

    No, antialiasing adds some blurriness to images. It smooths out jaggies on edges by blurring those edges. It's one of the reasons I don't think 4xAA or even AA in general should be at the top of the IQ features people should enable; I only turn it on when everything else is enabled, and I start at 2xAA and progress from there. I'm with Derek in that 2xAA with high resolution displays is more than enough for all but the most demanding people.

Log in

Don't have an account? Sign up now