If there was any doubt after Saturday night as to what NVIDIA's prybar was for, this should put it to rest. FedEx just dropped off the prybar's companion, the venerable wooden crate.

Top: Caution, Weapons Grade Gaming Power

Side: 0b1010110010 [690]
BT-7.080
G08-H86-A000

Applying the prybar in a slightly more civilized manner than we would in most video games, we find the GeForce GTX 690 inside. (ed: If this was a 90's video game, then according to the Crate Review System NVIDIA is already doing very well)

That's all we can show you for now. We'll have more on Thursday.

Comments Locked

68 Comments

View All Comments

  • InsaneScientist - Monday, April 30, 2012 - link

    So, I know that their supply is incredibly constrained, so the likelihood is that they only sent you one, but did they, by any chance, send you two of them so we can see some quad SLI figures?

    Granted that level of performance is insane, and the only thing that would be likely to need that kind of horsepower would be a large eyefinity rig, which you can't do with nVidia, but still...
  • Ryan Smith - Tuesday, May 1, 2012 - link

    Just the 1.
  • Sabresiberian - Tuesday, May 1, 2012 - link

    I get a chuckle out of the "crowbar and crate" thing. If I buy one, will it come with those things?

    :D

    This card looks so beautiful; I'm having a hard time fighting off the "coolz toyz" factor, heh. Maybe I'll be over it by the time they are available.

    Personally, I'd like a smidge more memory - I know FXAA is working wonders with memory usage, but still, in a multi-monitor setup, will 2GB be enough? (4GB is 2GB per GPU,and it doesn't stack, but is mirrored, for those who aren't familiar.)

    ;)
  • repoman27 - Tuesday, May 1, 2012 - link

    I noticed the 690 in binary on there right off the bat, but what is the meaning of the rest of the markings?
  • repoman27 - Tuesday, May 1, 2012 - link

    Never mind, I used the power of Google.

    0b1010110010 = 690
    BT-7.080 = 7.080 billion transistors
    G08-H86-A000 = 408-486-2000 (NVIDIA's phone number masked by GHA—Graphics Hardware Acceleration)
  • justniz - Tuesday, May 1, 2012 - link

    When nVidia come up with a new GPU architecture they always do this:
    First, they release a single GPU card called x80 GTX. It costs around $500 and it works perfectly fine with all games for the next 2-3 years at least.
    Then they release downtuned versions called x70 GTX, x70 GT, x60 GT, etc. fpor the budget market.

    Then they do a Dual GPU version with slightly lower clocks, called the x90 GTX that nobody can really put to full use, and it costs around $1000).

    Then its only a very short time until the next chip rev and its x80 come out, that blow away the old x90, totally devaluing the investment of anyone who actually bought a x90 only about 2 months earlier. More often than not because Microsoft also release a new version of Windows or Direct3D that coincidentally have 'features' that review sites say are to die for, but supposedly cant be supported on older GPUs than the very latest gen of hardware.

    Furthermore, it takes several years to make a video game, and they usually pitch the graphics only around whatever the top-end hardware is when they started the development.. At least because they know that a large amount of their customers will only have a budget card (x70 GT or lower).

    Consequently, assuming you don't have a large multi-monitor setup. as far as I can tell there are literally no PC games in existence or coming anytime soon (i.e. before the next nVidia chip rev) where you will actually need or use all the power of a GTX 690, even with all the graphics settings fully maxxed out.

    So I ask, unless you have a massive multi-monitor system, why waste the money? There are no games to use it, and when any game comes out that can stretch even a GTX680, the 6xx series will be at least a year out of date and probably wont support the latest DirectX anyway.

    My GTX 580 is still handling perfectly everything I can throw at it, even fully maxxed out.
  • slikts - Thursday, May 3, 2012 - link

    One reason is 120Hz screens: for instance, GTX 690 could run BF3 at ultra settings, full HD and 120 FPS. It's the only single card that could do this (I assume anyway) currently, and maybe it would even drop below 120 if the action is sufficient.
  • stjoker69 - Wednesday, May 2, 2012 - link

    "(ed: If this was a 90's video game, then according to the Crate Review System NVIDIA is already doing very well)"

    Ryan Smith, you got this backwards. According to the CRS, the longer it takes you to find a crate, the better. Come on mang!

Log in

Don't have an account? Sign up now