Update: NVIDIA has confirmed that the Twitter account is indeed theirs, so this information is official.

With a skeptical eye towards Twitter, a post was made on the NVIDIAGeForce account 4 hours ago announcing the names of the first two GF100 cards. As we’re largely sure this is a legitimate NVIDIA account we’re going to go ahead and post this information, but please keep in mind that we have not yet been able to confirm that this is indeed an official NVIDIA posting (it’s 10PM on the West Coast).

With that out of the way, the post reads as follows:

Fun Fact of the Week: GeForce GTX 480 and GeForce GTX 470 will be the names of the first two GPUs shipped based on our new GF100 chip!

It’s a very small piece of information so we don’t have a lot of commentary here, but the names are a little bit surprising. The names are consistent with NVIDIA’s G/GT/GTX naming scheme, but we’re not quite sure what happened with the numbers. Technically speaking NVIDIA launched their 300 series late last year with the GeForce 310, an OEM-only rebadge of the GeForce 210. But we had expected that NVIDIA would fill the rest of the 300 series with GF100 and GF100-derrived parts, similar to how the 200 series is organized with a mix of DX10 and DX10.1 capable parts. Such an expectation is also consistent with the earlier rumors on the GTX 380 and GTX 360.

Instead they've gone ahead and started the 400 series of cards with GF100, not that we’re complaining. This is great news for developers and buyers since it prevents a repeat of the GeForce 4 Ti/MX situation, but it does make us wonder what the point was in burning off a whole series number with a single bottom-tier card. And although this is an article about the 400 series, we're left wondering what the purpose is of a rebadged 300 series is since that clearly has an impact on the naming of the 400 series.

At any rate, no further information was announced. We still don't know what the performance will be like or the clock speeds. It's a good bet however that the GTX 470 will have some CUDA Cores disabled, if NVIDIA's GTX 280/270/260 naming scheme is anything to go by.

Comments Locked

46 Comments

View All Comments

  • DanNeely - Wednesday, February 3, 2010 - link

    The 8800/9800Gt had 112 SPs, the 240 only has 96; making it comparable to the best of several 9600 models they offered.
  • mindless1 - Wednesday, February 10, 2010 - link

    There is 240GT and 240GTS. GTS is not just an overclocked GT, and it has the larger 55(?)nm process size vs 40nm for GT.

    The former 240GT is what you refer to, the latter is what the former post refers to.

    http://www.nvidia.com/object/product_geforce_gts_2...">http://www.nvidia.com/object/product_geforce_gts_2...
  • marc1000 - Tuesday, February 2, 2010 - link

    anyway, they always try their best to confuse the customer... and they did it again! haha!
  • jeromekwok - Tuesday, February 2, 2010 - link

    Is Nvidia going to rename yet again the old good 8800GT as GT300 series?
  • AcydRaine - Wednesday, February 3, 2010 - link

    Haha, maybe a low-mid GT300 series. I would expect like the GTX260/275 chips to be the upper GT300 parts. Though the good ole G92 chip still has some pull. LoL
  • MrSpadge - Tuesday, February 2, 2010 - link

    8800GT will probably skip the 300 series and go straight to 400 series, as it's such a succesful card!
  • aegisofrime - Tuesday, February 2, 2010 - link

    Perhaps a 40nm version? AFAIK the GTS 250 is still 55nm. A 40nm version could be a low-mainstream card.
  • Bob Smith - Tuesday, February 2, 2010 - link

    What a pity!

    ATi/AMD delivered the DirectX 11 experience to its fans months ago. nVidia couldn’t make it. It’s real, and it’s a tremendous defeat! That’s what happened, that’s the truth.

    It’s called “Cost of Opportunity”. There’s no price to be the first to experience a ATi Radeon HD 5870 in its all glory, a single card crushing nVidia’s dual card GTX 295. And we’re talking about a heavy title such as Crytek Crysis @ 2560×1600.

    According to Tom’s Hardware, nVidia GTX 295 simply didn’t work at that resolution. Pity again! Please, see for yourself.

    http://www.tomshardware.com/reviews/radeon-hd-5870...">http://www.tomshardware.com/reviews/radeon-hd-5870...×1600_gtx_295_buffer_memory&xtcr=2

    One of the paragraphs from this review says:

    “Notice the missing result for Nvidia’s GeForce GTX 295 at 2560×1600 with 8xAA? That’s due to the card not having ample on-board memory to run that configuration (and the game not knowing any better than to keep it from trying). Grand Theft Auto gets around this by simply making resolutions unavailable if a graphics card doesn’t have a large enough frame buffer. Crysis crashes instead.”

    GTX 295 not being able to run Crysis @ 2560×1600? Pity!

    Fermi has got to be better and faster than Cypress. It’s an obligation for nVidia to build this in that way, since they had, at least, more time to conceive it.

    And, as always, don’t be fooled: you’re going to hurt your pocket to have Fermi installed onto your RIG. Be prepared to pay the price. It happened with Cypress. It’s going to be the same with Fermi. And since, nVidia cards are always much more expensive than ATi/AMD’s, one Fermi card can reach as much as 750 bucks. Wait and see.

    Take this weekend and go to your favorite retail store and grab your ATi Radeon HD 5870. It’s there, real. Just take it.

    Fermi, humpf…maybe 3Q2010 you’ll get one. It’s just an illusion…a dream (that hasn’t come true…hehehehe…)

    Cheers!

    ATi/AMD vs nVidia

    ATi/AMD offers 3-monitor output. You don’t need a Crossfire (CF).
    nVidia: the new Fermi delivers only 2-monitor output. If you want to experience a 3-monitor gaming setup, you must buy two nVidia cards, a lot more expensive than one single ATi Radeon HD 5970, for instance. Although it’s a bit expensive to afford a 3-monitor rig, that’s what gamers are starting to look at.

    ATi/AMD is delivering the ultimate gaming experience with its latest Dx11 card.
    nVidia is moving away from the gaming industry bringing horrible products that definitely didn’t make it.

    NVIDIA 3D Vision Surround – Expensive and A Huge Mess?
    http://en.expreview.com/2010/01/23/nvidia-3d-visio...">http://en.expreview.com/2010/01/23/nvid...nd-expen...

    We’re talking about gaming here, not working at the office with business solutions, CUDA, GPGPU, etc.

    @those_who_said_about_dx11_titles
    What the hell is CUDA, PhysX, anyway?
    If now we have just a couple of titles, it’s the beginning of a new gaming generation with Windows 7 and DirectX 11; it’s a trend, it’s the future.

    Does anybody REALLY has any titles which benefit from PhysX. How many are available? Did you know that PhysX is proprietary? nVidia does not offer it as an open standard. Guess now why so few or non-PhysX titles? huh?

    ATi/AMD is working on a new generation of its current line of products to be available in the second semester.
    nVidia: when exactly Fermi will be available? hehehehe…pity!

    ATi/AMD is completely committed to the gaming industry.
    nVidia is a big company, but it’s not working to the gaming industry anymore.

    ATi/AMD won last year. And it will win again this year.
    nVidia was a big FIASCO last year. It it will do it again in 2010! Pity!

    Cheers!
  • wizzlewiz - Wednesday, February 3, 2010 - link

    What is this I don't even
  • osmosum - Tuesday, February 2, 2010 - link

    What a pity you are just picking and choosing results to suit your argument.

    Theres actually a benchmark many sites use to convey what you are talking about. That is, the benchmark consists of dozens to 100 games. Each FPS score of every game is added together. Nvidia always wins. SLI always comes in last. Get a clue.

    Oz

Log in

Don't have an account? Sign up now