Update: NVIDIA has confirmed that the Twitter account is indeed theirs, so this information is official.

With a skeptical eye towards Twitter, a post was made on the NVIDIAGeForce account 4 hours ago announcing the names of the first two GF100 cards. As we’re largely sure this is a legitimate NVIDIA account we’re going to go ahead and post this information, but please keep in mind that we have not yet been able to confirm that this is indeed an official NVIDIA posting (it’s 10PM on the West Coast).

With that out of the way, the post reads as follows:

Fun Fact of the Week: GeForce GTX 480 and GeForce GTX 470 will be the names of the first two GPUs shipped based on our new GF100 chip!

It’s a very small piece of information so we don’t have a lot of commentary here, but the names are a little bit surprising. The names are consistent with NVIDIA’s G/GT/GTX naming scheme, but we’re not quite sure what happened with the numbers. Technically speaking NVIDIA launched their 300 series late last year with the GeForce 310, an OEM-only rebadge of the GeForce 210. But we had expected that NVIDIA would fill the rest of the 300 series with GF100 and GF100-derrived parts, similar to how the 200 series is organized with a mix of DX10 and DX10.1 capable parts. Such an expectation is also consistent with the earlier rumors on the GTX 380 and GTX 360.

Instead they've gone ahead and started the 400 series of cards with GF100, not that we’re complaining. This is great news for developers and buyers since it prevents a repeat of the GeForce 4 Ti/MX situation, but it does make us wonder what the point was in burning off a whole series number with a single bottom-tier card. And although this is an article about the 400 series, we're left wondering what the purpose is of a rebadged 300 series is since that clearly has an impact on the naming of the 400 series.

At any rate, no further information was announced. We still don't know what the performance will be like or the clock speeds. It's a good bet however that the GTX 470 will have some CUDA Cores disabled, if NVIDIA's GTX 280/270/260 naming scheme is anything to go by.

POST A COMMENT

46 Comments

View All Comments

  • Bob Smith - Tuesday, February 02, 2010 - link

    Agree with that... Reply
  • LeftSide - Tuesday, February 02, 2010 - link

    Hmm... Just worndering... Does AMD write the check out to Robert, Bobby, or Bob??? Reply
  • Nfarce - Tuesday, February 02, 2010 - link

    Nah, the douche bag just has no life, like an AMD loving, Intel hating fangirl (and I do say "girl" because little zit-faced punks like this couldn't get a girl in the real world - or a hot one anyway).

    What the limpdix like "Bob" don't understand is that competition is good. One year ATi wins; another year nVidia wins. When both take turns winning we consumers all win. I've had GeForce and ATi, and have both in two current gaming rigs (HD 4870 in one and GTX 275 in another). And for the record, my GTX 275 overclocked and tuned with programs like nHancer smokes the HD 4870 with the best it can do from ATi's CCC. But you'll never hear an ATi fangirl bring up overclocking facts and ATi driver and CCC snafus.

    In any event, I'll be selling one of those rigs this year and building another (probably the HD 4870 rig). If ATi still has the better card than nVidia as currently, then I'll buy ATi. If nVidia has better performance with the new card however - even for more bucks like I spent on the GTX 275 over the HD 4870, then I'll probably go nVidia. Finally, I game a lot with Microsoft FSX, and nVidia cards smoke the ATi cards in that program (one reason I have two rigs).

    Fangirls need not apply. Nothing to see here, right Bob?
    Reply
  • Galidou - Wednesday, February 03, 2010 - link

    Ati owner here, not fanboy, system builder, tester of every card on the market. Ati overclocking fact, never used CCC it shucks but ATI Tray tool overclocks well. Comparing GTX275 overclocked with ATI4870, comon, look at the prices, 4870 fares well against GTX260(higher priced than 4870). Bob, Fermi will overkill ATI and that's for sure, they developped something that was technically a challenge. Sure It's gonna be overpriced, but it's gonna be for those who want performance over anything like price.

    Problem nowadays, most games are developped for consoles and PCs, so they run amazing on something like GTX260/4870 at high/max details 1920*1080, except Crysis. The best of the games are gonna go out on consoles and PC. So until XboxCube or Playstation4 gets out, titles won't need amazing video card, keep your cpu and ram to a good level and everything will be alright. Performance wise, these video cards are for benchers, extreme performance overclocker and or someone that owns a 30 inch monitor. That represents .01%(if so) of the market for video cards, welcome!
    Reply
  • coldpower27 - Friday, February 05, 2010 - link

    Agreed with the portion that GTX 275 and the 4870 are in 2 different price categories. I would say the 4890 and the GTX 275 are at even parity.

    Fermi is designed to push technical limits and have a halo effect on Nvidia's lineup to help sell their lower offerings. It wasn't designed to be truly cost effective, like ATI's price/performance parts are. Nvidia has a different goal in mind, and that is totally alright.

    With the GPU horsepower we have now, we can continue pushing boundaries, you can run a multi-monitor setup, or have GTX 295 performance levels in a single GPU configuration at lower power consumption.

    If you wanted adequate performance, you can get something in the 9800GTX+ range that plays fine at 1680x1050.

    Take a look at Steam' survey results, even now 1280x1024, an older 17inch CRT, 17-19inch LCD Resolution, and 1680x1050, 20-22inch LCD's represent the majority.

    These video cards were never developed with "adequate" in mind, Fermi is truly about pushing boundaries and being bold.

    Reply
  • Targon - Monday, February 08, 2010 - link

    There is the fact that Fermi still isn't available, and AMD has had a good amount of time to improve things on the 5870. As a result, we may very well see a 5890 released by the time Fermi even hits the shelves. In addition to that, work is obviously going on toward the Radeon 6000 series while NVIDIA has to keep working to get Fermi out the door.

    Fermi may end up taking the performance crown by the time it is released, but we may be looking at another Radeon 9700 vs. Geforce 5800 situation here, where nothing NVIDIA does in this generation will let them catch up due to all the "extra abilities" that their products are trying to offer.

    The Radeon 5870(and a number of other 5000 series parts) are already capable of 3 monitor output, but not just for a few titles that are designed to support it. Eyefinity really does let you just combine multiple monitors together so applications only see one display with the higher resolution available. Fermi, no matter the horsepower may not be able to offer that.

    That's really about it, Fermi just isn't out yet, and until it is, the big question won't be about how POWERFUL it is for most, it will be about DirectX 11 support, and going forward, OpenCL support and other standards. Yes, PhysX support will matter in a handful of applications/games, but how long will it be before it is replaced by an open standard by application and game developers?
    Reply
  • Darkness Flame - Tuesday, February 02, 2010 - link

    nVidia has already announced, and possibly launched the GeForce 300M series, along with the G 310 OEM desktop card. They could be skipping the rest of the 300 series so that consumers will not be confused as to which cards carry the new architecture, and which do not. Just my guess, however. Reply
  • Taft12 - Tuesday, February 02, 2010 - link

    There have been 8800 rebadge jokes already, but I really think they are reserving the 300 namespace for rebadges of current products since the die size and price of Fermi will be incredibly high. Reply
  • Mr Perfect - Tuesday, February 02, 2010 - link

    "But we had expected that NVIDIA would fill the rest of the 300 series with GF100 and GF100-derrived parts, similar to how the 200 series is organized with a mix of DX10 and DX10.1 capable parts."

    I, for one, am glad they arn't pulling that this time around.
    Reply
  • shabby - Tuesday, February 02, 2010 - link

    Just wait, they'll rename those low end 300 cards into 400 ones in no time. Reply

Log in

Don't have an account? Sign up now