A Different Sort of Launch

Fermi will support DirectX 11 and NVIDIA believes it'll be faster than the Radeon HD 5870 in 3D games. With 3 billion transistors, it had better be. But that's the extent of what NVIDIA is willing to talk about with regards to Fermi as a gaming GPU. Sorry folks, today's launch is targeted entirely at Tesla.


A GeForce GTX 280 with 4GB of memory is the foundation for the Tesla C1060 cards

Tesla is NVIDIA's High Performance Computing (HPC) business. NVIDIA takes its consumer GPUs, equips them with a ton of memory, and sells them in personal or datacenter supercomputers called Tesla supercomputers or computing clusters. If you have an application that can run well on a GPU, the upside is tremendous.


Four of those C1060 cards in a 1U chassis make the Tesla S1070. PCIe connects the S1070 to the host server.

NVIDIA loves to cite examples of where algorithms ported to GPUs work so much better than CPUs. One such example is a seismic processing application that HESS found ran very well on NVIDIA GPUs. It migrated a cluster of 2000 servers to 32 Tesla S1070s, bringing total costs down from $8M to $400K, and total power from 1200kW down to 45kW.

HESS Seismic Processing Example Tesla CPU
Performance 1 1
# of Machines 32 Tesla S1070s 2000 x86 servers
Total Cost ~$400K ~$8M
Total Power 45kW 1200kW

 

Obviously this doesn't include the servers needed to drive the Teslas, but presumably that's not a significant cost. Either way the potential is there, it's just a matter of how many similar applications exist in the world.

According to NVIDIA, there are many more cases like this in the market. The table below shows what NVIDIA believes is the total available market in the next 18 months for these various HPC segments:

Processor Seismic Supercomputing Universities Defence Finance
GPU TAM $300M $200M $150M $250M $230M

 

These figures were calculated by looking at the algorithms used in each segment, the number of Hess-like Tesla installations that can be done, and the current budget for non-GPU based computing in those markets. If NVIDIA met its goals here, the Tesla business could be bigger than the GeForce one. There's just one problem:

As you'll soon see, many of the architectural features of Fermi are targeted specifically for Tesla markets. The same could be said about GT200, albeit to a lesser degree. Yet Tesla accounted for less than 1.3% of NVIDIA's total revenue last quarter.

Given these numbers it looks like NVIDIA is building GPUs for a world that doesn't exist. NVIDIA doesn't agree.

The Evolution of GPU Computing

When matched with the right algorithms and programming efforts, GPU computing can provide some real speedups. Much of Fermi's architecture is designed to improve performance in these HPC and other GPU compute applications.

Ever since G80, NVIDIA has been on this path to bring GPU computing to reality. I rarely get the opportunity to get a non-marketing answer out of NVIDIA, but in talking to Jonah Alben (VP of GPU Engineering) I had an unusually frank discussion.

From the outside, G80 looks to be a GPU architected for compute. Internally, NVIDIA viewed it as an opportunistic way to enable more general purpose computing on its GPUs. The transition to a unified shader architecture gave NVIDIA the chance to, relatively easily, turn G80 into more than just a GPU. NVIDIA viewed GPU computing as a future strength for the company, so G80 led a dual life. Awesome graphics chip by day, the foundation for CUDA by night.

Remember that G80 was hashed out back in 2002 - 2003. NVIDIA had some ideas of where it wanted to take GPU computing, but it wasn't until G80 hit that customers started providing feedback that ultimately shaped the way GT200 and Fermi turned out.

One key example was support for double precision floating point. The feature wasn't added until GT200 and even then, it was only added based on computing customer feedback from G80. Fermi kicks double precision performance up another notch as it now executes FP64 ops at half of its FP32 rate (more on this later).

While G80 and GT200 were still primarily graphics chips, NVIDIA views Fermi as a processor that makes compute just as serious as graphics. NVIDIA believes it's on a different course, at least for the short term, than AMD. And you'll see this in many of the architectural features of Fermi.

Index Architecting Fermi: More Than 2x GT200
Comments Locked

415 Comments

View All Comments

  • Dobs - Thursday, October 1, 2009 - link

    I'm with the zorro - will be setting this up for my son pretty soon - he is an extreme gamer who has mentioned multiple monitors to me a few times over the last few months. Up until now I only had a vague idea on how I could accommodate his desire.... that has all changed since the introduction of Eyefinity.
  • Finally - Thursday, October 1, 2009 - link

    ..pussy-whipped by your son?
  • the zorro - Thursday, October 1, 2009 - link

    moron, i am going to buy two more monitors and then... eyefinity.
  • chizow - Wednesday, September 30, 2009 - link

    Nvidia didn't mention anything about multi-monitor support, but today's presentation wasn't really focused on the 3D gaming market and GeForce. They did spend a LOT of time on 3D Vision though, even integrating it into their presentation. They also made mention of the movie industry's heavy interest in 3D, so if I had to bet, they would go in the direction of 3D support before multi-monitor gaming.

    It wouldn't be hard for them to implement it though if they wanted to or were compelled to. Its most likely just a simple driver block or code they need to port to their desktop products. They already have multi-monitor 3D on their Quadro parts and have supported it for years, its nothing new really, just new on the desktop space with Eyefinity. It then becomes a question if they're willing to cannibilize their lucrative Quadro sales to compete with AMD on this relatively low-demand segment. My guess is no, but hopefully I'm wrong.
  • Dobs - Thursday, October 1, 2009 - link

    I think Nvidia are underestimating the desire and affordability for multi-monitor gaming. Have you seen monitor prices lately? Have you seen the Eyefinity reviews?

    By not making any mention of it is a big mistake in my book. Sure they can do it, but it will reduce there margins even further since they obviously hadn't planned spending the extra dollar$ this way.

    I do like the sound of the whole 3D thing in the keynote though... and everyone wearing 3D glasses...(not so much). But it will be cool once the Sony vs Panasonic vs etc.? 3D format war is finished, (although it's barely started) so us mainstream general consumers know which 3D product to buy. Just hope that James Cameron Avatar film is good :)
  • chizow - Thursday, October 1, 2009 - link

    Yeah I've seen the reviews and none seemed very compelling tbh, the 3-way portrait views seemed to be the best implementation. 6-way is a complete joke, unless you enjoy playing World of Bezelcraft? There's also quite a few problems with its implementation as you alluded to, the requirement of an active DP adapter was just a short-sighted half-assed implementation by AMD.

    As Yacoub mentioned, the market segment for people interested or willing to invest in this technology is so ridiculously small, 0.1% is probably pretty close to accurate given multi-GPU technology is estimated to only be ~1% of the GPU market. Surely those interested in multi-monitor is below that by a significant degree.

    Still for a free feature its definitely welcome, even in the 2D productivity sense, or perhaps for a day trader or broker....or anyone who wanted to play 20 flops simultaneously in online poker.
  • Dobs - Thursday, October 1, 2009 - link

    Lol @ 20 flops simultaneously in online poker. I struggle with 4 :)

    Agree with 6 monitor bezelcraft - Cross hair is the bezel :)
    I guess I'm lucky that my son is due for a screen upgrade anyhow so all 3 monitors will be new. Which one will be the problem - I hear Samsung are bringing out small-bezel monitors specifically for this, but I probably can't wait that long. (Samsung LED looks awesome though) I might end up opting for 3 of Dell's old (2008) 2408WFP's (my work monitor) as I know I can get a fair discount for these and I think they have DisplayPort. I'm not sure if my son will like Landscape or Portrait better but I want him to have the option... and yeah apparently the portrait drivers are limited (read crap) atm.

    Appreciate your feedback as well as your comments on the 5850 article... I actually expected the GT prices to be $600+ not the $500-$550 you mentioned. Oops... rambling now. Cheers
  • chizow - Thursday, October 1, 2009 - link

    Heheh I've heard of people playing more than 20 flops at a time....madness.

    Anyways, I'm in a similar holding pattern on the LCD. While I'm not interested in multi-monitor as of now, I'm holding out for LED 120Hz panels at 24+" and 1920. Tbh, I'd probably check out 3D Vision before Eyefinity/multi-monitor at this point, but even without 3D Vision you'd get the additonal FPS from a 120Hz panel along with increased response times from LED.

    If you're looking to buy now for quality panels with native DP support, you should check out the Dell U2410. Ryan Smith's 5870 review used 3 of them I think in portrait and it looked pretty good. They're a bit pricey though, $600ish but they were on sale for $480 or so with a 20% coupon. If you called Dell Sm. Biz and said you wanted 3 you could probably get that price without coupon.

    As for GTX 380 price, was just a guess, Anand's article also hints Nvidia doesn't want to get caught again with a similar pricing situation as with GT200 but at the same time, relative performance ultimately dictates price. Anyways, enjoyed the convo, hope the multi-mon set-up works out! Sounds like it'll be great (especially if you like sims or racing games)!
  • RadnorHarkonnen - Thursday, October 1, 2009 - link

    Eyefinity is screaming for DIY.

    Bezel Craft can be easly avoided. Just tear you monitor apart. A stand for 3 monitors is easly ordered/DIY made. Ussually the bezel is way thicker than it need to be.

    Unfortunely i alreayd have a 4850 CF that i will keep for more a year or two and let the tecnology mature for now.
  • wifiwolf - Wednesday, September 30, 2009 - link

    Can we have a new feature in the comments please?
    I just get tired of reading a few comments and get bugged by some SiliconDoc interference.
    Can we have a noise filter so comments area gets normal again.
    Every graphics related article gets this noise.
    Just a button to switch the filter on. Thanks.

Log in

Don't have an account? Sign up now