The road to any new microprocessor design is by no means simple. Planning for a major GPU like NVIDIA's Kepler starts four years prior to the chip's debut. In a world that's increasingly more focused on fast production and consumption of everything, it's insane to think of any project taking such a long period of time.

Chip planning involves figuring out what you want to do, what features you want, what the architecture should look like at a high level, etc...  After several rounds of back and forth in the planning stage, actual architecture work begins. This phase can take a good 1 - 1.5 years depending on the complexity of the design. Add another year for layout and validation work, then a 6 - 9 month race from tape out to products on shelves. The teams that spend years on these designs are made up of hard working, very smart people. They all tend to believe in what they're doing and they all show up trying to do the best job possible. 

Unfortunately, picking a target that's 4 years out and trying to hit it better than your competition is extremely difficult. You can put in an amazing amount of work, push through late nights, struggle with issues, be proud of what you've done and still fall short. We've seen this happen to companies on both sides of the fence, whether we're talking CPUs or GPUs, you win some and you lose some

 

Today NVIDIA unveiled Kepler, a more efficient 28nm derivative of its Fermi architecture. The GeForce GTX 680 is the first productized Kepler for the desktop and if you read our review, it did very well. As our own Ryan Smith wrote in his conclusion to the GeForce GTX 680 review:

"But in the meantime, in the here and now, this is by far the easiest recommendation we’ve been able to make for an NVIDIA flagship video card. NVIDIA’s drive for efficiency has paid off handsomely, and as a result they have once again captured the performance crown."

We've all heard stories about what happens inside a company when a chip doesn't do well. Today we have an example of what happens after years of work really pay off. A trusted source within NVIDIA forwarded us a copy of Jen-Hsun's (NVIDIA's CEO) email to all employees, congratulating them on Kepler's launch. With NVIDIA in (presumably) good spirits today, I'm sure they won't mind if we share it here.

If you ever wondered what it's like to be on the receiving end of a happy Jen-Hsun email, here's your chance:

-----Original Message-----
From: Jensen H Huang 
Sent: Thursday, March 22, 2012 9:48 AM
To: Employees
Subject: Kepler Rising
 
Today, the first Kepler - GTX 680 - is on shelves around the world!
 
Three years in the making.  The endeavor of a thousand of the world's best engineers.  One vision - build a revolutionary GPU and make a giant leap in efficient-performance.
 
Achieving efficient-performance, great performance while consuming the least possible energy, required us to change our entire design approach.  Close collaboration between architecture-design-VLSI-software-devtech-systems, intense scrutiny on where energy is spent, and inventions at every level were necessary.  The results are fantastic as you will see in the reviews. 
 
Kepler also cultivated a passion for craftsmanship - nothing wasted, everything put together with care - with a goal of creating an exquisite product that works wonderfully.  Let's continue to raise the bar and establish extraordinary craftsmanship as a hallmark of our company.
 
Today is just the beginning of Kepler.  Because of its super energy-efficient architecture, we will extend GPUs into datacenters, to super thin notebooks, to superphones.  Not to mention bring joy and delight to millions of gamers around the world.
 
I want to thank all that gave your heart and soul to create Kepler.  You've created something wonderful.
 
Congratulations everyone!
 
Jensen

 

Comments Locked

31 Comments

View All Comments

  • jibberegg - Thursday, March 22, 2012 - link

    I couldn't help but cringe at the link to the RV600 2900XT review with "lose" as the anchor text. I bought that card not long before discovering AnandTech and I remember the sinking feeling I had reading that exact review. The knot of realisation in your stomach that screams "you bought the wrong card!" Sad times.

    Well done NVIDIA though :)
  • TonyB - Thursday, March 22, 2012 - link

    lol, you bought a 2900XT
  • tipoo - Thursday, March 22, 2012 - link

    Wow, remember when cards were struggling to hit 60fps at 1920x1200? Lots of todays cards are into the hundreds of FPS with todays games at higher resolutions than that. I guess console dominance is a blessing and curse in one.
  • Impulses - Thursday, March 22, 2012 - link

    The high end isn't 1920x1200 anymore tho... Sup 5760x1200, even Kepler isn't enough to drive that as a single card solution.
  • nathanddrews - Friday, March 23, 2012 - link

    1080p is still a difficult target to hit for most new games - especially if you want 4xAA and all manner of post processing. One thing I wish AT would do is offer benchmarks without AA enabled. When playing a game at 1:1 pixel ratio on and LCD or PDP, jaggies aren't as evident. IMO, AA in games is like DNR on Blu-ray titles - it just makes it blurrier. Give it to me crisp.

    I also think targeting for 60fps is a mistake. Sure, the current dominant display technology is all connecting and running at 60Hz, but the time will come where true 120Hz displays will become the norm - likely 4K as well. Also, in order to get smooth 3-D at 60fps, you need to have the power to push double the frames in as much time. GPU makers are getting lazy. When we had CRTs, pushing above and beyond 60fps was extremely important, but that goal has been lacking both from the hardware and expectations.

    I feel like we're being locked down by inferior display technology.
  • Dracusis - Friday, May 4, 2012 - link

    You have some strange and very misguided ideas about technology.
  • ShieTar - Friday, March 23, 2012 - link

    1920x1200 hasn't been "high end" for quiet a while. I bought my trusty 30'' back in 2007, and ATI introduced Eyefinity at about the same time. And then there is downsampling-AA.

    Nevertheless, due to Codevelopment for consoles and PCs, this top-resolutions have basically been ignored by game developers and reviewers (not Anandtech, but many others) ever since.

    This means that cards that are reviewed to "play modern games at FullHD and 60 fps" struggle to go beyond 25 FPS on my monitor, justifiying CrossFire/SLI setups.
    But if they include text, like MMOs, it's more often than not in almost unreadably small fonts. No good solution for that, since sadly my Dell isn't too good at scaling down, so playing 1920x1200 is not an option either.

    I just wish reviewers would drop the 1680x1050 resolutions from their Desktop-Card-Reviews. People with that kind of monitor should not invest 500$ into new Graphics cards. Get a decent Full-HD monitor already, or better yet a High-Res 27'' or 30'', so that PC developers start taking these resolutions into consideration for their developments.

    Seriously, font size scaling? It's not magic, my Amiga could do it in the early 90s.
  • setzer - Friday, March 23, 2012 - link

    I like my 1680x1050 resolution and the 22" 16:10 monitor that gives that resolution to me.
    In fact the top 3 resolutions on the steam hardware survey are:

    1920 x 1080 27.54% +1.06%
    1680 x 1050 18.04% -0.03%
    1280 x 1024 11.20% +0.06%

    So as you see, let 1680x1050 be in the reviews as there is a usage for that.

    On a side note it's funny that all top resolutions have different aspect ratios :/
  • SolMiester - Sunday, March 25, 2012 - link

    The high end isn't 1920x1200 anymore tho... Sup 5760x1200, even Kepler isn't enough to drive that as a single card solution.

    ??, try reading the HardwareHeaven review, it has 57*10 results, which prove your statement incorrect!
  • serrationlol - Thursday, March 22, 2012 - link

    That is because most games today are still using engines like the Unreal Engine 3 and such. Only games that really matter when benchmarking high-end cards are the games with engines that utilize the many DX11 features available today. For example, games such as BF3 and Metro 2033 (The new Crysis IMO) are the ones that matter.

Log in

Don't have an account? Sign up now