NVIDIA's GeForce 8600, 8500 and 680SE

While NVIDIA wasn't showing off anything impressive at the show, there was a lot of very good NVIDIA news that we kept running into at CES.  At CES we saw a mobile GeForce 8600 and 8400, later on during the show we learned a bit more about these two GPUs. 

NVIDIA has two G80 derivatives designed to target the more mainstream segments: G84 and G86.  G84 will be the base of NVIDIA's GeForce 8600 while G86 will be used in what is currently known as the GeForce 8500.  Detailed specifications aren't known other than the chips are supposed to be 80nm, but expected launch date is around April or May.

Also on the roster is a 320MB GeForce 8800 GTS card, which is expected to be priced at $299.  Currently clock speeds and shader configuration are expected to be no different than the $449 640MB version, the card will only have less memory.  Honestly, we expect 320MB to be more than enough for most games/resolutions, which may make the 320MB card extremely attractive.  Couple it with a Core 2 Duo E4300 and you've got one fast and affordable system. 

Despite a decent amount of information about upcoming NVIDIA GPUs, we didn't hear anything about a 80nm G80.  Much of what happens with the G80's successor will probably depend on ATI's R600 release schedule. 

On the platform side, NVIDIA will be introducing a nForce 680SE chipset, which will be a less overclockable version of the 680i chipset.  Price point will be less than $180 but we're still not sure how it will fit into the big picture between the 650i and 680i. 

Those looking for NVIDIA's Vista 8800 GTX driver needn't look any further than Microsoft's booth at CES.  All of the gaming machines at Microsoft's booth were running nForce 680i motherboards with single GeForce 8800 GTXs, under Windows Vista.  The machines were running Crysis and Halo 2, and actually ran reasonably well.  Halo 2 was choppy at times and there were some visual bugs with Crysis, but the driver was working and is apparently stable. 

We spoke to NVIDIA to understand why there isn't a 8800 Vista driver currently and why we won't see one until Vista's launch.  NVIDIA's GPU drivers these days are made up of approximately 20 million lines of code, which as a reference point is about the size of Windows NT 4.0. 

Because G70 and G80 are radically different architectures, they each require a separate driver.  Combine that with the fact that Windows Vista has completely changed the driver interface, similar in magnitude to what happened between Windows 3.1 and 95, and you've got a "perfect storm" of conditions for driver development.  The end result is that for Windows Vista, two 20M line drivers have to be completely re-written (one for G80 and one from all previous architectures).  In other words, this isn't a simple port, it's a radical departure from the way things were written before. 

There are other elements of Vista driver development that apparently require more work than before.  DirectX 9, DX9 SLI, DX10 and DX10 SLI support is provided through four separate binaries, which increases the complexity of testing and the overall driver itself, whereas there was only a single driver in the past.   

Interfaces for HD-DVD and Blu-ray video acceleration requires a lot more code than before, thanks to the support for a protected path for HD video under Vista.  Supporting this protected path for HD content decode means that you can't re-use the video part of your driver when developing a Vista version. 

The last major difference between Windows XP and Vista driver development is that the display engine connecting monitors to the GPUs has been completely redone.

Initial investment in driver development under Vista takes up quite a bit of time, and now we understand a little more of why.  While it would be nice to have one today, there's always a tradeoff that has to be made especially when driver work this intense has to be done.  Couple that with the recent launch of NVIDIA's G80 GPU and the decision was made to focus on DX9 and XP drivers in order to make the G80's launch as solid as possible, and commit to delivering an 8800 driver by Vista's launch. 

When the driver is eventually available NVIDIA expects performance to be at par, slightly slower or slightly faster than the XP driver.  What we've seen thus far from other Vista drivers is that performance is slower almost entirely across the board.  As stability is currently the primary goal for both ATI and NVIDIA, many compiler optimizations and performance tweaks aren't being used in order to get a good driver out in time for Vista's launch. 

Side Show in Action Ageia has big plans for PhysX in '07
Comments Locked

18 Comments

View All Comments

  • Johnmcl7 - Friday, January 12, 2007 - link

    Can't say I agree with that, while LCDs are thin and light their image quality leaves a lot to be desired especially given the superior image quality of the CRTs they've effectively replaced.

    John
  • PrinceGaz - Sunday, January 14, 2007 - link

    That very much depends on the type of LCD panel used. Maybe it's because my Mitsubishi 2070SB CRT display is about four years old and isn't as good as it used to be, but the overall image quality (including colour reproduction) of my new HP LP2065 which uses an S-IPS LCD panel is just as good as it. The response-time is also sufficiently fast that their is no visible blurring of fast moving images. And the 2070SB wasn't some cheapo CRT either, it was one of the best 20" visible CRT monitors you could get.

    The fact that the LP2065 was just a little over half the price of the old 2070SB actually makes modern LCD displays seem superior to CRTs, especially when the lower power consumption is factored in. It is also slightly (ahem!) less bulky and heavy than the old CRT monitor. It makes me wish I'd switched to an LCD sooner except of course that even a year or two ago, the picture quality of the best LCD panels wasn't anywhere near what it is today.

    Give a *good* (in other words one that does not use a TN panel) LCD display a chance and you'll probably be surprised.
  • msva124 - Monday, January 15, 2007 - link

    Does it scale well to different resolutions? I.E. for gaming.
  • tumbleweed - Thursday, January 11, 2007 - link

    "the display is superb, making it very similar to reading pages in a regular book"

    Hardly. It's dark grey on light grey, thus having less than stellar contrast. No, this really isn't similar to reading a regular book; it's similar to reading an ATM receipt. Once they get it to the point of true blank on something resembling white, then we can talk. Other than that, I'll admit it's nifty, but the display quality ain't there yet.
  • msva124 - Thursday, January 11, 2007 - link

    OMG! It's almost as good as one as those CRT things that Nostradamus said would be here in the year 3000!
  • GhandiInstinct - Thursday, January 11, 2007 - link

    I AM SOLD ON OLED!!!!! Come get me!
  • BladeVenom - Friday, January 12, 2007 - link

    Last time I checked, OLED displays had a very short lifespan. That may be OK if you don't use it much, or like replacing your monitor every year, but I think many will have a problem with that.
  • psychobriggsy - Friday, January 12, 2007 - link

    They've even got blues up to >30k hours now. That's a lot of TV watching, although some people sure do like to watch TV all day.

    Anyway, I'm sure I read that these Sony displays used a single colour OLED throughout, with colour filters on top. White OLEDs can have very long lives. If they're using 100k hour OLEDs, and you have the TV on for 10 hours a day because you cannot bear the idea of not having it on, then that is 30 years before the display is ~half as bright as originally. I think that predicting television display technology in 2037 will be quite difficult.

    I'm just hoping that one day OLEDs will actually really be available in large displays! Can't wait yet another 5 years...

Log in

Don't have an account? Sign up now