NVIDIA's GeForce 8600, 8500 and 680SE

While NVIDIA wasn't showing off anything impressive at the show, there was a lot of very good NVIDIA news that we kept running into at CES.  At CES we saw a mobile GeForce 8600 and 8400, later on during the show we learned a bit more about these two GPUs. 

NVIDIA has two G80 derivatives designed to target the more mainstream segments: G84 and G86.  G84 will be the base of NVIDIA's GeForce 8600 while G86 will be used in what is currently known as the GeForce 8500.  Detailed specifications aren't known other than the chips are supposed to be 80nm, but expected launch date is around April or May.

Also on the roster is a 320MB GeForce 8800 GTS card, which is expected to be priced at $299.  Currently clock speeds and shader configuration are expected to be no different than the $449 640MB version, the card will only have less memory.  Honestly, we expect 320MB to be more than enough for most games/resolutions, which may make the 320MB card extremely attractive.  Couple it with a Core 2 Duo E4300 and you've got one fast and affordable system. 

Despite a decent amount of information about upcoming NVIDIA GPUs, we didn't hear anything about a 80nm G80.  Much of what happens with the G80's successor will probably depend on ATI's R600 release schedule. 

On the platform side, NVIDIA will be introducing a nForce 680SE chipset, which will be a less overclockable version of the 680i chipset.  Price point will be less than $180 but we're still not sure how it will fit into the big picture between the 650i and 680i. 

Those looking for NVIDIA's Vista 8800 GTX driver needn't look any further than Microsoft's booth at CES.  All of the gaming machines at Microsoft's booth were running nForce 680i motherboards with single GeForce 8800 GTXs, under Windows Vista.  The machines were running Crysis and Halo 2, and actually ran reasonably well.  Halo 2 was choppy at times and there were some visual bugs with Crysis, but the driver was working and is apparently stable. 

We spoke to NVIDIA to understand why there isn't a 8800 Vista driver currently and why we won't see one until Vista's launch.  NVIDIA's GPU drivers these days are made up of approximately 20 million lines of code, which as a reference point is about the size of Windows NT 4.0. 

Because G70 and G80 are radically different architectures, they each require a separate driver.  Combine that with the fact that Windows Vista has completely changed the driver interface, similar in magnitude to what happened between Windows 3.1 and 95, and you've got a "perfect storm" of conditions for driver development.  The end result is that for Windows Vista, two 20M line drivers have to be completely re-written (one for G80 and one from all previous architectures).  In other words, this isn't a simple port, it's a radical departure from the way things were written before. 

There are other elements of Vista driver development that apparently require more work than before.  DirectX 9, DX9 SLI, DX10 and DX10 SLI support is provided through four separate binaries, which increases the complexity of testing and the overall driver itself, whereas there was only a single driver in the past.   

Interfaces for HD-DVD and Blu-ray video acceleration requires a lot more code than before, thanks to the support for a protected path for HD video under Vista.  Supporting this protected path for HD content decode means that you can't re-use the video part of your driver when developing a Vista version. 

The last major difference between Windows XP and Vista driver development is that the display engine connecting monitors to the GPUs has been completely redone.

Initial investment in driver development under Vista takes up quite a bit of time, and now we understand a little more of why.  While it would be nice to have one today, there's always a tradeoff that has to be made especially when driver work this intense has to be done.  Couple that with the recent launch of NVIDIA's G80 GPU and the decision was made to focus on DX9 and XP drivers in order to make the G80's launch as solid as possible, and commit to delivering an 8800 driver by Vista's launch. 

When the driver is eventually available NVIDIA expects performance to be at par, slightly slower or slightly faster than the XP driver.  What we've seen thus far from other Vista drivers is that performance is slower almost entirely across the board.  As stability is currently the primary goal for both ATI and NVIDIA, many compiler optimizations and performance tweaks aren't being used in order to get a good driver out in time for Vista's launch. 

Side Show in Action Ageia has big plans for PhysX in '07
POST A COMMENT

18 Comments

View All Comments

  • artifex - Monday, January 15, 2007 - link

    I'm getting offers in the ads from companies who claim to offer "free" stuff provided you join a lot of trial offers and buy a bunch of stuff and sucker your friends into joining, also.

    Does Anandtech approve of these ads? Don't say you have no control over them, because you do. You can complain to your provider, IndustryBrains, or switch if they continue to show these things.

    The suckier the ads are, the less credibility you have among people who see them, and the more likely everyone will use adblockers, which will kill your revenue.
    Reply
  • artifex - Monday, January 15, 2007 - link

    You guys must be too young to remember G-Force, the anime. :)
    When Nvidia announced their first GeForce product, I thought they might get sued, themselves. But of course, g-force is a term that predates either.
    Reply
  • Houdani - Friday, January 12, 2007 - link

    <--- that's me being grumpy about Toshiba & Canon not displaying the SED TVs at CES'07 due to legal wranglings with Nano-Proprietary. This, of course, is only pushing out their availability that much farther, further closing the window on this tech. Hrmph! Reply
  • semo - Friday, January 12, 2007 - link

    quote:

    Performance of a game with PhysX enabled must not be lower than with it disabled - you should no longer have the problem of better physics but lower performance. This is a big step forward for Ageia, as it is difficult to justify spending money on getting better physics if you end up reducing overall game performance as a trade off.
    why is that such an issue? what is performance? some numbers you couldn't care less when playing assuming the fps stay above a certain number. you expect performance to drop when enabling other eye candy, but when it comes to realism everyone seems to complain.

    this makes me think, are ppl buying better video cards for the increased "performance" or for the more immersive experience.
    Reply
  • Houdani - Friday, January 12, 2007 - link

    Physics doesn't necessarily have to mean that more polygons are pushed to the screen (such as when things go boom). When this happens, then it taxes the video card more and has a subsequent impact on performance. I think this relationship is understood and accepted.

    However, if the physics don't add more polygons but instead cause objects to interact more realistically then we're at the spot where we don't want overall performance to slow down. This is where Ageia needs to flex their strength and not disappoint their audience.

    In *software* we already have the ability to have great physics, but at a loss to performance. For Ageia to excel, they necessarily have to remove that hindrance and give us the physics without the performance hit -- otherwise they've provided us with little or no benefit, really.
    Reply
  • semo - Friday, January 12, 2007 - link

    that makes sense. how much of a performance hit are talking here anyway. and how much of the physics calculations are outsourced to the ppu (and are there any big overheads as a result) Reply
  • LoneWolf15 - Friday, January 12, 2007 - link

    quote:

    The unit itself is extremely light and honestly is one of the first devices of this type that we could actually see being a reasonable replacement to carrying around tons of books. While the demonstration centered around reading novels, what we’d really like to see is this technology used to store textbooks for schools. Rather than having to carry around multiple books each composed of hundreds of pages, a single e-Ink based Reader like this would be a much better experience.


    It would be, if you can make sure this product is extremely difficult to damage.

    I've seen way too many students that don't care how they treat something a school gives them --after all, it's (in their minds) not like they bought and paid for it with their own money (the concept that their parents' taxes did is irrelevant in their minds in those cases).

    I agree that the concept is brilliant on paper, and it should be perfect for higher education. In the K-12 evnironment though, unless there's a way of accountability that works without making parents upset, or a way of making them durable enough that this is not an issue, this could be an idea that falls one tiny step short of a great finish.
    Reply
  • bokep - Friday, January 12, 2007 - link

    I've been following OLEDs since I first learned about it over half a decade ago. Nice to see it working that well and should be coming out within the next few years. Reply
  • CSMR - Thursday, January 11, 2007 - link

    Great reviews, thanks for keeping the world updated! Reply
  • archcommus - Thursday, January 11, 2007 - link

    ...let's be serious here, LCD is surely getting the job done just fine. Reply

Log in

Don't have an account? Sign up now