Introduction

Incredibly high priced and high powered graphics cards are exciting. For geeks like us, learning about the newest and best hardware out there is like watching a street race between a Lambo and a Ferrari. We are able to see what the highest achievable performance really offers when put to the test. Standing on the bleeding edge of technology and looking out at what is currently possible towards the next great advancement inspires us. The volume and depth of knowledge required to build a GPU humbles us while physically demonstrating the potential of the human race.

Unfortunately, this hardware, while attainable for a price, is often out of reach for most of its admirers. Sometimes it isn't owning the thing which we set apart that gives us joy, but knowing of its existence and relishing the fact that some time, in the not too distant future, that kind of performance will be available for a reasonable price as a mainstream staple.

While the performance of the 8800 GTX is still a ways off from making it to the mainstream, we finally have the feature set (and then some) of the GeForce 8 series in a very affordable package. While we only have the fastest of the new parts from NVIDIA today, the announcement today includes these new additions to the lineup:

GeForce 8600 GTS
GeForce 8600 GT
GeForce 8500 GT
GeForce 8400 (OEM only)
GeForce 8300 (OEM only)

The OEM only parts will not be available as add-in cards but will only be included in pre-built boxes by various system builders. While the 8600 GTS should be available immediately, we are seeing a little lag time between now and when the 8600 GT and 8500 GT will be available (though we are assured both will be on shelves before May 1). While NVIDIA has been very good about sticking to a hard launch strategy for quite a while now, we were recently informed that this policy would be changing.

Rather than coordinate hardware availability with their announcement, NVIDIA will move to announcing a product with availability to follow. Our understanding is that availability will happen within a couple weeks of announcements. There are reasons NVIDIA would prefer to do things this way, and they shared a few with us. It's difficult for all the various card manufacturers to meet the same schedule for availability, and giving them more time to get their hardware ready for shipment will even the playing field. It's hard to keep information from leaking when parts are already moving into the channel for distribution. With some sites able to get their hands on this information without talking to NVIDIA (and thus can avoid abiding by embargo dates on publication), taking measures to slow or stop leaks helps NVIDIA control information flow to the public and appease publishers who don't like getting scooped.

The bottom line, as we understand it, is that hard launches are difficult. It is our stance that anything worth doing right justifies the trouble it takes. NVIDIA stated that, as long as the information is accurate, there is no issue with delayed launches (or early announcements depending on how we look at things). On the surface this is true, but the necessity of a hard launch has reared its ugly head time and time again. We wouldn't need hard launches if we had infinite trust in hardware makers. The most blatant example in recent memory is the X700 XT from ATI. This product was announced, tested, reviewed (quite positively), but never saw the light of day. This type of misinformation can lead people to put off upgrading while waiting for the hardware or, even worse, trick people into buying hardware that does not match the performance of products we review.

So many people get confused by the fact that we still love hard launches even if only a handful of parts are available from a couple retailers. Sure, high availability at launch is a nice pipe dream, but the real meat of a hard launch is in the simple fact that we know the hardware is available, we know the hardware has the specs a company says it will, and we know the street price of the product. Trust is terrific, but this is business. NVIDIA, AMD, Intel, and everyone else are fighting an information war. On top of that, the pace of our industry is incredible and can cause plans to change at the drop of a hat. Even if a company is completely trustworthy, no one can predict the future and sometimes the plug needs to be pulled at the very last second.

In spite of all this, NVIDIA will do what they will do and we will continue to publish information on hardware as soon as we have it and are able. Just expect us to be very unforgiving when hardware specs don't match up exactly with what we are given to review.

For now, we have new hardware at hand, some available now and some not. While the basic architecture is the same as the 8800, there have been some tweaks and modifications. Before we get to performance testing, let's take a look at what we're working with.

Under the Hood of G84
Comments Locked

60 Comments

View All Comments

  • JarredWalton - Tuesday, April 17, 2007 - link

    It's not surprising that G84 has some enhancements relative to G80. I mean, G80 was done six months ago. I'd expect VP2 is one of the areas they worked on improving a lot after comments post-8800 launch. Now, should they kill the current G80 and make a new G80 v1.1 with VP2? That's up for debate, but you can't whine that older hardware doesn't have newer features. "Why doesn't my Core 2 Duo support SSE4?" It's almost the same thing. I wouldn't be at all surprised to see a new high-end card from NVIDIA in the future with VP2, but when that will be... dunno.
  • harshw - Tuesday, April 17, 2007 - link

    quote:

    It is important to emphasize the fact that HDCP is supported over dual-link DVI, allowing 8600 and 8500 hardware to play HDCP protected content at its full resolution on any monitor capable of displaying 1920x1080


    So ... to confirm, the card *does* let you watch HDCP content on a Dell 3007WFP at 2560x1600 ? Of course, the card would probably scale the stream to the panel resolution ...
  • DerekWilson - Tuesday, April 17, 2007 - link

    The card will let you watch HDCP protected content at the content's native resolution -- 1920x1080 progressive at max ...

    Currently if you want to watch HDCP protected content on a Dell 30", you need to drop your screen resolution to 1280x800 and watch at that res -- the video is downscaled from 1920x1080. Higher resolutions on the panel require dual-link DVI, and now HDCP protected content over a dual-link connection is here.
  • AnnonymousCoward - Tuesday, April 17, 2007 - link

    Maybe I'm in the minority, but I don't care about this HDCP business. The players are still ultra expensive, and the resolution benefit doesn't really change how much I enjoy a movie. Also, a 30" screen is pretty small to be able to notice a difference between HD and DVD, if you're sitting at any typical movie-watching distance from the screen. Well, I would guess so at least.
  • Spoelie - Wednesday, April 18, 2007 - link

    We're talking about 30" lcd monitors with humongous resolutions, not old 30" lcd tvs with 1386x768 something.

    Or do your really don't see any difference between
    http://www.imagehosting.com/out.php/i433150_BasicR...">http://www.imagehosting.com/out.php/i433150_BasicR... and http://www.imagehosting.com/out.php/i433192_HDDVD....">http://www.imagehosting.com/out.php/i433192_HDDVD....
    or
    http://www.imagehosting.com/out.php/i433157_BasicR...">http://www.imagehosting.com/out.php/i433157_BasicR... and http://www.imagehosting.com/out.php/i433198_HDDVD....">http://www.imagehosting.com/out.php/i433198_HDDVD....
  • Myrandex - Tuesday, April 17, 2007 - link

    I loved it how the two 8600 cards listed 256MB memoy only however the 8500 card showed 256MB / 512MB. Gotta love marketing in attempting to grab the masses attention by throwing more ram into a situation where it doesn't really help...
    Jason
  • KhoiFather - Tuesday, April 17, 2007 - link

    Horrible, horrible performance. I'm so disappointed its not even funny! I'm so waiting for ATI to release their mid-range cards and blow Nvidia out the water to space.
  • jay401 - Tuesday, April 17, 2007 - link

    quote:

    We haven't done any Windows Vista testing this time around, as we still care about maximum performance and testing in the environment most people will be using their hardware. This is not to say that we are ignoring Vista: we will be looking into DX10 benchmarks in the very near future. Right now, there is just no reason to move our testing to a new platform.


    Very true, and not only because the vast majority of gamers are still running XP, but also because no games out to this point gain anything from DX10/Vista (aside from one or two that add a few graphical tweaks here and there in DX10).

    When there are enough popular, well-reviewed DX10/Vista focused games available that demonstrate appreciable performance improvement when running in that environment, such that you can create a test suite around those games, then it would be time to transition to that sort of test setup for GPUs.
  • Griswold - Tuesday, April 17, 2007 - link

    The real reason would that nobody wants to go through the nightmare of dealing with nvidia drivers under vista. ;)
  • jay401 - Tuesday, April 17, 2007 - link

    Derek you should add the specs of the 8800GTS 320MB to the spec chart on page 2, unless of course NVidia forbids you to do that because it would make it too obvious how they've cut too many stream processors and too much bus size from these new cards.

    Now what they'll do is end the production of the 7950GTs to ensure folks can't continue to pick them up cheaper and will be forced to move to the 8600GTS that doesn't yet offer superior performance.

    gg neutering these cards so much that they lose to your own previous generation hardware, NVidia.

Log in

Don't have an account? Sign up now