The GeForce 8800 Ultra

Physically, the layout of the board is no different, but NVIDIA has put quite a bit of work into their latest effort. The first and most noticeable change is the HSF.

We have been very happy with NVIDIA's stock cooling solutions for the past few years. This HSF solution is no different, as it offers quiet and efficient cooling. Of course, this could be due to the fact that the only real changes are the position of the fan and the shape of the shroud.

Beyond cooling, NVIDIA has altered the G80 silicon. Though they could not go into the specifics, NVIDIA indicated that layout has been changed to allow for higher clocks. They have also enhanced the 90nm process they are using to fab the chips. Adjustments targeted at improving clock speed and reducing power (which can sometimes work against each other) were made. We certainly wish NVIDIA could have gone into more detail on this topic, but we are left to wonder exactly what is different with the new revision of G80.

As far as functionality is concerned, no features have changed between the 8800 GTX and the 8800 Ultra. What we have, for all intents and purposes, is an overclocked 8800 GTX. Here's a look at the card:



While we don't normally look at overclocking with reference hardware, NVIDIA suggested that there is much more headroom available in the 8800 Ultra than on the GTX. We decided to put the card to the test, but we will have to wait until we get our hands on retail boards to see what end users can realistically expect.

Using nTune, we were able to run completely stable at 684MHz. This is faster than any of our 8800 GTX hardware has been able to reach. Shader clock increases with core clock when set under nTune. The hardware is capable of independent clocks, but currently NVIDIA doesn't allow users to set the clocks independently without the use of a BIOS tweaking utility.

We used RivaTuner to check out where our shader clock landed when setting core clock speed in nTune. With a core clock of 684MHz, we saw 1674MHz on the shader. Pushing nTune up to 690 still gave us a core clock of 684MHz but with a shader clock of 1728MHz. The next core clock speed available is 702MHz which also pairs with 1728MHz on the shader. We could run some tests at these higher speeds, but our reference board wasn't able to handle the heat and locked up without completing our stress test.

It is possible we could see some hardware vendors release 8800 Ultra parts with over 100MHz higher core clocks than stock 8800 GTX parts, which could start to get interesting at the $700+ price range. It does seem that the revised G80 silicon may be able to hit 700+ MHz core clocks with 1.73GHz shader clocks with advanced (read: even more expensive) cooling solutions. That is, if our reference board is actually a good indication of retail parts. As we mentioned, we will have to wait and see.

Index The Test
POST A COMMENT

66 Comments

View All Comments

  • kalrith - Wednesday, May 02, 2007 - link

    ...because you can't purchase an E6600 that's overclocked to 2.9GHz out of the box, with the warranty intact. The extreme CPUs are actually marketable to people who want the overclocked performance without doing it on their own and voiding the warranty.

    We can already do that with EVGA's overclocked 8800GTX that performs at about 2% less than the Ultra and costs 22% less. It does that right out of the box and keeps its warranty at that performance level.
    Reply
  • ADDAvenger - Wednesday, May 02, 2007 - link

    quote:

    would give NVIDIA the ability to sell a card and treat it like a Ferrari. It would turn high end graphics into a status symbol rather than a commodity.


    Like they aren't already more of a status symbol than commodity!?
    Reply
  • DerekWilson - Wednesday, May 02, 2007 - link

    perhaps to some ... and the ferrari analogy isn't quite right there either -- ferrari's actually have something to offer on the road/track, and they can be a good investment as well ... perhaps I need to rework that sentence.

    the thing is, there are enthusiasts out there who will buy the 8800 GTX for it's performance. but with cards more like the ultra, we will see fewer people buy the card for any quality/performance advantage. a higher ratio of status seekers will buy it as opposed to real enthusiasts.

    certainly the hardcore overclockers will be interested. and it'll be interesting to see what A3 G80 silicon can do when strapped to a phase change cooling system. but that market isn't very large.
    Reply
  • sxr7171 - Thursday, May 03, 2007 - link

    Well the market for any $830 card isn't large as it stands, but the likelihood users adding some crazy cooling to it is pretty high among those who would pay $830 for a video card. Reply
  • Den - Wednesday, May 02, 2007 - link

    I would like to see the power usage numbers on this card since part of the A3 revision was supposed to help reduce power consumption.

    I agree this is a big step in price for a small step in performance, but that is just like high end CPU's. The interesting question is, when EVGA and others come out with overclocked Ultra cards, how much faster will those be than their overclocked GTX's? If they can get a 10% lead for $200 more, I bet they will get some takers.
    Reply
  • DerekWilson - Wednesday, May 02, 2007 - link

    we don't usually test power with reference boards. we'll certainly look at it when we get our hands on a retail product though.

    nvidia is reporting lower power usage with the 8800 Ultra that ammounts to just a couple watts less than the 8800 GTX. While this is good for a higher performance part, it's nothing to write home about.
    Reply
  • Chadder007 - Wednesday, May 02, 2007 - link

    Holy Not worth the price of admission Batman!! That much more for an overclocked GTX? Reply
  • Fluppeteer - Wednesday, May 02, 2007 - link

    I completely understand this review's conclusions, but I can't help but notice...

    If the reviewers have agreed that the only point of this card is its ability to be overclocked, and given that they overclocked it (and proved that it has more headroom than the GTX), why are there no performance results for the overclocked card? Just because retail cards may behave differently? Surely they'd overclock *somewhat*, so the extra sample point (even with a "YMMV" by it) would be useful.

    Fine, overclocking ability varies on a card-by-card basis, but if the sole point of this card (whether nVidia market it as such or not) is to be ramped up from the default clock, it seems strange not to have shown how much performance this might have provided.

    Clearly the Ultra at default clock isn't economical compared with an overclocked GTX (no news there - a lot of overclocked devices are more economical than slower "higher end" parts), but if this card is really capable of running at higher speeds, that still makes it the fastest card available - and it would be nice to know by how much. Maybe nVidia will change their minds about the default clock (and remove a few Ultras from the production line) if the 2900XTX turns out to be faster than expected.

    I'll reserve judgement until the consumer cards appear.
    Reply
  • sxr7171 - Thursday, May 03, 2007 - link

    Yes that is the real question. The whole reason all the revisions were done was to enable better O/Cing. Anyway, I can't afford it, but I hope it O/Cs well for those who can. Reply
  • ss284 - Wednesday, May 02, 2007 - link

    This is a really good point. Some OCed results would help, although the card is still overpriced. Reply

Log in

Don't have an account? Sign up now