Original Link: http://www.anandtech.com/show/2222
New Ultra High End Price Point With GeForce 8800 Ultraby Derek Wilson on May 2, 2007 9:00 AM EST
- Posted in
NVIDIA owns the high end graphics market. For the past six months, there has been no challenge to the performance leadership of the GeForce 8800 GTX. Since the emergence of Windows Vista, NVIDIA hardware has been the only platform to support DX10. And now, before AMD has come to market with any competing solution whatsoever, NVIDIA is releasing a refresh of its top of the line part.
The GeForce 8800 Ultra debuting today doesn't have any new features over the original 8800 GTX. The GPU is still manufactured using a 90nm process, and the transistor count hasn't changed. This is different silicon (A3 revision), but the GPU has only really been tweaked rather than redesigned.
Not only will NVIDIA's new part offer higher performance than the current leader, but it will introduce a new price point in the consumer graphics market moving well beyond the current $600 - $650 set by the 8800 GTX, skipping over the $700 mark to a new high of $830. That's right, this new high end graphics card will be priced $230 higher than the current performance leader. With such a big leap in price, we had hoped to see a proportional leap in performance. Unfortunately, for the 38% increase in price, we only get a ~10% increase in core and shader clock speeds, and a 20% increase in memory clock.
Here's a chart breaking down NVIDIA's current DX10 lineup:
|NVIDIA G8x Hardware|
|SPs||ROPs||Core Clock||Shader Clock||Memory Data Rate||Memory Bus Width||Memory Size||Price|
|8800 GTS 320MB||96||20||513MHz||1.19GHz||1.6GHz||320bit||320MB||$300-$350|
We do know NVIDIA has wanted to push up towards the $1000 graphics card segment for a while. Offering the top of the line for what almost amounts to a performance tax would give NVIDIA the ability to sell a card and treat it like a Ferrari. It would turn high end graphics into a status symbol rather than a commodity. That and having a huge margin part in the mix can easily generate additional profits.
Price gaps larger than performance increases are not unprecedented. In the CPU world, we see prices rise much faster than performance, especially at the high end. It makes sense that NVIDIA would want to capitalize on this sort of model and charge an additional premium for their highest performing part. This way, they also get to introduce a new high end part without pushing down the price of the rest of their lineup.
Unfortunately, the stats on the hardware look fairly similar to an overclocked 8800 GTX priced at $650: the EVGA e-GeForce 8800 GTX KO ACS3. With core/shader/memory clock speeds at 626/1450/1000, this EVGA overclocked part poses some stiff competition both in terms of performance and especially price. NVIDIA's G80 silicon revision might need to be sprinkled with magic fairy dust to offer any sort of competition to the EVGA card.
We should also note that this part won't be available until around the 15th of May, and this marks the first launch to totally balk on the hard launch with product announcement standard. While we hate to see the hard launch die from a consumer standpoint, we know those in the graphics industry are thrilled to see some time reappear between announcement and launch. While hard launches may be difficult, going this direction leaves hardware designers with enough rope to hang themselves. We would love to believe AMD and NVIDIA would be more responsible now, but there is no real reason to think history won't repeat itself.
But now, let's take a look at what we are working with today.
The GeForce 8800 Ultra
Physically, the layout of the board is no different, but NVIDIA has put quite a bit of work into their latest effort. The first and most noticeable change is the HSF.
We have been very happy with NVIDIA's stock cooling solutions for the past few years. This HSF solution is no different, as it offers quiet and efficient cooling. Of course, this could be due to the fact that the only real changes are the position of the fan and the shape of the shroud.
Beyond cooling, NVIDIA has altered the G80 silicon. Though they could not go into the specifics, NVIDIA indicated that layout has been changed to allow for higher clocks. They have also enhanced the 90nm process they are using to fab the chips. Adjustments targeted at improving clock speed and reducing power (which can sometimes work against each other) were made. We certainly wish NVIDIA could have gone into more detail on this topic, but we are left to wonder exactly what is different with the new revision of G80.
As far as functionality is concerned, no features have changed between the 8800 GTX and the 8800 Ultra. What we have, for all intents and purposes, is an overclocked 8800 GTX. Here's a look at the card:
While we don't normally look at overclocking with reference hardware, NVIDIA suggested that there is much more headroom available in the 8800 Ultra than on the GTX. We decided to put the card to the test, but we will have to wait until we get our hands on retail boards to see what end users can realistically expect.
Using nTune, we were able to run completely stable at 684MHz. This is faster than any of our 8800 GTX hardware has been able to reach. Shader clock increases with core clock when set under nTune. The hardware is capable of independent clocks, but currently NVIDIA doesn't allow users to set the clocks independently without the use of a BIOS tweaking utility.
We used RivaTuner to check out where our shader clock landed when setting core clock speed in nTune. With a core clock of 684MHz, we saw 1674MHz on the shader. Pushing nTune up to 690 still gave us a core clock of 684MHz but with a shader clock of 1728MHz. The next core clock speed available is 702MHz which also pairs with 1728MHz on the shader. We could run some tests at these higher speeds, but our reference board wasn't able to handle the heat and locked up without completing our stress test.
It is possible we could see some hardware vendors release 8800 Ultra parts with over 100MHz higher core clocks than stock 8800 GTX parts, which could start to get interesting at the $700+ price range. It does seem that the revised G80 silicon may be able to hit 700+ MHz core clocks with 1.73GHz shader clocks with advanced (read: even more expensive) cooling solutions. That is, if our reference board is actually a good indication of retail parts. As we mentioned, we will have to wait and see.
As for our performance tests, we will be looking at a handful of games running the extreme resolutions and quality settings the 8800 Ultra is designed to enable. We will be including the stock 8800 GTX as well as the EVGA e-GeForce 8800 GTX KO ACS3. This should give us a good sense of what the new 8800 Ultra really has to offer.
We are using the same testing rig we've employed for quite some time now.
|System Test Configuration|
|CPU:||Intel Core 2 Extreme X6800 (2.93GHz/4MB)|
|Motherboard:||EVGA nForce 680i SLI|
|Chipset:||NVIDIA nForce 680i SLI|
|Chipset Drivers:||NVIDIA nForce 9.35|
|Hard Disk:||Seagate 7200.7 160GB SATA|
|Memory:||Corsair XMS2 DDR2-800 4-4-4-12 (1GB x 2)|
|Video Drivers:||ATI Catalyst 7.4
NVIDIA ForceWare 158.19
|Desktop Resolution:||1280 x 800 - 32-bit @ 60Hz|
|OS:||Windows XP Professional SP2|
Games include staples such as: BF2, Prey, Oblivion, and Rainbow Six: Vegas. We will also be testing new comers S.T.A.L.K.E.R. and Supreme Commander. This article sees the addition of AA modes in Oblivion and Rainbow Six, as this high performance hardware needs some room to stretch its legs.
Where possible we use built-in benchmarks. FRAPS is used for Oblivion, Rainbow Six, and S.T.A.L.K.E.R. (which has demo play functionality but no demo record).
Battlefield 2 Performance
Without AA enabled, our BF2 performance numbers are very CPU limited even up to 1920x1200. Even our 4xAA numbers peg the CPU at 1600x1200, but they do show a little more detail at the higher resolutions. Gamers who like the Battlefield series won't be losing anything by avoiding the 8800 Ultra.
BF2 performance in game is limited to 100fps, so any performance over this will be capped. Compressing our scores even more clearly shows that nothing faster than an 8800 GTX is going to make a real performance difference in BF2. Our 8800 Ultra is only slightly faster than the EVGA card, and comes in at a little more than 10% faster than the 8800 GTX.
The Elder Scrolls IV: Oblivion Performance
Oblivion is quite a bit more variable than our other tests, and we like to give our numbers closer to a 5% margin of error here. This is important to remember as our overclocked EVGA 8800 GTX and the new 8800 Ultra trade places a couple times in our testing. To us, this says that the cards have roughly the same performance.
In fact, because our numbers were so close, we decided to take a look at performance with 16xAF and 4xAA enabled this time around as well. Again, we see about a 10% improvement from the stock 8800 GTX to the 8800 Ultra, and our EVGA card falls just on the margin of error border with the 8800 Ultra at the high end.
Our obligatory OpenGL test is Prey. While not as graphically intensive as some of the other titles we will be testing, this is an important staple in a well rounded benchmark suite. Over our three test resolutions, the 8800 Ultra again showed about a 10% advantage over the 8800 GTX. But this time performance of the 8800 Ultra matched the performance of our overclocked EVGA card almost exactly. The graphs really do speak for themselves here.
Rainbow Six: Vegas Performance
Rainbow Six: Vegas is based on Epics Unreal Engine 3, which is sure to be popular among game developers. Our experience shows this game to be a very graphically demanding game, and while our benchmark is very repeatable, in game performance is generally slightly lower than the numbers we report here.
The performance characteristics under Rainbow Six: Vegas reflect the same patterns we have already seen. The 8800 Ultra doesn't outperform the 8800 GTX by much more than 10%, and the EVGA e-GeForce 8800 GTX KO ACS3 does a very good job of keeping up with NVIDIA's new flagship.
S.T.A.L.K.E.R. is a relatively new benchmark for us, and we are working on tweaking it. Currently we just run in a straight line through grass and trees toward some buildings and people using FRAPS to record framerate. For this test, we've turned everything up as high as it can go (except the in game AA setting) and enabled grass shadows.
While MSAA is not supported due to the deferred rendering model used, playability at extreme resolutions is already pushed to the limits. In this game, edge antialiasing is not really an issue for us, as level design is quite good at avoiding extremely high contrast edges. Thin lines are a problem, so some sort of real AA would be nice. The in game AA setting isn't very good quality and doesn't do anything for thin lines.
Our performance tests show another case where the 8800 Ultra is within 10% of the performance of the 8800 GTX. As with our other run through FRAPS test in Oblivion, the EVGA card and the 8800 Ultra trade places going from 16x12 to 19x12. This test does seem to be more consistent than Oblivion, but with anything FRAPS, we do give it a little more leeway. But once again our conclusion is that the overclocked EVGA 8800 GTX and the 8800 Ultra perform the same.
Supreme Commander Performance
While Supreme Commander supports antialiasing, we get some strange performance data when enabling AA in the performance test provided by Gas Powered Games. For that reason, we will be sticking with standard performance settings with all other options enabled. We did see a little more variability in this test than in other built in benchmarks. Our data more or less shows the same thing we've seen time and time again. The 8800 Ultra just doesn't consistently give us any more than the $650 EVGA e-GeForce 8800 GTX KO ACS3.
Often, when reviewing hardware, it is difficult to draw a hard line and state with total confidence that our conclusions are the only logical ones that can be drawn from the facts. We try very hard to eliminate personal opinion from our reviews and provide readers with enough information to form their own educated opinions. We try to point out the downsides of the best products out there, as well as the niche uses for which otherwise disappointing hardware might shine. So often our job is about balance and temperance.
But not this time: The NVIDIA GeForce 8800 Ultra is an utter waste of money.
Let's review the facts. First, our performance data shows the 8800 Ultra to perform on par with our EVGA e-GeForce 8800 GTX KO ACS3. Certainly the 8800 Ultra nudges the EVGA part out of the lead, but the performance difference is minimal at best. The price difference, however, is huge. We can easily find the EVGA card for its retail price of $650, while NVIDIA expects us to pay $180 more for what amounts to a repositioned cooling fan and updated silicon. Foxconn also offers an overclocked GTX for $550 that has essentially the same clocks as the EVGA KO ACS3 (Foxconn is 630/2000 versus 626/2000 for EVGA), making $830 even more unreasonable.
Add to that the fact that we've tested over a dozen 8800 GTX parts since their launch last year, and every single card we've tested has reached higher core clock speeds than the 8800 Ultra with overclocking. We know that increasing core clock speed using nTune causes shader clock speed to increase as well. Setting an 8800 GTX core clock to 621 would give us a shader clock of ~1450MHz, coming close to the 8800 Ultra level. The extra 50MHz increase in shader clock speed won't have a very large impact on performance as we have seen in our clock scaling tests.
All this leaves memory speed as the 8800 Ultra's only real advantage: none of the memory on 8800 GTX parts we've tested can reach 1080MHz from the base 900MHz. The only problem is that this doesn't give the part enough of a boost to matter in current real world performance tests.
With GPU revisions including layout changes, process tweaks, and an improved cooling solution, the least we would expect from the creation of a new price point in the consumer graphics market is a new level of performance. Price isn't the issue here: it's all about the value. It would be difficult even for a professional gamer to justify the purchase of an 8800 Ultra over the EVGA overclocked GTX. This incarnation of the G80 is even less justifiable than Intel's Extreme processors or AMD's FX line.
Certainly, placing some value in overclockability is fair. The problem here is that the stock speed at which the card runs offers no real added value over an already available overclocked 8800 GTX. If the overclockability of the G80 A3 silicon is its key point, why not simply offer the chips to add-in card builders at a premium and allow them to make custom overclocked boards at the speeds they choose? Let them call it an 8800 Ultra without defining a (rather low) stock speed for the new cards.
If user overclocking is where it's at, then standard 8800 GTX speeds are fine. Call it an 8800 Ultra because it features A3 silicon, market it towards overclockers, and sell it at a price premium. But don't try to sell us on 612/1500/1080 clock speeds.
With a push towards targeting overclockers we have to wonder: if there is so much headroom in the 8800 Ultra, why not offer us stock clock speeds that make a real performance difference?
We are all for higher performance, and we don't mind higher prices. But it is ridiculous to charge an exorbitant amount of money for something that doesn't offer any benefit over a product already on the market. $830 isn't the issue. In fact, we would love to see a graphics card worth $830. The 8800 Ultra just isn't it.