Half-Life 2 Performance

Half-Life 2, another graphically-intensive game, should see lots of improvement as well. Unlike Doom3, Half-Life 2 gets playable framerates at the higher resolutions with AA enabled on the 6800 Ultra, but you will still see a sizeable improvement with an upgrade to the 7800 GT.

Half Life 2


Half Life 2


Half Life 2


Half Life 2


Without AA, we see a modest 15% gain at 16x12, and 37% gain at 20x15 with the 7800 GT over the 6800U. With AA enabled, 16x12 gets a 42% gain and 20x15 gets 54.5%, showing that Half-Life 2 does a slightly better job with AA than Doom3.

Again, we see much larger gains with the 6800 Ultra in SLI configurations. The highest increase is 88.7% at 20x15 with AA enabled, about a 30 fps increase. 16x12 AA gets a 68% increase, and without AA, there's about a 22% increase at 16x12, and 59% increase at 20x15. This is more evidence for the superiority of the 6800U SLI over the 7800 GT, performance-wise, but whether or not it's worth the price as well as the extra power demands is debatable. Also note that other than 20x15 4xAA, the 7800 GT SLI and 6800 Ultra SLI are nearly the same performance. The extra memory bandwidth of the 6800 Ultra seems to help it match the additional processing power in quite a few tests.

Everquest II Performance Splinter Cell: Chaos Theory Performance
Comments Locked

77 Comments

View All Comments

  • Hacp - Thursday, August 11, 2005 - link

    THANK YOU FOR MAKING ALL THE POSTS VISABLE! IT WAS ANNOYING CLICKING ON THE DOWN BUTTON!

    sry for the caps :).
  • crimson117 - Thursday, August 11, 2005 - link

    Page 4, first two sentences:

    "One of the most demanding games we test in terms of graphics, Doom 3 shows some impressive gains here. First lets compare the 6800 Ultra and the 7800 GT."

    should be

    "One of the most demanding games we test in terms of graphics, Doom 3, shows some impressive gains here. First let's compare the 6800 Ultra and the 7800 GT."

    note: the comma after Doom 3 makes "One" the subject of the sentence. Alternatively, it could be:

    "Doom 3 is one of the most demanding games we test in terms of graphics, and it shows some impressive gains here."
  • pio!pio! - Thursday, August 11, 2005 - link

    Can the sample card you obtained be overclocked to anywhere near GTX speed? (cpu and memory)

    Also I know it's early, but any thoughts on softmodding it to enable more pipelines?
  • JNo - Thursday, August 11, 2005 - link

    Good question. xbitlabs.com state in their review that, "The GeForce 7800 GT reference graphics card we had in our lab proved quite overclockable. We managed to increase its working frequencies from 400MHz for the chip and 1000MHz for the memory to 470MHz for the chip and 1200MHz for the memory, which should ensure a significant performance increase during our tests." Unfortunately they did not include overclocked results in their graphs.

    Also, xbitlabs noted that the PCB is shorter than for the GTX, that the cooler is shorter and also commented on noise levels, given that unlike the GTX, the GT is apparently unable to regulate its fan speed. It is a shame that anandtech missed out on such details.
  • mmp121 - Thursday, August 11, 2005 - link

    I'm thinking this is a typo when you re-made the charts for BF2 to include the NV 6800 Ultra. The ATI Radeon X800 XT is really the ATI Radeon X850 XT PE right? Maybe I am wrong. Just wanted to point out the fluke so you guys can fix it. Good read so far!
  • DerekWilson - Friday, August 12, 2005 - link

    The BF2 numbers use the X800 not the X850
  • Phantronius - Thursday, August 11, 2005 - link

    Waaaah!!! Why no 6800GT comparison and why the hell didn't you guys benchmark Far Cry with HDR Mode???????
  • AtaStrumf - Thursday, August 11, 2005 - link

    I really resent this comment on page 1:

    "Until performance is increased beyond the 7800 GTX, it will be hard for us to see a reason for a new desktop 7 series part."

    So you rich boys got your flashy new toys and now you see no need for a new desktop 7 part, because you're too busy thinking about how you're gonna be playing games on the go (i.e. your new laptop GPU).

    What about us stuck at 6600 GT level. Don't we deserve an upgrade??? Like a 12-16 pipe/ 256-bit 7600 GT? I guess that we are just fine, since we're stuck with 1280x1024 res monitors anyway, right? WRONG! We've been cheated out of a significant upgrade long enough. Until 6600 GT there was only BS in the mainstream for about 2,5 years (R9500/9600/PRO/XT/GF5600/Ultra,...) and we're not going back there, no sir!

    Same sh!t when people with broadband post large uncompressed images on the web and forget about all those with dial-up, even though they themselves left that sorry bunch not too long ago. The world is weee bit bigger than your own back yard and someone writing for a site as big as AT should really know that.
  • DerekWilson - Friday, August 12, 2005 - link

    I just want to add my two cents as well ...

    I will agree with you that the 6600 GT was the first real solid mainstream option in a while. It's a good card.

    I'll argue that the next mainstream card you'll want to look at is the 6800 GT. There are 128MB parts and 256MB parts all with 256-bit busses and 16 pixel pipes at good clock speeds.

    As other's have said, releasing a G70 part with the same specs as the 6800 GT will have the same performance as well.

    We tried to explain in the article that the 6 Series comprises the rest of the lineup going forward. There are no performance gaps that the G70 needs to fill in. The only reason NVIDIA would want to release slower G70 parts would be to phase out the 6 Series part at that same speed grade.

    It also doesn't make sense for NVIDIA to immediately release a slower G70 part. The lower transistor count and more mature process used on NV4x chips will likely make it easier for NVIDIA to sell the parts at a lower price and higher profit than equivalently performing G70 parts. The economics of this are very complicated and depend quite a bit on NVIDIA and TSMC and the cost per IC for NV4x and G70 chips.

    It would almost make sense for NVIDIA to take less functional G70 chips and sell them as 6 Series parts. But maybe I'm wrong. Perhaps NVIDIA will think it can trick people into thinking a part with the same performance as a 6800 or 6800 GT and a 7 Series name is worth more money.

    It really shouldn't matter to anyone whether 6 Series parts keep falling in price or other 7 Series parts come out. I stand behind my statement. We'll see lower performing G70 parts come out if and when it becomes more financially viable for NVIDIA to sell those parts than NV4x parts. There really isn't any other factor that matters.

    Transparency AA may be an interesting thing, but moving towards the budget end of the spectrum will tend to make the performance impact of Transparency AA too high to matter. Other little tweaks and features have already been covered and aren't that compelling over the 6 Series, and seeing a G70 perform the same as an NV4x for the same price really wouldn't be that exciting no matter what my budget looks like.

    Derek Wilson
  • coldpower27 - Thursday, August 11, 2005 - link

    It would be interesting to see if the 90nm based 7600 part has a 256Bit Memory Interface, as it seems 256Bit Memory Interface cards usually have a minimum die size of around 200mm2 just out of range of a high volume mianstream level card.

    I would certainly want a 7600 GT if it had a similar pipeline configuration in comparison to the 6800 or 6800 GT because G7x does have some improvements I like, most notable the improvement in HDR performance levels, and Transparency AA, also what about WGF 1.0 support? It usually for the most part better to have a new mainstream card based on newer tech then just shifting older technology into the lower price points, as those cards aren't meant for those price points and are usually more expensive to produce.

    Not all of us can even afford the current 399US price point of 7800 GT.

Log in

Don't have an account? Sign up now