It's been a long time coming, but we finally have Epic's first Unreal Engine 3 based game out on the PC. While the final version of Unreal Tournament 3 is still a little farther out, last week's beta release kept us occupied over the past several days as we benchmarked the engine behind Rainbow Six: Vegas, Gears of War and Bioshock.

Used in some very beautiful games, Epic's Unreal Engine 3 has been bringing us some truly next-generation game titles and is significantly more demanding on the CPU and GPU than Valve's Source engine. While far from the impossible-to-run that Oblivion was upon its release, UE3 is still more stressful on modern day hardware than most of what we've seen thus far.

The Demo Beta

Although Unreal Tournament 3 is due out before the end of the year, what Epic released is a beta of the UT3 Demo and thus it's not as polished as a final demo. The demo beta has the ability to record demos but it can't play them back, so conventional benchmarking is out. Thankfully Epic left in three scripted flybys that basically take a camera and fly around the levels in a set path, devoid of all characters.

Real world UT3 performance will be more strenuous than what these flybys show but it's the best we can muster for now. The final version of UT3 should have full demo playback functionality, with which we'll be able to provide better performance analysis. The demo beta also only ships with medium quality textures, so the final game can be even more stressful/beautiful if you so desire.

The flybys can run for an arbitrary period of time, we standardized on 90 seconds for each flyby in order to get repeatable results while still keeping the tests manageable to run. There are three flyby benchmarks that come bundled with the demo beta: DM-ShangriLa, DM-HeatRay and vCTF-Suspense.

As their names imply, the ShangriLa and HeatRay flybys are of the Shangri La and Heat Ray deathmatch levels, while the vCTF-Suspense is a flyby of the sole vehicle CTF level that comes with the demo.

Our GPU tests were run at the highest quality settings and with the -compatscale=5 switch enabled, which puts all detail settings at their highest values.

Our CPU tests were run at the default settings without the compatscale switch as we're looking to measure CPU performance and not GPU performance.

The Test

Test Setup

Intel Core 2 Extreme QX6850 (3.33GHz 4MB 1333FSB)
Intel Core 2 Quad Q6700 (2.66GHz 4MB 1066FSB)
Intel Core 2 Quad Q6600 (2.40GHz 4MB 1066FSB)
Intel Core 2 Duo E6750 (2.66GHz 4MB 1333FSB)
Intel Core 2 Duo E6550 (2.33GHz 4MB 1333FSB)
Intel Core 2 Duo E4500 (2.2GHz 2MB 800FSB)
Intel Core 2 Duo E4400 (2.0GHz 2MB 800FSB)
Intel Pentium Dual-Core E2160 (1.8GHz 1MB 800FSB)
Intel Pentium Dual-Core E2140 (1.6GHz 1MB 800FSB)
AMD Athlon 64 X2 6400+ (3.2GHz 2x1MB)
AMD Athlon 64 X2 6000+ (3.0GHz 2x1MB)
AMD Athlon 64 X2 5600+ (2.8GHZ 2x1MB)
AMD Athlon X2 5000+ (2.6GHz 2x512K)
AMD Athlon X2 4200+ (2.2GHz 2x512K)
AMD Athlon X2 4000+ (2.1GHz 2x512K)

Motherboard Intel: Gigabyte GA-P35C-DS3R
AMD: ASUS M2N32-SLI Deluxe
Video Cards AMD Radeon HD 2900 XT
AMD Radeon X1950 XTX
NVIDIA GeForce 8800 Ultra
NVIDIA GeForce 8800 GTX
NVIDIA GeForce 8800 GTS 320MB
NVIDIA GeForce 7900 GTX
Video Drivers AMD: Catalyst 7.10
NVIDIA: 163.75
Hard Drive Seagate 7200.9 300GB 8MB 7200RPM
RAM 2x1GB Corsair XMS2 PC2-6400 4-4-4-12
Operating System Windows Vista Ultimate 32-bit
UT3 Teaches us about CPU Architecture
Comments Locked


View All Comments

  • Ryan Smith - Wednesday, October 17, 2007 - link

    See the post right above yours. ;-)
  • CRimer76 - Wednesday, October 17, 2007 - link

    Isn't this game supposed to use the Ageia crap to improve performance? Would love to see some benchies on that.
  • Ryan Smith - Wednesday, October 17, 2007 - link

    Yes, it can use the PhysX hardware to improve performance. However flybys are completely useless for testing PhysX, because there is no physics work going on. Since I cover the PPU side of things, once we have the ability to play with demos we'll have an article up.
  • MadBoris - Wednesday, October 17, 2007 - link

    Great work showing all the different gradients on vid card, cache, cpu, threading.
    The game really scales impressively on hardware.
    How did Sweeney manage to make the 2900xt shine? hehe

    I've got some small reservations using the flyby's, hopefully you guys will have a demo run for next time around. Theoretically, a demo loop should be pretty similar, mainly more CPU intensive, but I'm curious to see.

    Good work showing off the games scaling.
  • Sunrise089 - Wednesday, October 17, 2007 - link

    Not only does the X1950XT absolutely destroy the 7900GTX (the ATI card REALLY ended up being the better long-term buy), but the HD 2900XT looks absolutely great here. If this sort of performance is indicative of the future (and it may not be - perhaps there is something that will favor nVidia when the release copy of the game arrives with better graphics) then ATI looks much better than it did even last month.

    PLEASE do a follow up with this game at launch/when the new midrange GPUs launch. It's going to be very interesting to see the price/performance between the new 8800GT, an overclocked HD 2900pro, and the new ATI midrange card (HD 2950pro?)
  • tmx220 - Wednesday, October 17, 2007 - link

    they used an X1950XTX, though it would be about the same
    they should have used the XT because it can still be found, and for a decent price
  • Darth Farter - Wednesday, October 17, 2007 - link

    the last 2 game reviews are nice to have 1024x768 CPU comparisons, but I think we're seeing too many pages of 1024x768 of that which probably only 2% of the guys planning on running these games will use.

    I'd rather suggest showing at least the scaling if there's an intention of showing CPU limitations to the common 1280x1024, the 1680x1050(1600x1200) and the 1920x1200 to reflect what we're also actually are going to see on screen as user by swapping/Upgrading a CPU/Platform to another model/platform while planning on playing with our obviously high end GFX card at 1920x1200.

    an Overall CPU comparison with at only 1024x768 left me severely disappointed though I can understand the time constraints. This regrettably reminds me at Toms with their beside-the-point tactics in articles.

    Just my .02 for making Anands a bit better.

  • IKeelU - Wednesday, October 17, 2007 - link

    I'd really like to see 1280x720. I usually play on my 720p TV and I imagine that resolution would be CPU-bound as well, though probably less so than 1024x768.
  • Roy2001 - Wednesday, October 17, 2007 - link

    I am more interest to see Phenom data.
  • hubajube - Wednesday, October 17, 2007 - link

    I'm officially done with Anandtech's conclusions on benchmarks. Why are you making these blanket "AMD is not competitive" statements? A 9% difference in fps performance and now AMD is shit?! Not to mention that whopping 9% difference only works out to roughly 5 whole frames per second! ROFLMAO!!!!!

    Brad: Hey Johnny! Did you see the new UT3 review from Anandtech? They showed the Intel CPU's KILLING AMD by five frames per second!

    Johnny: Holy shit Brad!!!!! AMD is going out of business with piss poor performance like that. I'll NEVER buy AMD again.

Log in

Don't have an account? Sign up now