Rage Against the (Benchmark) Machineby Jarred Walton on October 14, 2011 11:55 PM EST
The bigger news with Rage is that this is id’s launch title to demonstrate what their id Tech 5 engine can do. It’s also the first major engine in a long while to use OpenGL as the core rendering API, which makes it doubly interesting for us to investigate as a benchmark. And here’s where things get really weird, as id and John Carmack have basically turned the whole gaming performance question on its head. Instead of fixed quality and variable performance, Rage shoots for fixed performance and variable quality. This is perhaps the biggest issue people are going to have with the game, especially if they’re hoping to be blown away by id’s latest graphical tour de force.
Running on my gaming system (if you missed it earlier, it’s an i7-965X @ 3.6GHz, 12GB RAM, GTX 580 graphics), I get a near-constant 60FPS, even at 2560x1600 with 8xAA. But there’s the rub: I don’t ever get more than 60FPS, and certain areas look pretty blurry no matter what I do. The original version of the game offered almost no options other than resolution and antialiasing, while the latest patch has opened things up a bit by adding texture cache and anisotropic filtering settings—these can be set to either Small/Low (default pre-patch) or Large/High. If you were hoping for a major change in image quality, however, post-patch there’s still plenty going on that limits the overall quality. For one, even with 21GB of disk space, id’s megatexturing may provide near-unique textures for the game world but many of the textures are still low resolution. Antialiasing is also a bit odd, as it appears have very little effect on performance (up to a certain point); the most demanding games choke at 2560x1600 4xAA, even with a GTX 580, but Rage chugs along happily with 8xAA. (16xAA on the other hand cuts frame rates almost in half.)
The net result is that both before and after the latest patch, people have been searching for ways to make Rage look better/sharper, with marginal success. I grabbed one of the custom configurations listed on the Steam forums to see if that helped at all. There appears to be a slight tweak in anisotropic filtering, but that’s about it. [Edit: removed link as the custom config appears mostly worthless—see updates.] I put together a gallery of several game locations using my native 2560x1600 resolution with 8xAA, at the default Small/Low settings (for texturing/filtering), at Large/High, and using the custom configuration (Large/High with additional tweaks). These are high quality JPEG files that are each ~1.5MB, but I have the original 5MB PNG files available if anyone wants them.
You can see that post-patch, the difference between the custom configuration and the in-game Large/High settings is negligible at best, while the pre-patch (default) Small/Low settings have some obvious blurriness in some locations. Dead City in particular looked horribly blurred before the patch; I started playing Rage last week, and I didn’t notice much in the way of texture blurriness until I hit Dead City, at which point I started looking for tweaks to improve quality. It looks better now, but there are still a lot of textures that feel like they need to be higher resolution/quality.
Something else worth discussing while we’re on the subject is Rage’s texture compression format. S3TC (also called DXTC) is the standard compressed texture format, first introduced in the late 90s. S3TC/DXTC achieves a constant 4:1 or 6:1 compression ratio of textures. John Carmack has stated that all of the uncompressed textures in Rage occupy around 1TB of space, so obviously that’s not something they could ship/stream to customers, as even with a 6:1 compression ratio they’d still be looking at 170GB of textures. In order to get the final texture content down to a manageable 16GB or so, Rage uses the HD Photo/JPEG XR format to store their textures. The JPEG XR content then gets transcoded on-the-fly into DXTC, which is used for texturing the game world.
The transcoding process is one area where NVIDIA gets to play their CUDA card once more. When Anand benchmarked the new AMD FX-8150, he ran the CPU transcoding routine in Rage as one of numerous tests.
I tried the same command post-patch, and with or without CUDA transcoding my system reported a time of 0.00 seconds (even with one thread), so that appears to be broken now as well. Anyway, I’d assume that a GTX 580 will transcode textures faster than any current CPU, but just how much faster I can’t say. AMD graphics on the other hand will currently have to rely on the CPU for transcoding.
Update: Sorry, I didn't realize that you had to have a game running rather than just using vt_benchmark at the main menu. Bear in mind that I'm using a different location than Anand used in his FX-8150 review; my save is in Dead City, which tends to be one of the more taxing areas. I'm using two different machines as a point of reference, one a quad-core (plus Hyper-Threading) 3.65GHz i7-965 and the other a quad-core i7-2630QM. I've also got results with and without CUDA, since both systems are equipped with NVIDIA GPUs. Here's the result, which admittedly isn't much:
This is using "vt_benchmark 8" and reporting the best score, but regardless of the number of threads it's pretty clear that CUDA is able to help speed up the image transcoding process. How much this actually affects gameplay isn't so clear, as new textures are likely transcoded in small bursts once the initial level load is complete. It's also worth pointing out that the GPU transcoding looks like it would be of more benefit with slower CPUs, as my desktop realized a 41% improvement while the lower clocked notebook (even with a slower GPU) realized a 52% improvement. I also tested the GTX 580 and GTX 560M with and without CUDA transcoding and didn’t notice a difference in perforamnce, but I don’t have any empirical data. That brings us to the final topic.