Technical Discussion

The bigger news with Rage is that this is id’s launch title to demonstrate what their id Tech 5 engine can do. It’s also the first major engine in a long while to use OpenGL as the core rendering API, which makes it doubly interesting for us to investigate as a benchmark. And here’s where things get really weird, as id and John Carmack have basically turned the whole gaming performance question on its head. Instead of fixed quality and variable performance, Rage shoots for fixed performance and variable quality. This is perhaps the biggest issue people are going to have with the game, especially if they’re hoping to be blown away by id’s latest graphical tour de force.

Running on my gaming system (if you missed it earlier, it’s an i7-965X @ 3.6GHz, 12GB RAM, GTX 580 graphics), I get a near-constant 60FPS, even at 2560x1600 with 8xAA. But there’s the rub: I don’t ever get more than 60FPS, and certain areas look pretty blurry no matter what I do. The original version of the game offered almost no options other than resolution and antialiasing, while the latest patch has opened things up a bit by adding texture cache and anisotropic filtering settings—these can be set to either Small/Low (default pre-patch) or Large/High. If you were hoping for a major change in image quality, however, post-patch there’s still plenty going on that limits the overall quality. For one, even with 21GB of disk space, id’s megatexturing may provide near-unique textures for the game world but many of the textures are still low resolution. Antialiasing is also a bit odd, as it appears have very little effect on performance (up to a certain point); the most demanding games choke at 2560x1600 4xAA, even with a GTX 580, but Rage chugs along happily with 8xAA. (16xAA on the other hand cuts frame rates almost in half.)

The net result is that both before and after the latest patch, people have been searching for ways to make Rage look better/sharper, with marginal success. I grabbed one of the custom configurations listed on the Steam forums to see if that helped at all. There appears to be a slight tweak in anisotropic filtering, but that’s about it. [Edit: removed link as the custom config appears mostly worthless—see updates.] I put together a gallery of several game locations using my native 2560x1600 resolution with 8xAA, at the default Small/Low settings (for texturing/filtering), at Large/High, and using the custom configuration (Large/High with additional tweaks). These are high quality JPEG files that are each ~1.5MB, but I have the original 5MB PNG files available if anyone wants them.

You can see that post-patch, the difference between the custom configuration and the in-game Large/High settings is negligible at best, while the pre-patch (default) Small/Low settings have some obvious blurriness in some locations. Dead City in particular looked horribly blurred before the patch; I started playing Rage last week, and I didn’t notice much in the way of texture blurriness until I hit Dead City, at which point I started looking for tweaks to improve quality. It looks better now, but there are still a lot of textures that feel like they need to be higher resolution/quality.

Something else worth discussing while we’re on the subject is Rage’s texture compression format. S3TC (also called DXTC) is the standard compressed texture format, first introduced in the late 90s.  S3TC/DXTC achieves a constant 4:1 or 6:1 compression ratio of textures. John Carmack has stated that all of the uncompressed textures in Rage occupy around 1TB of space, so obviously that’s not something they could ship/stream to customers, as even with a 6:1 compression ratio they’d still be looking at 170GB of textures. In order to get the final texture content down to a manageable 16GB or so, Rage uses the HD Photo/JPEG XR format to store their textures. The JPEG XR content then gets transcoded on-the-fly into DXTC, which is used for texturing the game world.

The transcoding process is one area where NVIDIA gets to play their CUDA card once more. When Anand benchmarked the new AMD FX-8150, he ran the CPU transcoding routine in Rage as one of numerous tests. I tried the same command post-patch, and with or without CUDA transcoding my system reported a time of 0.00 seconds (even with one thread), so that appears to be broken now as well. Anyway, I’d assume that a GTX 580 will transcode textures faster than any current CPU, but just how much faster I can’t say. AMD graphics on the other hand will currently have to rely on the CPU for transcoding.

Update: Sorry, I didn't realize that you had to have a game running rather than just using vt_benchmark at the main menu. Bear in mind that I'm using a different location than Anand used in his FX-8150 review; my save is in Dead City, which tends to be one of the more taxing areas. I'm using two different machines as a point of reference, one a quad-core (plus Hyper-Threading) 3.65GHz i7-965 and the other a quad-core i7-2630QM. I've also got results with and without CUDA, since both systems are equipped with NVIDIA GPUs. Here's the result, which admittedly isn't much:

Rage Transcoding Performance

This is using "vt_benchmark 8" and reporting the best score, but regardless of the number of threads it's pretty clear that CUDA is able to help speed up the image transcoding process. How much this actually affects gameplay isn't so clear, as new textures are likely transcoded in small bursts once the initial level load is complete. It's also worth pointing out that the GPU transcoding looks like it would be of more benefit with slower CPUs, as my desktop realized a 41% improvement while the lower clocked notebook (even with a slower GPU) realized a 52% improvement. I also tested the GTX 580 and GTX 560M with and without CUDA transcoding and didn’t notice a difference in perforamnce, but I don’t have any empirical data. That brings us to the final topic.

Rage Against the (Benchmark) Machine Performance Investigation and Wrap-Up
Comments Locked

80 Comments

View All Comments

  • KikassAssassin - Sunday, October 16, 2011 - link

    All Rage really needed to do was check your driver version, and if you're using an older driver than the game was designed for, pop up a message telling you to update your driver, with a link to download it from AMD/nvidia's website. They knew perfectly well that the game was broken on all but the latest drivers that were being released alongside the game, and few people were going to have those drivers installed when they tried playing the game for the first time, but they didn't do anything to let people know that beforehand.

    I'm a big fan of id and I'm enjoying Rage, but they really dropped the ball with this game's launch.
  • lkuzmanov - Monday, October 17, 2011 - link

    erm... it's not the chip makers responsibility to ensure a game runs - they certainly can and do help the devs, but QA should be part of the development / release cycle. in this context what's happening with Rage is just silly - your product either runs with the currently available drivers or you don't release it - it should be that simple.

    "Until then, Rage remains an impressive technology demonstration and a game that some people will certainly enjoy; it’s just not a good benchmark for us right now."

    the game would've qualified as "an impressive technology demonstration" had it come out on in a somewhat playable state. it didn't. oddly enough, however, the tone of most reviews I've seen remains positive.
  • JarredWalton - Monday, October 17, 2011 - link

    This totally depends on your hardware. It ran fine on my system prior to the patch, albeit with the beta NVIDIA drivers. I think it's mostly on lower spec machines and/or AMD GPUs that people are having problems.
  • ProDigit - Saturday, October 15, 2011 - link

    Their conclusions are a story by itself!
  • JarredWalton - Saturday, October 15, 2011 - link

    Not sure exactly what you're trying to say here... that I should have posted just a two paragraph blurb? Or that you like the long articles?
  • Taft12 - Tuesday, October 18, 2011 - link

    I think he's saying "I have a short attention span, doesn't everybody?"
  • Iketh - Saturday, October 15, 2011 - link

    you're a minority
  • ltcommanderdata - Saturday, October 15, 2011 - link

    I was hoping you could comment on what version of OpenGL id Tech 5 is using? I'm under the impression that it's using OpenGL 2.1 + a ton of extensions. Do you think it would have been easier for AMD and nVidia to support if id had stuck to a more standard OpenGL implementation such as stock OpenGL 3.0?

    And any word on adding OpenCL support for GPGPU transcode? In earlier conference presentations, id said they were looking at a variety of options of including CUDA and OpenCL. I'm surprised they didn't go with OpenCL seeing it would have supported both CPUs and AMD and nVidia GPUs instead of requiring a separate codepath for nVidia GPUs and a CPU codepath for everything else. Certainly now that OpenGL 1.1 drivers are available for AMD CPUs and GPUs, nVidia GPUs, and Intel CPUs and soon Ivy Bridge, now is definitely the time for the first OpenCL accelerated game.
  • Ryan Smith - Saturday, October 15, 2011 - link

    The context that Rage creates is an OpenGL 3.2 context. But I could not tell you off of the top of my head what they're using from OpenGL 3.x, but OpenGL is nebulous in general like that. Given that they're using a 3.2 context, I wouldn't be too worried about what specific features they are or are not using. It's not like OpenGL 4.x where there are specific headliner features (e.g. tessellation) that people are familiar with.

    As for OpenCL, the last thing I heard was the famous post a couple of weeks ago about CUDA, which noted that OpenCL was a performance issue. There hasn't been any news on that.to which I'm aware of.
  • swaaye - Tuesday, October 18, 2011 - link

    On Beyond3D in the PC Gaming forum, NV's Evan Hart (engineer behind most of the GPU transcoding) stopped by and mentioned that OpenCL was tested perhaps years ago on older hardware and it performed poorly at the time. Apparently it was not further explored. I'm not sure which company lacked the motivation to go further with it.

Log in

Don't have an account? Sign up now