A Case Study, Cont

Now that we've seen what can happen when we reach the 2GB barrier and how easy it can be to pass it, let's talk about what it took to remove it in the first place for Supreme Command. Supreme Commander is unfortunately not compiled as being large address aware and must be modified so that Windows thinks that it is. Microsoft supplies a tool as part of the Visual Studio suite, Editbin, that can do just this by rewriting the file header to report to Windows that it is in fact large address aware.

To the best of our knowledge Supreme Commander was programmed using proper programming practices and can handle the larger address space, and this is merely an issue of turning it on. However on a more pragmatic note this can break future patches, and disturbingly it doesn't set off any sort of multiplayer cheat detection in the game in spite of the fact that we have modified the executable in a very visible way. Out of the changes we need to make to deal with the 2GB barrier though, this is the safer of the two.

Update: Gas Powered Games contacted us and let us know that the modified executable not setting off any cheat detection is intentional. The game code is all in a DLL, and the executable is just a launcher; it's left unchecked because of the various Digital Rights Management systems used change the exectuable.

Changing Windows on the other hand to allocate more of the virtual address space to applications is in practice just as dangerous as we theorized earlier. We initially set our copy of Windows Vista to adjust the split to 1GB kernel mode, 3GB user mode, only for Vista to encounter a BSOD while booting. We had to settle for a 1.4GB/2.6GB split before Vista would boot, and even then Vista still periodically encounters a BSOD upon booting at that allocation or any other allocation other than 2GB/2GB. While what problems may occur and with what values is highly variable from system to system, this is why trying to move the barrier at all can be dangerous.

Having made the above changes, we also used the chance to take a look at system performance both in and outside of Supreme Commander, taking interest in to the effect of allocating more user mode space. As we theorized before, taking space away from the kernel may impact performance, and this is something that needs to be tested. For this we ran a cut-down version of our normal system test suite, with allocations of 1.4GB/2.6GB, 1.7GB/2.3GB, and the default 2GB/2GB.

Software Test Bed
Processor AMD Athlon 64 4600+
(2x2.4GHz/512KB Cache, S939)
RAM OCZ EL Platinum DDR-400 (4x512MB)
Motherboard ASUS A8N-SLI Premium (nForce 4 SLI)
System Platform Drivers NV 15.00
Hard Drive Maxtor MaXLine Pro 500GB SATA
Video Cards 1 x GeForce 8800GTX
Video Drivers NV ForceWare 158.45
Power Supply OCZ GameXStream 700W
Desktop Resolution 1600x1200
Operating System Windows Vista Ultimate 32-Bit
.

Starting with Supreme Commander, the only test here that can even utilize more than 2GB of user space, we do find a minor but consistent variation in performance. Increasing the user space improved Supreme Commander performance by about 1 frame per second, which at around a 3.5% performance improvement is right along the edge of either being significant or a normal variation. We repeated this test several times just to make sure that it wasn't a variation and the results remained consistent, so it doesn't appear to be a variation. With that said the instability caused by adjusting the user space size does not justify the extremely minor performance improvement.

Going down the list of benchmarks, we find that there is no notable change in performance in any of our benchmarks. Since none of these benchmarks are capable of using more user space we weren't expecting an increase, but this puts to rest the idea of a performance decrease. There does not appear to be a performance decrease in adjusting the user space size; if it boots it'll perform well.


A Case Study: Supreme Commander Other Problems, Other Solutions
Comments Locked

69 Comments

View All Comments

  • titan7 - Thursday, July 12, 2007 - link

    There is nothing you can do. If you ship a game you could test it and ensure it doesn't run out of memory (e.g. GameCube has just 24megs of system memory and the last Zelda looked great. Quite a ways away from 2048megabytes PC games run into!), but what about a mod?

    When the application starts just allocate 512 megabytes or whatever you feel is reasonable. When new throws an exception free that memory, display a warning you're low on memory and need to upgrade to Vista64, and continue. When it new fails again one microsecond later you're screwed so display a message to the end user along the lines of "I told you so!" End users really like that type of thing. ;)

    You could get a bit fancier by replacing all units with simple cubes or something, but all that does is delay in the inevitable a bit longer.
  • ncage - Thursday, July 12, 2007 - link

    1) First step is to detect which OS you are using at startup. Is it 64 bit os or not?

    2) You SHOULD never code your application around a 2GB memory limit. It is very bad coding practice. Going through thorough examples in a short post like this isn't very practicle

    3) Some higher level langauge/constructs abstract this away from you. For example if you are using .Net CLR you don't really have to worry about this unless maybe your doing some crazy pinvoke stuff which in most cases you shouldn't be doing anyways. Of course if your doing VB6 your never going to get around it anyways because vb6 is only 32 bit.

    4) If you are doing C++ or assembly with windows then you can use the GlobalMemoryStatus() Function to Effectively see how much available address space you have.
  • Ryan Smith - Thursday, July 12, 2007 - link

    The key to any of this is monitoring how much of the virtual address pool is in use; there should be an API call to ask Windows this. The easiest thing to do would be to give a warning at 1.9GB or so and then either do nothing, trigger a crash early, or attempt to reduce detail or flush space to stay below the 2GB barrier. The warning is the easy part, the hard part is preventing the crash, and I don't honestly believe anyone can or will be preventing crashing.
  • yacoub - Thursday, July 12, 2007 - link

    I just wish we had a better solution than Vista. Sure we can use 64bit XP but that's only going to last how much longer with full patch support from MS?
    If Vista wasn't such a pile and didn't perform worse in games when using equivalent hardware as the same system running XP, it wouldn't be such an unappealing alternative.

    And even so, when running 4GB of RAM, how much over a LargeAddressAware flagged game with the 3GB boot.ini switch are you really gaining by using 64bit OS? Not much, really. We first need motherboards that are happy running 8GB of RAM, RAM cheap enough to buy 8GB for a reasonable price (which is not too hard with DDR2 2GB DIMMs right now), and do so at full performance/speed settings.

    Really it's not just a move to a 64bit OS, it's also a move to 8GB of RAM.

    OR it's simply having developers who code their games to work properly within 3GB of addressable RAM.
  • instant - Saturday, July 21, 2007 - link

    How many more patches do you need for XP 64bit anyway?

    As long as the games work for it, why care about microsoft updating "security" hotfixes or not.
  • Tegeril - Thursday, July 12, 2007 - link

    Articles like this make me very glad that I opted to go with 64bit Vista. All my hardware is supported at this point with stable drivers (we can argue the Creative X-Fi, but it works fine). I'm just amazed that people saw this coming and yet we have games that just die because of the problem. 64 bit isn't as bad as people make it out to be regarding compatibility :D - except iTunes and Quicktime :(
  • halfeatenfish - Thursday, July 12, 2007 - link

    How do the *nix variants deal with this same issue? Do they even have it? Can someone shed some light, especially in terms of OS X... Leopard, if anyone knows anything there. But Tiger info is just as good.
  • titan7 - Thursday, July 12, 2007 - link

    They all do the same thing for 32bit. Bu generally speaking *nix has been 64bit for years (decades) so if it is a problem just run the 64bit version of everything. And being more cross platform their code tends to have less hacks like you get in Windows apps that assume there is an extra bit available on every pointer. A single bit! Bah, we're talking about billions of bytes and elite programmers are trying to squeeze every last bit out of their application at the expense of future compatibility. LAME.
  • The Boston Dangler - Thursday, July 12, 2007 - link

    OSX is a 64-bit system, *nix ymmv
  • MadBoris - Thursday, July 12, 2007 - link

    Cool article Ryan. Good to see these issues getting more global attention.
    Since 32 bit seems it is here to stay for a lot longer than we want it to, and with software bloat continuing, this will hopefully continue to put pressure on driver devs to write better drivers that can handle >2GB addresses without issue. So that people can use the /3Gb switch without concern. I personally have never had problems with /3GB with any of my hardware/drivers but certainly 'less mainstream' drivers may not be handled with the care that they should be.

    I like the breakdown of games/apps that support the LargeAddressAware flag, maybe this list can grow for future articles covering more apps/games. I also enjoyed your testing on the "potential" penalty of less kernel space, something I never took the time to do on my own.

    Imagine my suprise today when making my rounds to my favorite hardware site. ;)

Log in

Don't have an account? Sign up now