Removing the 2GB Barrier

As it turns out, it's possible and actually quite easy to move the 2GB barrier by increasing the size of the user space, but at the cost of reducing the size of the kernel space. Under Windows XP, this is the fabled "/3gb" switch for boot.ini, and for Windows Vista it's the "IncreaseUserVa" option in BCDedit. By using these options applications can use more than 2GB of virtual address space (generally up to 3GB), and ideally this would be the end of the article.

Unfortunately this is not the case as there are problems on both the application and kernel side of things. On the application side, a common poor programming practice has been to always assume that an application will only be dealing with 2GB of user space; code that makes this assumption will likely error if more than 2GB of user space is actually available. This is avoidable by following proper programming practices, but as a safety precaution even with additional virtual address space allocated to user space Windows still defaults to limiting an application to 2GB. Only finally, if an application indicates to Windows that it is capable of handling more than 2GB, via the "/LARGEADDRESSAWARE" flag, may it have access to any space above 2GB.

As for the kernel, having had up to half of its space taken away must now find a way to live in a smaller space. The (in)ability of any specific system/Windows configuration to deal with this is why the 3gb switch is considered dangerous, seldom recommended, and just generally a bad idea. The biggest culprit here is drivers that run in kernel space. Like applications, they may assume that there's an entire 2GB of address space to work with, except unlike applications this space gets smaller instead of bigger.

Windows' own memory needs can also cause problems with the reduced kernel space. As we mentioned before, space is required for the kernel to do a multitude of things, if a lot of space is required - video cards with a lot of memory are a particular offender here - then everything needing space may not fit in the kernel space. Because there are no strong safeguards against these conditions it may cause a failure to boot or system instability, especially if the culprit is a driver that is well enough behaved to boot. Many modern drivers from hardware vendors that deal with enterprise-level hardware are capable of handling this, many more consumer hardware drivers are not. Stability concerns are the number one reason that breaking the 2GB barrier on a 32bit version of Windows is not recommended.

There is also a second concern however: performance. While an individual application may benefit from more user space in which to work, the kernel now has less space to cache data (as non-obvious as this may seem given all the addresses are virtual) and this can in theory hurt performance. This condition is something we will take a look at in a bit.

A Primer on Windows’ Memory Management A Case Study: Supreme Commander
Comments Locked

69 Comments

View All Comments

  • brink - Thursday, July 12, 2007 - link

    it's a 2 prong solution for WinXP, you have to set a /3GB flag in your BOOT.INI file for the instance of windows you're booting. Additionally since Supcom doesn't have the LARGEADDRESSAWARE flag, you have to patch the EXE using the tool mentioned in the article ((someone created a batch that uses the tool to patch the supcom exe, easily found in the official supcom forums)

    I mention this since the article doesn't really say how they did it (and they used Vista, which is another boc to contend with) Since HardOCP did a comparison a while back between supcom perf in XP and Vista, I've really only installed/used supcom in XP still. With our fix for a 4GB machine (the machine I regularly use still has 2GB, I just stay away from 81KM maps) XP has still remained stable, but we did have one crash in a 40KM map game on Gentleman's Reef.

    I don't like the article's preference on FPS in Supcom, mainly because I don't look at Supcom as a FPS centric game at all. If you've played, you know when a slow computer enters the game (or you have 7 computers each with 1,000 units on a 81KM map) the in-game timer will start to crawl. 1 second of game time will take 2 seconds, or much much worse. It would have been approx 100x cooler if the bench was "it took this much before the timer started to skew".
  • jay401 - Thursday, July 12, 2007 - link

    lol funny, now that I am paging through the article I see you mention this very issue. Good!
  • jay401 - Thursday, July 12, 2007 - link

    Well you didn't address the 'WHY' - why the game uses so much memory. Hopefully I provided a little light on that subject.

    Also on Page 5 none of your graphs are labeled as to what they are measuring. Please note if they're measuring fps, which is my gut but I'm not sure because they are unlabeled.

    Thanks.
  • MadBoris - Thursday, July 12, 2007 - link

    quote:

    Well you didn't address the 'WHY' - why the game uses so much memory. Hopefully I provided a little light on that subject.


    Although you are right in part, the lead engineer did mention the units are one reason for large memory consumption. (BTW, I had heard that they are all being/were rerendered for November). There is another issue beyond that though that becomes obvious. The initial virtual address space at the beginning of a game between a 20k map and 81k map is only about 150MB difference. But as unit count climbs, the larger map gap grows somewhat exponentially compared to the smaller. So something else is askew.

    As Ryan mentioned the whys and wherefores aren't really the point, this issue is a global one and 2GB is a real hard limit now for games since we have the horsepower(CPU & GPU) for larger memory consuming texture maps, larger resolutions, yet the 2GB memory limit for a game is a definitive roadblock to forward progress so I am glad the issue is coming to the forefront.

    As much confusion and fear there is on this /3GB subject for the laymen, this is still a great rabbit in the hat for us with 32 bit OS's if more driver writers get on the ball, fears can subside. Hopefully devs like Crytek can continue to push demand for 64 bit with a nice 64 bit Crysis patch too, and we can start making the transition leaving 32 bit behind as drivers/apps also make the transition.

    I think articles like these help the cause.
  • Ryan Smith - Thursday, July 12, 2007 - link

    To be honest, we didn't address why because it really isn't relevant. Even if Supreme Commander was done perfectly in every way, the result would have been the same once it reached the 2GB barrier.
  • gigahertz20 - Thursday, July 12, 2007 - link

    They should have made Vista only 64 bit to put pressure on the transition.
  • johnsonx - Friday, July 13, 2007 - link

    The problem with going 64-bit only this time around is that there are too many 32-bit programs around that simply won't run on 64-bit windows. I have several myself that I depend on daily. They are slightly older programs that the developer doesn't intend to upgrade to be 64-bit, but that doesn't change the fact that I need them. If these apps didn't work with Vista because it was released in 64-bit only form, then I wouldn't be running Vista. Millions of others are (or would be) in this same situation, which would significantly harm Vista sales.

    If the next version of Windows were made 64-bit only, around the 2010 time frame, I think that would be quite reasonable. By then most 32-bit only programs will have been replaced or rendered obsolete.

    I think Microsoft has handled the 32/64-bit issue correctly so far, for the most part. XP64 should have been ready sooner and should have been better supported though.

    Related question for anyone who knows: I know retail Vistas include 32-bit and 64-bit on the same disc, and the user is free to install either. I also know that OEM Vistas include only the one version on the disc. What about the OEM keycodes though? Can you install a 64-bit Vista using the keycode that came with a 32-bit disc? Or has MS limited the keycode as well?
  • StygianAgenda - Thursday, June 5, 2008 - link

    To answer your question about the Windows Vista Retail package, I have 2 copies of Vista Ultimate retail, and it was packed with 2 DVDs, 1 is 32bit, the other is the 64bit build.

    The set comes with a single key, and the key is bound to the 64bit version, so if you opt to use the 32bit version instead, you'll most likely have to call into Microsoft's activation center and manually activate your copy. I've had to do this 4 times now, due to hardware changes, because Vista detects system changes, so if you remove 1 or 2 boards and boot into Vista, the system will automatically de-activate. Now, granted, the call to MS was fairly painless, but it's annoying all the same.

    Out of the 4 Vista systems I own currently (3 of which are laptops), I've had great success with the OS, itself. Unfortunately for me, the motherboard I've been using on my custom built workstation is flaky... I've done my research though (tonight), and might have a fix in the works, if it works, that is. Otherwise, I'll be ordering a new motherboard, and calling Microsoft yet again to transfer my license to the new configuration. By the way, they always ask "Is this copy running on more than one PC?". In light of all the hoopla over the licensing scheme in Vista, I would hope that no one is stupid enough to try to use a Vista Retail license on multiple PCs, because it'll cause all of them to be blacklisted. Oh, and the new Vista Enterprise edition is only available in lots of 25 licenses or higher, and requires a licensing server on the LAN with the deployed workstation licenses. It's either that, or expect to have a couple of extra hundred MB or so of net traffic from all of the Vista Ent. workstations checking in with MS everytime the systems are booted. Makes me glad that I also work with Linux very heavily, and all things considered, if Linux + WINE can run all of my criticle Win32 apps, then this will be the last Vista Licenses that I buy. I'll still keep Vista on my laptops, and I'll continue to run my XP workstations, and 2K3 servers, but MS is going to have to really do some impressive work to get me convinced to migrate to their next platform... such as maybe... a *real* 3D desktop... which is already available, stable and totally badass on Linux (check out Kubuntu + compiz).

    (btw: Sorry if I seem like I'm on a rant here... no offense intended toward the readers, at all... it's just that when you work with OS's at the level that I do, after a while stupid mistakes made by OS vendors start to get beyond aggrivating.)
  • instant - Saturday, July 21, 2007 - link

    And when we are talking about GAMES, how if at all is this relevant to the current discussion?

    x64 has been the way to go ever since it was released.
  • miahallen - Saturday, July 14, 2007 - link

    That is incorrect, Vista x64 will run x86 apps without problem (so will XP x64), that's the nice thing about it. I ran x64 for quite a while, and ran almost nothing but x86 apps on it.

    The keycodes for x86 do not chnge for x64 installs...they use the same key. And the retail versions I bought did not have both versions on one disc, I had to order a x64 disc online (and pay $10 for S&H).

Log in

Don't have an account? Sign up now