How much memory is "enough"?

Bill Gates is often misquoted as having said something to the effect that "no one will ever need more than 640K of memory!" Happy to poke some fun at himself, Gates suggested that anyone that actually believed this legendary quote probably also thinks that Microsoft is working on an email tracking system.



While the actual specifics of what was said may be lost to time and fading memories, the basic idea is that at some point, even unimaginable amounts of memory are likely to be exhausted. With the availability of 64-bit computing - and obviously XP-64 is not first to the party, although we'll leave discussions of Linux and other 64-bit OS solutions out for now - we now have the potential to address up to 2^64 bytes of memory (or 16 EiB of memory if you prefer). Gates quipped, "I'll be very careful not to say that 2 to the 64th will be enough memory for anyone. I will say that it might last us for a little while; it's quite a bit of memory, but some day somebody will write code that wants to go even beyond that."

In reality, our current x86-64 systems can't actually address that much memory - and with the largest readily available DIMMs currently coming in at 2 GB in size, it would require over two billion such DIMMs to provide 2^64 bytes of memory! For the present, hardware is limited to 40-bit or 48-bit physical address spaces, depending on implementation, which would still require hundreds or even thousands of 2 GB DIMMs to reach. As the hardware limits are approached, things can be modified to stretch the physical address space until it eventually reach 64-bits. When will this occur? Given that it took nearly two decades to go exceed the constraints of the 32-bit address space, 64-bits could very well last for several decades (at least on the desktop). But that's just speculation for now.



One of the areas that we really need to talk about is who specifically needs more than 32-bit memory spaces. While everyone stands to benefit in some ways from additional memory, the truth is that we will not all benefit equally. Servers and high-end workstations have already been available in 64-bit designs for a while, and they remain the primary examples of usage models that require more memory. You can see some examples of the server uses for 64 bit outlined above. A further example given was the MSN Instant Messenger servers. MS was able to reduce the number of servers - and thus the space required - while improving performance by shifting to a 64-bit environment.

On the desktop front, the vast majority of people aren't waiting with bated breath for Word 64 and Excel 64; instead, it's the content creation people that are working with large 3D models, movies and images that are beginning to run into the memory wall. 3D gaming may hit that wall next, although it may still be a couple more years. After conversations with several vendors following the keynote, we feel safe in stating that a major need for 64-bit Windows will only come if you're already running at least 2 GB of RAM. If you're not running that much memory, it doesn't necessarily mean you should avoid upgrading to XP-64, but you'll certainly get diminishing returns. On the other hand, if you're running 4 GB of RAM in your system and still running into memory limitations, 64-bit Windows has the potential to bring vast performance improvements.

What Took So Long? The Benefits of XP-64
Comments Locked

36 Comments

View All Comments

  • JarredWalton - Saturday, April 30, 2005 - link

    KHysiek - part of the bonus of the Hybrid HDDs is that apparently Longhorn will be a lot more intelligent on memory management. (I'm keeping my fingers crossed.) XP is certainly better than NT4 or the 9x world, but it's still not perfect. Part of the problem is that RAM isn't always returned to the OS when it's deallocated.

    Case in point: Working on one of these WinHEC articles, I opened about 40 images in Photoshop. Having 1GB of RAM, things got a little sluggish after about half the images were open, but it still worked. (I wasn't dealing with layers or anything like that.) After I finished editing/cleaning each image, I saved it and closed it.

    Once I was done, total memory used had dropped from around 2 GB max down to 600 MB. Oddly, Photoshop was showing only 60 MB of RAM in use. I exited PS and suddenly 400 MB of RAM freed up. Who was to blame: Adobe or MS? I don't know for sure. Either way, I hope MS can help eliminate such occurrences.
  • KHysiek - Friday, April 29, 2005 - link

    PS. In this case making hybrid hard drives with just 128MB of cache is laughable. Windows massive memory swapping will ruin cache effectiveness quickly.
  • KHysiek - Friday, April 29, 2005 - link

    Windows memory management is one of the worst moments in history of software development. It made all windows software bad in terms of managing memory and overall performance of OS. You actually need at least 2x of memory neede by applications to work flawlessly. I see that saga continues.
  • CSMR - Thursday, April 28, 2005 - link

    A good article!

    Regarding the "historical" question:
    Strictly speaking, if you say "an historical" you should not pronounce the h, but many people use "an historical" and pronounce the h regardless.
  • Pete - Thursday, April 28, 2005 - link

    That's about how I reasoned it, Jarred. The fisherman waits with (a?)bated breath, while the fish dangles from baited hook. Poor bastard(s).

    'Course, when I went fishing as a kid, I usually ended up bothering the tree frogs more than the fish. :)
  • patrick0 - Wednesday, April 27, 2005 - link

    If Microsoft manages graphics memory, it will sure be a lot easier to read this memory. This can make it easier to use the GPU as a co-processor to do non-graphics tasks. Now I could manage image-processing, but this doesn't sound like a no-graphcs task, does it? Anyways, it is a cpu task. Neither ATI nor nVidia has made it easy so far to use the GPU as co-processor. So I think ms managing this memory will be an improvement.
  • JarredWalton - Wednesday, April 27, 2005 - link

    Haven't you ever been fishing, Pete? There you sit, with a baited hook waiting for a fish to bite. It's a very tense, anxious time. That's what baited breath means.... Or something. LOL. I never really thought about what "bated breath" actually meant. Suspended breath? Yeah, sure... whatever. :)
  • Pete - Wednesday, April 27, 2005 - link

    Good read so far, Derek and Jarred. Just wanted to note one mistake at the bottom of page three: bated, not baited, breath.

    Unless, of course, they ordered their pie with anchovies....
  • Calin - Wednesday, April 27, 2005 - link

    "that Itanium is not dead and it's not going anywhere"

    I would say "that Itanium is not dead but it's not going anywhere"
  • JarredWalton - Tuesday, April 26, 2005 - link

    26 - Either way, we're still talking about a factor of 2. 1+ billion vs. 2+ billion DIMMs isn't really important in the grand scheme of things. :)

    23 - As far as the "long long" vs "long", I'm not entirely sure what you're talking about. AMD initially made their default integer size - even in 64-bit mode - a 32-bit integer (I think). Very few applications really need 64-bit integers, plus fetching a 64-bit int from RAM requires twice as much bandwidth as a 32-bit int. That's part one of my thoughts.

    Part 2 is that MS may have done this for code compatibility. While in 99% of cases, an application really wouldn't care if the 32-bit integers suddenly became 64-bit integers, that's still a potential problem. Making the user explicitly request 64-bit integers gets around this problem. Things are backwards compatible.

    Anyway, none of that is official, and I'm only partly sure I understand what you're asking in the first place. If you've got some links for us to check out, we can look into it further.

Log in

Don't have an account? Sign up now