What Took So Long?

One of the questions that many of your probably have is, what took so long? Opteron and later Athlon 64 have been available for quite some time - roughly two years now. AMD has talked of Windows 64 for that long and more, and only now are we finally seeing the fruits of MS' labor.



The conspiracy theorists are undoubtedly going to talk about an alliance between MS and Intel. It's difficult to say for certain whether that played a role, but remember that Xeon with 64-bit capabilities has been available for nearly a year now. Microsoft stands to benefit - in terms of increased sales of its OS and applications - by the release of XP-64, and we would like to think that they have simply been spending the extra time to make sure the release is as smooth as possible.

One of the other key factors in the delays is the drivers. While MS has control over the source code and APIs for Windows, those are not the only critical parts of the OS. Drivers are an integral part of any OS, and proper optimizations as well as porting take a lot of time and effort. While XP-64 is capable of running 32-bit applications, the drivers must be native 64-bit code.

Whatever the cause of the delays, we feel relatively confident in stating that there wasn't any major conspiracy to hurt AMD or any other company. Microsoft has seen quite a lot of groups shift to Linux simply to gain earlier support for x86-64, and that can't be something they're happy about. In the end, getting a new OS release done well is more important than getting it done fast, and hopefully the release of XP-64 will be one of the less painful upgrades for early adopters.

One last item that we want to quickly point out: many people have also assumed that the launch of XP-64 and the embrace of the x86-64 architecture by Intel has somehow signified an end to Itanium and IA64. It was reiterated on several occasions that Itanium is not dead and it's not going anywhere. XP-64 will also have a version for the IA64 platform, and Itanium will continue to compete primarily with high-end servers like those from IBM with the POWER5 processors and Sun with their UltraSPARC processors. The chances of any home user running an Itanium system anytime soon are pretty remote, but the platform lives on.

Index How Much Memory Is Enough?
Comments Locked

36 Comments

View All Comments

  • JarredWalton - Saturday, April 30, 2005 - link

    KHysiek - part of the bonus of the Hybrid HDDs is that apparently Longhorn will be a lot more intelligent on memory management. (I'm keeping my fingers crossed.) XP is certainly better than NT4 or the 9x world, but it's still not perfect. Part of the problem is that RAM isn't always returned to the OS when it's deallocated.

    Case in point: Working on one of these WinHEC articles, I opened about 40 images in Photoshop. Having 1GB of RAM, things got a little sluggish after about half the images were open, but it still worked. (I wasn't dealing with layers or anything like that.) After I finished editing/cleaning each image, I saved it and closed it.

    Once I was done, total memory used had dropped from around 2 GB max down to 600 MB. Oddly, Photoshop was showing only 60 MB of RAM in use. I exited PS and suddenly 400 MB of RAM freed up. Who was to blame: Adobe or MS? I don't know for sure. Either way, I hope MS can help eliminate such occurrences.
  • KHysiek - Friday, April 29, 2005 - link

    PS. In this case making hybrid hard drives with just 128MB of cache is laughable. Windows massive memory swapping will ruin cache effectiveness quickly.
  • KHysiek - Friday, April 29, 2005 - link

    Windows memory management is one of the worst moments in history of software development. It made all windows software bad in terms of managing memory and overall performance of OS. You actually need at least 2x of memory neede by applications to work flawlessly. I see that saga continues.
  • CSMR - Thursday, April 28, 2005 - link

    A good article!

    Regarding the "historical" question:
    Strictly speaking, if you say "an historical" you should not pronounce the h, but many people use "an historical" and pronounce the h regardless.
  • Pete - Thursday, April 28, 2005 - link

    That's about how I reasoned it, Jarred. The fisherman waits with (a?)bated breath, while the fish dangles from baited hook. Poor bastard(s).

    'Course, when I went fishing as a kid, I usually ended up bothering the tree frogs more than the fish. :)
  • patrick0 - Wednesday, April 27, 2005 - link

    If Microsoft manages graphics memory, it will sure be a lot easier to read this memory. This can make it easier to use the GPU as a co-processor to do non-graphics tasks. Now I could manage image-processing, but this doesn't sound like a no-graphcs task, does it? Anyways, it is a cpu task. Neither ATI nor nVidia has made it easy so far to use the GPU as co-processor. So I think ms managing this memory will be an improvement.
  • JarredWalton - Wednesday, April 27, 2005 - link

    Haven't you ever been fishing, Pete? There you sit, with a baited hook waiting for a fish to bite. It's a very tense, anxious time. That's what baited breath means.... Or something. LOL. I never really thought about what "bated breath" actually meant. Suspended breath? Yeah, sure... whatever. :)
  • Pete - Wednesday, April 27, 2005 - link

    Good read so far, Derek and Jarred. Just wanted to note one mistake at the bottom of page three: bated, not baited, breath.

    Unless, of course, they ordered their pie with anchovies....
  • Calin - Wednesday, April 27, 2005 - link

    "that Itanium is not dead and it's not going anywhere"

    I would say "that Itanium is not dead but it's not going anywhere"
  • JarredWalton - Tuesday, April 26, 2005 - link

    26 - Either way, we're still talking about a factor of 2. 1+ billion vs. 2+ billion DIMMs isn't really important in the grand scheme of things. :)

    23 - As far as the "long long" vs "long", I'm not entirely sure what you're talking about. AMD initially made their default integer size - even in 64-bit mode - a 32-bit integer (I think). Very few applications really need 64-bit integers, plus fetching a 64-bit int from RAM requires twice as much bandwidth as a 32-bit int. That's part one of my thoughts.

    Part 2 is that MS may have done this for code compatibility. While in 99% of cases, an application really wouldn't care if the 32-bit integers suddenly became 64-bit integers, that's still a potential problem. Making the user explicitly request 64-bit integers gets around this problem. Things are backwards compatible.

    Anyway, none of that is official, and I'm only partly sure I understand what you're asking in the first place. If you've got some links for us to check out, we can look into it further.

Log in

Don't have an account? Sign up now