It’s hard to say the last year has been anything but rough for Microsoft’s Windows division. Although we found Windows Vista favorable upon its launch last year after watching it go through an usually drawn-out development process, such a sentiment hasn’t been shared by Windows users as a whole. Windows XP proved to be every bit the competition for Vista that Microsoft could ever fear it would be, at the time as when Vista was having its own post-launch pains. It was a bad combination, making for a bad year for Microsoft’s efforts in pushing its first new desktop OS in 5 years. Microsoft was looking to make a solid case for why Vista is a worthwhile successor to XP in a market notorious for a resistance to change, and they failed to do this thanks to a failure in immature technology and an inability to get a consistent and convincing message out.

In the year since then you could make the argument that Microsoft’s marketing efforts still haven’t improved, but you would be hard pressed to make the same argument about Vista itself. Since its release an unfortunately large number of bugs and quirks have been discovered in Vista, which has kept Microsoft busy patching them over the year, while to their chagrin many consumers sit on the side watching. To Microsoft’s credit they’ve done a lot with Vista well before the first service pack, various patches including the reliability & compatibility packs released over the last year have solved many of the earliest complaints about Vista; it already performs better and is less quirky across the board now than when it launched. But it goes without saying that this hasn’t been enough to solve all of Vista’s problems, putting a lot of watchful eyes on Service Pack 1.

There is a saying among software development circles that businesses as a whole won’t touch a Microsoft product until the first service pack; they would prefer to wait until a product has been widely used and the biggest problems identified & solved. It’s cold but effective logic that also puts a great deal of pressure on Microsoft. No matter how good (or bad) a product is, half of their customers won’t bat an eye until there’s a service pack, making the first such pack just as important as the product launch itself in some ways. Complicating matters further with the Vista launch in particular is that Microsoft has tied Windows Server 2008 to the Vista kernel; getting Windows Server 2008 out the door means any and all Vista problems that would hinder server operation need to be eliminated. The result is that Service Pack 1 is a big deal for Microsoft, they need to show consumers that they can fix what still ails the OS, they need to show businesses that it’s now ready for them to use, and they need to show server administrators that the core technology is so good that a reliable server can be built off of it.

Furthermore, with the progression of technology in the last year the timing couldn’t be any more critical. The 4GB address space barrier for 32bit x86 is finally beginning to rear its head with more average computer uses; RAM prices have nosedived with 8GB of RAM going for as little as $160, resulting in a wide and very real need for a 64-bit operating system (and XP64 being a poor fit for consumers). Meanwhile PC OEMs are finally warming up to the Extensible Firmware Interface (EFI) and are ready to start building systems with it, meaning they too must move beyond XP. Even governments are finding they need to move to Vista as of late, as new encryption standards come in to play which only Vista supports.

The result of this is that many different groups have been watching SP1 far more intently than past service packs. With the final version of SP1 in hand, today we’ll be looking at what Microsoft is bringing to the table with Vista’s first service pack. With a combination of new features, bug fixes, and performance improvements, there’s a great deal to this service pack that we’ll be covering so let’s get started.

What’s Fixed In SP1
POST A COMMENT

62 Comments

View All Comments

  • 7Enigma - Thursday, February 28, 2008 - link

    It's probably to prevent all the tech calls saying, "Gee golly, I have 4 Giga-bites of RAM, but the screen only says I have 3! Where'd the other one go!".

    I don't see a problem with doing this, as if you care enough to know how much is actually addressable you probably already know the rest of the story anyway.
    Reply
  • sprockkets - Thursday, February 28, 2008 - link

    Any word if you can use a non sp1 key with a sp1 disk?

    Most of us hate vista for some reasons, like it for others. It just happens to be that the reasons why we hate it are more than why we like it, such as the way you have to pick how audio is sent to outputs (i.e. analog or digital, but not both at the same time as in previous versions, also that darn cdrom audio input I need), UAC, the fact that after your initial install of Vista that those 80 updates will then cause it upon shutdown to process, then upon restart will spend even more time processing the updates (albeit it only happens once and sp1 will do it for you, for now that is), and of course, everyone's favorite, why the hell did they have to change things like how to remove programs?

    Did SP1 restore the cdrom audio input? Probably not. No, I use it for my TV tuner card which Microsoft feels I no longer need to use, which I can still buy.

    And for UEFI, perhaps now we can take control of features that should be in the BIOS, yeah, right (I'm looking at you Intel for not putting in S3 standby support on your D201GLY v1 and 2 boards!).
    Reply
  • 7Enigma - Thursday, February 28, 2008 - link

    Many people use nLite or other programs to slipstream a service pack (among other programs and hotfixes). You can then burn this as a new .iso cd and use that in place of your original disk. Short story:

    I have a 3 year old system that I never reformatted (I know, the shame). When I built the system SP2 had just come out so my XP disk already has sp2 included. But obviously in 3 years quite a few (100's?) updates and hotfixes have come about and it would be a real pain to reformat and then have to go through all the reboots and installs. Not to mention the chance that during this my system could become compromised from a previous exploit.

    So I started hearing about "slipstreaming" and spent some time reading about it. These programs basically create a custom installation disk with exactly what you want (and excluding what you don't want), so that you can reformat the drive, install the slipstreamed version of XP or Vista, and have only a handful of updates (I think I had 8 updates total, and required 2 reboots). This is a fantastic way to start from scratch without spending a lot of time.

    RyanVM (at least for XP) keeps an infrequent update pack that I used with my SP2 disk to enable the reformat I described above. Once SP3 for XP comes out you could slipstream it to your existing disk in 20min and be ready to reinstall. Really I'll never go back to the old-fashioned way of hosing a system.

    HTH (and I apologize if you already knew this)
    Reply
  • kdog03 - Wednesday, February 27, 2008 - link

    Well? Reply
  • Ryan Smith - Thursday, February 28, 2008 - link

    No. That's a fundamental limit of 32bit operation, it can't be fixed. Reply
  • mcnabney - Wednesday, February 27, 2008 - link

    In regard to the network throttling to make room for audio. A gigabit connection will be trimmed back to 80Mbs, 8% of capacity. Anandtech writes:

    "We’ll fully admit the problem will only affect a small number of users (those with gigabit networks who need high network performance while using multimedia applications), but then we’re exactly that kind of user."

    Hmmm. Lets think. What might need more than 80Mps while using audio? Hmmmm. How about, playing HD video off of a local media server?
    Reply
  • HaZaRd2K6 - Wednesday, February 27, 2008 - link

    I just recently bought an eSATA enclosure for backups (FAST!) and in the article you mention "The only significant loser here are file operations over high-latency high-bandwidth links". Would eSATA be included in this list? Any chance of some numbers to check eSATA performance? Reply
  • trparky - Sunday, March 02, 2008 - link

    The only difference between eSATA and SATA is the connector. Everything else is the same, even the data encoding used to transmit it across the wire. Reply
  • ellis - Wednesday, February 27, 2008 - link

    Does anybody know of any good benchmarks made measuring the multitasking performance on Vista? (vs. XP?)

    And I'm not talking about the ability for multi-core optimized applications ability to distribute it's load on several cores, but on the ability to run several demanding single threaded applications in parallell nicely distributed on the avalable cores.

    So does anyone know of any test that has focused on this aspect of utilizing a multi-core CPUs?
    Perhaps comparing dual-core with quad-core, on XP and Vista?
    Reply
  • Griswold - Wednesday, February 27, 2008 - link

    I cant think of a single benchmark for this other than the "benchmark suite" they use here at AT sometimes (bunch of apps doing certain things creating alot of load and timing it). But I wouldnt really need that as vista always felt more responsive with multiple heavy load tasks than XP - and it even does so on this almost 1 year old install.

    I would like to expect MS being able to improve on the far from perfect task scheduler that comes with XP for their new flagship OS. As far as i'm concerned, they succeeded.
    Reply

Log in

Don't have an account? Sign up now