CPU Performance and Threading in Vista

While we found memory requirements in Vista to be greater than XP, CPU requirements weren't as much of a big deal. There are parts of Vista that will obviously benefit from faster CPUs, but if you have anything in the Core 2 or Athlon 64 X2 class you should be just fine. The argument for dual and quad core processors remains relatively unchanged with Vista. For multitaskers and many CPU-intensive workloads, a dual core CPU makes perfect sense. Vista doesn't make the argument for dual or quad core any more compelling than XP in our opinion; the additional background tasks that run in Vista that weren't present in XP don't really eat up that much CPU time to begin with, so dedicating an entire core to them isn't necessary.

The new applications in Vista don't appear to be any more threaded than XP, despite Vista being heralded as the beginning of a highly threaded future. Microsoft Word remains single threaded, although Excel can now take advantage of multiple cores when performing calculations. Windows Movie Maker seems to be optimized for two cores, while importing and attaching files in Photo Gallery is surprisingly single threaded. Dual core still makes a big difference in the overall experience, while quad core still isn't necessary but remains useful for a handful of situations.

With high definition video playback and encoding being two very big drivers of CPU performance and number of cores, Vista will be the OS under which new highly threaded applications really start to appear but there's no reason to feel like four cores are necessary to run Vista today. An interesting bit of trivia is that on a Core 2 Duo E6300, simply opening a new Explorer window in Vista will eat up about 19% of your total CPU time while the window opens and animates; turning Aero Glass off doesn't change the CPU usage either. Maybe four cores are necessary...

Vista Search for Fast Drives Only?

In our opinion the two biggest reasons to migrate from XP to Vista are its Search and SuperFetch technologies, as they both dramatically impact productivity. When Mac OS X introduced system-wide indexed search functionality, we wondered if disk performance would dramatically impact how responsive the search was. More specifically, would notebooks running OS X have significantly slower search times than desktops with faster 3.5" drives? Under OS X, while we noticed a difference between desktop and notebook drives, it wasn't large enough to render the feature crippled on a slower drive. Thankfully, the same can be said about Windows Vista.

We noticed absolutely no difference in how long it took indexed search results to appear whether we used a 150GB 10,000RPM Western Digital Raptor, our 500GB WD test drive, or even a 5-year old 100GB drive - the results were always near-instantaneous. In fact, a much larger impact on search performance was how much memory the system had. The less system memory you have the more disk I/O there's going to be due to swapping data in/out of the pagefile, and that I/O ends up reducing search performance tremendously. We noticed a much bigger search performance improvement going from 512MB to 1GB of memory than going from a 5-year old drive to a modern day, high-end 10,000 RPM Raptor.

While performing searches didn't show any difference between various hard drives, there is a noticeable performance difference between drives when it comes to how long it takes to index your drive if you should have to rebuild your index. The chart below shows the amount of time it took to rebuild Vista's search index on the three drives we've been using in this review:

Vista Search Index Performance

Obviously the larger the amount of data to index, the greater the impact drive speed will make on it, but this should give you a bit of a reference point. Of course all of the normal benefits of moving to a faster drive still apply (faster application launches, documents open quicker, games load faster, etc...), but the point we're trying to make here is that if you've got a reasonably fast drive already, don't feel like you have to replace it in order to keep up with Vista.

Networking Performance Synthetic Application Performance
Comments Locked

105 Comments

View All Comments

  • Ryan Smith - Saturday, February 3, 2007 - link

    That's up to Vista, it benchmarks a flash drive to make sure it's fast enough to be effectively used as a ReadyBoost cache. If ReadyBoost won't engage, then your drive isn't passing one(or more) of their tests.
  • mlambert890 - Friday, February 2, 2007 - link

    How is the PC hemmhoraging marketshare to the Mac? You've got to be kidding. Their marketshare in 06 rose from a pathetic 4.4 to a somewhat less pathetic 4.8. Thats with ALL of their ridiculous hype, ALL of the asskissing from the press (including you guys now I guess?) and ALL of the armies of lunatic "Mac priests" that pollute every forum.

    Its hillarious that you would position this tiny growth in a share that declined steadily for 22 years until it hit rock bottom at like 3% in 2003 as a "hemmhorage". I have to wonder why you would characterize it that way. To be honest, it reeks of bias.
  • quanta - Friday, February 2, 2007 - link

    Think about it, ReadyBoost is treated by Vista as random access memory, to store temoprary contents than can change very often. Considering typical USB flash drive only has 100k write cycle, you will need to replace it very soon. Worse yet, when the flash drive is gone, so will your critical data at the worst possible time. With the hardware requirement of Vista, no amount of wear levelling is going to help.
  • Ryan Smith - Friday, February 2, 2007 - link

    No, this is wrong.

    ReadyBoost is a write-through data cache handled by the SuperFetch system; when enabled SuperFetch uses it as another cache location optimized for small files. Based on the information we've seen, it's used primarily to store DLLs and other static and semi-static data that is needed an intermediate amount of time(not important enough to spend valuable RAM, important enough to cache), with highly dynamic data sent to SuperFetch or the hard disk to avoid unnecessary wear out. It will most certainly put wear on flash memory, but it seems unlikely that it will put 51TB of write-wear(the amount of data that needs to be written on a 512MB flash card to write over all bits 100k times) before several years out.

    Of course, this is as according to Microsoft. We don't really know what exactly is being stored on a ReadyBoost drive at any given moment, however we have no reason to believe that Microsoft isn't really taking efforts to minimize writes. We'll find out if/when flash memory starts wearing out.
  • mlambert890 - Friday, February 2, 2007 - link

    We'll see. Please remember that the 100k write cycle is an average, that the flash is used as a small cache only, and that write leveling of COURSE will help before making assumptions.

    Ive been beating up flash for YEARS thats still going. There are moves to literally put OS's on flash based hard drives. Hybrid drives already use the same concept as ReadyBoost (and are also supported on Vista).

    Using flash as a cache for magnetic media is not some untested concept that is going to lead to global data destruction.

    MS must have really destroyed their mindshare that so many armchair scientists are just fully willing to believe that theyve figured out ALL the stuff that the "idiots" in Redmond dont realize. Give a little credit to the armies of PhDs that work on at least the basic concept for this crap. Maybe implementation gets flawed by the realities of release cycles and budgets, but BASIC CONCEPT is typically sound.
  • dugbug - Friday, February 2, 2007 - link

    UAC is like a firewall -- chatty at first (during installs and configurations), but once you have set up your system you will hardly ever hear from it. This should be obvious to the authors.

    And for that matter, the 6-operation file delete they discuss in the beginning was for deleting a file on a shared desktop (meaning a delete was for all users). This is commonplace for enterprise and workplace users, it should be no surprise that a file used by others would require permissions to delete. Though Im glad the number of operations was greatly reduced.

    As to the comments about vista being sluggish? Perhaps it is RAM? I have 2Gb and vista runs without any slowdown at all. Once you use it for a while you won't go back to XP.

    -d
  • LoneWolf15 - Friday, February 2, 2007 - link

    Untrue. Enthusiasts use lots of things like the Control Panel, MMC console, etc. and these all require UAC every time. Currently, I also have startup programs on my beta-test box that UAC blocks. This would be fine, if UAC had a feature saying "Yes, I know what this program is, let it run every time all the time" and be done. But, UAC doesn't have this option, so a user has to allow the program to run every single freaking time they boot their machine.

    I've tried changing the program properties so that it runs as Administrator; that hasn't solved the problem. I turned off UAC, which gives me a lovely annoying red-X shield in the system tray that every so often decides to warn me with a popup balloon that UAC is turned off and I could be in danger, so it's annoying even when turned off, and there's no easy way around it. Enthusiasts do a lot with their computers, and what they do is likely to increase their number of UAC prompts. Bottom line: Unlike OS X's methods, Vista's UAC happens far more often, and is far more annoying. And because it doesn't require a password (like OS X) and is just a click-through, I'll put money down that within a year, it will be worthless, as the average user will learn to click through it without reading a single bit of info.
  • funk3y - Sunday, February 4, 2007 - link

    The red cross can also be disabled for sure; on my computer, which is a member of a domain I recieve no error message at all, even if UAC & co are disabled.
  • haplo602 - Friday, February 2, 2007 - link

    Realy. What's the hype all about ?

    SuperFetch - trivial change to caching mechanisms. Anybody that would require it would have already implemented it in *NIX systems. This is a purely desktop user feature to hid some processing overhead. There's nothing new about this that would prevent implementation in w2k already except MS incompetence ...

    ReadyBoost - So the new standard is to have a permanently attached USB stick to have some performance ?

    Compund TCP, Receive window auto tuning - I laughed like mad. So they finaly made a proper implementation of something network related? End even then Vista is SLOWER. I'd suggest take a stand-alone NIC that Vista nad XP have drivers for themselves and test it. Should rule out driver bugs.

    I/O improvements - so I make an app that makes a high priority high capacity I/O operation (say 1GB) and you can go for lunch till the system is anyway usable. Seriously. I/O in small chunks makes perfect sense in multitasking environments since you have more entry point and can adjust the stream on OS level and tune performance. That XP or Vista are stupid enough to do this is their fault. I guess MS will hype this as the next best thing in a future OS ?

    All in All every feature hyped in the article does not deserve a Marketing Name(tm) because it is a normal concept. So we have a shiny new bigger and slower OS that is hiding this behind hyped features. F.E. memory compression could very much improve system performance without relying on external devices (ReadyBoost).
  • mlambert890 - Friday, February 2, 2007 - link

    Just admit your bias man. There is NOTHING MS could do that would cause you to give them kudos. I spend my days arguing with guys like you for a living (unfortunately) and its just exhausting.

    I could point you to REAMS of documentation of all the crap that has been rewritten and overhauled in Vista, but whats the point? You want to hate it so hate it.

    Its sad that technology debates are STILL religion for so many after all this time :(

Log in

Don't have an account? Sign up now