Multitasking Scenario 1: DVD Shrink

If you've ever tried to backup a DVD, you know the process can take a long time. Just ripping the disc to your hard drive will eat up a good 20 minutes, and then there's the encoding. The encoding can easily take between 20 and 45 minutes depending on the speed of your CPU, and once you start doing other tasks in the background, you can expect those times to grow even longer.

For this test, we used DVD Shrink, one of the simplest applications available to compress and re-encode a DVD to fit on a single 4.5GB disc. We ran DVD Decrypt on the "Star Wars Episode VI" DVD so that we had a local copy of the DVD on our test bed hard drive (in a future version of the test, we may try to include DVD Decrypt performance in our benchmark as well). All of the DVD Shrink settings were left at default including telling the program to assume a low priority, a setting many users check in order to be able to do other things while DVD Shrink is working.

We did the following:

1) Open Firefox using the ScrapBook plugin loaded locally archived copies of 13 web pages; we kept the browser on the AT front page.
2) Open iTunes and start playing a playlist on repeat all.
3) Open Newsleecher.
4) Open DVD Shrink.
5) Login to our news server and start downloading headers for our subscribed news groups.
6) Start backup of "Star Wars Episode VI - Return of the Jedi". All default settings, including low priority.

This test is a bit different than the test we ran in the Intel dual core articles, mainly in that we used more web pages, but with more varied content. In the first review, our stored web pages were very heavy on Flash. This time around, we have a much wider variety of web content open in Firefox while we conducted our test. There is still quite a bit of Flash, but the load is much more realistic now.

DVD Shrink was the application in focus. This matters because by default, Windows gives special scheduling priority to the application currently in the foreground. We waited until the DVD Shrink operation was complete and recorded its completion time. Below are the results:

DVD Shrink + Multitasking Environment

As we showed in the first set of dual core articles, tests like these are perfect examples of why dual core matters. The performance of the single core Athlon 64 FX-55 is dismal compared to any of the dual core offerings. You'll also note that the Athlon 64 X2 4400+ completes the DVD Shrink task in less than half the time of the higher clocked single core FX-55. The reasoning behind this is more of an issue with the Windows' scheduler. The problem in situations like these is that the Windows scheduler won't always preempt one task in order to give another its portion of the CPU's time. For a single threaded CPU, that means that certain tasks will take much longer to complete simply because the OS' scheduler isn't giving them a chance to run on the CPU. With a dual core or otherwise multi-threaded CPU, the OS' scheduler can dispatch more threads to the CPU, and thus, is less likely to be in a situation where it has to preempt a CPU intensive task.

In this test, the Athlon 64 X2 4400+ does better than the Pentium D 840, but the Extreme Edition manages to offer slightly better performance. A faster X2 shouldn't have much of a problem remaining competitive, however.
Development Performance - Compiling Firefox Multitasking Scenario 2: File Compression
Comments Locked

144 Comments

View All Comments

  • liebremx - Thursday, April 21, 2005 - link


    Anand, great reading as always.

    I have an observation:

    On the 'Development Performance - Compiling Firefox' section you write
    "This particular test is only single threaded, ..."

    Why not launch a multithreaded build?

    "make -j3 -f client.mk build_all"
  • Jalf - Thursday, April 21, 2005 - link

    Makes good sense for AMD to keep their (server) dualcore chips pricey. AMD has limited manufacturing capacity, and they have best singlecore solution. In other words, they might as well keep the dualcore prices high, to a) make more money in cases where people are willing to fork over lots of money, and b) keep people who are on a budget interested in their singlecore offerings, at least until their new fab goes online.
  • GentleStream - Thursday, April 21, 2005 - link

    I have some comments about the Firefox compile test. First, thanks alot for including it. Now I have some comments about it. First, you are using GNU make and it supports parallel compiles. So, you should be able to replace the line:

    make -f client.mk build_all

    with the line:

    make -j 2 -f client.mk build_all

    to perform a parallel compile using 2 processors. The -j option specifies how many processors or threads you are using. You can do parallel compiles on a single processor machine as well as multi-processor or multi-core machines. It is often the case that using -j 2 or -j 3 on a single processor machine will give the best results because of it's allowing the overlaping of cpu computations and I/O.

    You don't say whether you did a debug or optimized build. I would recommend doing both the debug and optimized builds and reporting the results of both. When doing parallel optimized compiles, you may want to make sure you are not swapping although for the server tests it looks like you have plenty of memory - 4 GBytes. I did not see immediately how much memory you were using for the X2 tests. Anyway, I would recommend doing both debug and optimized compiles with -j n where n is 1, 2, 3, and 4 or perhaps just 1, 2, and 4. Since compiles are essential to development work and also embarassingly parallel, this should provide a really good comparison of the multitasking capabilities of these systems.

    Hope you can do this or at least some of it and thanks alot for adding a really good compile test to your test suite.

    Dave
  • michaelpatrick33 - Thursday, April 21, 2005 - link

    The server market is where AMD is going headed to get large margins in their chips. With Supermicro joining the AMD camp (they must have seen the performance of the Opteron dualcore, blinked their eyes and said, "we're in") Dell is left alone holding Intel only product lines. Intel will not have a response on the server front until Q1 2006. That is troubling for Intel because it give AMD six months of market buildup and Fab36 time to come online and increase volume tremendously. It should be interesting.

    Imagine a 4800+ on a 939 DFI board running at 2-2-2-8 1t timings versus the P4 Extreme dualcore. Drooling just thinking about having either processor, but especially the AMD
  • erwos - Thursday, April 21, 2005 - link

    "AMD would probably have problems delievering a lower cost dual core in quantities ."

    This is exactly it. Why should AMD let demand outstrip supply? Just jack up the price until you've got just enough demand to consume your supply.

    I mean, yes, I'd love an Athlon64 X2 5000+ with 1mb of cache for ~$250, but that's life. AMD stockholders should be pleased with this decision.

    There's also the impending move to socket M2 to consider... the Athlon64 X2 makes sense for people with very low-end A64's, but M2 is going to be the better upgrade path for FX and/or 3800+ users. I would be surprised to see any 939 Athlon64's past 5200+.
  • eetnoyer - Thursday, April 21, 2005 - link

    While our desires as desktop users are for high volumes of X2s at low prices, we have to balance that with what AMD as a company needs to survive...money. AMD is currently capacity constrained with regard to dual-core CPUs with only Fab30. They have entered into agreements with both IBM and Chartered for additional capacity (probably on the lower end chips), but that won't come online until late this year. Just before production starts to ramp at Fab36.

    In the meantime, AMD has stated that their order of priority goes Server -> Mobile -> Desktop with the profitability motive in mind. For most users that will be heavily into the multi-tasking benefits of dual-core CPUs, spending $5xx for the low-end X2 vs $1000 for the PEE 840 will be a no-brainer. Seeing how that is a small minority of users, AMD can reasonbly supply the demand for them while still maintaining the highlest level of availability of dual-core Opterons at much better ASPs. Remember that AMD wants to capture as much market share in the server market as possible while Intel has no response.

    As a share-holder, I hope that the demand for dual-core Opteron is deafening based on the incredible price/performance ratio (thus limiting their ability to produce X2 in high quantity). As a middle-of-the-road desktop user, I'm quite content with my mildly OC'd A64 for the next year or two.
  • ksherman - Thursday, April 21, 2005 - link

    w00t! Ill have to read it later tho...
  • MrHaze - Thursday, April 21, 2005 - link

    Certainly impressive.

    I think it is important to remember that the "Athlon64 X2" was actually an Opteron running ECC RAM at 2T on a less-than-stable motherboard. I think it is best think of this as a comparison of Intel's dual cores, AMD's single cores, and a hog-tied Athlon64 X2.
    Makes you wonder how an actual X2 with fast memory on a fast motherboard will perfom.

    Regardless, I'm really excited about the upgrade potential, and I hope that AMD sticks with socket 939 for a long while.

    Mr.Haze
  • kirbalo - Thursday, April 21, 2005 - link

    Great review Anand...Thanks for fixing your gaming bar charts...they were wacked before!

  • Tapout1511 - Thursday, April 21, 2005 - link

    Sure would have been nice if they had included a single core A64 at 2.2GHz w/ 1MB cache (3500+ right?) to illustrate instances where the extra core was useful and when it wasn't.

    Oh well.

Log in

Don't have an account? Sign up now