X-Men: The Last Stand CPU Overhead

The first benchmark we will see compares the CPU utilization of our X6800 when paired with each one of our graphics cards. While we didn't test multiple variations of each card this time, we did test the reference clock speeds for each type. Based on our initial HDCP roundup, we can say that overclocked versions of these NVIDIA cards will see better CPU utilization. ATI hardware doesn't seem to benefit from higher clock speeds. We have also included CPU utilization for the X6800 without any help from the GPU for reference.

X-Men III Playback (H.264)


X-Men III Playback (H.264)


The leaders of the pack are the NVIDIA GeForce 8800 series cards. While the 7 Series hardware doesn't do as well, we can see that clock speed does affect video decode acceleration with these cards. It is unclear whether this will continue to be a factor with the 8 Series, as the results for the 8800 GTX and GTS don't show a difference.

ATI hardware is very consistent, but just doesn't improve performance as much as NVIDIA hardware. This is different than what our MPEG-2 tests indicated. We do still see a marked improvement over our unassisted decode performance test, which is good news for ATI hardware owners.

The second test we ran explores different CPUs performance with X-Men 3 decoding. We used NVIDIA's 8800 GTX and ATI's X1950 XTX in order to determine a best and worse case scenario for each processor. The following data isn't based on average CPU utilization, but on maximum CPU utilization. This will give us an indication of whether or not any frames have been dropped. If CPU utilization never hits 100%, we should always have smooth video. The analog to max CPU utilization in game testing is minimum framerate: both tell us the worst case scenario.

X-Men III Playback (H.264)


While only the E6700 and X6800 are capable of decoding our H.264 movie without help, we can confirm that GPU decode acceleration will allow us to use a slower CPU in order to watch HD content on our PC. The X1950 XTX clearly doesn't help as much as the 8800 GTX, but both make a big difference.

The Test Final Words
Comments Locked

86 Comments

View All Comments

  • Chucko - Monday, December 11, 2006 - link

    So is the output resolution via HDCP DVI really not able to output at high res? I heard a few rumors about when HDCP was enabled the output resolution of these cards was less than 1080p. Is there any true to this?
  • harijan - Monday, December 11, 2006 - link

    I guess that no integrated graphic solutions will be able to decode them without dropping frames?
  • Tujan - Monday, December 11, 2006 - link

    " Intel Core 2 Duo E6600 can play X-Men: The Last Stand without dropping frames. "


    "Thus we chose the Core 2 Duo X6800 for our tests."

    "we haven't found a feature in PowerDVD or another utility that will allow us to count dropped frames"

    "The second test we ran explores different CPUs performance with X-Men 3 decoding. We used NVIDIA's 8800 GTX and ATI's X1950 XTX in order to determine a best and worse case scenario for each processor. The following data isn't based on average CPU utilization, but on maximum CPU utilization. This will give us an indication of whether or not any frames have been dropped. If CPU utilization never hits 100%, we should always have smooth video. The analog to max CPU utilization in game testing is minimum framerate: both tell us the worst case scenario"

    "All video cards that have an HDMI connection on them should support HDCP, but the story is different with DVI. Only recently have manufacturers started including the encryption keys required for HDCP. Licensing these keys costs hardware makers money, and the inclusion of HDCP functionality hasn't been seen as a good investment until recently (as Blu-ray and HDDVD players are finally available for the PC). While NVIDIA and ATI are both saying that most (if not all) of the cards available based on products released within the last few months will include the required hardware support, the final decision is still in the hands of the graphics card maker. "

    "It is important to make it clear that HDCP graphics cards are only required to watch protected HD content over a digital connection. Until movie studios decide to enable the ICT (Image Constraint Token), HD movies will be watchable at full resolution over an analog connection. While analog video will work for many current users, it won't be a long term solution."

    This here:

    "Fortunately, it seems that studios are making the sacrifices they need to make in order to bring a better experience to the end user."<-lol


    ..so tell me. What kind of processor and video graphics processor does a blue-ray 'box have ?

    Somehow I dont buy the analog,as a premise to the test results...yet still you are probably correct in its detail.

    Seems the person on the computer made all the sacrifices here.Those are thousand dollar rigs.! 'With a Blue-ray player there ?

    I would be happy with 30 gigs of MPEG2 in this matter.Maybe use a fresnel lense or something for the Display.Or put a opaque projector up in some way.

    Desktop Resolution:1920x1080 - 32-bit @ 60Hz

    Doesn't your lab got enough money for a High resolution HDTV wDVI connectors ?
    Im not understanding all the fat in the fire here.Studios.

    __________________

    Saw your last comment there Mr.Wilson. Noticed that some .avis have higher data rate than others. Why should 'I pay (in dollars,and technology) for that kind of pony ?



  • DerekWilson - Tuesday, December 12, 2006 - link

    you mis quoted me --

    **Not even an** Intel Core 2 Duo E6600 can play X-Men: The Last Stand without dropping frames.
  • Tujan - Thursday, December 14, 2006 - link

    Thanks for reply.Good to see you are on top of things. Yeah I simply didn't include the first portion of the sentence in quoting you there.
    The article has so many baselines in it. For now I guess that Xmen BR disk is the trait holder of wich did a performance consideration criteria then. Venturing to say then,that all BR are not created alike. That is other BR disks will not have the same characteristics as 'would have necesitated a 6800 CPU.
    That is something of a rig that costs 2 grand.To include as well the BR player itself. Now a BR shelf box,it does not have a X6800 CPU,nor Nvidia,or ATI high-end graphics chips.(speaking comparately). And a Monitor capable of doing the resolution will certainly do so using only that set of components.
    So I think that perhaps it would be question enough to say that BR 'penalizes'a computer using it as an accessory.
    I dont know if this is true. Still if it was necesary to have BR accesories as you had listed them,BR would have had to have had them listed in the BR patent itself.BR is a fully capable technology w/o the computer.
    So frankly the penalty here 'must be the DRM involved.Since BR does that on-the-fly incryption.'I'll just speculate.
    Look at the power consumption there ! True the Sony Notebook I showed does only 1080i(check it if Im wrong). But the graphics there will run on a freekin battery! AND the notebook (with Sony liscencing power no doubt)can be hooked up to the family television setup - maybe in HD.
    Lets face it though.I dont see the reason so timid to conduct comparisions to HDTV sets.? A Dell 30" monitor ,or something such as this ? Run the comparisons with the HDMI out to them from these bottle cork (the notebook I showed) technologies.

    As in the same light,if an HDTV can display 1080P with an OTA,you've got to suppose that the 'bus'in wich is being conductive to it may or may have nothing or something in common to what the computer is doing.True you may laugh as there is nothing such as an HD OTA 1080P,and I dont know if there is.

    Yet HDMI,HDCP etc thats a really fancy chance I either spend performance per dollar where It does me some good,or waste away to consider whom and of what technology can be participated to no avail.

    If the whole industry is waiting on a freaking HDMI cable.WTF is wrong with you people.
    I get a light on the players you'll hear some more.Why the timidity to put the archetecture on the block for testing !!!.Computers and more. And I dont give a farts rott for what RIAA,or MPAA sais. They dont have that life to lead.

  • Renoir - Thursday, December 14, 2006 - link

    Tujan, I don't mean to appear rude but I find your posts very hard to understand and from one of the posts above I'm not the only one. Am I right in thinking english isn't your first language? If so perhaps you could make your posts a bit simpler and more concise. Again I don't mean to cause offence as I think you may have some interesting points to make I just can't quite understand them.
  • Tujan - Thursday, December 14, 2006 - link

    There is several topics covered in each paragraph.Put a [ ] at end of each paragraph.

    The article said a lot more than just conduction to the BR. For the most part I am darn tee'd off because you just dont take and burn 700 watts to run a BR. Now I dont know what the specs.are for the on-the-shelf BR players. But as I explained they do not have any components of the computer,and they sure dont take up that much power to run using them. A screen simply does its stuff and that is it.The screen should be capable of doing exactly what the BR disk was mentioned to be in HD WITHOUT A GRAPHICS CARD !!!.

    Now I dont know what the parameters for this descretion is. I know that Walmart has a 4$ HDMI cable. And that HDCP Graphics cards do not use HDMI.

    Your last post consisted of differences between running Quad Core with the BR. Well you can do Dual-server Quad-Core with the same graphics but if the connection,and testing cannot be done with HD screens,there is not much to have been said about them.

    Especially the detail in the power utilization with the CPUs.So where's the descreprency ?

    I am happy with an HDMI and Battery power on a notebook attached to a HDTV screen. Just how happy remains to be accessed. By both the graphics card vendors,and the authors of these not so confounding computer articles.

    P.S.If I could have edited out the last paragraph from the last post I would have done so.It does not lack its substance though,since there must be a penalty to run BR on the computer.BR does not need this type of technology to conduct a standard session- as could be seen if the testing would be done.Then we could reason the problem.


  • Tujan - Thursday, December 14, 2006 - link

    BTW. I cannot figure out exactly the reason the tested BR player was not listed in the test setup.The brand name,and type. And wether it was using a Sata connection (wich it probably was). It should not be long before anyone should be able to conduct data transfer tests between different computer based players.

    I dont know if I can wait or not. Since still we are dealing with certain media.And criteria specific to it. As well as the performance of them. So without that,a computer capable BR player should be the least of considerations.

    Plextors got a blue ray player. Yeah think so. The specs. of the drive should have been announced.

    See I just dont get the confoundedness of the HDCP and kissy feel good about the light hollywood puts out.For me,i have other considerations beyond them for the space,and the conduction of the technology.

    happy holidays.
  • johnsonx - Monday, December 11, 2006 - link

    I had a lot less trouble following the article than I did that post. What?
  • Tujan - Monday, December 11, 2006 - link

    Has anybody done a comparison of the bandwidth difference between HDMI,and DVI ? Ive only seen a lowly 1600 (Saphire-ATI) with HDMI. Although behind the scenes there are others. Had no idea of HD-DVD,or Blue-Ray were such 'system eating bandwidth hogs.

    Either way.Its a knuckle sandwich for the studios.

    Hope everybody runs out an grabs all the DVDs that can fit in a shopping cart.

    So I guess the 'hi-def resolutions is a 'studio spec. too ?

Log in

Don't have an account? Sign up now