Final Words

We've been hearing for quite some time now that Blu-ray and HDDVD movies could prove to be too much for today's desktop microprocessors; today we finally have the proof. X-Men: The Last Stand encoded using the H.264/MPEG-4 AVC High Profile at 1080p requires more processing power to decode than affordable dual core CPUs can handle. We are at a point where GPU decode acceleration is essentially required with all but the highest end processors in order to achieve an acceptable level of quality while watching HD content on the PC.

NVIDIA hardware performs better under our current set of drivers and the beta build of PowerDVD we are using, but exactly how well GeForce 7 Series hardware handles the decode process is more dependant on the type of card being used than ATI. In general, higher performance NVIDIA cards do better at decoding our H.264 Blu-ray content. The 7950 GX2 doesn't perform on par with the rest of the high end NVIDIA cards as SLI doesn't help with video decode. With the exception of the X1600 Pro, each of the ATI cards we tested affected performance almost exactly the same.

While there isn't much more to say about performance right now, we do need to consider that we are working with an early release of our player software, and ATI and NVIDIA are always improving their driver support for video decode acceleration. While we can't count on seeing improved performance in the future on current hardware, it is always nice to know that the possibility exists. We will continue to track performance with future player and driver updates.

But no matter what we see in the future, NVIDIA has done an excellent job with the 8800 series. G80 based cards will definitely lead the way in HD video decode performance, making it possible to stick with a cheaper CPU and still get a good experience. Of course, nothing about playing HD content on the PC is cheap right now, especially if we are talking about using an 8800 in conjunction with our Blu-ray drive.

For those who don't have the money to build a computer around Blu-ray or HDDVD, a standalone player is the other option. We tested our Samsung player with X-Men: The Last Stand to see if it could handle the demands of an H.264 movie (as any good CE player should). We were happy to see that the Samsung box didn't seem to have any problems playing our movie.

As for recommendations, based on our testing, we would not suggest anything less than an Intel Core 2 Duo E6600 for use in a system designed to play HD content. The E6400 may work well enough, but not even the 8800 GTX can guarantee zero dropped frames on the E6300. ATI owners will want to lean more towards an E6700 processor, but can get away with the E6600 in a pinch. But keep in mind that X-Men: The Last Stand is only one of the first H.264 movies to come out. We may see content that is more difficult to decode in the future, and faster processors are definitely a good place to pad your performance to ensure a quality HD experience on the PC.

X-Men: The Last Stand CPU Overhead
Comments Locked

86 Comments

View All Comments

  • Chucko - Monday, December 11, 2006 - link

    So is the output resolution via HDCP DVI really not able to output at high res? I heard a few rumors about when HDCP was enabled the output resolution of these cards was less than 1080p. Is there any true to this?
  • harijan - Monday, December 11, 2006 - link

    I guess that no integrated graphic solutions will be able to decode them without dropping frames?
  • Tujan - Monday, December 11, 2006 - link

    " Intel Core 2 Duo E6600 can play X-Men: The Last Stand without dropping frames. "


    "Thus we chose the Core 2 Duo X6800 for our tests."

    "we haven't found a feature in PowerDVD or another utility that will allow us to count dropped frames"

    "The second test we ran explores different CPUs performance with X-Men 3 decoding. We used NVIDIA's 8800 GTX and ATI's X1950 XTX in order to determine a best and worse case scenario for each processor. The following data isn't based on average CPU utilization, but on maximum CPU utilization. This will give us an indication of whether or not any frames have been dropped. If CPU utilization never hits 100%, we should always have smooth video. The analog to max CPU utilization in game testing is minimum framerate: both tell us the worst case scenario"

    "All video cards that have an HDMI connection on them should support HDCP, but the story is different with DVI. Only recently have manufacturers started including the encryption keys required for HDCP. Licensing these keys costs hardware makers money, and the inclusion of HDCP functionality hasn't been seen as a good investment until recently (as Blu-ray and HDDVD players are finally available for the PC). While NVIDIA and ATI are both saying that most (if not all) of the cards available based on products released within the last few months will include the required hardware support, the final decision is still in the hands of the graphics card maker. "

    "It is important to make it clear that HDCP graphics cards are only required to watch protected HD content over a digital connection. Until movie studios decide to enable the ICT (Image Constraint Token), HD movies will be watchable at full resolution over an analog connection. While analog video will work for many current users, it won't be a long term solution."

    This here:

    "Fortunately, it seems that studios are making the sacrifices they need to make in order to bring a better experience to the end user."<-lol


    ..so tell me. What kind of processor and video graphics processor does a blue-ray 'box have ?

    Somehow I dont buy the analog,as a premise to the test results...yet still you are probably correct in its detail.

    Seems the person on the computer made all the sacrifices here.Those are thousand dollar rigs.! 'With a Blue-ray player there ?

    I would be happy with 30 gigs of MPEG2 in this matter.Maybe use a fresnel lense or something for the Display.Or put a opaque projector up in some way.

    Desktop Resolution:1920x1080 - 32-bit @ 60Hz

    Doesn't your lab got enough money for a High resolution HDTV wDVI connectors ?
    Im not understanding all the fat in the fire here.Studios.

    __________________

    Saw your last comment there Mr.Wilson. Noticed that some .avis have higher data rate than others. Why should 'I pay (in dollars,and technology) for that kind of pony ?



  • DerekWilson - Tuesday, December 12, 2006 - link

    you mis quoted me --

    **Not even an** Intel Core 2 Duo E6600 can play X-Men: The Last Stand without dropping frames.
  • Tujan - Thursday, December 14, 2006 - link

    Thanks for reply.Good to see you are on top of things. Yeah I simply didn't include the first portion of the sentence in quoting you there.
    The article has so many baselines in it. For now I guess that Xmen BR disk is the trait holder of wich did a performance consideration criteria then. Venturing to say then,that all BR are not created alike. That is other BR disks will not have the same characteristics as 'would have necesitated a 6800 CPU.
    That is something of a rig that costs 2 grand.To include as well the BR player itself. Now a BR shelf box,it does not have a X6800 CPU,nor Nvidia,or ATI high-end graphics chips.(speaking comparately). And a Monitor capable of doing the resolution will certainly do so using only that set of components.
    So I think that perhaps it would be question enough to say that BR 'penalizes'a computer using it as an accessory.
    I dont know if this is true. Still if it was necesary to have BR accesories as you had listed them,BR would have had to have had them listed in the BR patent itself.BR is a fully capable technology w/o the computer.
    So frankly the penalty here 'must be the DRM involved.Since BR does that on-the-fly incryption.'I'll just speculate.
    Look at the power consumption there ! True the Sony Notebook I showed does only 1080i(check it if Im wrong). But the graphics there will run on a freekin battery! AND the notebook (with Sony liscencing power no doubt)can be hooked up to the family television setup - maybe in HD.
    Lets face it though.I dont see the reason so timid to conduct comparisions to HDTV sets.? A Dell 30" monitor ,or something such as this ? Run the comparisons with the HDMI out to them from these bottle cork (the notebook I showed) technologies.

    As in the same light,if an HDTV can display 1080P with an OTA,you've got to suppose that the 'bus'in wich is being conductive to it may or may have nothing or something in common to what the computer is doing.True you may laugh as there is nothing such as an HD OTA 1080P,and I dont know if there is.

    Yet HDMI,HDCP etc thats a really fancy chance I either spend performance per dollar where It does me some good,or waste away to consider whom and of what technology can be participated to no avail.

    If the whole industry is waiting on a freaking HDMI cable.WTF is wrong with you people.
    I get a light on the players you'll hear some more.Why the timidity to put the archetecture on the block for testing !!!.Computers and more. And I dont give a farts rott for what RIAA,or MPAA sais. They dont have that life to lead.

  • Renoir - Thursday, December 14, 2006 - link

    Tujan, I don't mean to appear rude but I find your posts very hard to understand and from one of the posts above I'm not the only one. Am I right in thinking english isn't your first language? If so perhaps you could make your posts a bit simpler and more concise. Again I don't mean to cause offence as I think you may have some interesting points to make I just can't quite understand them.
  • Tujan - Thursday, December 14, 2006 - link

    There is several topics covered in each paragraph.Put a [ ] at end of each paragraph.

    The article said a lot more than just conduction to the BR. For the most part I am darn tee'd off because you just dont take and burn 700 watts to run a BR. Now I dont know what the specs.are for the on-the-shelf BR players. But as I explained they do not have any components of the computer,and they sure dont take up that much power to run using them. A screen simply does its stuff and that is it.The screen should be capable of doing exactly what the BR disk was mentioned to be in HD WITHOUT A GRAPHICS CARD !!!.

    Now I dont know what the parameters for this descretion is. I know that Walmart has a 4$ HDMI cable. And that HDCP Graphics cards do not use HDMI.

    Your last post consisted of differences between running Quad Core with the BR. Well you can do Dual-server Quad-Core with the same graphics but if the connection,and testing cannot be done with HD screens,there is not much to have been said about them.

    Especially the detail in the power utilization with the CPUs.So where's the descreprency ?

    I am happy with an HDMI and Battery power on a notebook attached to a HDTV screen. Just how happy remains to be accessed. By both the graphics card vendors,and the authors of these not so confounding computer articles.

    P.S.If I could have edited out the last paragraph from the last post I would have done so.It does not lack its substance though,since there must be a penalty to run BR on the computer.BR does not need this type of technology to conduct a standard session- as could be seen if the testing would be done.Then we could reason the problem.


  • Tujan - Thursday, December 14, 2006 - link

    BTW. I cannot figure out exactly the reason the tested BR player was not listed in the test setup.The brand name,and type. And wether it was using a Sata connection (wich it probably was). It should not be long before anyone should be able to conduct data transfer tests between different computer based players.

    I dont know if I can wait or not. Since still we are dealing with certain media.And criteria specific to it. As well as the performance of them. So without that,a computer capable BR player should be the least of considerations.

    Plextors got a blue ray player. Yeah think so. The specs. of the drive should have been announced.

    See I just dont get the confoundedness of the HDCP and kissy feel good about the light hollywood puts out.For me,i have other considerations beyond them for the space,and the conduction of the technology.

    happy holidays.
  • johnsonx - Monday, December 11, 2006 - link

    I had a lot less trouble following the article than I did that post. What?
  • Tujan - Monday, December 11, 2006 - link

    Has anybody done a comparison of the bandwidth difference between HDMI,and DVI ? Ive only seen a lowly 1600 (Saphire-ATI) with HDMI. Although behind the scenes there are others. Had no idea of HD-DVD,or Blue-Ray were such 'system eating bandwidth hogs.

    Either way.Its a knuckle sandwich for the studios.

    Hope everybody runs out an grabs all the DVDs that can fit in a shopping cart.

    So I guess the 'hi-def resolutions is a 'studio spec. too ?

Log in

Don't have an account? Sign up now