Frame Rate Conversion and You

There are two basic types of content stored on most DVDs: content that came from a 24 fps source and content that came from a 30 fps source. The most popular is 24 fps source content because all movies are recorded at 24 fps and since the majority of DVDs out there are movies, this is the type we'll talk about first.

Although most motion pictures are recorded at (approximately) 24 frames per second, no consumer television is capable of displaying at that frame rate. In order to get TVs up to the sizes that we all know and love (and at affordable prices), TVs are not as flexible as computer monitors - they are fixed frequency displays, so displaying contents with varying frequencies is not exactly possible. The DVD production houses know this so they have to convert their 24 fps source into something that can be displayed on the majority of TVs out there.

In the North American market, the majority of TVs are interlaced NTSC TVs that display 60 fields per second. As we've just explained, a single interlaced field has half the resolution of a full frame in order to save bandwidth; by displaying 60 of those interlaced fields per second, the human eye is tricked into thinking that each frame is complete. But how can you convert 24 non-interlaced (aka progressive) film frames into 60 interlaced fields?

The first step is to convert the progressive film frames into interlaced frames, which is pretty simple, just divide up each frame into odd and even lines and send all the odd ones to one field and all the even ones to another.

Now we have 48 interlaced fields, but we are still short of that 60 fields per second target. We can't just add 12 more fields as that will make our video look like we hit the fast forward button, so the only remaining option is to display some of the 48 fields longer. It turns out that if we perform what is known as a 3-2 pulldown we will have a rather nice conversion.

Here's how it works:

We take the first progressive frame and instead of just splitting it into two interlaced fields, we split it into three, with the third being a copy of the first. So frame 1 becomes field1a, field2a and field1a again. Then, we take the next progressive frame and split it into two interlaced fields, field2a and field2b, no repetition. We repeat this 3-2 pattern over and over again to properly display 24 fps film source on interlaced NTSC TV.

There are a few movies and some TV shows that are recorded at a different frame rate: 30 fps. The 30 fps to 60 fields per second conversion is a lot easier since we don't need to alternate the pattern, we still create interlaced fields for the sake of NTSC compatibility but we display each field twice, thus performing a 2-2 pulldown instead of the 3-2 pulldown that is used for film. One of the most popular 30 fps sources is Friends (note: it turns out that Friends is incorrectly flagged as a 30 fps source but is actually a 24 fps source) but other sources are sometimes recordered at 30 fps, including some bonus material on DVDs. Because of this, while 24 fps sources are usually categorized as "film", 30 fps sources are usually called "video" (these names will have significance later on).

Remember that the whole point for performing these conversions is that until recently, all televisions have been these low bandwidth interlaced displays. Recently however, televisions have become more advanced and one of the first major features to come their way was the ability to display non-interlaced video. A non-interlaced TV is useless without non-interlaced content, thus manufacturers produced affordable non-interlaced DVD players, otherwise known as progressive scan DVD players.

But if you have a progressive scan DVD player you don't have to buy progressive scan DVDs, the DVD player instead does its best to reassemble the original progressive scan frames from the interlaced content stored on the DVD. Given the two major algorithms we mentioned above, reconstructing the original progressive frames from the interlaced data on the DVD shouldn't be a difficult task. Once the DVD player know if it is dealing with 24 fps or 30 fps content, it simply needs to stitch together the appropriate fields and send them out as progressive frames. The DVD spec makes things even easier by allowing for flags to be set per field that tell the DVD player how to recover the original progressive source frames. No problems, right? Wrong.

It turns out that the flags on these DVDs aren't always reliable and can sometimes tell the DVD player to do the wrong thing, which could result in some pretty nasty image quality. So DVD players can't just rely on the flags, so algorithms were created to detect the type of source the DVD player was dealing with. If the decoder chip detected a 3-2 pattern it would switch into "film" mode and if it detected a 2-2 pattern it would switch into "video" mode. The problem here is that due to a variety of factors including errors introduced during editing, transition between chapters on a disc, and just poorly encoded DVDs, these algorithms are sometimes told to do the wrong thing (e.g. treat 24 fps content as 30 fps content). These hiccups in the 3-2 pattern don't happen for long periods of time (usually), but the results can be quite annoying. For example, if the DVD decoder chip tries to combine two fields that belong to different frames, the end result is a frame that obviously doesn't look right. While it may only happen in a few frames out of thousands on a single DVD those few frames are sometimes enough to cause a ruffled brow while watching your multi-thousand-dollar home theater setup.

So what does all of this have to do with NVIDIA's PureVideo? Although it's not in a set-top box, PureVideo is just as much of a DVD decoder as what you have sitting underneath your TV, it's just in your computer. And to measure its effectiveness, we have to look at how it handles these trouble cases. Remember that your PC is inherently a "progressive scan" device, there's no interlacing here, so the quality of your videos directly depends on NVIDIA's algorithms.

An Interlacing Primer A Brief Look at De-Interlacing Modes
Comments Locked

62 Comments

View All Comments

  • mcveigh - Tuesday, December 21, 2004 - link

    nserra: where do you get your info on mpg4 from?
  • tfranzese - Tuesday, December 21, 2004 - link

    I guess giving you the hardware features now means you should be entitled to the software to take advantage of it as well, huh?

    Those of you who think you're entitled to nVidia's DVD decoder are living in la la land. So now they should also give you the games that take advantage of their technology/features too?
  • Koing - Tuesday, December 21, 2004 - link

    Just don't buy an Nvidia card next time if you feel ripped off.

    You paid 'expecting' a feature to be 'fully' and not half 'assed' in to work 8 months late.

    Protest with your WALLETS.

    <-- never an early graphics adopter!

    Still with my Nvidia GF4Ti4200 :P

    Koing
  • nserra - Tuesday, December 21, 2004 - link

    http://www.ati.com/vortal/videoinnovations/flash/i...

    #29 VIRGE
    Ati have MPEG4 encoding and decoding since 18/07/2002 so what do you want?

    S3 Omni is better at video then Ati of Nvidia the problem is the rest.... even so, to bad there isnt an S3 in the tests.

    I dont know why nvidia doesnt charge some buck or two for each chip/card it makes, so the feature get paied?

    Unless this enconder software that they are charging, works on any card in the market, if its the case they are right in doing so.
  • LoneWolf15 - Tuesday, December 21, 2004 - link

    Anand,

    I think you would do your readers a great service by retesting all of this on an AMD processor. Your Pentium 4 rig had Hyperthreading on, making this issue look far les serious than it is. Hyperthreading P4 chips report far lower CPU utilization numbers as opposed to even high-level Athlon 64 processors. So those of us with a Geforce 6800 who are running high-end AMD hardware are getting 70-100% CPU utilization in WMV9. It explains why a lot of us are having trouble trusting nVidia now, and in the future.
  • atssaym - Tuesday, December 21, 2004 - link

    NVDVD =/= NVIDIA DVD Decoder, they are TWO DIFFERENT THINGS

    link
    http://nvidia.com/object/decoder_faq.html#othernv

    Will this product work with NVDVD and ForceWare Multimedia?
    The new software will upgrade the decoders in ForceWare Multimedia 3.0 to access the new features in the NVIDIA DVD Decoder. The NVIDIA DVD Decoder will not upgrade NVDVD 2.0 as it was a separate application based upon an old architecture.

    Decoder Homepage
    http://nvidia.com/object/dvd_decoder.html
    NVDVD Homepage
    http://nvidia.com/page/nvdvd.html
  • pataykhan - Tuesday, December 21, 2004 - link

    Nice comparison of ATI and NVIDIA

    But still one thing is missing. I use cyberlink powerdvd for watching dvds. My question is that if i have got nvidia or ati dvd decoder. Should i use hardware accelerated decoder or should i stick with cyberlink decoder.

    Cyberlink has implemented some smart software deinterlacer and image quality enhancer (CLEV-2). So i would like to see the image quality differences between hardware/software decoders available.
  • bigpow - Tuesday, December 21, 2004 - link

    This sucks...

    ATI doesn't charge $20 for the claimed decoder/accelerator...

    And they have the ATI HD component dongle too!
  • segagenesis - Tuesday, December 21, 2004 - link

    Bah, I would be kind of pissed if I had a high end $500 card only to find out now its more worthless than a sub $200 card at decoding video. A bit of shame on nVidia.
  • Guspaz - Monday, December 20, 2004 - link

    Oh, as a clarification of my previous comment (#41) I meant that I was under the impression that normally the DVD decoding software handled the de-interlacing, obviously PureVideo overrides that. But what I mean is on ATI cards, without any type of hardware de-interlacing, wouldn't the deinterlacing differ from decoder to decoder?

Log in

Don't have an account? Sign up now