Custom Refresh Rates with AMD 7750

One of the drawbacks of the GPUs built into the Intel CPUs was the lack of a 23.976 Hz refresh rate to match the source frame rate of many videos. Combined with the lack of reliable support for open source software, this has often pushed users to opt for a discrete HTPC GPU. Ideally, a GPU should be capable of the following refresh rates at the minimum:

  1. 23.976 Hz
  2. 24 Hz
  3. 25 Hz
  4. 29.97 Hz
  5. 30 Hz
  6. 50 Hz
  7. 59.94 Hz
  8. 60 Hz

Some users demand integral multiples of 23.976 / 24 Hz because they result in a smoother desktop experience, while also making sure that the source and display refresh rates are still matched without repeated or dropped frames.

With the Sony Bravia KDL46EX720 (the new display in our HTPC testbed) being capable of PAL refresh rates (despite not exposing it in the EDID), we were able to test out almost all the above mentioned refresh rates. The only exception was 60 Hz (which definitely works well in the usual desktop mode, but, was one for which we were unable to obtain a sample file with a matching frame rate).

Using madVR, we were able to judge how often a frame would get repeated or dropped in the renderer due to the mismatched refresh rate and frame rate. With the 23.976 fps sample, we were looking at more than 15 minutes before a dropped frame, while, for the 24 fps sample, we were looking at more than a hour before a frame repeat.

By default, modes not reported in the EDID were not available in the CCC settings. However, it was quite straightforward to display and choose the refresh rates not supported by the monitor in its Windows settings. Some of the other 'native refresh rates' we tested are reproduced below:

AMD's refresh rate matching feature works pretty well most of the time, but it could be made better by giving the user more control over the various timing aspects (maybe in an 'Advanced' hidden menu similar to what is done by NVIDIA). This could be useful for consumers who don't want to put up with frame drops / repeats even once every 20 minutes or so.

HQV 2.0 Benchmarking Video Post-Processing: GPU Loading
Comments Locked

155 Comments

View All Comments

  • Oxford Guy - Wednesday, February 15, 2012 - link

    What I'd like to know is why the 7950 is shown in the Idle temp charts and then vanishes from the Load temp charts.
  • Oxford Guy - Wednesday, February 15, 2012 - link

    Sorry.. Power consumption charts.

    I'd like to see the 7950/7970 load power consumption. The idle consumption is less interesting and that's where they're shown.
  • dananski - Wednesday, February 15, 2012 - link

    Me too, the 6850 was worse than the 5850, so I'd expected it to be easily beaten by this generation's 770. Then again, the 6770 was just a 5770, so I suppose I should've learned that the mid-range is barely moving.
  • designerfx - Thursday, February 16, 2012 - link

    if you think about the fact of it's price today then it will probably be down substantially in a month - at which point it'd be quite competitive.
  • ce12373 - Wednesday, February 15, 2012 - link

    Hmmm. This AMD story sounds like a lot of other AMD stories (cough*cough "BULLDOZER"). Maybe no one has piledriven the point home to AMD yet. Oh well.
  • medi01 - Wednesday, February 15, 2012 - link

    Right, and it's a long time that nVidiai stopped producing overpriced "you can fry pancackes with these" GPUs and even hold performance crown? Oh, it's still producing them and AMD still hold performance crown? What a pity.

    Oh, but AMD went nVidia route with "confuse consumer more" naming scheme? How shameless, do they pay royalties for this to nVidia, the inventor of this rubbish?
  • aguilpa1 - Wednesday, February 15, 2012 - link

    AMD has been doing the name bait and switch just as long as Nvidia but since your such a fan boy apparently you haven't noticed. It is obvious from your overheated GPU remarks that you are stuck on some ancient review of a past Nvidia product. And again, AMD has done the same, also in the past, 2900XT anyone?
  • CeriseCogburn - Wednesday, March 21, 2012 - link

    The GTX 590 still holds the single card crown.
    The very strange situation that has occurred is amd holding the single core card crown with 7970, finally passing the 580 after a year or closer to two and I don't remember how long.
    This single core crown is gone already gone with the GTX680 benches leaked a few days early.
    So amd finally did hold a crown for once in a very long time, for a very short time, 2.5 months or so....with most of that time in very weak or absent stock on retail shelves.

  • Spunjji - Thursday, June 21, 2012 - link

    BLAH BLAH BLAH BLAH BLAH
  • Reticence - Monday, June 24, 2013 - link

    You realize you're kinda the laughing stock of anandtech right? You wait with baited breath for every post remotely including AMD somewhere in the article to give you the opportunity to suck off nvidia and intel. I literally think you might need psychiatric help with that raging superiority complex, then again I've always thought there should be specially designed concentration camps for people like you who never grew out of acting like a blow-hard highschool kid.

    Oh well, down to business.

    Two 7970's (And I mean two 7970's, NOT a 7990) perform better than one Titan, EVERY benchmark has shown it, you can not deny this. And I already know what you're doing to say "BUT THAT'S 2 CARDS VS. 1, NOT FAIR ;[" But see, this is the main point that proves you're a major fucking moron. That.does.not.matter.at.all.

    The the Titan is 1000$.
    Two 7970's are 800$.

    And while yes, it's impressive that a single card can hold it's own against two, it doesn't matter, who is going to spend 200$ more for LESS performance? I'm sure you'll also say "you're just an AMD fanboy." Wrong, I just don't like to waste my money. And I really, really, don't like you.

    Nuff said.

    Oh and by the way, read it and weep, pussy.

    http://www.semiaccurate.com/forums/showpost.php?p=...

Log in

Don't have an account? Sign up now