What about Performance?

If you can find a source file that Badaboom will accept and transcode fine, the process is pretty quick.

The first test I ran was to convert Chapter 3 of Bad Boys on DVD to a 5Mbps VBR .mp4 file using the Xbox 360 profile. I upscaled the video to 1280 x 720.

The Core 2 Quad Q6600 completed the test in 245 seconds using the x264 codec, outputting a file that was similar in size and quality to what Badaboom managed (the file was a bit smaller 109MB vs. 116MB and the quality a bit better).

 

The entry level and midrange 8/9 series GPUs couldn’t actually do much better. The GeForce 9500 GT was actually slower, as were the 8500 GT and the 8600 GTS. The GeForce 8800 GT changed things though, at 103 seconds it encoded the test in less than half the time. NVIDIA’s fastest, the GeForce GTX 280 managed it in just over 60 seconds.

Next I tried outputting a lower resolution file for use on an iPhone, encoded at 1.5Mbps. Despite the default resolution being 480 x 320 the actual output resolution was 480 x 272:

 

The file outputted was obviously smaller at 35MB and the time to transcode went down significantly. Now our Q6600 took 36.5 seconds and the 8800 GT’s advantage was cut down, it ended up being only about 7 seconds faster (or about 28%). The GTX 280 still pulled ahead, processing the encode in just under 19 seconds.

What this chart shows is that the load on the GPU varies, much as it does in 3D games, depending on what we're doing. Just as higher resolutions tend to be more GPU bound than CPU bound, it would seem that smaller, simpler content at lower transcoding bitrates don't show as big of an advantage. The benefit of GPU accelerated transcoding is clear, but the performance gains will vary depending on the load.

For the final test I repeated the iPhone conversion but instead of only converting Chapter 3 of the DVD I selected the first ten chapters:

 

In the 5Mbps Xbox 360 test the GeForce GTX 280 ended up being around 3.5x faster than the Core 2 Quad Q9450, in the single-chapter iPhone test the advantage was reduced to 2.1x and here we find that the gap grows slightly to 2.2x but still not quite as high as the original test. It's looking like a range of 2 - 4x the speed of a reasonably fast quad-core CPU is what we can expect from Badaboom if you use NVIDIA's fastest GPU.

If you look at a more reasonably priced GPU, the 9800 GTX ends up being around 2 - 3x faster than the same quad-core CPU. The value of the entry level GPUs isn't that great unless you've got a dual-core CPU, otherwise quad-core chips will be able to encode faster and with better quality.

Next up I wanted to see how fast of a CPU we needed to keep the GeForce GTX 280 fed in its most CPU-bound test, the single-chapter iPhone conversion:

 

This graph should make NVIDIA pretty happy, you only really need a Core 2 Duo E4500 to keep the GTX 280 fed resulting in performance better than any quad-core Intel CPU can offer. The upside to GPU accelerated video transcode is huge, we just need a better app to deliver it.

Image Quality Energy Efficiency
Comments Locked

38 Comments

View All Comments

  • JarredWalton - Monday, August 18, 2008 - link

    While what you say is true to an extent, we're testing the value of a specific piece of hardware to perform certain work. Using your logic, gaming benchmarks are worthless as well, because it's not like you're going to play games all the time.

    We can look at the power question in a lot of ways. It appears an E4500 would do about just as well as the Q6600 used in testing, so for power should we compare Q6600 with IGP to E4500 with GTX 280 (or 9800)? That's certainly one valid comparison point, but if you go that route you quickly get to the stage where you have so many valid points of comparison that the project becomes unmanageable.

    Personally, I assume most users understand that this is a look at energy efficiency for a specific task, and not a holistic look at PC power use. What it tells us is that in heavily bottlenecked situations, GPU encoding is far more efficient than CPU encoding. That's useful information. Now we just need a good codec and application to back it up.
  • Inkjammer - Monday, August 18, 2008 - link

    Since this is still a beta version, I have to wonder how much could possibly change by end of release? Were you able to talk to Elemental to address the issues with the beta and the dissapointment in the "advanced" settings?

    The Pro edition seems dissapointing, but if they ironed out the kinks in the end... I'd be interested in picking it up. Will there be a follow-up review for the release version?
  • Anand Lal Shimpi - Tuesday, August 19, 2008 - link

    I've kept Elemental aware of all of the issues I've had. I gave them some suggestions back after my first preview of the software. Every single problem I've encountered Elemental has added to their list of things to QA for, I'm hoping we'll see some significant improvements in the next major release.

    I will keep an open dialogue with Elemental and definitely look at any significant changes in the future.

    Take care,
    Anand
  • GotDiesel - Monday, August 18, 2008 - link

    Oh jeez.. are these guys retarded or what??? baseline only.. wake up guys.. everyone uses HIGH at least level 4.1..
    this is a typical example of windows software. all GUI and no go..

    what we need here is an open source version.. x264 is a perfect example of superior quality software surpassing close source .. now if only you "professionals" could do the same..

  • michal1980 - Monday, August 18, 2008 - link

    given, that most blu-ray content is already a varient of the efficent mp4 (avc,vc-1,x264 etc etc).

    to compress it just for the shake of saving file space seems foolish.

    IMHO, in most cases, the file on the blu-ray has been encoded to give you the best possible picture in that file size. No automagic program is going to somehow make the file size smaller, and maintain the same quality.


    Now if converting to a smaller resolution, theres a point, but then data loss is a given.


    IMHO, this solution would ideal for a gamer that wants to work with video, since inalot of cases more cores dont make a difference in gaming... yet make sense for data compression, you could have the best of both worlds, buy a higher speed, dual core, and use the money saved on a faster video card....

    if only the software worked.
  • gamerk2 - Monday, August 18, 2008 - link

    They said the same things with the .mpeg (and later. .mp3) formats: Why convert from .WAV and lose data and quality?
  • michal1980 - Monday, August 18, 2008 - link

    at least with a wav to mp3, theres a compression coversion.

    starting with a blu-ray to just run x264 on it.

    is like taking and mp3, and converting it to mp3 again, just with more compression.

    your stacking detail loss.
  • JarredWalton - Monday, August 18, 2008 - link

    True, but at 20-40 GB per BRD even a 1TB HDD runs out of space with only 20-50 movies. A 35 Mbps AVC stream may look "best", but outside of still captures I bet most users wouldn't notice a difference between 35 Mpbs AVC and 20 Mbps AVC... or possibly even 10 to 15 Mbps.
  • michal1980 - Tuesday, August 19, 2008 - link

    if i'm buying a blu-ray, and paying for that 30-35Mbps. Why would I kill it?

    it just baffels me.
  • Lonyo - Monday, August 18, 2008 - link

    Since the 9600GT isn't too far off the 8800GT in gaming, but has a large difference in the number of SP's (IIRC), it would be interesting to see how the two compare, rather than looking at even lower end cards like the 9500 and 8600's.

    Any chance of some additional numbers (even only one benchmark) using the 9600?

Log in

Don't have an account? Sign up now