Badaboom: A Full Test of Elemental's GPU Accelerated H.264 Transcoder
by Anand Lal Shimpi on August 18, 2008 12:00 AM EST- Posted in
- GPUs
Energy Efficiency
The Badaboom transcoding process actually taxes the CPU as well as the GPU as we’ve already seen, so it’s not too surprising that power consumption when you’re using the GPU as well as the CPU is actually greater than when it’s all CPU based.
The factor you have to take into account is not only how much power is consumed by the system, but for what duration. While the Core 2 Quad Q9450 system only drew about 160W, the entire encode task took 211 seconds. The same system with a GeForce GTX 280 performing the transcode finished the task in 61.9 seconds despite drawing 210W at the wall. Multiply the two out and you get total energy consumed in Joules:
Instantaneous Power Consumption | Energy Use over Benchmark Duration | |
Intel Core 2 Quad Q9450 | 160W | 33760J |
NVIDIA GeForce GTX 280 | 210W | 12999J |
While offloading the transcode task to the GeForce GTX 280 takes more power, it uses less than 40% of the energy since it can complete the transcode so much faster. GPU accelerated video transcode appears to be, as we first suspected, the more efficient way of doing things.
38 Comments
View All Comments
JarredWalton - Monday, August 18, 2008 - link
While what you say is true to an extent, we're testing the value of a specific piece of hardware to perform certain work. Using your logic, gaming benchmarks are worthless as well, because it's not like you're going to play games all the time.We can look at the power question in a lot of ways. It appears an E4500 would do about just as well as the Q6600 used in testing, so for power should we compare Q6600 with IGP to E4500 with GTX 280 (or 9800)? That's certainly one valid comparison point, but if you go that route you quickly get to the stage where you have so many valid points of comparison that the project becomes unmanageable.
Personally, I assume most users understand that this is a look at energy efficiency for a specific task, and not a holistic look at PC power use. What it tells us is that in heavily bottlenecked situations, GPU encoding is far more efficient than CPU encoding. That's useful information. Now we just need a good codec and application to back it up.
Inkjammer - Monday, August 18, 2008 - link
Since this is still a beta version, I have to wonder how much could possibly change by end of release? Were you able to talk to Elemental to address the issues with the beta and the dissapointment in the "advanced" settings?The Pro edition seems dissapointing, but if they ironed out the kinks in the end... I'd be interested in picking it up. Will there be a follow-up review for the release version?
Anand Lal Shimpi - Tuesday, August 19, 2008 - link
I've kept Elemental aware of all of the issues I've had. I gave them some suggestions back after my first preview of the software. Every single problem I've encountered Elemental has added to their list of things to QA for, I'm hoping we'll see some significant improvements in the next major release.I will keep an open dialogue with Elemental and definitely look at any significant changes in the future.
Take care,
Anand
GotDiesel - Monday, August 18, 2008 - link
Oh jeez.. are these guys retarded or what??? baseline only.. wake up guys.. everyone uses HIGH at least level 4.1..this is a typical example of windows software. all GUI and no go..
what we need here is an open source version.. x264 is a perfect example of superior quality software surpassing close source .. now if only you "professionals" could do the same..
michal1980 - Monday, August 18, 2008 - link
given, that most blu-ray content is already a varient of the efficent mp4 (avc,vc-1,x264 etc etc).to compress it just for the shake of saving file space seems foolish.
IMHO, in most cases, the file on the blu-ray has been encoded to give you the best possible picture in that file size. No automagic program is going to somehow make the file size smaller, and maintain the same quality.
Now if converting to a smaller resolution, theres a point, but then data loss is a given.
IMHO, this solution would ideal for a gamer that wants to work with video, since inalot of cases more cores dont make a difference in gaming... yet make sense for data compression, you could have the best of both worlds, buy a higher speed, dual core, and use the money saved on a faster video card....
if only the software worked.
gamerk2 - Monday, August 18, 2008 - link
They said the same things with the .mpeg (and later. .mp3) formats: Why convert from .WAV and lose data and quality?michal1980 - Monday, August 18, 2008 - link
at least with a wav to mp3, theres a compression coversion.starting with a blu-ray to just run x264 on it.
is like taking and mp3, and converting it to mp3 again, just with more compression.
your stacking detail loss.
JarredWalton - Monday, August 18, 2008 - link
True, but at 20-40 GB per BRD even a 1TB HDD runs out of space with only 20-50 movies. A 35 Mbps AVC stream may look "best", but outside of still captures I bet most users wouldn't notice a difference between 35 Mpbs AVC and 20 Mbps AVC... or possibly even 10 to 15 Mbps.michal1980 - Tuesday, August 19, 2008 - link
if i'm buying a blu-ray, and paying for that 30-35Mbps. Why would I kill it?it just baffels me.
Lonyo - Monday, August 18, 2008 - link
Since the 9600GT isn't too far off the 8800GT in gaming, but has a large difference in the number of SP's (IIRC), it would be interesting to see how the two compare, rather than looking at even lower end cards like the 9500 and 8600's.Any chance of some additional numbers (even only one benchmark) using the 9600?