Power Consumption

Intel has a full process node advantage when you compare Ivy Bridge and Trinity, as a result of that plus an architectural efficiency advantage you just get much better power consumption from the Core i3 than you do with Trinity. Idle power is very good but under heavy CPU load Trinity consumes considerably more power. You're basically looking at quad-core Ivy Bridge levels of power usage under load but performance closer to that of a dual-core Ivy Bridge. AMD really needs a lot of design level efficiency improvements to get power consumption under control. Compared to Llano, Trinity is a bit more efficient it seems so there's an actual improvement there.

Power Consumption - Idle

Power Consumption - Load (x264 HD 5.0.1 2nd Pass)

Of course on the processor graphics side the story is much closer with Trinity being a bit more power hungry than Ivy Bridge, but not nearly by this margin.

Overclocking & Power Consumption Final Words
Comments Locked

178 Comments

View All Comments

  • Spunjji - Tuesday, October 2, 2012 - link

    That seems to be a big if rather than a matter of when, though, given the patchy support that's been forthcoming for QuickSync so far! So possibly a valid avenue of investigation anyway. :)
  • eBombzor - Tuesday, October 2, 2012 - link

    Am I missing something in the benchmarks? Tom's did a CPU comparison with the 2100 and the 8120 (which isn't a whole lot different from the 8150) and the 8120 is near the Phenom CPU gaming wise.
    http://www.tomshardware.com/reviews/gaming-fx-pent...
    Something is not right here, the 2100 dominated the 8120 in Tom's benchies, the 3220 should be better.
  • Ryan Smith - Tuesday, October 2, 2012 - link

    Just thumbing through Tom's article, it looks like they're using 1920x1080 with high quality settings (GPU-limited settings) while we're mostly using 1024 and 1680 in order to ensure we're CPU-limited.
  • Rezurecta - Wednesday, October 3, 2012 - link

    Who cares about CPU limiting? You're not going to play a game @ 1024. 1680 might be valid, but why not show benchmarks at 1920? It just doesn't make sense to show a benchmark that isn't at a major demographic point.

    It could be a very misleading benchmark for a substantial amount of readers.
  • CeriseCogburn - Tuesday, October 9, 2012 - link

    But it makes amd look better, so it's awesome, and irresistible.
  • Rand - Tuesday, October 2, 2012 - link

    Why was the overclocking test done on Windows 8 (Image shows Win8), while the performance testing was done on Windows 7 (Test setup lists Win7)?
  • nofumble62 - Tuesday, October 2, 2012 - link

    This Trinity performance didn't beat i3, let alone i5.

    AMD statement " i5 performance at i3 price" is a total lie.
  • ac2 - Tuesday, October 2, 2012 - link

    It's only true for heavily threaded integer work and AES...

    But yeah, disappointing...
  • Taft12 - Tuesday, October 2, 2012 - link

    ... and gaming on the integrated GPU
  • MySchizoBuddy - Tuesday, October 2, 2012 - link

    Legit reviews state that AMD advised them to disable turbo mode else it will throttle the overclock. They were able to overclock it to 4.6 with full stability using a larger cooler.

    http://www.legitreviews.com/article/2047/18/

Log in

Don't have an account? Sign up now