Ivy Bridge Architecture Recap

At IDF Intel disclosed much of Ivy's CPU architecture, but below is a quick summary:

- 4-wide front end with µOp cache from Sandy Bridge
- OoO execution engine from Sandy Bridge
- Data structures previously statically shared between threads can now be dynamically shared (e.g. DSB queue), improves single threaded performance
- FP/integer divider delivers 2x throughput compared to Sandy Bridge
- MOV instructions no longer occupy an execution port, potential for improved ILP when MOVs are present
- Power gated DDR3 interface
- DDR3L support
- Max supported DDR3 frequency is now 2800MHz (up from 2133MHz), memory speed can be moved in 200MHz increments
- Lower system agent voltage options, lower voltages at intermediate turbo frequencies, power aware interrupt routing
- Power efficiency improvements related to 22nm
- Configurable TDP

I've highlighted the three big items from a CPU performance standpoint. Much of the gains you'll see will come from those areas coupled with more aggressive turbo frequencies.

On the GPU, the improvements are more significant. Some of the major changes are below:

- DirectX 11 Support
- More execution units (16 vs 12) for GT2 graphics (Intel HD 4000)
- 2x MADs per clock
- EUs can now co-issue more operations
- GPU specific on-die L3 cache
- Faster QuickSync performance
- Lower power consumption due to 22nm

Introduction The Lineup
POST A COMMENT

195 Comments

View All Comments

  • jjj - Tuesday, March 06, 2012 - link

    CPU perf pretty much as expected,GPU perf somewhat dissapointing ,i thought they'll at least aim to match Llano but i guess it is ok for 1MP laptops screens if mobile parts perform close enough (and a couple of big ifs when it comes to image quality and drivers).
    Any opinions yet about QuickSync encoding quality?
    Reply
  • wifiwolf - Tuesday, March 06, 2012 - link

    And we should remark that's comparing 2600 with 3700 which have different cpu too.
    Other benchmarks had significantly better results on 3700 than 2600.

    So Anand, how you know that difference is not attributable to the CPU and not to some gpu improvement?
    Reply
  • IntelUser2000 - Tuesday, March 06, 2012 - link

    You are not being serious, are you? The CPU gets 10% in CPU sensitive benchmarks and GPU gained 40-60%. Even taking out 10%, its still 30-50%, which btw isn't true as games aren't very sensitive to CPU changes as applications do. Reply
  • wifiwolf - Wednesday, March 07, 2012 - link

    Look at crisys or metro benchmarks and tell me where you find that improvement, at least more than what you find in cpu difference. Reply
  • mosu - Wednesday, March 07, 2012 - link

    I've tried it on some HD clips at a local TV station and on a big screen it really sucked.It's way behind AMD.We used aHP EliteBook 8460P laptop. Reply
  • Articuno - Tuesday, March 06, 2012 - link

    At least AMD's products are HD capable. Reply
  • dr/owned - Thursday, March 08, 2012 - link

    My 5 year old laptop with a shared ram gpu is "HD capable". GTFO noob. Reply
  • Articuno - Tuesday, March 06, 2012 - link

    Billions in R&D, double the MSRP, half the power and yet it still can't play Crysis better than Llano, which will be replaced by Trinity in a few weeks. What a crying shame. Reply
  • travbrad - Tuesday, March 20, 2012 - link

    Not playing Crysis sounds like a good thing to me. Reply
  • tipoo - Tuesday, March 06, 2012 - link

    Source or gtfo. Apple got the stock HD 3000, why would this be different? Reply

Log in

Don't have an account? Sign up now