Anisotropic Filtering Quality

At IDF last year Intel promised an improvement in its anisotropic filtering quality compared to Sandy Bridge. Personally I didn't believe SNB's GPU was fast enough to warrant turning on AF in most modern titles, but as Intel's GPU performance improves it must take image quality more seriously.

I wouldn't put a ton of faith in these early results as things can change, but AF quality does appear to be much better than Sandy Bridge:

The peculiar radial lines that were present in SNB's algorithm remain here, although they are more muted. Again it's too early to tell if we're looking at final image quality or something that will improve over time. If we are to judge based on this result alone, I'd say it mirrors what we saw in our performance investigation: Ivy is a step towards AMD in the GPU department, but not a step ahead.

DirectX 11 Compute Performance

As Ivy Bridge is Intel's first DirectX 11 GPU architecture, we're actually able to run some DX11 workloads on it without having them fall back to DX10. We'll do a much more significant investigation into GPU compute performance in our full Ivy Bridge review, but as a teaser we've got our standard DirectX 11 Compute Shader Fluid Simulation test from the DX11 SDK:

DirectX11 Compute Shader Fluid Simulation - Nearest Neighbor

Ivy Bridge does extremely well here, likely due in no small part to its excellent last level cache. The Fluid Simulation we run looks at shared memory performance, which allows Ivy to do quite well. We're seeing over 3.2x the performance of Sandy Bridge here, and even a slight advantage over Llano.

Intel HD 4000 Performance: Skyrim QuickSync Performance
Comments Locked

195 Comments

View All Comments

  • krumme - Wednesday, March 7, 2012 - link

    Well the dilemma for Anand is apparent. If he stops writing those previews that is nice to Intel, someone else will get the oportunity and all the info. He can write two bad previews and the info and early chips just stops comming. Intel and Anand have a business to run, and there is a reason Intel gives Anand the chips (indirectly).

    He have a "deal" with Intel, the same way we have a deal with Anand when we read the review. We get the info - bended/biased - and then we can think ourselves. I think its a fair deal :) - we get a lot of good info from this preview. The uninformed gets raped, but its alway like that. Someone have to pay for the show.
  • chemist1 - Wednesday, March 7, 2012 - link

    The Macbook Pro, for instance, has a discrete GPU, yet can switch to the chip-based GPU to save power when on battery. So having a better chip-based GPU makes sense in this context.
  • Sabresiberian - Wednesday, March 7, 2012 - link

    I'd like to see the discreet graphics card industry make the kind of progress, relatively speaking, Intel has made in the last 2 years.

    Ivy Bridge is a ways from competing with a high-end discreet solution, but if the relative rates in progress don't change, Intel will catch up soon.
  • sixtyfivedays - Wednesday, March 7, 2012 - link

    I use the iGPU on my build for my second monitor and it is quite nice.

    I can watch HD videos on it and it doesn't take away from my dedicated GPU at all.
  • mlkmade - Thursday, March 8, 2012 - link

    Is that even possible? Special hack or software?

    When you install a discrete graphics card, the integrated gpu gets disabled.

    Would love to know how you accomplished this..Is it a desktop or laptop?
  • mathew7 - Thursday, March 8, 2012 - link

    "When you install a discrete graphics card, the integrated gpu gets disabled."

    It was exclusive in northbridge-IGP units (Core2Duo/Quad and older). With Core-i, it's by default disabled but can be enabled through BIOS (of course if you don't have a P5x/6x chipset).
  • AnnonymousCoward - Wednesday, March 7, 2012 - link

    1. How much faster is Ivy Bridge at single thread versus my Conroe@3GHz?
    2. How much faster is my GTX560Ti than HD4000?
  • dr/owned - Thursday, March 8, 2012 - link

    1) Your 65 nm cpu would get the shit blow out of it by IB at the same clock speed in single threaded applications. Assuming 15% improvements in each of the tick-tocks since Conroe, a 1.8 Ghz IB would probably be about the same as a 3 Ghz Conroe.
    2) Discrete graphics vs. integrated graphics. Intel isn't trying to compete here so it's a stupid comparison.
  • AnnonymousCoward - Friday, March 9, 2012 - link

    1. Your "get the shit blown out" is worthless. All I'm looking for is a number, and your effective answer is +67%.

    2. It's not a stupid comparison, because:
    a) I'm interested.
    b) HD4000 is designed for games.
    c) They benchmarked with modern games.
    d) Games are designed around people's performance.
  • AnnonymousCoward - Friday, March 9, 2012 - link

    1. Another website shows the i7 3770K scored 2643 on the Fritz Chess Benchmark with 1 processor. My machine does 2093. That's only 26% different.

    2. I very roughly estimate the GTX560Ti might be 5-6x faster than the HD4000.

    It'd be useful to see a real comparison of these though.

Log in

Don't have an account? Sign up now