Dirt 2

Dirt 2 came to the PC in December 2009, developed by Codemasters with the EGO Engine. Resulting in favorable reviews, we use Dirt 2’s built-in benchmark under DirectX 11 to test the hardware. We test two different resolutions at two different quality settings using a discrete GPU, and an appropriate integrated GPU setting. (Since the game only runs in DX9 or DX11 modes and the HD 3000 lacks support for DX11, we test in DX9 mode on the iGPU.)

Dirt 2, Integrated GPU, 1024x768, Medium Quality

Dirt II—1680x1050; Single GPU

Dirt II—1680x1050; Dual GPU

Dirt II—1680x1050; Triple GPU

Dirt II—1920x1080; Single GPU

Dirt II—1920x1080; Dual GPU

Dirt II—1920x1080; Triple GPU

Our ASUS board does better in the 1680x1050 results than the 1920x1080, topping out the single and dual GPU results.

Metro 2033

Metro 2033 is the Crysis of the DirectX 11 world (at least until Crysis 2 is updated to DX11 support), challenging every system that tries to run it at any high-end settings. Developed by 4A Games and released in March 2010, we use the built-in DirectX 11 Frontline benchmark to test the hardware. For the iGPU, we run in DX10 mode.

Metro 2033, Integrated GPU, 1024x768, Medium Quality

Metro 2033—1920x1080; Single GPU

Metro 2033—1920x1080; Dual GPU

Metro 2033—1920x1080; Triple GPU

Metro 2033—1680x1050; Single GPU

Metro 2033—1680x1050; Dual GPU

Metro 2033—1680x1050; Triple GPU

Computation Benchmarks Final Words
Comments Locked

95 Comments

View All Comments

  • fr500 - Thursday, May 12, 2011 - link

    Not any new crashes but there are crashes now and them with certain nvidia driver versions on TF2 for instance. With an aditional layer on top it makes it worse I guess...

    Also I said that about control panels because of techreport's review but it seems they didn't know about d mode
  • AnnihilatorX - Thursday, May 12, 2011 - link

    Anand noted that Intel said (when virtu didn't exist), to use QuickSync in Z68, one needs to use 2 monitors.

    So basically you can have 2 separate monitors, one connection from discrete and one from onboard.
  • L. - Thursday, May 12, 2011 - link

    I'm pretty sure you can come up with more complex and useless examples of using relatively bad technology, namely Lucid.

    As long as you have a real gpu, you should not be using the IGP at all, that is not going to change anytime soon as 7% of a real GPU >>> anything Intel ever made.

    Not to worry though, Virtu will be gone very soon, like Hydra.

    "Normally", you should be able to send the transcoding to your GPU and have 3d / screen input at the same time.

    In other words : intel IGP bad except if you don't have anything else, virtu bad always unless your GPU is really bad too (wtf?).

    This will become more interesting when AMD starts selling Llano, as llano's gpu will be much stronger than Intel's IGP and using both ressources efficiently (discrete+igp) will make a real difference.
  • fr500 - Thursday, May 12, 2011 - link

    The whole deal with this is QuickSync is really fast for transcoding.

    I don't like the idea of Hydra either so I think using a spare input from your monitor could work, it would still be detected even if the input is not the current active one and even if it doesn't get detected you could just select the other input, do your transcoding and be done with it.
  • cbgoding - Wednesday, May 11, 2011 - link

    So 1.42V was your safe limit, but under load it jumped to 1.544 V?

    What the hell?
  • AnnihilatorX - Thursday, May 12, 2011 - link

    He wrote 'but'. He's suggesting a bug in the TurboV software
  • cbgoding - Friday, May 13, 2011 - link

    alright, makes sense. Guess I'll never use TurboV, I'd shit a brick if i had a .124V spike.
  • L. - Thursday, May 12, 2011 - link

    My mobo does that ... although I went a bit hard on the pencilmod on purpose - which gives me a vload > vidle (although very close, it's about .025 more).
  • DBissett - Thursday, May 12, 2011 - link

    Sorry for grammar policing but some grammar is so bad it makes reading otherwise good articles impossible. "Asus" and "Intel" are companies, singular nouns, and require singular verbs. To say "Asus have" and "Intel have" is not only technical incorrect but just plain reads very badly. "Asus has...." and "Intel has...." is the correct grammatical form. Or if you have to use plural verbs then try "People at Asus have..." for example, and now you've got a plural subject.

    Dave
  • IanCutress - Friday, May 13, 2011 - link

    Hi Dave,

    This is one of the (many) differences between British and American English. I attempt to write in an American style for AnandTech, but as I am British, a few things scrape through the net.

    All the best,
    Ian

Log in

Don't have an account? Sign up now