The problem with Sandy Bridge was simple: if you wanted to use Intel's integrated graphics, you had to buy a motherboard based on an H-series chipset. Unfortunately, Intel's H-series chipsets don't let you overclock the CPU or memory—only the integrated GPU. If you want to overclock the CPU and/or memory, you need a P-series chipset—which doesn't support Sandy Bridge's on-die GPU. Intel effectively forced overclockers to buy discrete GPUs from AMD or NVIDIA, even if they didn't need the added GPU power.

The situation got more complicated from there. Sandy Bridge's Quick Sync was one of the best features of the platform, however it was only available when you used the CPU's on-die GPU, which once again meant you needed an H-series chipset with no support for overclocking. You could either have Quick Sync or overclocking, but not both (at least initially).

Finally, Intel did very little to actually move chipsets forward with its 6-series Sandy Bridge platform. Native USB 3.0 support was out and won't be included until Ivy Bridge, we got a pair of 6Gbps SATA ports and PCIe 2.0 slots but not much else. I can't help but feel like Intel was purposefully very conservative with its chipset design. Despite all of that, the seemingly conservative chipset design was plagued by the single largest bug Intel has ever faced publicly.

As strong as the Sandy Bridge launch was, the 6-series chipset did little to help it.

Addressing the Problems: Z68

In our Sandy Bridge review I mentioned a chipset that would come out in Q2 that would solve most of Sandy Bridge's platform issues. A quick look at the calendar reveals that it's indeed the second quarter of the year, and a quick look at the photo below reveals the first motherboard to hit our labs based on Intel's new Z68 chipset:


ASUS' P8Z68-V Pro

Architecturally Intel's Z68 chipset is no different than the H67. It supports video output from any Sandy Bridge CPU and has the same number of USB, SATA and PCIe lanes. What the Z68 chipset adds however is full overclocking support for CPU, memory and integrated graphics giving you the choice to do pretty much anything you'd want.

Pricing should be similar to P67 with motherboards selling for a $5—$10 premium. Not all Z68 motherboards will come with video out, those that do may have an additional $5 premium on top of that in order to cover the licensing fees for Lucid's Virtu software that will likely be bundled with most if not all Z68 motherboards that have iGPU out. Lucid's software excluded, any price premium is a little ridiculous here given that the functionality offered by Z68 should've been there from the start. I'm hoping over time Intel will come to its senses but for now, Z68 will still be sold at a slight premium over P67.

Overclocking: It Works

Ian will have more on overclocking in his article on ASUS' first Z68 motherboard, but in short it works as expected. You can use Sandy Bridge's integrated graphics and still overclock your CPU. Of course the Sandy Bridge overclocking limits still apply—if you don't have a CPU that supports Turbo (e.g. Core i3 2100), your chip is entirely clock locked.

Ian found that overclocking behavior on Z68 was pretty similar to P67. You can obviously also overclock the on-die GPU on Z68 boards with video out.

The Quick Sync Problem

Back in February we previewed Lucid's Virtu software, which allows you to have a discrete GPU but still use Sandy Bridge's on-die GPU for Quick Sync, video decoding and basic 2D/3D acceleration.

Virtu works by intercepting the command stream directed at your GPU. Depending on the source of the commands, they are directed at either your discrete GPU (dGPU) or on-die GPU (iGPU).

There are two physical approaches to setting up Virtu. You can either connect your display to the iGPU or dGPU. If you do the former (i-mode), the iGPU handles all display duties and any rendering done on the dGPU has to be copied over to the iGPU's frame buffer before being output to your display. Note that you can run an application in a window that requires the dGPU while running another that uses the iGPU (e.g. Quick Sync).

As you can guess, there is some amount of overhead in the process, which we've measured to varying degrees. When it works well the overhead is typically limited to around 10%, however we've seen situations where a native dGPU setup is over 40% faster.

Lucid Virtu i-mode Performance Comparison (1920 x 1200—Highest Quality Settings)
  Metro 2033 Mafia II World of Warcraft Starcraft 2 DiRT 2
AMD Radeon HD 6970 35.2 fps 61.5 fps 81.3 fps 115.6 fps 137.7 fps
AMD Radeon HD 6970 (Virtu) 24.3 fps 58.7 fps 74.8 fps 116.6 fps 117.9 fps

The dGPU doesn't completely turn off when it's not in use in this situation, however it will be in its lowest possible idle state.

The second approach (d-mode) requires that you connect your display directly to the dGPU. This is the preferred route for the absolute best 3D performance since there's no copying of frame buffers. The downside here is that you will likely have higher idle power as Sandy Bridge's on-die GPU is probably more power efficient under non-3D gaming loads than any high end discrete GPU.

With a display connected to the dGPU and with Virtu running you can still access Quick Sync. CrossFire and SLI are both supported in d-mode only.

As I mentioned before, Lucid determines where to send commands based on the source of the commands. In i-mode all commands go to the iGPU by default, and in d-mode everything goes to the dGPU. The only exceptions are if there are particular application profiles defined within the Virtu software that list exceptions. In i-mode that means a list of games/apps that should run on the dGPU, and in d-mode that is a smaller list of apps that use Quick Sync (as everything else should run on the dGPU).

Virtu works although there are still some random issues when running in i-mode. Your best bet to keep Quick Sync functionality and maintain the best overall 3D performance is to hook your display up to your dGPU and only use Sandy Bridge's GPU for transcoding. Ultimately I'd like to see Intel enable this functionality without the use of 3rd party software utilities.

SSD Caching
Comments Locked

106 Comments

View All Comments

  • JarredWalton - Wednesday, May 11, 2011 - link

    Obviously, I missed changing the pasted text above. That's Bold, Italics, and underlined text. (And highlighted text is now gone, thankfully, so people talking about [H]OCP don't look weird. LOL)
  • Mr Perfect - Thursday, May 12, 2011 - link

    I hadn't though to try BBCode[\b[\i][\u].

    Thanks, Jarred.
  • Mr Perfect - Thursday, May 12, 2011 - link

    Much less use it correctly...
  • FlameDeer - Thursday, May 12, 2011 - link

    Hi Jarred, about the option to do links, I have tried before, by using the below codes:

    [L=text]/[/L] = [L=AnandTech]http://www.anandtech.com/[/L]

    The codes I put are
    <L=AnandTech>http://www.anandtech.com/</L>
    just replace the < > symbols to [ ] will do. :)

    Hopefully Intel will be more concern about what the users really needs & not just simply apply their own set of rules to users by limiting certain functions as they like.

    Good job of the review & take care, guys! :)
  • FlameDeer - Thursday, May 12, 2011 - link

    Ops, not working, anyway I try again few codes here, if still not working then just abandon it.

    [ L ]/[ /L ] = [L]Text[/L]
    [ A ]/[ /A ] = [A]Text[/A]
    [ B ]/[ /B ] = Text
    [ I ]/[ /I ] = Text
    [ U ]/[ /U ] = Text
    [ H ]/[ /H ] = [H]Text[/H]
  • therealnickdanger - Wednesday, May 11, 2011 - link

    Obviously, a lot of time goes into these reviews, but I would really like to see an update using a 64GB Vertex 3 or other fast 64GB drive as the cache. I suppose that the only real improvement would be how many apps/files are cached before eviction. But the Vertex 3 is a LOT faster than the new Intel 311 or whatever it is...
  • y2kBug - Wednesday, May 11, 2011 - link

    Take this with a huge grain of salt. The following quote from the review makes me shiver “In my use I've only noticed two reliability issues with Intel's SRT. The first issue was with an early BIOS/driver combination where I rebooted my system (SSD cache was set to maximized) and my bootloader had disappeared. The other issue was a corrupt portion of my Portal 2 install, which only appeared after I disabled by SSD cache.”

    Don’t get me wrong, I’m not trolling. I was really looking forward to SSD caching. But my previous experience when I randomly lost all data on an Intel RAID 1 array without any signs of hard-drive failures made me skeptical in the Intel RAID software.
  • NCM - Wednesday, May 11, 2011 - link

    Anand writes: "Paired with a 20GB SLC SSD cache, I could turn a 4-year-old 1TB hard drive into something that was 41% faster than a VelociRaptor."

    That's an assertion that really needs some heavy qualification, for instance by appending "at least sometimes and for certain things."

    SRT is an intriguing approach on the part of Intel, but ultimately it comes across to me as insufficient and unfinished. I have little confidence in its ability to gauge what's important to cache as opposed to what's used more often. Those aren't the same things at all.

    I'd like to see a drive approach where a limited capacity boot/application SSD is combined with a conventional HD within a single standard drive enclosure. This hybrid would present itself to the host as a single drive, but include a software toggle to allow selective access to each drive for setup purposes. You'd install the OS and programs on the SSD for rapid boot/launch, while user mass file storage would be on the HD. In normal use you wouldn't know, other than in performance terms, that two devices were involved.

    Yes, I know that we can achieve much of that today by using separate SSD and HD devices. I have two such setups, one a server and the other a workstation. However they both require some technical attention on the part of the user, and it's not an approach that works in a laptop, at least not without big compromises.
  • LancerVI - Wednesday, May 11, 2011 - link

    Can install OS on 1 60 GB SSD for example and then SRT a second 60 SSD for a 2 TB Raid 0 array?

    I've got two 60's in a Raid 0 now, but obviously, most of my programs are on seprate HDD's. If my above question is possible, maybe this is a way to split the difference as it were.

    Any insight would be appreciated.
  • djgandy - Wednesday, May 11, 2011 - link

    Considering you can pick up a 30GB SSD in the UK for £45, this seems like an easy way to get some performance increase for desktop productivity.

    http://www.overclockers.co.uk/showproduct.php?prod...

Log in

Don't have an account? Sign up now