POST A COMMENT

39 Comments

Back to Article

  • lukechip - Monday, February 28, 2011 - link

    I'm curious to understand how discrete GPU power consumption behaves when Virtu is not installed.

    If I have a H67 mobo and a discrete GPU, and no display connected to the discrete GPU, does the discrete GPU consume any power ? If it does, then is it the rated 'Idle' power, or something even less since it is maybe just spinning its fan(s) ?

    From the power consumption table in the article, it seems that under Virtu, the 6970 was using nearly 90 W at idle (126 - 34.7), whereas the 6970 when reviewed by AT was clocked at about 20 W idle.

    All in all though, Virtu does sound promising for me, as I plan to have an H67 or Z68 based HTC with a discrete GPU for gaming.
    Reply
  • Ryan Smith - Monday, February 28, 2011 - link

    My best guess (I have not seen Virtu myself) is that the AMD GPU didn't properly go idle. Otherwise AMD has 2 idle power states - one for normal idling and an even lower one for the slave GPUs in Crossfire - either one of them should have consumed much less power.

    It's worth noting that at CES Lucid told us that they can completely shut off the dGPU. One of the benefits of working with mobo manufacturers is that they can ensure that the boards properly support the necessary sleep states. So it's supposed to be possible to completely power down the dGPU, just as Optimus, etc power down the dGPU in a laptop.
    Reply
  • mianmian - Tuesday, March 01, 2011 - link

    According to Tom's hardware, the idel power when Virtu is enabled seems to be IGP + dGPU. That is a little bit more than when the dGPU is at idel without Virtu. Reply
  • jimmyzaas - Monday, February 28, 2011 - link

    This technology reminds me of the 3dfx Voodoo 2 card I used to love! The addon card that adds 3d goodness to your games when you need it via some sort of loopback cable that hooks up to your normal 2D video card. I'm kinda shocked Nvidia didn't invent this.

    Really an excellent idea, this Virtu chip.. looking forward more stuff from Lucid.
    Reply
  • aegisofrime - Tuesday, March 01, 2011 - link

    The best thing is that, as I understand it, Virtu isn't even a chip: it's software. It will be great to see all motherboards that can support it do so. Reply
  • kilkennycat - Monday, February 28, 2011 - link

    Planning any tests with SLI/Crossfire and Virtu ?? Reply
  • jcandle - Tuesday, March 01, 2011 - link

    Its great Lucid found another niche, and not to rain on their parade... but, does anyone else think with the release of Quick Sync on SB that ATI and nVIDIA will pay up to MPEG-LA and drop in encoders to their next GPU releases. If so, Virtu's usefulness could be even more niche, since as mentioned in the article notebooks have been shipping with mechanisms to swap between IGP and discrete. Furthermore there's really nothing stopping nVIDIA or ATI from integrating the piping of data to IGP for Quick Sync with an update to their Optimus or Switchable Graphics software routines.

    This is not designed to be a flame, just looking for a poll of opinions on whether or not you'd drop dollars on this if either graphics manufacturer came up with their own solution on a refresh.
    Reply
  • Stahn Aileron - Tuesday, March 01, 2011 - link

    Well, the nice things about Lucid's tech is the fact it's purely software. As far as I recall, all the other switchable graphics are hardware based to some extent.

    If what Anand is saying about Virtu is true, it'll be a more seamless system that's completely vendor- and hardware-agnostic. The other switchable schemes are proprietary and locked into whatever platform they support - That works for laptops (as a user would usually replace the system in one go; not that many laptops are GPU-upgradeable anyway...)

    In a desktop, where the possible combinations of hardware are near infinite, having something that's completely neutral to what you have installed could be very useful.
    Reply
  • jcandle - Tuesday, March 01, 2011 - link

    That certainly is a good point. The question remains, would you buy a board just for this? Over an undoubtedly cheaper competitor or model, if as I mentioned both AMD and nVIDIA start including Quick Sync type encoders inside their the next refresh of their respective graphics boards. Their previous tech Hydra seemed more useful for the reasons (not usage) you mentioned.

    And just to clarify if Quick Sync does catch on you can expect AMD to either include it in their graphics boards or with their CPU releases. If it becomes a feature of the graphics board you can expect similar offerings from nVIDIA.
    Reply
  • sean.crees@gmail.com - Tuesday, March 01, 2011 - link

    I'm sorry if i missed this in the article. But what exactly is the point? Were going to pay money for software that makes our games run at a lower frame rate so that we can have software driven switchable graphics on our desktops? For what purpose? I just don't get it. Reply
  • PhoenixEnigma - Tuesday, March 01, 2011 - link

    The point, in a nutshell, is to be able to have both a dGPU and Quick Sync work on one system. Without this, it's one or the other, which is a shame as both provide significant benefits. Reply
  • synaesthetic - Tuesday, March 01, 2011 - link

    To save power, to reduce heat and fan noise.

    (This will probably be more useful on laptops than desktops, however...)
    Reply
  • JumpingJack - Tuesday, March 01, 2011 - link

    SB enables a key feature that some, maybe even many, people will find useful: video transcoding. If you do not do much nor no video transcoding, then all this is sorta pointless. But, if you are one to move lots of media to your portable devices, then this is a good solution to an otherwise bizarre design of the SB platform.

    QuickSync uses the iGPU to work much of it's magic, a combination of fixed function encode HW and use of the EUs in some of the transcoding stream. Intel, in a fit of non-brilliance, did not account (or did not take time to think through) the fact that some people will would want to use a dGPU. As a result, using a dGPU renders the function of SB HW transcode non-operational. This is a big deal since the SB transcode can out perform most GPU accelerated transcode by a wide margin, for a high end Radeon SB can be as much as 2x faster, and for Fermi as much as 60-80% faster.
    Reply
  • mbf - Tuesday, March 01, 2011 - link

    ...thinking it would have been a good idea to have iGPU data sent to the dGPU instead of vice versa? Or perhaps even make this configurable.

    I know that you wouldn't be able to cut down on power or noise, but I'm sure you'd lose less perfomance in gaming.
    Reply
  • QuantumTR - Tuesday, March 01, 2011 - link

    lol exactly... Reply
  • Breit - Tuesday, March 01, 2011 - link

    ...just wanted to ask same question. ;)

    As far as I understand it, Intel shuts down its iGPU if no monitor is connected and that is why you have to route the display output through the iGPU. Maybe Lucid should talk to Intel first before releasing this... 8)

    Btw: How do you connect multiple monitors to such a setup?
    Reply
  • QuantumTR - Tuesday, March 01, 2011 - link

    I think that it could be better if the Virtu actually copied the frame buffer from integrated GPU in 2D mode or encode and decode to the discrete GPU so that the overhead incurred in 3D rendering would be minimized? I'm pretty sure most people would not prefer having a %5-10 performance impact on their games instead of finishing a QuickSync encoding a few seconds later. Or better than this would be allowing the user to select the output graphics card. Reply
  • jcompagner - Tuesday, March 01, 2011 - link

    http://www.nvidia.com/object/optimus_technology.ht...

    I am searching for a SB notebook, but i want that enabled, because i almost never use the discrete gpu anyway. And yes i have to buy one because in the 17" high end laptop you don't have a choice..
    Reply
  • mutantmagnet - Tuesday, March 01, 2011 - link

    or Lucid eventually revises their technology so we can choose the output otherwise I wouldn't be able to use Eyefinity without constantly rearranging the cables.

    I really like the idea behind this but I'm wondering how flexible it can be at handling the allocation of multiple gpus to more than 2 displays. Having an agnostic gpu virtualization platform instead of being locked into either AMD or Nvidia would be beneficial espeically since they currently believe only people in the enterprise market are considering gpu virtualization.
    Reply
  • DooDoo22 - Tuesday, March 01, 2011 - link

    What is apple doing to utilize both quick sync and discrete amd graphics switching? Is it something similar to this? Reply
  • DanNeely - Tuesday, March 01, 2011 - link

    This is being pitched as something to be bundled by the mobo vendor. Does this mean people who bought SNB prior to this coming out will be sunk unless they buy a new board? Reply
  • haplo602 - Tuesday, March 01, 2011 - link

    So finaly somebody reimplemented what voodoo graphics chips (1 and 2) were doing back then (and TV tunner cards). Copying the framebuffer to a specified region where it was displayed by the primary card.

    What's the big deal ? I mean Microsoft should have this figured out already and implemented in the OS. Why does it take an ISV to kick the big players ass to implement usable technologies ?

    Intel made a big mistake with their QuickSync implementation (only usable on laptops and very low end desktops). But given the SB issues on Linux, I guess Intel did not think things through.

    I am waiting on a homogenous implementation of various computation units with a NUMA like architecture. I hope AMD will make this possible with their Llano parts and the integrated GPU will simply be another coprocessor when not hooked to a display.
    Reply
  • Stargozer - Tuesday, March 01, 2011 - link

    In addition to the reduced frame rates, is there any latency added by using Virtu?

    How are the minimum frame rates affected (it seems possible that all the shuffling might be more of a limitation in a high-stress scenario)?
    Reply
  • Figaro56 - Tuesday, March 01, 2011 - link

    If Intel is so great why don't they fix this retarded situation? For the marginal performance benefit you get form Sandy Bridge I think I'll just stick with my AMD chipset and motherbaord. I don't have to mess around with gay things like this to benefit. Reply
  • fic2 - Tuesday, March 01, 2011 - link

    Wouldn't it be easier just to make a dongle that plugged into the iGPU video port and acted like it was a monitor? Reply
  • hechacker1 - Tuesday, March 01, 2011 - link

    What happens if you are interested in DXVA? Which GPU does the decoding?

    I'm also curious if there is any latency issues for gaming and video playback, with the frame-buffer being copied from the discrete GPU to the Intel GPU.
    Reply
  • billythefisherman - Tuesday, March 01, 2011 - link

    If vritu is simply copying the frame buffer over what exactly is turning off the iGPU normally? If its simply the port then surely there's a simple way to turn the SB iGPU on to aid in direct compute or open cl acceleration. Surely this can be controlled by a very simple firmware update provided by a mobo manufacturer? Reply
  • jmunjr - Tuesday, March 01, 2011 - link

    So Virtu's only real purpose is to allow Quicksync on systems with discrete GPUs? This is moronic... Reply
  • fic2 - Wednesday, March 02, 2011 - link

    Hence my previous suggestion that it would be easier just to create a "monitor" dongle that plugged into the iGPU port and acted like a monitor. This would enable the QuickSync feature without actually having a monitor.

    Of course, the obvious solution is for Intel to fix their d@mn implementation so that there doesn't have to be a monitor plugged in.
    Reply
  • strikeback03 - Wednesday, March 02, 2011 - link

    Supposedly it could also power down the dGPU when not needed, saving energy. That said, it seems Intel should be able to enable the iGPU portion of the chip when desired for transcoding, maybe a firmware update. Reply
  • Voldenuit - Wednesday, March 02, 2011 - link

    While I applaud the job that Lucid has done in ameliorating some of the annoying flaws in the Sandy Bridge platform, I do wonder who this solution is targeted at.

    10% performance penalty on all your games on the off chance that you want to use the HD 3000 to transcode video. That's a lot. 10% is enough to make some gamers upgrade their GPU.

    On the bright side, you can continue to play games while transcoding, with a hit on Quicksync performance, so that may be of interest to anyone who games and has a PMP.

    Right now I'm using Handbrake instead of ATI's AVIVO Video converter, as AVIVO is simply useless - no 2-pass encoding, no deinterlacing (very useful on those nasty 1080i broadcastss) or inverse telecine, no batch processing, etc. and the quality is not as good as the CPU with high profile. Hopefully media espresso and media converter are a lot more powerful and configurable than AMD's free tool, but they're also asking $40 a pop, and they're competing with free, in this case Handbrake (among other tools), which is very full featured for the asking price ($0).
    Reply
  • MrSpadge - Wednesday, March 02, 2011 - link

    I hope this works not only with DX but also with GP-GPU, be it CUDA, CAL or OpenCL. It could help keep your desktop snappy while the GPU is doing heavy number crunching.

    MrS
    Reply
  • slyck - Wednesday, March 02, 2011 - link

    For everybody wondering why Intel made these mistakes, or accidentally messed up on the platform, come on this is Intel you should know better. These restrictions were put in place intentionally to maximize profit. If anybody wants to argue "but they would make more if it was done right blah blah", no. Large corrupt monopolistic corps like Intel know how to make money better than you or I. SANDY BRIDGE IS BROKEN ON PURPOSE. Thank their greedy execs and beancounters. Reply
  • kenour - Thursday, March 10, 2011 - link

    Don't be stupid... It's not like Intel Capital is one of Lucidlogix’s investors! Oh wait... Reply
  • mpowell - Saturday, April 02, 2011 - link

    And yet another Windows-only Windows-centric item...

    Which does nothing for someone who runs a Linux desktop and dual monitors. However, I did note the Intel workaround comment about connecting one monitor to each video output, e.g. one to the SNB motherboard output and the second to a discreet add-in video card.

    I would simply love to know if this workaround situation would also apply to a Linux desktop with dual monitors, in lieu of Lucid's software probably never being released for Linux.

    Thanks
    Reply
  • senderj - Wednesday, May 04, 2011 - link

    Does Virtu work on all H67 mobo, or just those "registered" with Lucid? Reply
  • swindmill - Monday, May 14, 2012 - link

    The LogMeIn Mirror driver seems to break Lucidlogix's Virtu software as detailed in this blog post:

    http://blog.ampx.net/2012/05/lucid [...] gmein.html

    Has anyone else experienced this issue?
    Reply
  • swindmill - Monday, May 14, 2012 - link

    Whoops, let's try that again!

    http://blog.ampx.net/2012/05/lucidlogix-virtu-and-...
    Reply
  • bodark - Friday, August 10, 2012 - link

    I don't know if there is latest post about LucidVirtu. This review dated 2/28/2011. Today is 2012, whoever read this. This is not the full result anymore.

    I use LucidVirtu & I approve LucidVirtu power. I use
    i7-3770 (HD 4000) with Z77 Asus Motherboard (with LucidVirtu MVP feature of course) & GTX 550 Ti SLI.

    I tell you what. I never believe it, it's 2x the power.

    P/S: Sorry for my bad English.
    Reply

Log in

Don't have an account? Sign up now