POST A COMMENT

34 Comments

Back to Article

  • fluxtatic - Wednesday, April 27, 2011 - link

    I've been waiting for this for years...although I will continue to wait, apparently, as I'm an AMD man, proc-wise, and will continue to be. Bring me Bulldozer with switchable graphics, and then kill me, as I'm unlikely to ever be that happy again. Reply
  • Zendax - Wednesday, April 27, 2011 - link

    At first I was sad that this apparently won't work with GTX 200 series GPUs, but then I remembered my i7-860 doesn't come with integrated graphics anyway.

    This is a good change for those of us who like to have equally powerful and power hungry graphics card, but don't want them chewing away on power when they're not necessary. Especially with the powerful IGP on the high-end SNB processors, this makes a lot of sense.
    Reply
  • slyck - Friday, April 29, 2011 - link

    Sandy's IGP powerful? Guess I wouldn't stretch it quite that far. If it's so "powerful", why have a dedicated card? Reply
  • dananski - Saturday, April 30, 2011 - link

    Powerful enough for accelerating e.g. flash playback and other minor things you used to need dedicated cards for. You don't really want your gaming card with its high idle power on at those times if you can help it. Reply
  • iGo - Wednesday, April 27, 2011 - link

    Back in 2009, when I met Anand sir I had discussed this him. At that time nVidia and ATi was working on not-too-similar but related tech called Hybrid SLi/Crossfire. That is using their own IGP with dGPUs in SLi or Crossfire mode. The query that I had asked was, instead of running them Crossfire or SLi why NV/ATI doesn't make switchable graphics. Running IGP on normal operation and switching to dGPU for demanding tasks like Gaming/HD Decoding etc. thus saving power in normal day-to-day operation. Anand sir did mention that it might come to that, but if GPU chips also started using power-gating like CPUs then idle power consumptions wouldn't be an issue.

    Thankfully, to QuickSync there is a very compelling reason for switchable graphic on desktop. Plus, current generation IGPs are powerful enough to handle HD decoding tasks negating the need for powerful dGPU to handle HD video. Hope ATi comes up with their solution also quickly. Too bad for Lucid/virtu, although inability to access dGPU native control panels was kind of a bummer for Virtu (when IGP was used in primary mode and dGPU was used in virtualized mode). The other way around enabled the panel access (NVC or CCC) but by keeping the dGPU always active and scrapping the power saving benefits.
    Reply
  • Shadowmaster625 - Wednesday, April 27, 2011 - link

    The trouble with quicksync is that it will be useless when a new codec comes along that delivers twice the compression (along with 8 times the cpu usage, but with 8 cores who cares?). It has been a long time since we've had a major breakthrough in encoding technology but that could change in an instant. Reply
  • shawngmc - Monday, May 02, 2011 - link

    Not necessarily... By then, firmware may have advanced to the point that quick sync might be usable with a discrete GPU. Then, you can have interesting solutions, such as HW-to-HW transcoding. For example, let's say you've downloaded or recorded video in a newer codec. The flexible GPU could decode and hand off to the CPU, which could reencode in H.264 for other devices that don't support the new format (like an iPod). Plus, even as better formats exist, not all video gets instantly converted to it.

    It's true that a new format likely won't work with QuickSync, but any new codec might also be optimized for more efficient computation. I'm not a compression software expert, but I am a software developer, and if I was going to make a compression format today, bandwidth would honestly be the smallest priority in my mind. I'd be focusing on processor usage (since we are moving towards mobile devices with ever-increasing bandwidth but few battery life gains) and better quality (artifact reduction and/or dealing with dropped keyframes better).

    I'm not saying your point is wrong, per se, just that it's not the be-all-end-all.
    Reply
  • Hrel - Wednesday, April 27, 2011 - link

    "You also don’t have to worry about new GPU drivers breaking support with Virtu, as NVIDIA handles all of that in their own drivers." I think you meant with Synergy.

    As far as the article, at most I can say "eh". I'm still running LGA775 with GTX460. I have intention of changing that till USB 3.0 is natively supported as well as full 6MBPS SATA. I'm thinking/hoping summer/fall 2012.

    As far as Optimus goes I'm FAR FAR more interested in seeing it implemented on laptops with the GTX460 and up. Mostly just the GTX460 personally cause that's the GPU I want; but there's no reason it can't be on EVERY GPU.

    I see the advantage of switchable graphics on the desktop; especially as a casual video editor. But really, on my wish list that's pretty far down there. Nvidia GPU's do a pretty darn good job of speeding up rendering times all on their own. Intels way isn't much (if at all) better. And really with their motherboard failures and constant socket switching and apparent inability to use a technology that feels old at this point (USB 3.0, SATA 6MBPS) and price gauging of their customers I really am about to jump ship to becoming an AMD only guy. Intel is just pissing me off lately. My Penryn kicks butt; however if they don't stop being so greedy, and stupid with the sockets and SB USB support, I will buy AMD only out of spite.

    I do still prefer Nvidia GPU's though when I can get them at fair prices. For several reasons.
    Reply
  • Hrel - Wednesday, April 27, 2011 - link

    I have no intention* Reply
  • JarredWalton - Wednesday, April 27, 2011 - link

    No, I meant GPU drivers breaking support with Virtu -- as in, the latest version of Virtu only lists support for the 260.89 and 260.99 drivers from NVIDIA, which are now about four months old. Maybe it still works -- I don't have hardware to test it on -- but the fact that their release notes explicitly state that they added support for the 260.99 drivers makes me nervous.

    PS -- Yes, I edited and wrote some of this article. :-)
    Reply
  • Kristian Vättö - Wednesday, April 27, 2011 - link

    If you look at the Sandy Bridge article, you will see that QuickSync boosted the transcoding by up to 100%. It was tested with NVIDIA GTX 460 though, so I don't know how much faster e.g. GTX 580 would be. If I recall correctly, NVIDIA transcoding also suffered from some quality issues (it's been awhile since I read that article so might be that it is fixed or I remember incorrectly).

    We are all entitled to make out own choices though so if you find AMD better, don't hesitate to buy one. Bulldozer looks very promising :)
    Reply
  • ratana - Wednesday, April 27, 2011 - link

    Yawn...I keep all my CPUs and GPUs at 100% Folding so who cares? besides, I am in the demographic that doesn't even think about electricity bills. Reply
  • lowenz - Wednesday, April 27, 2011 - link

    A question: is it possible a "reverse" approach as Lucid Virtu can do?

    Connecting the output to discrete VGA and virtualize the IGP for Quick Sync.....
    Reply
  • pubjoe - Wednesday, April 27, 2011 - link

    "besides, I am in the demographic that doesn't even think about electricity bills".

    Haha. What demographic would that be? The one who pops to the shops in a hummer or a Lambo?

    Efficiency is ALWAYS a good thing and it's about much more than money (if you're able to stop thinking about it!).

    For one, this technology can dramatically increase the life of high-clocked multiGPU configurations as well as the system they run in.

    But then perhaps the demographic that doesn't care about electricity usage also happily tosses burnt out computers in their local river.

    Well done for folding, but surely you can see that any step toward less wastage of energy and hardware is a very good thing for everybody.
    Reply
  • erple2 - Wednesday, April 27, 2011 - link

    Curiously, if you're not running your desktop at 100%, it's wasting electricity as it is. You should probably have instead purchased an extremely low power machine (that runs fine on a 150W PSU), and a serious power machine that has the muscle when you need it. Trying to buy for an outside chance possibility is asking for cutting too many corners when it comes to efficiency.

    It's a similar argument made about servers. If the server isn't at near-100% utilization, you've wasted money on overbuying for a use case you don't use.
    Reply
  • pubjoe - Wednesday, April 27, 2011 - link

    I agree.

    Nothing's ever 100% efficient, but steps like this are at least in the right direction.

    For the record, I don't have a beefy gpu by the way.
    Reply
  • ratana - Wednesday, April 27, 2011 - link

    Well, I suppose it is my own demo, since i don't have any bills to pay and I have like 200 Tesla 2070s (6GB) running 24/7 in my lab doing DNA matching with a custom C++/CUDA app we built to BLAST and run Smith-Waterman. This is a gov lab, so I don't pay the bills, the US taxpayer gracefully does. Useful stuff too, when I can match a DNA seq between Arabidopsis and mouse in less than a couple of days, i can tell the vaccine guys if they are on target or not, saving the US so we can accumulate more adipose cells on the backside whilst filling the vaccuum between the ears with re-runs of Serenity and Red Dwarf.. Reply
  • pubjoe - Thursday, April 28, 2011 - link

    Righty-ho. Reply
  • bunga28 - Friday, April 29, 2011 - link

    Society is paying for it. Humanity is paying for it. Your children and grand children are going to pay for it. Hell, I'm paying taxes therefore, I am paying for it. Your mentality is similar to those in Congress is that, if they don't directly pay for things then they can be wasteful. When are we, as a society, going to take responsibility (and castrate people like you)? As a famous person used to say, "some people should pro-create." And I think that you are a perfect example of that person. Reply
  • bunga28 - Friday, April 29, 2011 - link

    "some people shouldn't pro-create." Reply
  • chaoticlusts - Wednesday, April 27, 2011 - link

    I've been wanting this for ages..hell if this had of been announced before I bought my HD6950 it may well have been the tipping point to go Nvidia instead (as it was the tipping point was being able to unlock the 6950)

    Also as a random note the new name of Synergy is kinda hilarious for a power saving feature...the power company in my state (Western Australia) in named Synergy so if I get Synergy on my next GPU upgrade then Synergy will be making Synergy lose money :P
    Reply
  • Neoarun - Wednesday, April 27, 2011 - link

    Well Optimus is really an awesome application till now i never found out when it switches graphics from IGP to dgpu and it would be better if synergy works the same. Nvidia as it logo, is going GREEN :) now a days providing room heaters with graphic card :) Reply
  • pelov - Wednesday, April 27, 2011 - link

    yea, I hope they tackle the issue of massive power consumption by nvidia cards under full load as well. They just always seem to chew through more power than comparative AMD cards (whereas the idle power consumption is relatively the same).

    Using the IGP (and maybe even APU?) for non-graphic intensive tasks is fantastic and a great power saving measure, but currently the most available nvidia modern class of graphics cards on mobile devices like laptops are the 400 series, and those things are the pickup trucks of GPUs. The 500 series isn't much better off and still trails behind AMD in wattage, so i guess what I'm saying is: Great. but where's the rest?

    furthermore, llano is almost here and we're only going to see better and better graphics offerings in an APU by both AMD and Intel. this just seems like a way for nvidia to hang on for dear life in the laptop market without addressing the real issue. It's a great idea, but it's limited in its market and it only partially goes after the real problem.
    Reply
  • Den - Wednesday, April 27, 2011 - link

    I'd been planning to replace my original C2D E6600 with SNB when x67 came out but I had not decided yet between AMD and Nvidia graphics. I'd really like to be able to do this though, so that means Nvidia for me.

    Now I wonder though, would it work with SLI?
    Reply
  • Kristian Vättö - Wednesday, April 27, 2011 - link

    If you buy a mobo with Z68 chipset, then Synergy should work with SLI as well (at least according to VR-Zone's slide). Reply
  • hechacker1 - Wednesday, April 27, 2011 - link

    Besides QuickSync (which is nice if you don't care about quality but speed), new GPUs idle at around 20-40W.

    I'm guessing activating and switching between integrated video and discrete isn't going to save that much for a desktop user with a modern GPU. Plus you have to deal with any driver issues that brings. And you have to connect your monitor to the integrated video output.
    Reply
  • JarredWalton - Wednesday, April 27, 2011 - link

    NVIDIA has been doing this with Optimus for over a year, so they've gotten pretty good at it. Is it perfect? No. But 20 to 40W is pretty steep if you're only consuming 100W. My particular gaming system idles at around 185W, and a large chunk of that is the GPU (HD 5870, so this wouldn't help). I'd guess it's at least 60W at idle, considering when I had two 5850 cards I was sitting at 150W idle. (The second CF card can go into a very low power state, apparently.) Reply
  • Shining Arcanine - Wednesday, April 27, 2011 - link

    When will this be available on Linux? Does Nvidia intend to cede the entire market to ATI and Intel? Reply
  • Slaimus - Thursday, April 28, 2011 - link

    Why is video encoding even using the GPU like QuickSync? Why not run it through the much touted AVX units? Each core should have its own 256-bit AVX unit. Reply
  • DesktopMan - Sunday, May 01, 2011 - link

    Sandy Bridge CPUs have dedicated hardware for video encoding/decoding. It's not strictly speaking "using the GPU" as nVidia and AMD does for encoding. Even if the functionality is made available through the GPU drivers. Reply
  • Wolfpup - Saturday, April 30, 2011 - link

    Greaaaaaaat. I don't want Sloptimus on a notebook, and actively avoid notebooks using any switchable graphics (at least with Intel CPUs...I assume AMD can pull it off better with a Phenom II and their integrated graphics).

    Just...swell...that it and Intel's Flaphics are going to infect desktops too.
    Reply
  • flyvog6 - Tuesday, May 17, 2011 - link

    sells clothing,footwear,handbags,Sunglasses
    Online Store,Get Name Brand Fashion From 12USD Now!
    Our Website: ===== www voguecatch com ====
    Our main product list is as follows:
    Reply
  • flyvog6 - Tuesday, May 17, 2011 - link


    sells clothing,footwear,handbags,Sunglasses
    Online Store,Get Name Brand Fashion From 12USD Now!
    Our Website: ===== www. flyingstyle org ====
    Our main product list is as follows:
    Reply
  • aa0101bb - Wednesday, May 18, 2011 - link

    sells clothing,footwear,handbags,Sunglasses
    Online Store,Get Name Brand Fashion From 12USD Now!
    Our Website: ===== w w w voguecatch com ====
    Our main product list is as follows:
    Reply

Log in

Don't have an account? Sign up now