ASUS and Intel are putting together a webcast that they've invited me to attend. The topic of discussion? Sandy Bridge. The webcast will air after Intel's official announcement of Sandy Bridge at 9AM PST on January 5, 2011 at CES.

The discussion will be a conversation between myself, Gary Key (former AT Motherboard Editor, current ASUS Technical Marketing Manager), and Michael Lavacot, an Intel Consumer Field Application Engineer. 

If you have any questions you'd like to see me answer on air or that you'd like me to grill ASUS and Intel on, leave them in the comments to this post and I'll do my best to get them addressed.

Of course we will also have our full review of Sandy Bridge around the same time. 

Update: Intel posted some of the videos from this webcast on its YouTube channel. I tried to answer as many of the big questions you guys asked as I could in the video or in our Sandy Bridge review

I'll add links here for more videos as they get posted:

POST A COMMENT

223 Comments

View All Comments

  • Stuka87 - Wednesday, December 08, 2010 - link

    I second both. How much power will the GPU use when it is not in use, and do they have plans for a standard for switchable GPU's from their 'HD' graphics and a discrete GPU. Reply
  • Full Ctrl - Thursday, December 09, 2010 - link

    Wouldn't the switching be a bit more complicated on desktops? I'm thinking specifically of cabling: Would you need to have your monitor plugged into both the on-board (Intel) video and separately to the discrete card?

    Not that I don't think it's a great idea from a power consumption standpoint, but it sounds like it would require extra cabling.
    Reply
  • Nataku - Thursday, December 09, 2010 - link

    I think they can just by-pass the discrete graphic's gpu and go straight to the ports eliminating the need for 2 cables to be plugged in... then again im just guessing

    my question is probably if this is the last socket change they will do in a long time... the current socket seems to have died a little too quickly causing some up roar
    Reply
  • davmat787 - Sunday, December 19, 2010 - link

    This is where Microsoft comes in, to help the other companies work together to ratify a new mutually beneficial spec. An official spec and API for GPU switching would be great. Reply
  • MrSpadge - Thursday, December 09, 2010 - link

    Seconded!

    If we can get Optimus-style switching, ideally with ATIs and nVidias, that would make the IGPU really valuable in terms of power savings, even for power users who need a real GPU anyway.

    And there's another point, closely linked to this one, which would be in Intels interest to push, but is not in their direct hands: My main rig feels choppy, because the GPU is crunching BOINC in the background. What's that got to do with SNB? Simply: if I could use the IGPU to drive my display it wouldn't matter what my n number of crunching ATIs or nVidias would be doing in the background - desktop, videos etc. would still be smooth. Given the increasing focus on GP-GPU such scenarios are likely to become numerous and an IGPU for free would be a nice solution, independent of when GPU scheduling, time slicing and partitioning will eventually be here.

    The discrete GPU might even use a device driver (like the Teslas) instead of a graphics driver, enabling faster access to it as co-processor. If Intel pushed MS and AMD/NV to enable such solutions more people would be inclined to upgrade to a GPU-enabled CPU.
    Reply
  • jiffylube1024 - Friday, December 17, 2010 - link

    Building on that, how much power does the integrated GPU use out of the typical Sandy Bridge thermal envelope of 95W (TDP)?

    How much will power consumption/TDP go down (if at all) with integrated graphics disabled and a discrete PCI-e video card installed?
    Reply
  • mlavacot - Thursday, January 20, 2011 - link

    It is difficult to put a number on how much of the TDP is reserved for the processor graphics and how much is for the CPU since they both change frequency and load depending on what they are doing. If you use a discrete card and you are not using the processor graphics at all (Examples: Desktop with add in card and nothing plugged into the processor graphics connector; or Laptop in a non-switchable configuration), the processor graphics is powered gated off.

    If the processor graphics is power gated off, it will give all of the TDP headroom of the processor to the CPU so the TDP does not change. The overall processor average power will drop when the graphics is power gated off, I just don’t have a number for you. But I can tell you that adding a discrete card will use much more power than processor graphics.
    Reply
  • Filiprino - Wednesday, December 08, 2010 - link

    What will the encoding capabilities be? I have read on doom10 forums a discussion between an x264 developer and an Intel developer and I would like to know what can we expect from Sandy Bridge in respect to this topic.
    There will be only x264 acceleration or it could be capable of more generic acceleration via API calls?
    Reply
  • Nehemoth - Wednesday, December 08, 2010 - link

    Maybe for another occasion you please could ask ASUS why they don't release an ADSL Modem with Wireless and Gigabits ports (4).

    A convergence router/switch/modem of this category is really desirable.

    I don't want to have a modem/wireless router and a gigabit switch for these things.

    Thank you
    Reply
  • tntomek - Wednesday, December 08, 2010 - link

    The N's (i.e. N53JQ-A1)are one of the prettiest notebooks out there, I'll be pulling the trigger and voting with my $ come January and SB, I hope ASUS will offer a mid-high end product that goes beyond an expensive CPU. If one of your most expensive units offers only a N53JQ-A1 15.6" HD (1366x768) LED screen you have mental issues. Reply

Log in

Don't have an account? Sign up now