ASUS and Intel are putting together a webcast that they've invited me to attend. The topic of discussion? Sandy Bridge. The webcast will air after Intel's official announcement of Sandy Bridge at 9AM PST on January 5, 2011 at CES.

The discussion will be a conversation between myself, Gary Key (former AT Motherboard Editor, current ASUS Technical Marketing Manager), and Michael Lavacot, an Intel Consumer Field Application Engineer. 

If you have any questions you'd like to see me answer on air or that you'd like me to grill ASUS and Intel on, leave them in the comments to this post and I'll do my best to get them addressed.

Of course we will also have our full review of Sandy Bridge around the same time. 

Update: Intel posted some of the videos from this webcast on its YouTube channel. I tried to answer as many of the big questions you guys asked as I could in the video or in our Sandy Bridge review

I'll add links here for more videos as they get posted:

POST A COMMENT

223 Comments

View All Comments

  • Hrel - Wednesday, December 08, 2010 - link

    The Asus N53JF-XE1 is an extremely attractive laptop at $999. When can we expect to see a similar laptop except with Sandy Bridge for under a thousand bucks? Seriously, don't change anything except the CPU, maybe add some more USB 3.0 and up the GPU a little if it's in the budget. Everything about the laptop itself seems amazing for the money.

    I like that it doesn't FORCE me to buy an expensive quad core i7 just to get a halfway decent GPU. I persoanlly have very little interest in quad core, it doesn't help much if at all in games. Hyperthreaded dual core is more than enough and while it'd be nice to have it's not worth the price premium. Not even close.
    Reply
  • DanNeely - Wednesday, December 08, 2010 - link

    A lot of the dual vs quad core question will come down to available clock speeds and prices. SB is supposed to push quadcore farther into the mainstream so some reasonably priced quads seem likely, which makes it a question of what the clock speed differences will be. A quad with 2 cores gated shouldn't be using any more power than a dual with both cores running, but arandale had a fairly large penalty there for the reasonably priced chips, if SB does better there might not be much reason to go dual unless you're getting a very budget system. The fact that in leaked mobile parts dual core models only outnumber quads by 2:1 vs 6:1 for clarkdale/arandale seems to imply that quads will occupy most of the slots currently taken by faster dual core chips. Reply
  • K1rkl4nd - Wednesday, December 08, 2010 - link

    The HTPC market has been waiting for 3D-capable HDMI 1.4, but now we finding our current receivers don't like a 1.4 signal. High end Blu-ray players are offering a HDMI 1.4 connection to your TV for video and a HDMI 1.3 connection to carry audio to your standalone receiver. Is there going to be an easy way to implement this with Sandy Bridge setups, or are we going to get stuck with measly 5.1 performance through our optical cables, locking us out of DTS Master Audio unless we buy this year's flavor of receiver as well? Reply
  • Wiggy McShades - Wednesday, December 08, 2010 - link

    In a scenario where the GPU is running full tilt and you wanted to multitask and do some other task that is memory bandwidth intensive, how much of an impact can we expect from using the GPU? So basically are the memory access requests by the cpu and gpu balanced in a situation where the combined memory bandwidth required is larger than what is currently available? Does one get precedence over the other ? Reply
  • Catalina588 - Wednesday, January 05, 2011 - link

    The CPUs and GPUs share (compete for) level 3 cache. That's good when CPU hands physics off to GPU, but contention when two are off doing something completely different.

    My understanding is that the OS, as usual, is the traffic cop, not the chip. However, you can upclock and downclock the GPU in the BIOS. (Yes, I know that's crude).
    Reply
  • GeorgeH - Wednesday, December 08, 2010 - link

    1) How long will LGA-1155 last?

    2) Why did Intel need to go with a new socket? ASRock has made an LGA-1156 P67 motherboard; are their engineers smarter than Intel's?

    3) If Intel is going to be releasing a new socket every 12 months, why should I spend lots of money on a fancy motherboard? Especially when overclocking is locked down the way it is, wouldn't it make much more sense to buy the cheapest and most disposable motherboard and CPU combo that'll still get the job done and spend the savings on quality network, audio, USB 3.0 and whatever else add-on PCIe cards that traditionally have helped differentiate high end motherboards?

    4) Will there ever be more than 4 cores on 1155, or will it live and die as nothing more than a Lynnfield/1156 speed bump?

    5) Why should I bother with LGA-1155 instead of waiting for Sandy Bridge on LGA-2011? With next-gen SSDs already rumored to be pushing the limits of SATA 6Gbps, will an LGA-1155 motherboard's PCIe lanes be completely saturated with PCIe SSDs, Light Peak cards (which will exist, right?), and GPU traffic long before the performance of the CPU itself is an issue?

    6) Why should I buy Sandy Bridge now, before we know what Bulldozer is capable of?
    Reply
  • DanNeely - Wednesday, December 08, 2010 - link

    #4 is probably no. Current intel CPUs need half a channel of DDR3 per core to avoid bottlenecking; I don't see sandybridge changing that calculation.

    My gut feeling is that ivy bridge will either bring mainstreamish hex core chips via a resurrected LGA 1356 socket, or a new DDR4 socket (LGA1154?). The 2012 ETA for DDR4 would be possible, although it seems questionable that intel would launch DDR4 on a lower end platform first because the initial supply will almost certainly be tight and pricey.
    Reply
  • Agamemnon_71 - Thursday, December 09, 2010 - link

    #2 Greed.

    #3 I dont buy highend mobos anymore. A total waste of money.
    Reply
  • anactoraaron - Wednesday, December 08, 2010 - link

    Will the chip be able to "power gate" the gpu when it's not being used -essentially power off the gpu when a dedicated card is being used? Is that in the works for Ivy Bridge? It would be great for allowing additional "turbo" headroom for desktops and a battery saver for the notebook space.

    Oh and will the gpu have different op frequencies for notebook versions? IE clock down when only viewing web content etc. and go full power when watching HD movies?
    Reply
  • dlwilliams21 - Wednesday, December 08, 2010 - link

    The new socket 1155 will support sandy bridge. Will that socket be sticking around for ivy bridge?

    How soon can we expect to see the uefi interface on asus motherboards?
    Reply

Log in

Don't have an account? Sign up now