Intel X38: Where's the Beef?

by Gary Key on 10/10/2007 3:15 PM EST
POST A COMMENT

14 Comments

Back to Article

  • Lord Evermore - Sunday, October 14, 2007 - link

    quote:

    Real bandwidth per lane will be up to 4Gb/s (estimated to be around 500MB/s per lane/pin on average in current testing) in each direction


    Estimated? In testing? 4Gbps is exactly 500MBps given the encoding used. If you mean that actual performance and throughput gets close to the theoretical maximum, that's a bit odd a way to put it. You could also say that SATA's real burst bandwidth has been estimated and tested to be "around" 300MBps.
    Reply
  • IntelUser2000 - Thursday, October 11, 2007 - link

    Now there are other reviews about X38 on the web, and here are two that I find it interesting:

    http://www.firingsquad.com/hardware/gigabyte_x38_d...">http://www.firingsquad.com/hardware/gigabyte_x38_d...
    http://www.ocworkbench.com/2007/gigabyte/GA-X38-DQ...">http://www.ocworkbench.com/2007/gigabyte/GA-X38-DQ...

    The Crossfire benchmarks seem to offer significant advantage of X38 over P35.
    Reply
  • DigitalFreak - Thursday, October 11, 2007 - link

    quote:

    The Crossfire benchmarks seem to offer significant advantage of X38 over P35.


    Of course they do. P35 boards run Crossfire in a x16 + x4 config, while X38 is x16 + x16. The only exception is some Asus P35 board hacked to run x8 + x8 via a PCI-E switch.
    Reply
  • IntelUser2000 - Friday, October 12, 2007 - link

    Now I am thinking Intel shouldn't have bothered to make faster memory controller on the X38 at all.

    -Average performance increase has been so far less than 2%. In some tests that's considered margin of error!
    -X38 added 10 or so more watts over P35
    -X38 also is rumored to have a 140mm2 die

    A theoretical 9% improvement for going for a motherboard that costs $100 more in some cases. X38 added 10 or so more wasted watts, a new die, for nothing. All they should have done is take a P35 chipset and put ability to do 2xPCI-E x16.

    In fact no Intel chipset manufacturers shouldn't bother with it. There used to be a time when high end chipsets were actually faster. 875P with PAT offered 3-5% real world performance increases over the 865 chipsets.

    Luckily, by the time Nehalem is out with integrated memory controller(hopefully, at least all the mainstream versions), this stupidity should be over. IMC will do far more than what futile advancements made on the external chipsets will ever do.
    Reply
  • avaughan - Wednesday, October 10, 2007 - link

    On page one you mention improved "4x1GB compatibility". Do you test 4x2GB? 2x1GB+2x2GB? Reply
  • wingless - Wednesday, October 10, 2007 - link

    Under the list of features these chipsets have it states that Intel chipsets ONLY supports Crossfire and not SLI. If they support both I would image they would list both as supported, not just Crossfire. If Intel only supports AMD's multi-card setup then thats a big win for ATI. Jeez, Nvidia better whore out their SLI technology because I know a lot of people that run Nvidia only. Reply
  • mongo lloyd - Wednesday, October 10, 2007 - link

    Any Intel board sold is an AMD CPU/platform not purchased. Reply
  • microAmp - Wednesday, October 10, 2007 - link

    I can't get to page 2, link looks ok, just dead ends for me and takes me to search.anandtech.com.

    I also tried print page to read, empty. :(
    Reply
  • microAmp - Wednesday, October 10, 2007 - link

    Ignore me, worky now. Reply
  • 8steve8 - Wednesday, October 10, 2007 - link

    I have a few questions:

    1. Will the Intel x38 desktop board be overclockable in any usable sense?

    2. Is it true the G35 can only talk to the ich8 (and not ich9)? this seems odd...

    3. I guess if you were going to buy an uber-high-end p35, this will be a great chipset to look foward to... but whos dying to spend $250+ on a motherboard for a few % improvement in the real-world user-experience, when you can get perfectly fine p35/g33 motherboards for ~$120, that overclock to like 425mhz. Especially when Nehalem, with its integrated memory controller, (which won't work in the x38) will likely whipe the floor with anything you buy today from intel.

    for me the choice is clear save $150 and go with a decent g33/p35/g35 board, then i'll be better able to afford Nehalem next year which will make this system irrelevant anyway.... penryn is a nice improvement, but we are talking ~5% per clock improvements, and a smaller power envelope... don't expect anything huge.


    who's spending $300 on a motherboard?
    and who would (in their right mind) build a system with ddr3. (now)





    We are all egerly awaiting G35 in the channel.
    Reply
  • Owls - Wednesday, October 10, 2007 - link

    The odd thing is why CF is supported but not SLI. I'm guessing nVidia is not giving up this to intel? Reply
  • JarredWalton - Wednesday, October 10, 2007 - link

    SLI is an NVIDIA technology, and it is currently limited to only running on NVIDIA chipsets (by the SLI drivers - not by anything else). There are some exceptions - SLI notebooks for example use 945GM chipsets from Intel - but what it basically amounts to is that no one has thrown enough money at NVIDIA to get them to open the standard.

    Now, there are a few things to consider. First, CrossFire often appears to require additional driver tweaks to run on new chipsets - X38 today, P35 previously, etc. If NVIDIA opened support, we might see additional compatibility problems on other chipsets. That would require more effort from their driver team, presumably, so perhaps they are just trying to keep from overextending.

    More likely, however, is that NVIDIA likes being able to push their own chipsets with their graphics hardware. Probably the chipset division doesn't want to become marginalized by opening SLI support to others. This is a bit odd, though, as the profit margins on $300+ graphics cards are much higher (for NVIDIA) than on $100-$200 motherboards. Given how many people have P965 and P35 chipset boards (and even some AMD chipset boards), NVIDIA could probably sell a reasonable number of additional GPUs if they would open up SLI support for these platforms. Who cares about the money from Intel if it means you can gain a bigger advantage over AMD/ATI and still make $100 per GPU sold (or whatever it is they get paid per GPU chipset)?

    Well, obviously some higher-ups at NVIDIA care, but I can't help but wonder if they're wearing blinders. Nothing like cutting off your nose to spite your face.
    Reply
  • lopri - Thursday, October 11, 2007 - link

    Disagreed. SLI has been and will remain the thing for NV in the foreseeable future. I would go as far as to say that one of the biggest factor that drove ATI out of the business is NV's platformization strategy based on SLI and ATI's struggle to catch up.

    Although the number of people who actually run dual GPUs in tandem might be small, SLI has more meaning to it. (Just like brand-name value or trademarks) Now that Intel is getting ready for high-end discrete GPU market, NV won't give up SLI without a significant return.
    Reply
  • 8steve8 - Wednesday, October 10, 2007 - link

    oh yeah and we are awaiting the triple play of amd products being anounced in november,,, is it still thought that only an extreme version of pheonom will be available in the channel before xmas? or will there be a q6600 class product from amd before xmas? Reply

Log in

Don't have an account? Sign up now