AMD's Quad FX: Technically Quad Core

by Anand Lal Shimpi on 11/30/2006 1:16 PM EST
POST A COMMENT

87 Comments

Back to Article

  • VooDooAddict - Saturday, December 02, 2006 - link

    It strikes me as odd that you haven't reviewed a Dual Xeon 5300 system.

    While the motherboards aren't geared toward the enthusiast. It would seem anyone who could really benefit from two Dual core Athlon FXs would get incredibly more bang for the $$ out of a Dual Xeon 5300 system giving them 8 cores on an established and sure to be supported platform.

    I only see two major drawbacks to the Dual Xeon as opposed to 4x4.

    You don't have access to SLI. That's unimportant though as people who want SLI would be better served with an Intel QX6700 on an NVIDIA 680i.

    You need FB-DIMMs. Considering the expense you are already at with any Dual CPU system, and the types of activities you'd need to be doing to get real benefit. ECC probably wouldn't be a bad thing to have.

    I'd love to see a review of a Xeon 5300 System with a 8800GTX.
    Reply
  • yyrkoon - Friday, December 08, 2006 - link

    To the best of my knowledge ECC memory is usually slower compared to non ECC memory. Also, if I didn't read it here, I read it somewhere else, not only is FB-DIMM memory more expensive, it also does not perform as well compared to standard DDR2 memory.

    I think, for now, if you're wanting the best of both PC/server worlds, you would have to settle for a motherboard similar to Asus' Workstation, or Pro line. However, you would still be stuck with a single CPU, with multiple cores, but you would have SLI, PCI-X, and 2-3 PCI 2.3 slots :) As for memory quantity, anything more than 4 GB on a 32 bit OS is a waste (unless you use the right OS with the equivalent of the /PEA boot option, which I understand can cause some system instabilities)
    Reply
  • yyrkoon - Friday, December 01, 2006 - link

    As far as I can remember, using multiple CPUs (as in socket, not core), has always shown diminishing returns. In fact, I think I remember reading something to that effect here on this web site (or maybe it was TH?). Regardless, having several Dual CPU boxen around, and have had the chance to play around with a few Quad CPU systems, I've seen this with my own two eyes.

    I did not start reading this article with the belief that AMDs solution would beat an Intel quad setup, but I did have hopes that AMDS system would be at least feasible. The way I see it, from these results, AMD should have never even bothered with this design, and would have better spent their time working on future technology.

    A few more mistakes like this, and pretty soon the only choice we're going to have is Intel, and it doesn't matter which company you buy from. We all should be hoping this never happens. My current system right now, is an AM2 system, and I have to say, I'm a bit miffed at AMD for leaving customers like myself out in the cold, when it comes to performance CPUs. So now, since I spent less than $200 for my current Motherboard / CPU, I think I'll Leave AMD in the dust (before they get the chance to do the same to me), and opt for an Intel system board, and CPU, for my next upgrade.
    Reply
  • stepz - Sunday, December 03, 2006 - link

    quote:

    he way I see it, from these results, AMD should have never even bothered with this design, and would have better spent their time working on future technology.

    To be fair, the engineering effort was close to zero. They sell almost exactly the same stuff as Opterons, only in shinier boxes and surrounded by a lot more nonsense buzzwords. This product is brought to you mainly by the marketing department.
    Reply
  • fitten - Friday, December 01, 2006 - link

    quote:

    As far as I can remember, using multiple CPUs (as in socket, not core), has always shown diminishing returns.


    Amdahl's Law.
    Reply
  • T Rush - Thursday, November 30, 2006 - link

    I'm really not that thrilled to see a brand new Intel CPU barely squeak by a similar config from AMD that is using a CPU architecture that is over 4 years old

    I just gave Anand Lal Shimpi's article a closer inspection, we do see an advantage to the new Intel Core2 Q 65nm Kentsfield, but its not as large as I think it should be compared to the AMD64 90nm K8/Hammer dual cores times two...and however the pricing also doesn't reflect the large advantage you would expect to find with a newer more efficient design: as tested the two FX-74's are priced the same as the Extreme QX6700, and then next we see that two FX-72's under pricing the coming Quad Q6600 by a noticeable amount, and last we don't see a Quad Core2 to compete with the two FX-70's at the lowest $599 for the pair...I would expect the newer tech to not only be a much better performer, but also less expensive as well
    ....but to my surprise Shimpi almost caused me to fall out of my chair when I got to the last page and one of his biggest complaints was the higher price of the competition's motherboard!!!!, this to him seems to be a problem for AMD now, but the Intel systems he has tested before he was willing to overlook....here with this comparison he has a problem with the total system cost(mobo+cpus) being higher do to the motherboard being over one third more expensive as tested; $370(which he later even 'rounds up' to $400) for the AMD vs. $260 for the Intel, looking at $110 difference
    ....oddly with even his own widely revered "Tremendous Value Through Overclocking" that he wrote, where the focus is in fact "Value"(not something that I would think is "as big" of an issue when looking at the highest priced/performing/featured/etc. systems that you could possibly buy today) he completely was not bothered by the fact that he was using a $250 motherboard for the "Value" Intel CPUs at as much as $120 difference more to the $130 AMD board he used where value was a factor in that earlier case ..what!!!
    Reply
  • fitten - Friday, December 01, 2006 - link

    Well... the parts that "barely squeak by" are also running at only 80% of the clock frequency... 2.4GHz Q6600 compared to 3.0GHz AMD FX... and Intel's Core2 architecture seems to have plenty of headroom to go to higher clockspeeds on Intel's whim... 3.6GHz overclocks are pretty common with the Core2Duo parts (and the Quad parts are the same exact cores).

    There's no doubt that the Core2 architecture from Intel is the largest CPU performance bump in a number of years... previous years were almost stagnant as we saw 200MHz to 400MHz clock frequency increases on the same cores as the previous year. Core2 was a performance jump that was similar to the days of the Pentium3 timeframe.
    Reply
  • T Rush - Thursday, November 30, 2006 - link

    was anyone really expecting AMD's "4X4" system using two CPUs built using the 4 or 5 year old K8/Hammer CPU architecture to out-perform Intel's latest release using its newest recent Core2 architecture?...I would hope not
    ...if anything Intel should have been able to pull off a much larger victory in price and performance....where the only clear major advantage that shows the large difference you would expect to see when comparing very new tech to much older was with the power consumption, which I wouldn't think is that great of concern with any system that is running four cores and most likely dual high end video cards(or more) in a "all out" highest performing/benchmarking gaming desktop system
    the over all win does go to the Intel on this dual dual core battle..but personally I would have liked to of seen a much broader margin before putting all my eggs in one basket as many will do
    Reply
  • cryptonomicon - Friday, December 01, 2006 - link

    Thats an incredible amount of power... 450w!! Now that's what I need to keep my room warm this winter :D Reply
  • JarredWalton - Friday, December 01, 2006 - link

    You can overclock an E6300 even higher on a lot of P965 boards than on the $225 ASUS P5W DH. The motherboard chosen was used for consistency in benchmarking, not because it is the recommended "budget overclocking" motherboard for socket 775. As for QX6700 barely outperforming FX-74, there's a couple of things to keep in mind. First, you can't buy the AMD chips yet, whereas QX6700 is actually available (albeit a bit expensive). Second, the past indicates that 3.0 GHz is pretty clearly to limit of AMD's current chips on 90 nm.

    Meanwhile you have QX6700 systems shipping factory overclocked at 3.33 GHz and beyond. Sure, with that level of overclocking, the power draw might actually begin to be similar between the two platforms. However, a 25% overclock of a Core 2 Quad certainly isn't going to close the margins. And given that these are indigenous platforms we are talking about, overclocking is going to be far more common than not I would say.

    I'm not sure on Q6600 pricing either (since it's not available), though I've heard at least one rumor suggest the price has dropped to around $500. Guess we'll wait and see there as well. Basically, we are see Core 2 Quad beat AMD's latest and greatest in almost every test, without even pushing the clock speed limits. In seven months, we can evaluate again, and maybe AMD will have the lead.

    Basically, we should expect the margin of victory to be slightly lower with a single socket quad core going up against dual socket dual core then it was with Core 2 Duo going up against Athlon X2. The dual sockets give more memory bandwidth, so particularly when running a NUMA aware operating system, things should be pretty close when comparing FX-74 against QX6700. AMD was basically able to close the performance gap somewhat by throwing more power and hardware at the problem, sort of similar to what Intel did with Pentium D. It wasn't a good solution when Intel did it, and it's not a good solution now. It is "OK" at best.
    Reply
  • JarredWalton - Friday, December 01, 2006 - link

    Yay for speech recognition!

    indigenous = enthusiast
    we are see = we are seeing
    Scratch a couple "basically" from the last paragraph.
    Reply
  • yyrkoon - Friday, December 01, 2006 - link

    I though Gary was the "one" with the Texas "Twang" ;) Reply
  • laok - Thursday, November 30, 2006 - link

    The current 4x4 does not look to be a mature system to me. Wait until 65nm 4x4 comes out and hopefully a better chipset will be available at that time. 65W x 2 is reasonable, 130W x2 is kinda too much.

    And I also want to know how 4x4 compares to dual dual-core opteron with the same frequence: performance, power consumption etc.

    Reply
  • DigitalFreak - Thursday, November 30, 2006 - link

    It's the same chip as the Opteron, with the memory controller changed to work with unbuffered memory. Reply
  • JarredWalton - Friday, December 01, 2006 - link

    The first word of his subject is the best advice: WAIT! Even if the future might get better, does anyone want to spend $1000+ on what may or may not turn out to be the better platform? When the new CPUs are available, then we can evaluate and decide. Of course, once AMD launches their quad core processors, I'm almost certain that our advice will be that most people only need a single socket motherboard and CPU anyway -- if that. Many people still get by a single core CPU, and the number of people that actually need more than dual CPUs is very small, at least in the desktop workspace. Reply
  • DigitalFreak - Thursday, November 30, 2006 - link

    This is the best AMD can do against Kentsfield? They get stomped on in every benchmark, cost more, and draw nearly twice as much power. No one in their right mind would buy this over a Core2Quad. Whomever came up with this product should be fired.

    The last time I laughed this hard at a CPU/platform launch was when Intel rolled out the P4 dual core CPUs, and at least they came out on time in some of the multimedia benchmarks.

    For the record, my current system is running an Opteron 165, so I ain't no Intel fanboy.
    Reply
  • photoguy99 - Friday, December 01, 2006 - link

    You're right it's actually embarrasing isn't it?

    If AMD's next-gen architecture improves performance by 30% at same clock, which is huge, they still won't take the lead.

    And it seems Intel is done sitting on their hands, they are working like hell to dominate again by the time K8L ramps up big.

    People hate hearing this but think it's over for AMD.

    And like you, for the record, my current system is an FX-60 so I'm also no Intel fanboy.
    Reply
  • DigitalFreak - Thursday, November 30, 2006 - link

    ...came out on top... Reply
  • photoguy99 - Thursday, November 30, 2006 - link

    Seriously, I'd like to know who is selling them... Reply
  • Furen - Thursday, November 30, 2006 - link

    That such a badly engineered product was rushed out to reviewers just to have a paper launch. Did AMD believe that no one would make a big deal about the power draw? Or maybe it expected no one to even look at power draw. I was actually impressed by what AMD had accomplished with 4x4, after all, the 3.0GHz Quad FX parts were close to the QX6700, until I saw the insane power draw. Two loaded FX-62 systems (whole systems, mind you) draw about the same power as 4x4 does IDLE! Reply
  • mino - Friday, December 01, 2006 - link

    If you would bother to read, you would see those IDLE numbers are Without C'n'C.

    Witch C'n'C the IDLE number is be more like 250W than 380W.
    Reply
  • mino - Sunday, December 03, 2006 - link

    Hell, I should REALLY read after myself more thoroughly... Reply
  • JKing76 - Thursday, November 30, 2006 - link

    What have you got against pickup trucks? Reply
  • Genx87 - Thursday, November 30, 2006 - link

    I think it is safe to say Intel has caught AMD with its pants down this round with their Core 2 Duo line of products. Intels product line is much more compelling and performance\watt is scary good for Intel.

    Hell Intel's offering must be good, it got me to buy their product for the first time in nearly a decade! ;)
    Reply
  • mino - Friday, December 01, 2006 - link

    Actually not.

    AMD has caught Intel pants down in 2003. It took Intel 3!!! years to come back to game.
    Those 3 yrs Intel was NOT price competitive.

    Intel has just caught up in midle of 2006, this was to be expected and WAS expected by AMD.

    AMD is about to catch up to Intel after 1 year..
    This 1 year AMD IS price competitive, hence it is still in the game..

    The 2008 Intel CSI may catch AMD with pants down. May.

    Actually, in 2008 AMD will have some 30-35% marketshare and be so well entrenched in the corporate market that some mild performance(as now) hiccup is not gonna hurt them in any serious manner.
    Reply
  • Roy2001 - Thursday, November 30, 2006 - link

    If you need quad-core/CPU system, kentsfield is a much better choice, no question asked. Reply
  • sprockkets - Thursday, November 30, 2006 - link

    Why is it that just putting the other 2 cores on the same package reduces power consumption so much?

    Anyhow, yeah, Intel is ahead, though this would be good for servers, not for desktops. Even so, Intel for now is still better.

    But, I found for perhaps 90% of all people, an old s754 board with a $45 dollar Sempron works fast enough. I wish Anand would check out the new C7 processor mini ITX boards to see how well it works for so little power consumption.
    Reply
  • Furen - Thursday, November 30, 2006 - link

    The QX6700 pretty much draws twice as much power as the E6700, the big benefit of going for quad-core in a single system is that you only have one motherboard, harddrive, one set of RAM sticks, one video card, etc. The 4x4 is horribly engineered, I think even 400W at load is too much for two Opterons at 3GHz. Reply
  • mino - Friday, December 01, 2006 - link

    Two Opterons DO NOT employ 8000GTX usually ...
    Two Opterons do have 95W TDP(lower voltage) ... compared to 125W for FXs
    Two Opterons are available in 68W TDP ...
    Two Opterons are NOT available in 3GHz flavour ....

    Two Opterons are twice as expensive ....
    Reply
  • Furen - Thursday, November 30, 2006 - link

    The 4x4 motherboard, that is... Reply
  • JarredWalton - Thursday, November 30, 2006 - link

    Why is it that just putting the other 2 cores on the same package reduces power consumption so much?

    It doesn't. Core 2 Duo uses less power than Athlon FX-62, so two of them are going to use less than two FX-62 (or whatever) chips. Now, adding the second socket also adds additional voltage regulation circuitry, so the second socket will increase the power load, but I don't think the second socket accounts for more than a 20W power increase, and probably more like 10W.
    Reply
  • Slaimus - Thursday, November 30, 2006 - link

    The odd thing for this platform is that the single CPU is actually really cheap versus comparable products. If only server boards can take these CPUs. Reply
  • Beachboy - Thursday, November 30, 2006 - link

    I wonder how many diehard AMD enthusiats will want to split a set of these "quads". Reply
  • mino - Friday, December 01, 2006 - link

    Count me in!

    IMHO enthusiast forums are will be full of guys sharing the CPU purchase... :)
    Reply
  • peternelson - Friday, December 01, 2006 - link

    Very likely eg I would and thought of that, knowing the guys on forums I frequent ;-)

    The other option is just buy two motherboard/systems and put each of the paired cpus into each one.
    Reply
  • rqle - Thursday, November 30, 2006 - link

    Best case scenario.
    100% price reduction in mainboard
    Assume these FX cpu perform as well as QCore
    Price it Similar to Performance
    Major Power Reduction
    Assume it a windows error =/ , no clue why you would run server software and e-commerce over softwares/games on this platform

    I still have a very hard time recommended this setup to an enthusiast. Already have a hard time reaching 3.0ghz, it going to have a very hard time going just 10% beyond that. The upper limit of AMD cpu doesn’t impress me right now. Cheapo Intel Core 2, with an overclocker in mind seem to have more potential.
    Reply
  • photoguy99 - Thursday, November 30, 2006 - link

    I generally agree with your logic -

    But even your best case scenario is impossible because two 90 process CPUs have never come close to the power comsumption of a single 65 process CPU at the same performance.
    Reply
  • mino - Friday, December 01, 2006 - link

    Depends. EE X2's are more efficient than C2D's. Even performance wise.

    Not even comparing IDLE C'n'Q and EIST enabled ....
    Reply
  • Anonymous Freak - Thursday, November 30, 2006 - link

    Of course they'll sell more FX processors now than before. There was literally nothing to differentiate them before, other than clock speed. That, plus now they'll sell two for every computer built with them.

    But, I have a feeling that the FX processors are going to be even more niche than they were before. Before, it was at least a high end normal processor. Someone could buy a midrange system, and upgrade to an FX later. Now, you have to decide up front that you're going to pay a fortune for the computer. Presently, I have an el-cheapo $99 motherboard that I put my old Pentium 4 in. If I want, I can slap a Quad-Core Core 2 Extreme in there. I can't do that with AMD's setup.

    I'm not an Intel fanboy, either. The only reason I even have the Pentium 4 is because a friend gave it to me free when he upgraded his system. I was perfectly happy with my laptop and my AthlonXP 1700+. But a free 3.8 GHz processor is a free 3.8 GHz processor. I went and bought the cheapest motherboard and memory I could find. Spent about $200, and I can upgrade to quad core anytime I want. (Although I'll probably upgrade from the onboard video to a decent PCI-E card first.)
    Reply
  • photoguy99 - Thursday, November 30, 2006 - link

    quote:

    Of course they'll sell more FX processors now than before

    I don't know man, why would they sell any more?

    To sell more someone would have to buy this "Ford Excursion" of a system. But who is going to buy this?

    What boutique shop is going to even sell it?

    Is there one single person here who is planning to get one?
    Reply
  • peternelson - Thursday, November 30, 2006 - link


    I image www.Scan.co.uk in the UK will stock both boards and cpus at realistic street pricing.

    And I am seriously thinking of getting one, to put lots of PCIe cards into, and for use WITH A NUMA AWARE OPERATING SYSTEM.

    Who knows, Asus may even release a bios upgrade for it!

    And also note that there is a similar Asus "Deluxe" board not just the WS workstation variant, so may be a little cheaper.

    My other options are a dual socket Opteron board (with expensive memory) or a Core2Quad, using Nvidia 680i chipset which gives less I/O capacity.
    Reply
  • Anonymous Freak - Friday, December 01, 2006 - link

    Or, you could go for a dual socket Xeon system. It would cost near the same, and you'd have the option of two quad-core processors RIGHT NOW. (Rather than late next year.)

    Of course, then you run into more expensive (and more power-hungry) memory. But if you are actually looking at such a system, a Xeon might be a better fit.
    Reply
  • mino - Friday, December 01, 2006 - link

    The problem IS the memory and I/O, these are the strenghts of this solution.

    Prices of cheap 4C WS solutions:

    HIGHEST PERFORMANCE 4C: -> systems have comparable performance (Vista/XPx64 in account)
    -------------------------------------------------
    ***Quad FX:
    1pc FX-74 $1000
    1pc MB $350
    4pcs 1G RAM $400
    1pc Fortron PSU $100
    0pcs 8-port SATA controller $0
    ----------------
    Total: $1750

    ***Core 2 Quad:
    1pc QX6700 $1000
    1pc MB $250
    4pcs 1G RAM $400
    1pc Fortron PSU $60
    1pc 8-port SATA controller $150
    ---------------
    Total: $1860

    BEST VALUE 4C: -> systems have comparable performance[except QX6700] (Vista/XPx64 in account)
    -------------------------------------------------
    ***Quad FX:
    1pc FX-70 $600
    1pc MB $350
    4pcs 1G RAM $400
    1pc Fortron PSU $100
    0pcs 8-port SATA controller $0
    ----------------
    Total: $1350

    ***Core 2 Quad:
    1pc QX6700 $1000
    1pc MB $250
    4pcs 1G RAM $400
    1pc Fortron PSU $60
    1pc 8-port SATA controller $150
    ---------------
    Total: $1860

    ***Opteron 2000:
    2pcs 2216 $1200
    1pc MB $400
    4pcs 1G RAM $600
    1pc Fortron PSU $100
    1pc 8-port SATA controller $150
    ---------------
    Total: $2450

    ***Xeon 5100:
    2pcs 5150 $1200
    1pc MB $400
    4pcs 1G RAM $700
    1pc Fortron PSU $100
    1pc 8-port SATA controller $150
    ---------------
    Total: $2550

    IDEAL WS/PC Solution: (best value proposition, upgrade possible when necessary)
    ----------------------------------------------
    1/2pcs FX-70 $300
    1pc MB $350
    2pcs 1G RAM $200
    1pc Fortron PSU $60
    0pcs 8-port SATA controller $0
    ---------------
    Total: $910

    The funny thing beeing, the biggest value of QuadFX is in the exceptional motherboard. Except those those 10 not routed USB ports, that board is a dream come true.

    And yes, those 12 SATA3G ports would be a blessing for my storage needs...
    Reply
  • peternelson - Friday, December 01, 2006 - link


    Nice costings, I'm thinking the same here.

    That board has lots of I/O bandwidth.

    Since the Gbit ether ports support teaming (2 muxed together), I would have thought it worthwhile for Asus to implement all four rather than two.

    Four used separately would make nice routing between lans, and four used as two teamed pairs would give > 1Gbps performance in and out eg as a firewall, or as a server with redundant connection, or to talk to NAS boxes etc. Two extra nets wouldn't push the power consumption up by much.

    Hopefully Asus (or other) will produce a rev2 board with the extra ethernets.
    Reply
  • Lonyo - Thursday, November 30, 2006 - link

    quote:

    You also get all twelve SATA ports, but there's only support for ten USB ports and two GigE ports. Obviously the number of people that will complain about not having all twenty USB ports and four GigE ports are limited, but with AMD expecting the L1N64-SLI WS to retail for around $370, we wanted all of the bells and whistles.


    There's support for 14 USB ports AFAIK, 4 on the back, and 5 headers on the mobo, which makes 14 I think, which still isn't 20, but it's more than enough, and it's more than 10 :P
    Reply
  • mino - Friday, December 01, 2006 - link

    No, those 2 headers are for IEEE1394... Reply
  • SLIM - Thursday, November 30, 2006 - link

    Is this just a prelim review only talking about the processors and the real review is still to come? I mean the only reason for this platform to exist is the 4 GPUs not the cpus. Let's see 4 8800gtx's with the FX74s playing at insane resolutions in games that intel simply can't fathom due to the lack of support for 4 gpus (at least as far as I know of).

    The 4 gpus are the point of this platform, not power consumption, not 3dsmax... GAMES WITH 4 GPUS! I don't mean to be too abrupt, but the boards got 4 graphics slots for a reason.
    Reply
  • Spoelie - Friday, December 01, 2006 - link

    A single 8800gtx is already being bottlenecked by a core 2, let alone them in SLI.

    Quad SLI will not have any performance advantage over SLI, as the processing power to feed them is just not there.
    Reply
  • PrinceGaz - Thursday, November 30, 2006 - link

    I dread to think what the power consumption of a system with the FX-74s and four 8800GTX cards would be. It would probably be close to 1KW, possibly even around 1100W or so. That would be insane! Reply
  • JarredWalton - Thursday, November 30, 2006 - link

    Given that this "4x4" initiative is apparently going to use NVIDIA graphics cards, and NVIDIA makes both AMD and Intel chipsets, if they decide to do anything with quad G80 chips you can pretty much guarantee that it will be for both platforms. Thus, 4x4 is really 2x2 right now, whereas the 775 platform is 1x4 and Xeon is 2x4. In the future, AMD will have 2x4, and perhaps all the platforms will support some silly quad GPU configuration.

    Basically, quad SLI was all about the pissing contest. "Oh yeah, well I have for GPUs and you only have two!" Then there was the completely bogus marketing material that they sent out with quad SLI talking about how great the extra GPUs would be for accelerating physics calculations. I suppose it's possible that in the future such applications will become useful, but almost a year after the initial talk of NVIDIA physics acceleration and we have yet to see any actual demonstration of this capability. Anyway, this Quad FX is just the same thing as Quad SLI: potentially good marketing, but lackluster final performance and terrible heat and power requirements.
    Reply
  • Viditor - Thursday, November 30, 2006 - link

    quote:

    if they decide to do anything with quad G80 chips you can pretty much guarantee that it will be for both platforms

    If they can...
    The 680a chipset has a direct HT link to each MCP, the 680i obviously can't do that and must bridge through the SPP.

    quote:

    Anyway, this Quad FX is just the same thing as Quad SLI: potentially good marketing, but lackluster final performance and terrible heat and power requirements


    Now if only we could find a review that actually showed that...;)
    Seriously, the one major benefit of Quad FX is that it can run 4 GPUs. While I appreciate all of the conjecture and speculation, it isn't really a test of the facts, is it?
    Reply
  • defter - Friday, December 01, 2006 - link

    <quote>Seriously, the one major benefit of Quad FX is that it can run 4 GPUs.</quote>

    How that's a benefit? You can have 8 GPUs in a same system (AMD or Intel based, it doesn't matter) with a couple of NVIDIA Quadro Plex 1000 Model II's if money isn't an issue:
    http://www.nvidia.com/page/quadroplex_comparison_c...">http://www.nvidia.com/page/quadroplex_comparison_c...

    Reply
  • JarredWalton - Friday, December 01, 2006 - link

    Fact: Quad SLI (7950 GX2) works on 590 SLI and 680i.
    Fact: Quad SLI (8800 GTX) does not exist.

    Until the second item changes, we only have the first to go on, which is that current quad SLI works - at least as much as it works anywhere - on both platforms. And the QSLI drivers are still largely broken - you can run benchmarks, but as soon as you start playing lots of games rather than just benching, problems crop up. Neverwinter Nights 2 for example doesn't even run properly with CrossFire or SLI, so let's not even worry about getting QSLI support for now.
    Reply
  • JackPack - Thursday, November 30, 2006 - link

    8800 GTX requies two slots, which means it won't fit in the 4x4 motherboard. Quad-SLI performance has already shown to be poor using two 7950 GX2 cards. Finally, how do you bridge four 8800 cards together? Reply
  • Viditor - Thursday, November 30, 2006 - link

    quote:

    8800 GTX requies two slots, which means it won't fit in the 4x4 motherboard


    Huh?
    http://www.bit-tech.net/hardware/2006/11/08/nvidia...">Single slot 8800 GTX

    quote:

    Quad-SLI performance has already shown to be poor using two 7950 GX2 cards


    This is only when using a single MCP, the 680a uses dual MCPs.
    The 680i uses one MCP and one SPP.

    quote:

    Finally, how do you bridge four 8800 cards together?

    By having 2 sets of bridges (one bridge per MCP).
    Reply
  • JarredWalton - Friday, December 01, 2006 - link

    Quad SLI has problems whether or not you have dual MCPs. It's driver and software related - basically the drivers don't do AFR on a lot of titles and so you end up with lower than 7900 GTX SLI performance.

    As for two slots, they're talking the width of the cards. They only plug into one slot, but they fill the adjacent slot. Quad 8800 GTX would require eight expansion slots right now. Given that Vista 8800 drivers aren't even out yet, I think NVIDIA has other things to do before they worry about moving beyond SLI'ed 8800 cards.
    Reply
  • PrinceGaz - Thursday, November 30, 2006 - link

    I suppose you could replace the HSF with something smaller which would fit in a single-slot, which would have to mean water-cooling.

    Quad-SLI performance (or lack of) is probably a driver-issue.

    Don't 8800 cards have two SLI sockets therefore allowing you to chain together as many as you like (in theory)?
    Reply
  • casket - Thursday, November 30, 2006 - link

    It appears with win-xp sp2... this quad fx stinks. How about Win 2003 or Vista Ultimate? It might change things drastically. Reply
  • Neosis - Thursday, November 30, 2006 - link

    I don't think the problems in the benchmarks are not an opperating system issue. Two processors having totally four cores are not the same as a processor having the same number of cores. Additional latencies will slow down the performance. Reply
  • Viditor - Thursday, November 30, 2006 - link

    quote:

    I don't think the problems in the benchmarks are not an opperating system issue


    Actually, they probably are...Windows XP is not NUMA aware, while Vista is.

    quote:

    Two processors having totally four cores are not the same as a processor having the same number of cores. Additional latencies will slow down the performance


    In this case there is no difference...the Kentsfield has exactly the same latency as a 2 socket dual core because the 2 dual cores on-board don't talk directly with each other.
    Reply
  • Neosis - Friday, December 01, 2006 - link

    quote:

    the Kentsfield has exactly the same latency as a 2 socket dual core because the 2 dual cores on-board don't talk directly with each other.


    However (in my opinion) since all these four cores share the same 8MB L2 cache and Intel's memory disambiguation forces all cores to use L2 cache more, additional latencies are not significant as the Amd's 4x4 platform. But you are right again that connecting the dies through the FSB requires all die to die communication to go back to the Northbridge and into the system memory. That can be a serios perfomance issue when Amd has competing processers.
    Reply
  • mino - Friday, December 01, 2006 - link

    Kentsfield == 2 Conroes stuck on 1 FSB. They have _separate_ 4M L2 cache. No 8M L2 on the horizon.. Reply
  • Neosis - Thursday, November 30, 2006 - link

    Where is edit button?

    The first sentence should be "I think ..."
    Reply
  • Neosis - Thursday, November 30, 2006 - link

    I don't think AMD can compete with Kenstfield even with this platform. Enthuiasts usually don't care power consumption and heat problems. A water cooling system (with a large radiator and a strong pump) will do just good. The main concern is neither the power consumption nor the heat problems. When you install two dual core processor, you are going to have performance down due to the increased latency. Nearly in all benchmarks Intel is leading. No suprise that only one motherboard manufacturer was in on.

    Even though I'm an AMD user, I don't see any particular reason people will buy this. But I can say why not:
    - no one knows how long Amd will support this platform. In the past years Amd has beem changing sockets almost each year and half. We know Socket Am3 will use Ddr3.
    - pricing
    Reply
  • Griswold - Saturday, December 02, 2006 - link

    quote:

    - no one knows how long Amd will support this platform. In the past years Amd has beem changing sockets almost each year and half. We know Socket Am3 will use Ddr3.


    Well the first part isnt quite true or very precise, as for the second part, we also know that AM3 CPUs will run in AM2 sockets (but not vice versa). On top of that, we're talking about Socket F here and not AMx.

    If you want to name a good reason to not buy this: The other option is just that much better. End of story. If you want quad AMD, wait 6 months.
    Reply
  • Gigahertz19 - Thursday, November 30, 2006 - link

    On black Friday I was at Circuit City and some store employee near me was telling this woman who was looking for a computer to make sure she buys a computer with a AMD processor because their faster and all around better. I couldn't stand there and let him lie to that woman so I went over there and told her she needs to buy a comp with a Core 2 Duo and gave my reasons. Then the Circuit City guy went into this rant about AMD and the 5000+ processor and how it's the best, haha apparently he hasn't updated his knowledge for quite sometime. I could of stood there and argued it but I just said okay and walked away, didn't walk to make a scene...plus how geeky would that be arguing over processors in the middle of a store where customers are.

    Anyways looks like Intel Core 2 Duo tech is the thing to get. I'm stilling running a old XP-M overclocked with a DFI Socket A mobo. I want to upgrade to Core 2 Duo sometime soon probably get the Core 2 E6600 only because it has 4Mb cache and the slower speed ones don't. Overclock that baby to 3GHz which should be a given with the right mobo like the Evga one and I'll have a awesome system, probably buy a X1950 XT or Pro for around $250 then upgrade to DX10 when it gets cheaper.
    Reply
  • madnod - Thursday, November 30, 2006 - link

    i am really into AMD and i was buying AMD since the last 4 years, but this time intel isreally pushing ahead.
    there is a major thing that intel is doing these days and it's really funny to see the way AMD is responding to that, it kinda remind me of the 3DFX approach, start stacking more things that u already have and wish that things will be better.
    AMD should expedite their transition to the newer CPU desgin, the current K8 architecture can't keep up with the core technology.
    Reply
  • THX - Thursday, November 30, 2006 - link

    Very nice tests. I can't believe the power draw AMD is dealing with here. Reply
  • Ecmaster76 - Thursday, November 30, 2006 - link

    The pin count of AM2 probably isn't an issue. It has as many pins as 940 which can handle multiple HT links and dual channel memory.

    AMD just moved it tot he other socket to people from buying the bundled CPUs and selling them individually for a profit. The 2.6 GHz model for example runs about $100 less per chip than the standard X2 does.
    Reply
  • punjabiplaya - Thursday, November 30, 2006 - link

    Are we going to see updated benchmarks with 64 bit performance and/or Vista and when there is a BIOS fix for the NUMA issues on the board (not the WinXP shortfalls as far as NUMA is concerned, Vista should take care of that)?
    Just curious.
    Reply
  • Nighteye2 - Thursday, November 30, 2006 - link

    I'm interested in that as well. NUMA will be an important part of 4x4 performance - so why isn't NUMA used in the benchmark, or at least mentioned. NUMA is the advantage of having 2 sockets - having NUMA disabled in this benchmark by using an OS that does not support it unfairly cripples the 4x4 performance.
    Reply
  • Viditor - Thursday, November 30, 2006 - link

    quote:

    NUMA will be an important part of 4x4 performance - so why isn't NUMA used in the benchmark, or at least mentioned

    Agreed...I think that one of the reasons that AMD delayed release of this so long is that they wanted to show it on Vista instead of WinXP. It seems to me that there would be a substantial difference between the 2...
    Reply
  • Viditor - Thursday, November 30, 2006 - link

    As a follow up on just how important NUMA is for 4x4, check out http://babelfish.altavista.com/babelfish/trurl_pag...">this review which actually compares the 2...
    There is a DRASTIC difference between performance on XP and Vista!
    Reply
  • Accord99 - Friday, December 01, 2006 - link

    Most of the difference is running in 64-bit mode. The extra bandwidth didn't help the FX-74 in the megatasking bench. They didn't do any game benchmarks but based on past reviews of NUMA, the FX-74 will probably keep on losing to the FX-62 in games. Reply
  • Viditor - Friday, December 01, 2006 - link

    quote:

    Most of the difference is running in 64-bit mode

    I'm not sure I agree...there's a 22.5% increase in performance there, and I haven't seen anything like that on the 64 bit version of 3DS Max before...
    Not to mention that Vista isn't known as a real speed demon (quite the opposite) for these apps...
    What the 64bit version does is allow for larger scene use and stability, not so much faster rendering.
    Reply
  • photoguy99 - Friday, December 01, 2006 - link

    quote:

    I'm not sure I agree...there's a 22.5% increase in performance there, and I haven't seen anything like that on the 64 bit version of 3DS Max before...


    Sorry totally wrong -

    64-bit can make a big difference in performance depending on the app. Remember you can process 64 bits of data in a typical instruction instead of 32, so theoretically twice as much pixel data at a time for rendering.

    Some apps may not show the full benefit it depends on how they are coded and compiled, but it's definitely a real potential for speedup.

    Bottom line is 64-bit could easily account for a bigger performance increase than NUMA.
    Reply
  • Kiijibari - Friday, December 01, 2006 - link

    quote:

    64-bit can make a big difference in performance depending on the app. Remember you can process 64 bits of data in a typical instruction instead of 32, so theoretically twice as much pixel data at a time for rendering.


    quote:

    I'm not sure I agree...there's a 22.5% increase in performance there, and I haven't seen anything like that on the 64 bit version of 3DS Max before...


    You see that he refers already to 3DS MAX .. I have not investigated this, but if he refers to it, then I trust him on that one ...

    Futhermore I miss synthetical Sandra Mem bandwidth benches .. these should easily show what is going on there ...

    Anyways a 4x4 review without mentioning the XP - NUMA problem is just not worth reading it ... Sorry Anand ...

    cheers

    Kiijibari
    Reply
  • Anand Lal Shimpi - Friday, December 01, 2006 - link

    The performance deficit seen when running latency sensitive single and dual threaded applications exists even in a NUMA-aware OS (I've confirmed this under Vista). I'm still running tests under Vista but as far as I see, running in a NUMA-aware OS doesn't seem to change the performance picture at all.

    Take care,
    Anand
    Reply
  • Kiijibari - Saturday, December 02, 2006 - link

    Hi Anand,

    first of all, thanks for your reply.

    Then, if there is really no performance difference, then I would double check the BIOS, if you have really disabled node interleave.

    Furthermore there seems to be a BIOS bug, with the SRAT ACPI tables, which are necessary for NUMA. It would be nice, if you can dig up some more information about that topic.

    Clearly, that would be not your fault, but AMD's.

    cheers

    Kiijibari
    Reply
  • Anand Lal Shimpi - Saturday, December 02, 2006 - link

    From what I can tell the Node Interleave option in the BIOS is doing something. Disabling it (enabling NUMA) results in lower latencies than leaving it enabled, but still not as slow as running with a single socket.

    CPU-Z offers the following latencies for the three configurations:

    2S, NUMA On: 168 cycles
    2S, NUMA Off: 205 cycles
    1S: 131 cycles

    From my discussions with AMD last week, this behavior is expected. I will do some more digging to see if there's anything else I'm missing though.

    Take care,
    Anand
    Reply
  • BikeDude - Tuesday, December 05, 2006 - link

    Could the reason be that 1GB per memory node is simply too little?

    On a configuration like this, you'll easily see one of the nodes with only 256MB or so left...

    So, put in some more memory! At this point 32-bit XP will be limiting, even for 32-bit apps. (XP won't address more than 2^32 Bytes, some of this will be masked by PCI and PCIe devices, and additionally each process only has a 2GB address space for code&data unless you upgrade to 64-bit Windows) Also be aware that nVidia ForceWare 80.00 and newer lost PAE support. You'll experience crashes and non-working games if combined with a PAE aware 32-bit OS (such as Win2003). ForceWare 79.11 works fine though.

    (BTW: MSFT added NUMA support in XP SP2)
    Reply
  • Kiijibari - Saturday, December 02, 2006 - link

    Hi Ananand,

    sounds credible, because there is some extra cache snooping traffic going on, anyways, please keep us posted if there is a new BIOS version available, and if it would "do" something :)

    Windows schweduler differences between XP and VISTA would be interesting, too.
    So far there were only Win32 XP vs. Win Vista64 comparisions, not possible to draw a fair conclusion with that data.

    Thanks a lot

    Kiijibari
    Reply
  • mino - Friday, December 01, 2006 - link

    One important question:

    Are those new FX-7x CPU identical or is there some differentiation employed ???
    Reply
  • Kiijibari - Saturday, December 02, 2006 - link

    identical to what ?

    If you meant Socket-F Opterons, then yes, they are identical, if the BIOS allows it, then normal Opterons should be able to run in 4x4 boards, too.

    cheers

    Kiijibari
    Reply
  • mino - Sunday, December 03, 2006 - link

    Thanks that info(if correct) pretty much clears the FUD. Reply
  • Griswold - Saturday, December 02, 2006 - link

    How so? The 2P+ Opteron IMC wants buffered RAM, while these FX types do not. I dont think a simple BIOS hack can circumvent that. Reply
  • Kiijibari - Saturday, December 02, 2006 - link

    *gasp*

    Du you really think AMD engineers, tests, validate, etc. a CPU for a niche market ??
    There are maybe only a few thousand 4x4 CPUs, that are sold worldwide per month ... it would be economical ridiculous.

    But if you dont know anything about business, maybe that will convince you:

    http://www.aceshardware.com/forums/read_post.jsp?i...">http://www.aceshardware.com/forums/read_post.jsp?i...

    cheers

    Kiijibari

    Reply

Log in

Don't have an account? Sign up now