Single Channel vs. Dual Channel Memory
aka Why Apple uses 2x4GB as a base option for the iMac

As a result of this review, Lenovo and I have spoken back and forth regarding the DRAM situation around the P300. When sampling the media for the P300 series, their goal would seem to be to show off their default Haswell Xeon workstation and gather feedback. Due to the whole reviewing procedure, we only test the one configuration that is sent to us, so conclusions about performance could plausibly be completely different if a separate configuration had ended up on the doorstep.

The issue at hand is that the system, as per Lenovo’s system configurator, only gives one module of DRAM by default. This happened to be the configuration that was sent out to the media for review. For anyone outside the technology industry, this might not sound like an issue, but for anyone that has delved into how a computer works should see this as an immediate yellow flag, especially when paired with a dual channel processor.

To cut a long explanation short, a single memory module used in a system with a dual channel processor means that peak memory performance that requires a lot of memory accesses is slower than the peak potential of the processor. A dual channel processor can issue a command that reads or writes from the memory, and the system can access each memory channel independently and at the same time, thereby halving the time to read or write a big block of data. Haswell processors also support up to two memory modules per channel, which gives four DRAM slots in total.

Lenovo’s initial response to my concerns was a little terse. Rather than acknowledge a potential problem with performance, it was redirected around to a potential benefit. By offering a single 8GB DRAM module by default, rather than a 2 x 4GB arrangement, users could upgrade the system to a maximum of 32GB at a later date by adding more 8GB modules.

There was another disagreement regarding price. Again, I suggested that a 2 x 4 GB option should be considered, although there was no way for me to select this in the configurator:

Should a user want to take advantage of dual channel operation, using two modules of the same size to keep performance optimal, it would require an extra $395 outlay. This is despite the reality that the module itself costs as little as $70 over at Newegg, making a 464% markup over retail cost. One response on this issue was that Lenovo also acts as an OEM with resellers and did not want to price compete with them. The reality for consumers is that reducing that $395 to $125 would still be a bit expensive, but more in line with what a prosumer might expect.

Even if a seller wants to be able to offer a 1x8GB configuration with this system then by all means do so, but please adjust the default option to 2x4GB. This way, if a prosumer or IT department of a company want to purchase a number of systems, if they have to manually select 1x8 GB then they might actually know what they are doing performance wise. Even better would be a disclaimer on the website notifying a user of potential loss in peak performance due to the choice would be best.

As I most of our readers know, here at AnandTech we do not tend to make suggestions without some hard data to back it up. To this end, we ran the P300 through our benchmark suite twice – once in the default 1x8GB configuration, and another in a 2x4GB configuration using some Kingston memory we have in from a previous review. Both sets of modules were verified at the same speed (DDR3-1600 CAS 11), and the results comparison is as follows. Percentage improvements above 5% were listed in green.

A number of benchmarks show complete ambivalence to the adjustment from single channel memory to dual channel memory. However, there is a number of key prosumer type software that is heavily affected by the dual channel arrangement:

x264 4K encoding: +30.5%
Fluid Dynamics: +30.0%
OpenSSL: +27.6% / +25.4%
PCMark8 Storage: +19.2%
SYSMark Media: +10.3%
Pro/Engineer: +9.0%
Compression: +8.7%

While it is true that not everyone will be using this sort of software, Lenovo could easily be removing a potential negative by offering a dual channel arrangement by default.

In the title of this page, it mentions Apple’s newest iMac, the model that comes with the 5K screen. I bring this into the picture because it shows a level of detail surrounding DRAM selections and perhaps a different mindset when it comes to hardware selection. If I navigate to the Apple’s Configurator, it gives me the following options for DRAM:

Apple gives three options with their Haswell processors: 2x4GB, 2x8GB or 4x8GB. This means that Apple will only sell you an iMac in a dual channel configuration, rather than any lopsided configuration. By limiting their offering to the user base, it subsequently provides a level of performance that maximizes the potential of the system regardless of what the system is used for.

Ultimately using a single module configuration might invoke a little of the Dunning-Kruger effect – if a user does not know their system is not running at its potential, there won’t be any complaints. Unfortunately this skews the reality of the situation, as I’m sure that if up to 30% extra performance were offered, they would take it – depending on the price.

Offering a 2x4GB option by default is something Lenovo could do very quickly, and cost them very little in the process, although it might make them miss out on potential sales of 8GB DRAM modules at $395 each.

Professional Performance Lenovo ThinkStation P300 Conclusion
POST A COMMENT

56 Comments

View All Comments

  • Gigaplex - Wednesday, November 05, 2014 - link

    With what GPU? That's where the bulk of the costs went. Reply
  • SuperVeloce - Tuesday, November 04, 2014 - link

    What the hell is this with non-standard 24pin power motherboard connector? Are they out of their minds? Reply
  • DanNeely - Tuesday, November 04, 2014 - link

    Dunno; OTOH the target market for something like this would never service it except via waranty so it wouldn't matter; and the 24pin ATX connector is really out of sync with modern systems needs; specifically the 5x 3.3 and 5x 5v are way over what a modern system needs and unless you're doing RS232 the -12v is useless too. 3.3 is going the way of the dodo since it's only used by legacy PCI now; and USB doesn't need anywhere as much 5V as a P1 system does. Spitting the connector would theoretically help with cable management by making the bundles slimmer; and adding more spread out connection points on the board makes it easier to maintain stable voltages everywhere. Reply
  • Gigaplex - Wednesday, November 05, 2014 - link

    Dell have started doing this again recently too. My previous Dell workstation is far more flexible than my current one. I'm also not happy with the lack of air flow over the hard drives. I've had several drive failures that might be attributed to overheating. This Lenovo looks like it has similar issues but at least it has a fan on the front. Reply
  • edzieba - Tuesday, November 04, 2014 - link

    I recognise that Al heatsink! Lenovo plonk them on the secondary CPU in the C20 and C30 too.

    I really wish Lenovo would invest in backplanes for their drives, but at least the sideways mount with connectors facing you is better than the horrific mess at the bottom of the C20/30.
    Reply
  • TETRONG - Wednesday, November 05, 2014 - link

    Completely pointless system especially if it's non ECC memory.

    The truth is that you could build a system that would crush this with an overclocked i5 and a 970 for half the price + it would be upgradeable and running DDR4.

    There's nothing magical about Xeons and Quadros..total bullshit unless you absolutely need DP.
    Anandtech should build the aforementioned system to embarrass all these clowns.

    They could even hackintosh it to piss on the Mac Pro.
    Reply
  • TETRONG - Wednesday, November 05, 2014 - link

    Sorry, DDR3 with i5 or DDR4 with a 5820K Reply
  • nwai2208 - Wednesday, November 05, 2014 - link

    A Xeon E3 system with non-ECC memory means it is just a i7 machine with a Xeon label on it. Reply
  • NanoTube1 - Wednesday, November 05, 2014 - link

    To sum it up: a poor, ugly, cheap build... Reply
  • Dr.Neale - Wednesday, November 05, 2014 - link

    I would use 2 or 4 sticks of Samsung 8GB DDR3L-1600 1.35V ECC UDIMM, model M391B1G73BH0-YK0, which go for $90 at oemPCworld.

    But then again, I would also roll my own using an ASUS P9D WS motherboard (Intel C226 chipset, ATX, supports ECC, unlike ASUS Z97 WS) and an AMD FirePro W7100 (K4200 level: 256-bit 4GB) or W8100 (K5200 level: 512-bit 8GB) GPU. Although the recently-released W7100 isn't listed on NewEgg just yet, right now you could get the W8100 instead, at roughly the same cost, by taking advantage of AMD's current half-price FirePro promo (which ends Jan. 15, 2015).

    Also, I'd use a SeaSonic SS-520FL2 fanless 520W 80+Platinum PSU, and put AeroCool DS Dead Silence Case Fans (available at FrozenCPU) in a Fractal Designs Arc Midi R2 mid-tower ATX case (which has a tinted window).

    I'd stick with the Intel Xeon e3-1276 v3 CPU, but cool it with a ThermalRight Archon IB-E X2 single-tower cooler (also available at FrozenCPU). Using the double-tower ThermalRight Silver Arrow IB-E instead would run maybe 2° C cooler, but 2 dBA louder, according to reviews I've read, but using the Archon IB-E X2 guarantees zero clearance issues on the motherboard.

    For an SSD, the pro-sumer Samsung 850 Pro (used in a UPS-backed system) or the enterprise Samsung 845DC Pro are both viable options. Both use next-generation MLC V-NAND, with all its advantages.

    But all this is only IF you happen to need an entry-to-mid-level Work Station RIGHT NOW. Broadwell 14nm Xeon e3-1200 v4 series Socket 1150 CPUs are about 6 months away (everything else could stay the same), and Skylake 14nm Xeon e3-1200 v5 series Socket 1151 CPUs are about 12 months away (but they would need a next-generation motherboard with an Intel C236 Greenlow chipset, which requires DDR4 2133 1.20V ECC RDIMM memory). However, this setup could use PCIe NVMe SSDs, and could (probably, assuming LGA isn't supplanted by BGA) be later upgraded with a Cannonlake 10nm Xeon e3-1200 v6 series CPU.

    Also, by waiting, you could buy Windows 10 instead of Windows 7 for your OS.

    Anyways, just my thoughts on a decent bang-for-the-buck, near-silent Work Station build.

    P.S. A WASD Code backlit mechanical keyboard might be a nice cherry-on-top touch.
    Reply

Log in

Don't have an account? Sign up now