Other (Windows 2003 64-bit)

Render server are only a small part of the server market. We show you the typical render tests we have performed so many times before.

3DSMax 2008 32-bit - architecture scene

Cinebench 10 64-bit

This is the one of the applications where the Xeons can still roll their muscles without being slowed down by the platform. The new AMD Shanghai Opteron does much better than its older brother, but Xeons remain the best choice: a dual Xeon 3.3GHz offers 81% to 88% of the rendering power of the quad Opteron at a much lower price point.

HPC Virtualization (ESX 3.5 Update 2/3)
Comments Locked

29 Comments

View All Comments

  • Bruce Herndon - Tuesday, December 23, 2008 - link

    I'm surprised by your comments. You claim that VMmark is a CPU/memory-centric benchmark. If I look at the raw data in the VMmark disclosure for Dell's R905 score of 20.35 @ 14 tiles, I see that the benchmark is driving 250-300 MB/s of disk IO across several HBAs and storage LUNs. This characteristic scales with the various systems mentioned in the article.

    As a designer of VMmark, I happen to know that both storage bandwidth (for the fileserver) and latency (for mail and database)are critical to acheiving good VMmark scores. Furthermore, the webserver drives substantial network IO. The only purely CPU-centric component to VMmark is the javaserver. Overall, the benchmark does exercise the entire virtualization solution - hypervisor, CPU, memory, disk, and network.
  • cdillon - Tuesday, December 23, 2008 - link

    While SAS and Infiniband share some connectors and obtain similar data rates, they are incompatible technologies with two different purposes. Infiniband can be used for disk shelf connections, but it is less common and definitely not the case here. You should not call the connection between the Adaptec 5805 controller and the disk shelf an "Infiniband connection", even if it is using Infiniband connectors and cables, it is simply an SAS connection.

  • JohanAnandtech - Tuesday, December 23, 2008 - link

    Well, the physical layer is Infiniband, the used protocol is SCSI. I can understand calling it an "infiniband connection" maybe confusing, but the cable is an infiniband cable.
  • shank15217 - Friday, December 26, 2008 - link

    Anand, I think the above poster is right. The Adaptec RAID 5805 uses SFF-8087 connectors but the protocol is SSP (Serial SCSI Protocol). Infiniband is a physical layer protocol that shares the same connector as SAS but they are not the same. Nothing in the Adaptec RAID 5805 spec mentions Infiniband as a supported protocol.

    http://www.adaptec.com/en-US/products/Controllers/...">http://www.adaptec.com/en-US/products/C...ers/Hard...
  • niva - Tuesday, December 23, 2008 - link

    I'm not sure you can run your same ol benchmark for rendering, and I'd really like more insight into what you guys are rendering and if it's indeed using all 16/24(six core 4 point system)/32(hyperthreading) cores on the system.

    What renderer, what scene, details details...

    These chips get gobbled up by render farms and this is indeed where they can really flex their muscles to the fullest.
  • JohanAnandtech - Tuesday, December 23, 2008 - link

    Just click on the link under "we have performed so many times before" :-)
  • akinneyww - Tuesday, December 23, 2008 - link

    I read DailyTech and anandtech.com to keep up with the latest in IT. I appreciate the thought that has gone into putting together this article. I would like to see more articles like this one.
  • Jammrock - Tuesday, December 23, 2008 - link

    The VMware results shocked me the most. I know AMD has been working hard on the virtualization sector and it looks like their work has paid off.
  • classy - Tuesday, December 23, 2008 - link

    With the rapid increase of virtualization, AMD is looking really strong. We have begun using 3.5 Vmware and are expanding the use of it. Virtualization is truly becoming a big thing in server choice.

Log in

Don't have an account? Sign up now