Benchmarking Performance: CPU Rendering Tests

Rendering tests are a long-time favorite of reviewers and benchmarkers, as the code used by rendering packages is usually highly optimized to squeeze every little bit of performance out. Sometimes rendering programs end up being heavily memory dependent as well - when you have that many threads flying about with a ton of data, having low latency memory can be key to everything. Here we take a few of the usual rendering packages under Windows 10, as well as a few new interesting benchmarks.

Corona 1.3

Corona is a standalone package designed to assist software like 3ds Max and Maya with photorealism via ray tracing. It's simple - shoot rays, get pixels. OK, it's more complicated than that, but the benchmark renders a fixed scene six times and offers results in terms of time and rays per second. The official benchmark tables list user submitted results in terms of time, however I feel rays per second is a better metric (in general, scores where higher is better seem to be easier to explain anyway). Corona likes to pile on the threads, so the results end up being very staggered based on thread count.

Rendering: Corona Photorealism

Blender 2.78

For a render that has been around for what seems like ages, Blender is still a highly popular tool. We managed to wrap up a standard workload into the February 5 nightly build of Blender and measure the time it takes to render the first frame of the scene. Being one of the bigger open source tools out there, it means both AMD and Intel work actively to help improve the codebase, for better or for worse on their own/each other's microarchitecture.

Rendering: Blender 2.78

POV-Ray 3.7.1

Another regular benchmark in most suites, POV-Ray is another ray-tracer but has been around for many years. It just so happens that during the run up to AMD's Ryzen launch, the code base started to get active again with developers making changes to the code and pushing out updates. Our version and benchmarking started just before that was happening, but given time we will see where the POV-Ray code ends up and adjust in due course.

Rendering: POV-Ray 3.7

Cinebench R15

The latest version of CineBench has also become one of those 'used everywhere' benchmarks, particularly as an indicator of single thread performance. High IPC and high frequency gives performance in ST, whereas having good scaling and many cores is where the MT test wins out. 

Rendering: CineBench 15 SingleThreadedRendering: CineBench 15 MultiThreaded

 

 

Benchmarking Performance: CPU System Tests Benchmarking Performance: CPU Web Tests
Comments Locked

254 Comments

View All Comments

  • Maleorderbride - Tuesday, April 11, 2017 - link

    Read more than eight words and you will see that he refers to DX9 and DX11 specifically, which of course benefit far less from more CPU cores. DX12 is generally a win for AMD. What's the problem?
  • farmergann - Tuesday, April 11, 2017 - link

    The problem is clearly laid out in the OP. Pitiful that an i5 can be so thoroughly trounced yet moronic shills such as this author still go out of their way to make laughable attempts at rationalizing the defunct intel product.
  • Icehawk - Tuesday, April 11, 2017 - link

    Yay, we finally are at a point where AMD is a viable choice. It will be interesting to see what/if Intel fires back. If I was buying a new PC right now it would be a tough choice because I do a fair amount of HEVC encoding but am primarily a gamer.
  • psychobriggsy - Wednesday, April 12, 2017 - link

    If you do both at the same time, then the 1600's addition two cores and SMT will really help hide the effect on gaming from the encoding.
  • Falck - Tuesday, April 11, 2017 - link

    Great review! Just another typo on page 3:

    "As the first consumer GPU to use HDM, the R9 Fury is a key moment in graphics..."

    I think it's HBM?
  • Maleorderbride - Tuesday, April 11, 2017 - link

    Why did the i5-7600K get dropped from the majority of the benchmarks (or their results)? It seems rather odd to not report the data with the same set of CPUs for every benchmark.

    Minor typo, but I believe in the Conclusion you mean to say " Looking at the results, it’s hard NOT to notice "
  • Outlander_04 - Tuesday, April 11, 2017 - link

    Is there going to be a follow up article where you compare Ryzen performance when you use 3200Mhz RAM ?
    It does make a difference
  • psychobriggsy - Wednesday, April 12, 2017 - link

    What's the cost differential of such RAM versus a more reasonable (when considering CPUs in this price range) option?
  • trivor - Tuesday, April 11, 2017 - link

    If you're going to be doing anything other gaming (and only 1080P gaming) then the Ryzen is a very good pick. When you're talking about video transcoding (one of my primary uses for my higher end computers) Ryzen 5 takes i5 to town.
  • Joe Shmoe - Tuesday, April 11, 2017 - link

    Nice to see these chips tested with sensible gpu solutions.
    The GTX 1080 & above Nvidia cards (tho A.M.D. has yet to release anything as powerful) have been used by every site on the planet to test rysen chips;
    it took Jim on the adored TV youtube channel to actually show the lack of asynchronous compute hardware (which is not built in to Nvidia cards)and/ or the Nvidia drivers are actually knee capping rysen chips in 1080p game benchmarking, in DX 12, vs kaby lake i7's.
    Nvidia are just rubbish at DX12 for the money,and this will not improve no matter how many transistors they throw at it without assync compute hardware.
    Most experienced users I know are going to buy an R5 1600 (non X),
    clock it to 3.8 gig on all 6 cores,slap in an RX 580 when they drop to £200 ish, and not actually worry about benchmarks.
    It will game fine in 1080p compared to what they are running now.
    The whole i7 'gaming chip' argument is moot_
    Until ~ 20 months ago, intel marketed i5's as gaming chips and the extra price on i7's was for a productivity edge.
    (5* consumer chips at a massive price hike,but they are a lot more pro work capable)
    I dont know anybody who uses a 7700K for anything, frankly.
    The whole system price thing has got beyond a joke.

Log in

Don't have an account? Sign up now