Test Bed and Setup

As per our processor testing policy, we take a premium category motherboard suitable for the socket, and equip the system with a suitable amount of memory running at the manufacturer's maximum supported frequency. This is also typically run at JEDEC subtimings where possible. It is noted that some users are not keen on this policy, stating that sometimes the maximum supported frequency is quite low, or faster memory is available at a similar price, or that the JEDEC speeds can be prohibitive for performance. While these comments make sense, ultimately very few users apply memory profiles (either XMP or other) as they require interaction with the BIOS, and most users will fall back on JEDEC supported speeds - this includes home users as well as industry who might want to shave off a cent or two from the cost or stay within the margins set by the manufacturer. Where possible, we will extend our testing to include faster memory modules either at the same time as the review or a later date.

Test Setup
Processor AMD Ryzen 3 1300X (4C/4T, 3.4G, 65W)
AMD Ryzen 3 1200 (4C/4T, 3.1G, 65W)
Motherboards ASUS Crosshair VI Hero
Cooling Noctua NH-U12S SE-AM4
Power Supply Corsair AX860i
Memory Corsair Vengeance DDR4-3000 C15 2x8GB
Memory Settings DDR4-2400 C15
Video Cards MSI GTX 1080 Gaming X 8GB
ASUS GTX 1060 Strix 6GB
Sapphire Nitro R9 Fury 4GB
Sapphire Nitro RX 480 8GB
Sapphire Nitro RX 460 4GB (CPU Tests)
Hard Drive Crucial MX200 1TB
Optical Drive LG GH22NS50
Case Open Test Bed
Operating System Windows 10 Pro 64-bit

Many thanks to...

We must thank the following companies for kindly providing hardware for our multiple test beds. Some of this hardware is not in this test bed specifically, but is used in other testing.

Thank you to Sapphire for providing us with several of their AMD GPUs. We met with Sapphire back at Computex 2016 and discussed a platform for our future testing on AMD GPUs with their hardware for several upcoming projects. As a result, they were able to sample us the latest silicon that AMD has to offer. At the top of the list was a pair of Sapphire Nitro R9 Fury 4GB GPUs, based on the first generation of HBM technology and AMD’s Fiji platform. As the first consumer GPU to use HDM, the R9 Fury is a key moment in graphics history, and this Nitro cards come with 3584 SPs running at 1050 MHz on the GPU with 4GB of 4096-bit HBM memory at 1000 MHz.

Further Reading: AnandTech’s Sapphire Nitro R9 Fury Review

Following the Fury, Sapphire also supplied a pair of their latest Nitro RX 480 8GB cards to represent AMD’s current performance silicon on 14nm (as of March 2017). The move to 14nm yielded significant power consumption improvements for AMD, which combined with the latest version of GCN helped bring the target of a VR-ready graphics card as close to $200 as possible. The Sapphire Nitro RX 480 8GB OC graphics card is designed to be a premium member of the RX 480 family, having a full set of 8GB of GDDR5 memory at 6 Gbps with 2304 SPs at 1208/1342 MHz engine clocks.

Further Reading: AnandTech’s AMD RX 480 Review

With the R9 Fury and RX 480 assigned to our gaming tests, Sapphire also passed on a pair of RX 460s to be used as our CPU testing cards. The amount of GPU power available can have a direct effect on CPU performance, especially if the CPU has to spend all its time dealing with the GPU display. The RX 460 is a nice card to have here, as it is powerful yet low on power consumption and does not require any additional power connectors. The Sapphire Nitro RX 460 2GB still follows on from the Nitro philosophy, and in this case is designed to provide power at a low price point. Its 896 SPs run at 1090/1216 MHz frequencies, and it is paired with 2GB of GDDR5 at an effective 7000 MHz.

We must also say thank you to MSI for providing us with their GTX 1080 Gaming X 8GB GPUs. Despite the size of AnandTech, securing high-end graphics cards for CPU gaming tests is rather difficult. MSI stepped up to the plate in good fashion and high spirits with a pair of their high-end graphics. The MSI GTX 1080 Gaming X 8GB graphics card is their premium air cooled product, sitting below the water cooled Seahawk but above the Aero and Armor versions. The card is large with twin Torx fans, a custom PCB design, Zero-Frozr technology, enhanced PWM and a big backplate to assist with cooling.  The card uses a GP104-400 silicon die from a 16nm TSMC process, contains 2560 CUDA cores, and can run up to 1847 MHz in OC mode (or 1607-1733 MHz in Silent mode). The memory interface is 8GB of GDDR5X, running at 10010 MHz. For a good amount of time, the GTX 1080 was the card at the king of the hill.

Further Reading: AnandTech’s NVIDIA GTX 1080 Founders Edition Review

Thank you to ASUS for providing us with their GTX 1060 6GB Strix GPU. To complete the high/low cases for both AMD and NVIDIA GPUs, we looked towards the GTX 1060 6GB cards to balance price and performance while giving a hefty crack at >1080p gaming in a single graphics card. ASUS offered a hand here, supplying a Strix variant of the GTX 1060. This card is even longer than our GTX 1080, with three fans and LEDs crammed under the hood. STRIX is now ASUS’ lower cost gaming brand behind ROG, and the Strix 1060 sits at nearly half a 1080, with 1280 CUDA cores but running at 1506 MHz base frequency up to 1746 MHz in OC mode. The 6 GB of GDDR5 runs at a healthy 8008 MHz across a 192-bit memory interface.

Further Reading: AnandTech’s ASUS GTX 1060 6GB STRIX Review

Thank you to Crucial for providing us with MX200 SSDs. Crucial stepped up to the plate as our benchmark list grows larger with newer benchmarks and titles, and the 1TB MX200 units are strong performers. Based on Marvell's 88SS9189 controller and using Micron's 16nm 128Gbit MLC flash, these are 7mm high, 2.5-inch drives rated for 100K random read IOPs and 555/500 MB/s sequential read and write speeds. The 1TB models we are using here support TCG Opal 2.0 and IEEE-1667 (eDrive) encryption and have a 320TB rated endurance with a three-year warranty.

Further Reading: AnandTech's Crucial MX200 (250 GB, 500 GB & 1TB) Review

Thank you to Corsair for providing us with an AX1200i PSU. The AX1200i was the first power supply to offer digital control and management via Corsair's Link system, but under the hood it commands a 1200W rating at 50C with 80 PLUS Platinum certification. This allows for a minimum 89-92% efficiency at 115V and 90-94% at 230V. The AX1200i is completely modular, running the larger 200mm design, with a dual ball bearing 140mm fan to assist high-performance use. The AX1200i is designed to be a workhorse, with up to 8 PCIe connectors for suitable four-way GPU setups. The AX1200i also comes with a Zero RPM mode for the fan, which due to the design allows the fan to be switched off when the power supply is under 30% load.

Further Reading: AnandTech's Corsair AX1500i Power Supply Review

Thank you to G.Skill for providing us with memory. G.Skill has been a long-time supporter of AnandTech over the years, for testing beyond our CPU and motherboard memory reviews. We've reported on their high capacity and high-frequency kits, and every year at Computex G.Skill holds a world overclocking tournament with liquid nitrogen right on the show floor.

Further Reading: AnandTech's Memory Scaling on Haswell Review, with G.Skill DDR3-3000

 

The AMD Ryzen 3 1300X and Ryzen 3 1200 CPU Review Benchmark Overview
POST A COMMENT

140 Comments

View All Comments

  • Gavin Bonshor - Thursday, July 27, 2017 - link

    One of the hardest working men in the industry! :D Reply
  • edlee - Thursday, July 27, 2017 - link

    I dont understand the point of making a $100 cpu without an integrated gpu if you wanted to attract the lower end market, this is really silly mistake. Sort of like intel including an integrated gpu with i7-7700k, it doesnt make sense, 95% of those with a 7700k will buy a gpu, but someone who is looking for a lowend cpu is not going to buy a discrete graphics cards, its just silly Reply
  • phoenix_rizzen - Thursday, July 27, 2017 - link

    It really depends on the use case.

    For example, are there any integrated GPUs that support 3 monitors? I know a lot of them support dual monitors, but haven't come across any that support 3 (although I haven't looked that hard). My work PC is a low-profile desktop running an AMD Athlon-II x4 CPU and an Nvidia 730 GT GPU for tri-monitor setup. Upgrading the CPU/motherboard/RAM to a Ryzen 3 1300X would be a huge upgrade for this system.

    90-odd % of the desktops in the schools here use AMD Athlon-II CPUs (graphics integrated into the chipset), with the rest using Intel Pentium CPUs (graphics integrated into the CPU). And we add Nvidia 210 or 730 GPUs to those that need better multi-monitor support or better 3D performance. Why do we do it that way? Cost. We try to keep the complete desktop system (case, motherboard,
    CPU, at least 2 GB RAM, no storage of any kind) to under $200 CDN (they're diskless Linux stations). We have just shy of 5000 of those in the district right now.

    We've avoided the Bulldozer-based APUs so far as the price/performance just wasn't there compared to the Pentium line (from our suppliers). But the Ryzen 3 looks like a decent upgrade. Will be interesting to see what the prices are like for it from our suppliers this winter/spring. Will also be interesting to see what the GPU side of the Zen-based APUs will be like next year.

    The other important bit is driver support. We are a mostly Linux-using school district, so we tend to use hardware that's at least 2 steps back from the bleeding edge. That way, we get better prices, and better driver support.
    Reply
  • edlee - Thursday, July 27, 2017 - link

    i understand when upgrading from integrated to gpu like you stated in your use case, but from the low end price standpoint, a i3-7100 is cheaper because they dont need to add a gpu like the ryzen 3 needs, so its not competing on a performance standpoint or a price standpoint when you add the price of the cheapest gpu Reply
  • Outlander_04 - Friday, July 28, 2017 - link

    Using an integrated gpu is usually a poor choice. Intels drivers are so dumbed down they are worse than hopeless.
    Factor in that using integrated means less system RAM available as well so performance can be reduced
    Reply
  • Ratman6161 - Tuesday, August 01, 2017 - link

    Many people may be starting out from the position of knowing that the integrated graphics on any of the Intel CPU's in the test are not good enough for them. If you know that from the start then the argument that AMD doesn't have an IGPU is meaningless. I'm also somewhat interested in seeing overclocking tests with the R3 as that is one thing you just don't get with Intel at this level short of the 7350K. I sort of suspect that an OC'd 1200 could but just as fast or faster than a 1300X (though at only a $20 difference I'm not sure how much it matters).
    Also, in more computationally intense tasks, the 1300x really doesn't do badly against the i5 that costs $53 more so once again, if you don't care about integrated graphics it could be a good choice for some people.

    On the other hand, for someone for whom MS Office, email, and web browsing are their main uses, then something like the i3-7100 suddenly looks very attractive - or even the Pentium G.
    In this segment, AMD really needs to get a Ryzen Based APU on the market. If they did a single CCX, 4 core and used the empty space vacated by the second CCX for a decent IGPU they could definitely have an i3 killer.
    Reply
  • renw0rp - Thursday, July 27, 2017 - link

    I had HP Folio 9470m with core i5-3437U and it was driving 3 * 1920x1200 screens without an issue. And it's ~2013 processor...

    3rd gen of Core processors was the first to support 3 displays. The 2nd gen supported just 2.
    Reply
  • stuartlew - Thursday, July 27, 2017 - link

    AMD Kaveri does three monitors Reply
  • serendip - Friday, July 28, 2017 - link

    Are there motherboards with integrated chipset graphics for Ryzen?

    I understand the good thing about adding a discrete GPU only to PCs that need one but not having an integrated GPU is nuts, for the mass market at least.
    Reply
  • silverblue - Friday, July 28, 2017 - link

    No, but Bristol Ridge launched yesterday, so there are now APUs that use AM4. Reply

Log in

Don't have an account? Sign up now