CPU Benchmark Performance: Power, Office, and Science

Our previous set of ‘office’ benchmarks has often been a mix of science and synthetics, so this time we wanted to keep our office section purely on real-world performance.

For the remainder of the testing in this review of the Ryzen 7 5800X3D, we are using DDR4 memory at the following settings:

  • DDR4-3200

Power

(0-0) Peak Power

Looking at the power draw between the Ryzen 7 5800X3D and the other chips tested, it is more power-efficient than the original Ryzen 7 5800X. This could be down to a load VID core voltage as the 5800X3D is actually clocked lower by 100 MHz at turbo clock speeds.

Office

(1-1) Agisoft Photoscan 1.3, Complex Test

In our office benchmark, the newer Ryzen 7 5800X3D performs similarly to the previous Ryzen 7 5800X processor.

Science

(2-1) 3D Particle Movement v2.1 (non-AVX)

(2-2) 3D Particle Movement v2.1 (Peak AVX)

(2-3) yCruncher 0.78.9506 ST (250m Pi)

(2-4) yCruncher 0.78.9506 MT (2.5b Pi)

(2-4b) yCruncher 0.78.9506 MT (250m Pi)

(2-5) NAMD ApoA1 Simulation

(2-6) AI Benchmark 0.1.2 Total

(2-6a) AI Benchmark 0.1.2 Inference

(2-6b) AI Benchmark 0.1.2 Training

Our science-based benchmarks, for the most part, show that the Ryzen 7 5800X is slightly better computational-wise than the Ryzen 7 5800X3D. This is primarily due to the 5800X being clocked 100 MHz higher than the newer 5800X3D.

Where the extra L3 cache can benefit performance, it does, including in AI Benchmark, but overall the performance is very similar between both chips.

Gaming Performance: 4K CPU Benchmark Performance: Simulation And Rendering
Comments Locked

125 Comments

View All Comments

  • Qasar - Thursday, June 30, 2022 - link

    Makaveli, he wont, according to only him. the m1 is the best thing since sliced bread.
  • GeoffreyA - Thursday, June 30, 2022 - link

    Lor', the Apple Brigade is already out in full force.
  • at_clucks - Saturday, July 2, 2022 - link

    Look, if we're being honest the M line punches above its weight so to speak and yes, it does manage to embarrass traditional (x86) rivals on more than one occasion.

    This being said, I see no reason to review it here and compare it to most x86 CPUs. The reason is simple: nobody buys an M CPU, they buy a package. So comparing M2 against R7 5800X3D is pretty useless. And even if you compare "system to system" you'll immediately run into major discrepancies, starting with the obvious OS choice, or the less obvious "what's an equivalent x86 system?".

    With Intel vs. AMD it's easy, they serve the same target and are more or less a drop in replacement for each other. Not so with Apple. The only useful review in that case is "workflow to workflow", even with different software on different platforms. Not that interesting for the audience here.
  • TheMode - Tuesday, July 5, 2022 - link

    I never understood this argument. Sure some people will decide never to buy any Apple product, but I wouldn't say that this is the majority. Let's assume that M3 gets 500% faster than the competition for 5% of the power, I am convinced that some people will be convinced to switch over no matter the package.
  • GeoffreyA - Wednesday, July 6, 2022 - link

    I'd say it's interesting to know where the M series stands in relation to Intel and AMD, purely out of curiosity. But, even if it were orders faster, I would have no desire to go over to Apple.
  • mode_13h - Thursday, July 7, 2022 - link

    Yes, we want to follow the state of the art in tech. And when Apple is a leading player, that means reviewing and examining their latest, cutting edge products.
  • Jp7188 - Friday, July 8, 2022 - link

    Perhaps that could make sense in a seperate piece, but M1 doesn't really have a place in a gaming focused review. M1 gaming is still in its infancy as far as natively supported titles.
  • Skree! - Friday, July 8, 2022 - link

    Skree!
  • mode_13h - Sunday, July 10, 2022 - link

    I'm going to call spam on this. Whatever it's about, I don't see it adding to the discussion.
  • noobmaster69 - Thursday, June 30, 2022 - link

    Better late than never I guess.

    Am I the only one who found it puzzling that Gavin recommends DDR4-3600 and then immediately tests with a much slower kit? And ran gaming benchmarks with a 4 year old GPU?

Log in

Don't have an account? Sign up now