Random Read/Write Speed

The four corners of SSD performance are as follows: random read, random write, sequential read and sequential write speed. Random accesses are generally small in size, while sequential accesses tend to be larger and thus we have the four Iometer tests we use in all of our reviews. Our first test writes 4KB in a completely random pattern over an 8GB space of the drive to simulate the sort of random access that you'd see on an OS drive (even this is more stressful than a normal desktop user would see).

We perform three concurrent IOs and run the test for 3 minutes. The results reported are in average MB/s over the entire time. We use both standard pseudo randomly generated data for each write as well as fully random data to show you both the maximum and minimum performance offered by SandForce based drives in these tests. The average performance of SF drives will likely be somewhere in between the two values for each drive you see in the graphs. For an understanding of why this matters, read our original SandForce article.

Desktop Iometer - 4KB Random Read (4K Aligned)

Random read and write performance has not changed at all from previous generation. This was expected because Plextor was not claiming increased random read/write performance, and the performance figures from Plextor are almost identical; The 256GB M3 Pro has 75K IOPS random read and 68K IOPS random write. The respective figures for the 256GB M5S are 73K IOPS read and 70K IOPS write.  

Desktop Iometer - 4KB Random Write (4K Aligned) - 8GB LBA Space

Desktop Iometer - 4KB Random Write (8GB LBA Space QD=32)

 

Sequential Read/Write Speed

To measure sequential performance we ran a one minute long 128KB sequential test over the entire span of the drive at a queue depth of 1. The results reported are in average MB/s over the entire test length.

Desktop Iometer - 128KB Sequential Read (4K Aligned)

Desktop Iometer - 128KB Sequential Write (4K Aligned)

Sequential read and write performance has not changed dramatically either. Sequential read performance is right in-between the M3 and M3 Pro while sequential write performance is similar to the M3 Pro. As we discovered in our M3 Pro review, the standard M3 actually performed better in sequential write test with compressible data, although the M3 Pro was much faster when tested with incompressible data. 

Inside the M5S and Test Setup AS-SSD Incompressible Sequential Performance
Comments Locked

43 Comments

View All Comments

  • StevoLincolnite - Wednesday, July 18, 2012 - link

    Another reason regarding Newegg reviews is, will all happy users post a review? Disgruntled customers are highly likely going to do so.
  • themossie - Wednesday, July 18, 2012 - link

    There's a strong selection bias, but this bias should be similar for all SSDs. If you compare the percentage of (dis)satisfied reviews, it's a useful way to compare different SSDs - as long as you don't take the numbers too seriously on their own.

    The Plextor M3 256GB and 128GB SSDs rate 88% and 90% five eggs respectively, which is exceptionally high. Compare this with the OCZ Vertex 3 120GB (one of the most popular and highest ranking Sandforce drives) at 35% one and two egg reviews and 62% five eggs.

    I won't speak for the statistical significance of any of this (especially with the <100 review sample size for the Plextors) but it looks like very few people regretted buying a Plextor, something I like to hear about any product :-)
  • Zak - Wednesday, July 18, 2012 - link

    I've just ordered 128GB and 256GB M3Pros, so I was a little upset when I saw this review of M5 but then it looks like other than higher price and 30MB/s increase I didn't miss much. But I wonder how fast the M5Pro will be.

    BTW, I don't believe that the current SSDs are significantly more reliable than hard drives (which is a bummer) so the 5 years warranty was the deciding factor for me. Plus, I was always a fan of Plextor products. Two of my older OCZ SSDs died in their second year, after the warranty was over. So I'm more mindful of warranties when buying stuff these days. The recent trend in lowering hard drive warranties is regrettable.
  • karasaj - Wednesday, July 18, 2012 - link

    Statistically speaking, anything above thirty is actually considered "relevant, fairly reliable information"

    Granted, that might not be entirely true due to the insane selection bias, but since that's also present on all drives it might not matter.
  • Zak - Wednesday, July 18, 2012 - link

    "I checked NewEgg reviews for Plextor's M3 and M3 Pro and only 4.2% of the reviews (189 user reviews in total) were one or two eggs, which usually indicates a serious problem with the drive." -- or serious problem with the reviewer. For example I've see people giving SSDs poor reviews because they didn't run at the advertised speeds over 500MB/s on their SATA 3.0Gbps interfaces, etc.
  • justaviking - Wednesday, July 18, 2012 - link

    And then there are people who get the ratings backwards.

    How many times have you read a glowing review ("I love my new drive!!!") but it has a rating of 1? Either they thought "1" meant excellent, as in "first place," or they forgot to enter a rating when they did their review.

    I've seen that on more than one site. Maybe the online retailers should use "3" as their default value.
  • Kristian Vättö - Wednesday, July 18, 2012 - link

    "or serious problem with the reviewer"

    I first thought you meant me and was like why the attitude. Took me a while to figure out you mean NewEgg reviewers, not me - or at least that's the way I hope it is :-)

    I definitely agree with you though.
  • TrackSmart - Wednesday, July 18, 2012 - link

    I strongly disagree that Newegg reviews "mean squat". For items with similar buyers and *hundreds* of reviews, it quickly becomes clear when there is an unacceptably high failure rate for an SSD. Check out OCZ's Petrol series of SSDs for instance: http://www.newegg.com/Product/Product.aspx?Item=N8... Or the Vertex 2: http://www.newegg.com/Product/Product.aspx?Item=20...

    Certainly, a bad review (1 - 2 eggs) does not equal failure in a 1:1 relationship, but you can bet that the correlation will be high. And highly statistically significant if there are enough reviews, even with self-selection bias.

    Would you really buy one of those OCZ Petrol drives to save $20, despite the preponderance of bad reviews? 72% are 1-2 eggs! That's a correlation.
  • yyrkoon - Saturday, July 21, 2012 - link

    I agree with Kristian on this.

    Personally, I sometimes take newegg reviews more seriously on products like these. Simply, because Anandtech reviews are controlled, and limited by the amount of items they are given.

    However, you also have to be able to ascertain the given reviewers ( on newegg ) understanding of technology. Which thankfully is not too hard. You just need to read. Often, you will find that reviewer have very little understanding of what they are buying, if negative reviews are given. Passed that, ignoring the rating system of a given review, and understanding the product your self is a must,

    Sometimes, you will find that a negative review has merit. Then all you have to figure out. Is if the problem is something you can live with or not. Simple.
  • Nickel020 - Wednesday, July 18, 2012 - link

    Thank you very much for reviewing the drive. I only skimmed over the review (will read later), but I noticed that the prices in the table on the last page are completely different than what you get when you click on the links.

    It would also be nice if you were to include European prices as well, I think geizhals.de is a very good indicator of what drives actually sell for in Europe.

Log in

Don't have an account? Sign up now