Performance Consistency

Performance consistency tells us a lot about the architecture of these SSDs and how they handle internal fragmentation. The reason we do not have consistent IO latency with SSDs is because inevitably all controllers have to do some amount of defragmentation or garbage collection in order to continue operating at high speeds. When and how an SSD decides to run its defrag or cleanup routines directly impacts the user experience as inconsistent performance results in application slowdowns.

To test IO consistency, we fill a secure erased SSD with sequential data to ensure that all user accessible LBAs (Logical Block Addresses) have data associated with them. Next we kick off a 4KB random write workload across all LBAs at a queue depth of 32 using incompressible data. The test is run for just over half an hour and we record instantaneous IOPS every second.

We are also testing drives with added over-provisioning by limiting the LBA range. This gives us a look into the drive’s behavior with varying levels of empty space, which is frankly a more realistic approach for client workloads.

Each of the three graphs has its own purpose. The first one is of the whole duration of the test in log scale. The second and third one zoom into the beginning of steady-state operation (t=1400s) but on different scales: the second one uses log scale for easy comparison whereas the third one uses linear scale for better visualization of differences between drives. Click the dropdown selections below each graph to switch the source data.

For more detailed description of the test and why performance consistency matters, read our original Intel SSD DC S3700 article.

Mushkin Reactor 1TB
Default
25% Over-Provisioning

Despite the use of newer and slightly lower performance 16nm NAND, Reactor's performance consistency is actually marginally better than the other SM2246EN based SSDs we have tested. It's still worse than most of the other drives, but at least the increase in capacity didn't negatively impact the consistency, which happens with some drives. 

Transcend SSD370 256GB
Default
25% Over-Provisioning

 

Transcend SSD370 256GB
Default
25% Over-Provisioning


TRIM Validation

To test TRIM, I filled the drive with sequential 128KB data and proceeded with a 30-minute random 4KB write (QD32) workload to put the drive into steady-state. After that I TRIM'ed the drive by issuing a quick format in Windows and ran HD Tach to produce the graph below.

And TRIM works as expected.

Introduction, The Drive & The Test AnandTech Storage Bench 2013
Comments Locked

69 Comments

View All Comments

  • nathanddrews - Monday, February 9, 2015 - link

    That didn't stop them from releasing 1TB SSDs for $800, which have conveniently come down...
  • rahuldesai1987 - Monday, February 9, 2015 - link

    Samsung 860 Evo is likely to be 2TB.
  • Christopher1 - Sunday, February 15, 2015 - link

    Who would need all that space on an SSD save an uber-gamer? I personally put an SSD in my laptop but I download and store all my stuff to an external hard drive that is a USB 3 spinny disk.
  • Greg100 - Tuesday, February 10, 2015 - link

    What??? I am still waiting for Samsung PM863 - 3.84TB capacity in 2.5" form factor! The price doesn't matter. I will buy two or three of them.
  • Levish - Thursday, January 14, 2016 - link

    Mostly it comes down to supply vs demand, imagine if for example Samsung released a Hypothetical 1TB EVO SATA 6Gbps at $50 - $100 and a pro at $125 - $150 pretty much no one else would sell a consumer / oem SSD.

    What kills me is the pricing difference of M.2 NVMe disks, other than the initial R&D to produce they should be way cheaper than SATA varieties
  • Tom Womack - Tuesday, February 10, 2015 - link

    1TB is enough for most laptops, and few cases are short enough of space or SATA ports that you can't strap together two 1TB drives in RAID0, so there's very little pressure to produce 2TB drives at less than twice the price of 1TB drives; if you want 4TB of SSD tomorrow you can buy four drives and fiddle around a little with Molex-to-SATA-passthrough power adaptors.
  • Cpt. Obvious - Tuesday, February 10, 2015 - link

    There's also a big push for cloud services. Local storage is often seen as unreliable and inconvenient, especially when the user is supposed to be using several platforms to access the same data. And let's be honest, the cloud services are very convenient, when they work.

    The big problem with cloud storage, in my opinion at least, is bandwidth. For most people It's simply not efficient to work with large volumes of data over the internet. Even many popular games are running into several gigabytes. Recently I reinstalled a game that's a few years old, Borderlands, and the download from Steam was over 12GB. That's not something I'd like to run from a cloud storage. But even then I could fit 80 games of this size on a one TB SSD.

    Movies is another subject. A 1080p movie stored in a good quality can be about 20 - 25 GB, so that 1TB drive could house about 40 of these. However movies are generally read sequentially, and they don't need a very high transfer rate so they are prime candidates for storing on cheap HDD's in a NAS and / or using cloud storage.

    So where is the multi TB SSD demand in the consumer market today? I think 4K video editing is one of the few cases where consumers may need multi TB SSD's. Note that I say "need" not "want", because I for one sure "want" as large a SSD as I can get for a reasonable price. I might not fill it up, but I still want it...

    When it comes to professional use things are a lot different. If you work with huge sets of data you will need both large and fast storage. But then again that is already available, though at prices thats out of range for most end users.
  • cm2187 - Friday, February 13, 2015 - link

    Particularly upload bandwidth. In my country most optic fibre providers only have upload speeds a tenth of the download speed. Most pictures today are 2-6MB out of the camera (mobile to DSLR). A photo roll from a birthday or a trip can be pretty long to upload. And we are not even talking audio or video.

    Plus there is the trust issue. Do you really want to upload all your private life to the internet?
  • Christopher1 - Sunday, February 15, 2015 - link

    Actually, a 1080p in H.265 format (the one that people are switching to) should be only 1GB, tops for a two-hour movie in 23.9-24 fps at a pretty high Kbps.
    Yes, movies are usually read sequentially but the problem is that many drives do not store data sequentially.
    Every single time I download a TV show, I have to defrag the hard drive or the file itself to get it sequential on a spinny disk hard drive and play it's best.
  • Christopher1 - Sunday, February 15, 2015 - link

    Local storage is one of the most reliable things in the world today, especially since a lot of cloud hosters are now doing the insanity of removing anything that is 'flagged' by their systems as 'possibly pirated'. I just would not trust them with my data.

Log in

Don't have an account? Sign up now