Prices and New Competitors

It's been a while since I've published on the SSD landscape. Not much has changed. SandForce's popularity has skyrocketed, easily making it the target to beat, while we patiently await Intel's 3rd generation SSDs. Once virtually an OCZ-only supplier, nearly everyone has a SandForce based drive these days. Capacities have also changed. While the original drives allocated nearly 30% of their NAND to spare area, newer extended versions have since appeared that drop the % of spare area down to 12 - 22% depending on the SKU (40/80/160GB drives allocate 22% while 60/120/240 drives allocate 12%). The performance impact of the reduced spare area is nonexistent as we've proved in the past.

Indilinx is still around but undesirable at this point. Performance is no longer competitive and write amplification is much higher than what you get from SandForce at the same cost. Crucial's RealSSD C300 is still trucking, however you do pay a premium over SandForce. Whether or not the premium is justified depends on your workload.

SSD Price Comparison - November 11, 2010
SSD NAND Capacity User Capacity Price Cost per GB of NAND
Corsair Force F40 40GB 48GB 37.3GB $124.99 $2.603
Corsair Force F120 120GB 128GB 111.8GB $229.99 $1.797
Corsair Nova V128 128GB 128GB 119.2GB $219.99 $1.719
Crucial RealSSD C300 64GB 64GB 59.6GB $134.99 $2.109
Crucial RealSSD C300 128GB 128GB 119.2GB $269.99 $2.109
Intel X25-M G2 160GB 160GB 149.0GB $409.00 $2.556
Intel X25-V 40GB 40GB 37.3GB $94.99 $2.375
Kingston SSDNow V Series 30GB 30GB 27.9GB $82.99 $2.766
Kingston SSDNow V Series 128GB 128GB 119.2GB $224.99 $1.758
Kingston SSDNow V+ Series 128GB 128GB 119.2GB $277.00 $2.164
Kingston SSDNow V+ 100 128GB 128GB 119.2GB $278.99 $2.180
OCZ Agility 2 120GB 128GB 111.8GB $229.99 $1.797
OCZ Vertex 2 120GB 128GB 111.8GB $234.99 $1.836
Patriot Inferno 60GB 64GB 55.9GB $149.00 $2.328
Western Digital SiliconEdge Blue 128GB 119.2GB $214.99 $1.680

We broke the $2/GB barrier a while ago. Prices continue to fall as NAND manufacturers transistion to 2xnm processes, the existing 3xnm supplies become cheaper as a result. Surprisingly enough, the most affordable drives actually come from companies who don't own NAND foundries. SandForce's partners who have to pay a big chunk of their margins to SandForce as well as the NAND vendor are actually delivering the best value in SSDs. Kingston and Western Digital also deliver a great value. Not Crucial/Micron and not Intel, which is not only disappointing but inexcusable. These companies actually own the fabs where the NAND is made and in the case of Intel, they actually produce the controller itself.

Within the SandForce camp prices seem pretty consistent. I grabbed data from three different SF partners: Corsair, OCZ and Patriot. At 128GB of NAND both Corsair and OCZ are competitive on pricing. As you look at the smaller capacity drives however, cost per GB goes up dramatically. A 40GB Corsair Force will cost you 44.8% more per GB than a 120GB drive. The same is true when you look at the 60GB Patriot Inferno at $2.328 per GB.

If you're trying to keep total cost down, the best bang for your buck from a capacity standpoint is the 64GB Crucial RealSSD C300. It's more expensive per GB than the larger SandForce drives, but at $134.99 it's a cheap way to get into a decent SSD.

The new Kingston SSDNow V+ 100 is actually more expensive than the Crucial drives from a cost-per-GB standpoint. Traditionally the V series has been the value line while the V+ series have been Kingston's more performance oriented SSDs. In the past however, the performance oriented V+ never seemed to have the performance to back up its price. Perhaps the V+ 100 can change that.

The Test

CPU Intel Core i7 965 running at 3.2GHz (Turbo & EIST Disabled)
Motherboard: Intel DX58SO (Intel X58)
Chipset: Intel X58 + Marvell SATA 6Gbps PCIe
Chipset Drivers: Intel + Intel IMSM 8.9
Memory: Qimonda DDR3-1333 4 x 1GB (7-7-7-20)
Video Card: eVGA GeForce GTX 285
Video Drivers: NVIDIA ForceWare 190.38 64-bit
Desktop Resolution: 1920 x 1200
OS: Windows 7 x64
Introduction Random Read/Write Speed
Comments Locked


View All Comments

  • Dorin Nicolaescu-Musteață - Thursday, November 11, 2010 - link

    Anand, what about the Samsung 470 Series?

    It's been out since August and looks like a very nice drive. Why in the world the reviews have only started to appear this week on-line.
  • Nickel020 - Thursday, November 11, 2010 - link

    Thanks for the review! I've got a few suggestionss/questions though.

    I've been out of the loop for a while and looking at Sandforce and other newer drives. So some Sandforce drives have the I/O limitations that were intended for the SF-1200 and some have SF-1500 like performance?

    I'm surprised the Corsair F40 does so well. I thought the lower capacity drives performed worse than the 120GB versions, but it holds up really well. Or is this just a special case with the 40GB one and the 60GB is worse than the 40GB one? The 60GB Sandforce are also much better value than the 40GB ones, 50% more capacity at >20% more price. I find it strange you didn't include them and mentioned the 64GB C300 to be the value drive at their price point.

    I'm pretty sure the Indilinx 60GB, the unlocked SF-1200 60GB and the X25-M 80GB are the most popular drives out there, which makes them great reference points, but they're not in the review. The former two are not on Bench either unfortunately. Do you have some around and could test them?

    You tested the Crucial drives on the ICH10R, right?

    Also, I would appreciate some blog posts or small articles about developments with newer FWs. I remember the FW development improving the Indilinx drives significantly, and am always wondering how accurate your older reviews still are given there are newer FWs out now. It would also be nice if you could list the tested FW version in Bench.

    It would also be great if you could look at SSD performance in Macbooks. I want to put one in my Macbook Pro (Late 2008), but all the talk of freezing has me hesitating, and I haven't seen an in-depth look at this issue. Is it related to what kind of SSD you use, and does it make a difference wheter you have a late 2008 or mid 2009? It would also be interesting to see how tha lack of TRIM actually affects different drives under OSX.

    That's all for now, thanks again!
  • retnuh - Thursday, November 11, 2010 - link

    I've had a OWC Mercury Extreme Pro 240GB in my late 2008 MBP since may, not one issue or freezing. Best upgrade you can do.
  • iwod - Thursday, November 11, 2010 - link

    I posted and asked on many forums and did not found an answer to my MBA ( Macbook Air ) question.

    Why did the MBA do so well in its test, while its performance data were below the King of SSD controller, Sandforce?

    No one could answer. There were number of review pointing out that their MBA actually feels snappier then their Macbook with Sandforce or Intel SSD. Although this is impossible when first heard, numerous other review site seems to confirm similar findings. Of cause there is no way to test it out since the MBA does not have an regular SATA slot.

    Now this article actually print out the truth. The same Toshiba SSD controller used in MBA's SSD, is top of the chart in BOTH Synthetic Benchmarks and Real World usage ( Anand Bench ) Benchmarks. What we have been talking as the Holy Grail of SSD Performance Delta, the 4K Random Read / Write, didn't matter when Toshiba was literally the bottom of the chart in those test.

    There is a reason why Apple choose a Inferior part ( To us at the time ) instead of Sandforce. The argument for choosing it because of always On GC doesn't make sense, since Sandforce has the same capabilities within the Firmware itself.

    One reason would be Toshiba is a NAND manufacture itself, and buying NAND and Controller directly from Toshiba would be cheaper. The other reason being Toshiba ( properly involve Sandisk as well since their JV ) had a controller chip which is very fast.

    There has to be an missing pieces in our performance test, something that these companies knows and we dont.
  • Chloiber - Thursday, November 11, 2010 - link

    I'd like to see more real world tests - and I don't consider the AnandTech Storage Bench to be "real world" - it's still a bench, like PCMark.

    But yes, you are right: synthetic tests tell us little about the performance you actually get from an SSD. There are more unknown variables than we think.
    You may see big differences in benches like SysMark or PCMark - and even bigger differences in synthetic tests like AS SSD or even IOMeter. But these scores tell us little about REAL world performance - and with REAL I mean things like:
    - "How long does it take to start Photoshop while running Virus Scan?"
    - "How long does it take to start iTunes while unzipping a not-so-much-compressed zip-file?"

    That's the things I care about. And interestingly, you often get COMPLETELY different results, than what you would think when looking at synthetic tests or "half-synthetic"-tests like PCMark or AnandTech Storage Bench.
  • Anand Lal Shimpi - Thursday, November 11, 2010 - link

    I used to run a lot of those types of tests, however I quickly found that if you choose your iometer and other benchmarks appropriately they don't any new data. And often times they are so limited in their scope (e.g. launch an application with virus scan in the background) that you don't see any appreciable differences between drives. Most high end SSDs are fast enough to do most of these types of tasks just as quickly as one another. It's when you string a bunch of operations together and look for cumulative differences in response time or performance that you can really begin to see which one is faster. These types of scenarios are virtually impossible to perform with consistency by hand, that's where our test suite comes in.

    AnandTech Storage Bench, PCMark and even SYSMark do what is necessary - they measure performance of a more complex usage case. PCMark Vantage is a great showcase of truly light workload I/O performance, while SYSMark is more CPU bound and shows you how small the differences can be. Our own benchmark offers a more modern set of usage models (we actually do run photoshop while virus scan is active and actually edit images in photoshop, all while doing other things as well).

    All of these tests are application based, they are simply scripted or isolate the I/O component specifically. They give us a look into bursts of activity that's, again, near impossible to reproduce by hand with a stopwatch.

    Benchmarking a specific task usually just repeats some information we've already presented, fails to present the bigger picture or shows no repeatable difference between drives. I can absolutely add those types of benchmarks back in, however I originally pulled them out because I believed they didn't add anything appreciable to the reviews.

    Of course this is your site, if you guys would like me to present some of that data I definitely can :)

    Take care,
  • Nickel020 - Thursday, November 11, 2010 - link

    The problem is that the synthetic tests you do are hard to interpret for just about anyone. "What drive is the best for this usage profile?" is still really hard to answer after reading your reviews (not that anyone else does a better job).

    And even if there is little difference between todays drives in the level loading time tests you used to do, we don't know that even if you do. Right now the average AT reader reads this test and doesn't know that the more expensive drives won't load his games noticeably faster or perform better when doing video editing.

    Maybe you should give recommendations for certain usage profiles, like video editing, photo editing, gaming etc. Even if you're just saying that there's not gonna be a noticeable difference.
  • wumpus - Thursday, November 11, 2010 - link

    It might help if you included statements like "you don't see any appreciable differences between drives. Most high end SSDs are fast enough to do most of these types of tasks just as quickly as one another." a bit more often in the articles. While we might be interested in the technical data, it would usually be foolish to buy SSDs by things other than size, price, and reliability.
  • Chloiber - Saturday, November 13, 2010 - link

    But that's the thing. You don't see any difference in an application a "normal" homeuser would use. We see huge differences in those synthetic tests, but in reality, you don't have any faster loading time.

    Of course you can test it like this and say:
    "You don't see much difference between these three SSDs in "real world" application tests. Get the cheapest SSD (or most robust, whatever)."

    Or another position (I think the one you are currently in) is:
    "You don't see much difference between these three SSDs in "real world" application tests so let's stress them some more and base our verdict on those stresstests."

    The thing is:
    a) You don't know how the SSDs would REALLY react if you stress them in reality like this. They are still synthetic tests and unless you can prove that there are scenarios where differences appear (without any influence of some kind of bench program) they don't tell us that much.

    b) I think we have to begin to widen our horizon a little bit. Why exactly is it, that you don't see any beneftit using a, let's say, 50k IOPS drive and a 15k IOPS drive? Shouldn't you see some significant faster load times?

    Im telling you this because of future SSDs. We get 30k IOPS, soon 60k IOPS, and in one year maybe over 100k IOPS. The score in your benches gets bigger and bigger...and bigger...
    And what exactly is it the user gets? NOTHING because everything else in his computer is limiting his SSD (which is already happening right now!)!

    I agree that you have to test hardware in scenarios, where nothing else is limiting your subject. That's why you use a 4GHz i7 when testing GPUs. That's why you test CPUs game performance using a very low resolution.
    But I think it's really important that you also test scenarios a user experiences in reality. And that means in this case: "real world" benches. And yes, there will be nearly no difference there. But isn't this the thing I want to know? If I spend 600$ on a fking RevoDrive and nothing loads faster, I WANNA KNOW ABOUT IT!

    I hope you see my point :)
  • Out of Box Experience - Thursday, November 11, 2010 - link

    Real World testing of SSD's should be done in a worst case scenaio on the lowest common denominator

    They should be plug and play on XP machines without any tweaks on the slowest computer you use to amplify the differences between drives

    I use a copy/paste test on ATOM CPU's to guage the Real World differences between Platter drives and SSD's

    Using 200MB of data (900+ files in 80 or so folders), I simply time a copy/paste of that data on the ATOM computer

    Using a faster computer WILL reduce the "Relative" speed gap between drives to the point where it becomes hard to tell which of two drives is actually the fastest

    Using Windows 7 with its funky caching scheme will make ALL the drives appear to copy and paste at the same speed on the ATOM core and therefore cannot be used for this test

    A 40GB Vertex 2 can copy and paste this data in 55 seconds (3.6MB/sec)
    A 5400RPM Western Digital Laptop drive does it in 54 seconds
    A 7200RPM Western Digital Desktop Drive takes 17 seconds

    ALL testing was done under XP-SP2 without ANY tweaks!
    All tests were repeated for accuracy

    Sandforce SSD's are HORRIBLE at handling data that is NOT compressible or that is already on the drive in compressed form

    Any drive that requires Windows 7 or multiple tweaks just to give you "Synthetic" numbers that have no bearing in the Real World are worthless

    Show us how they compare in a worst case scenario on the least common denominator for results we can use please

    I'm tired of hearing how great Sandforce drives are when they can't even beat a 5400RPM laptop drive in a Real World test such as the one I've just described

Log in

Don't have an account? Sign up now