The Samsung SSD 840 EVO read performance bug has been on the table for over six months now. Initially Samsung acknowledged the issue fairly quickly and provided a fix only a month after the news hit the mainstream tech media, but reports of read performance degradation surfaced again a few weeks after the fix had been released, making it clear that the first fix didn't solve the issue for all users. Two months ago Samsung announced that a new fix is in the works and last week Samsung sent us the new firmware along with Magician 4.6 for testing, which will be available to the public later this month.

I covered the reason behind the issue in one of our earlier articles, but in short the read performance degradation is a result of cell charge decay over time that caused extensive read-retry cycles to retrieve the correct data. The new firmware fixes this by periodically refreshing (i.e. rewriting) old data, which recovers the cell charge back to its original state and ensures that no read-retry or ECC that would degrade the performance is needed. Samsung says that the refresh operation does not impact user performance, suggesting that it's a relatively low priority process that is run when the drive is idling. 

The new Magician 4.6 also includes an Advanced Performance Optimization feature, which is similar to the performance restoration tool that Samsung released earlier. Basically, it's a command that tells the SSD to rewrite all of its internal data, which resets all cell charges and hence recovers performance. It's merely a supplementary tool as the firmware upgrade itself should be enough to restore performance, but in case the performance isn't fully restored after the firmware upgrade (and some idle time to let the drive refresh the cells), the tool can be used to force a cell charge refresh. 

I haven't run any tests of my own because I don't have any 840 EVOs deployed in my systems (I only have one 2.5" EVO anyway), but Allyn Malventano from PC Perspective managed to run some tests on a degraded drive to show the impact of the new firmware.

Before update

After update

After "Advanced Performance Optimization"

Allyn's tests indicate that the new firmware seems to mostly fix the issue even without running the optimization tool. Note that Allyn didn't give the drive any idle time after the firmware update, so the update appears the be very effective and with idle time the performance would likely have restored on its own.

Obviously, the big question is whether the performance will stay high because there was never a problem with freshly written data. We won't know that for sure until a couple of months later, but given the way the new firmware handles old data it does sound more promising because no data should get old enough to be slow to read.

Some of you are likely skeptical about the effect on endurance since rewriting the data will consume P/E cycles, but I find this to be a non-issue. We know that Samsung's 19nm TLC NAND is rated at 1,000 P/E cycles, so if the drive was to refresh all cells once a week, even that would only consume 52 cycles in a year. In five years time the total would be 260 cycles, which leaves you with 740 cycles for user data writes (for the record, that's 52GB of NAND writes per day for five years with the 120GB 840 EVO). 

All in all, I hope this fix will finally put an end to the performance degradation. The issue has been bugging many users for months and it's critical that the users get what they initially paid for. On one hand I'm confident enough to say that this fix is permanent given the way it works, but on the other hand I don't want to be too optimistic this time around because the first fix didn't turn out so great. Either way, I think this fix is the last chance for Samsung to provide a permanent solution because they already failed to do so once and it would no longer be fair to ask the customers to wait months for a fix that might or might not fix the issue. For now the only thing we can do is wait for user reports and hope for the best, but at least in theory the new firmware should be a permanent fix. 

Comments Locked

78 Comments

View All Comments

  • rexbass24 - Wednesday, April 15, 2015 - link

    I am at the end of my rope with my brand new Samsung evo 640 500gb ssd. I get MAX 100MB/s on Novabench no matter what settings in samsung magician I use. I am using AHCI mode, not ide, so this should be blazing fast but since the day I got it, no love.

    Sabertooth r2.0
    8gb ram
    FX-4100
    GTX n760 hawk
    Newest stable bios on Sabertooth board and newest firmware on ssd.

    My Adata 900 series 128gb was 3x as fast as this drive
  • rexbass24 - Wednesday, April 15, 2015 - link

    oops, I meant evo 840 haha
  • BlobTheCop - Wednesday, April 15, 2015 - link

    Now seriously, I have about 300 of these drives deployed in my users desktops.
    How do I deploy this fix? Where is the command line tool?
  • Coup27 - Wednesday, April 15, 2015 - link

    Pretty sure there isn't one, after all these are consumer level drives.
  • leexgx - Friday, April 17, 2015 - link

    walk to each system and press a button reboot job done :) (just take you a week to do it)
  • Penti - Friday, April 17, 2015 - link

    Corporate systems might have Intel AMT available and with that you could do all the flashing remotely using the DOS tool for example. If you got consumer stuff just walk around starting the update. Pretty sure Magician doesn't allow automation / unattended firmware flashing. The server tool doesn't work with Evo's.
  • Pajos - Thursday, April 16, 2015 - link

    please, how days must no function disk, for file degraded?
    I thing buy 840 EVO and i dont now is it good choice, price 40 USD
  • chucky2 - Sunday, April 19, 2015 - link

    So how does this affect power consumption? 840 EVO was a good drive for lower power usage in a notebook (at least from what I remember), now with this new firmware, does it remain so?
  • BenjiSt - Sunday, April 19, 2015 - link

    Correct me if I'm wrong: All these tweaks to keep the aging cells of your drive alive seem to depend on that you actually use your drive regularly. But that doesn't help if you leave your drive with your precious data without power for a couple of months, does it?

    That's actually my usage pattern, and I'm in the market for a new notebook and new SSD as well. I use my work notebook for private stuff, then boot my private PC very rarely (say 1-2 a year) to move private stuff from the work notebook to the private notebook.

    So, should I fear data loss if I use a TLC drive?
  • andrewbaggins - Sunday, April 19, 2015 - link

    maybe go with intel, sandisk, ore crucial .....

Log in

Don't have an account? Sign up now