Testing Optane Memory

For this review, Intel provided a fully-assembled desktop system with Windows 10 pre-installed and Optane Memory caching configured and enabled. The system was assembled by Intel's Demo Depot Build Center as the equivalent of a typical low to mid-range retail desktop with an i5-7400 processor, a B250 motherboard and 16GB of RAM. Storage is a 1TB 7200RPM WD Black hard drive plus the Optane Memory 32GB module.

Intel Optane Memory Review System
CPU Intel Core i5-7400
Motherboard ASUS B250-PLUS
Chipset Intel B250
Memory 2x 8GB Kingston DDR4-2400 CL17
Case In Win C583
Power Supply Cooler Master G550M
OS Windows 10 64-bit, version 1607
Drivers Intel Optane Memory version 15.5.0.1051

In addition, we tested the Optane Memory's performance and power consumption as a standalone SSD using our own testbed. This allowed us to compare against the Optane SSD DC P4800X and to verify Intel's performance specifications for the Optane Memory.

Unfortunately, this review includes only an abbreviated set of benchmarks, for two reasons: the Optane Memory review system arrived less than a week ago, as I was trying to finish up the P4800X review, and the Optane Memory module did not survive testing. After about a day of benchmarking the Optane Memory review system locked up, and after rebooting the Optane Memory module was not detected and the OS installation was corrupted beyond repair. The drive is not completely dead: Linux can detect it as a NVMe device but cannot use it for storage or even retrieve the drive's error log. In communicating with Intel over the weekend, we were not able to figure out what went wrong, and the replacement module could not be delivered before the publication of this review.

The fact that the Optane Memory module died should not be taken as any serious evidence against the product's reliability. I kill review units once every few months during the course of ordinary testing, and I was due for another failure (ed: it's a bona fide AnandTech tradition). What we call ordinary testing is of course not something that anybody would mistake for just the intended use of the product, and no SSD brand has been entirely free from this kind of problem. However, the fact remains that we don't have as much data to present as we wish, and we don't have enough experience with the product to make final conclusions about it.

For comparison with the Optane Memory caching configuration, we selected the Crucial MX300 525GB and the Samsung 960 EVO 250GB. Both of these are available at retail for slightly less than the price of the Optane Memory 32GB module and the 1TB hard drive. They represent different capacity/performance tradeoffs within the same overall storage budget and are reasonable alternatives to consider when building a system like this Optane Memory review system.

For testing of the Optane Memory caching performance and power consumption, we have SYSmark 2014 SE results. Our synthetic tests of the Optane Memory as a standalone SSD are abbreviated forms of the tests we used for the Optane SSD DC P4800X, with only queue depths up to 16 considered here. Since those tests were originally for an enterprise review, the drives are preconditioned to steady state by filling them twice over with random writes. Our follow-up testing will consider the consumer drives in more ordinary workloads consisting of short bursts of I/O on drives that are not full.

Intel's Caching History SYSmark 2014 SE
POST A COMMENT

110 Comments

View All Comments

  • ddriver - Tuesday, April 25, 2017 - link

    Yeah, daring intel, the pioneer, taking mankind to better places.

    Oh wait, that's right, it is actually a greedy monopoly that has mercilessly milked people while making nothing aside from barely incremental stuff for years and through its anti-competitive practices has actually held progress back tremendously.

    As I already mentioned above, the last time "intel dared to innovate" that resulted in netburst. Which was so bad that in order to save the day intel had to... do what? Innovate once again? Nope, god forbid, what they did was go back and improve on the good design they had and scrapped in their futile attempts to innovate.

    And as I already mentioned above, all the secrecy behind xpoint might be exactly because it is NOTHING innovative, but something old and forgotten, just slightly improved.
    Reply
  • Reflex - Tuesday, April 25, 2017 - link

    Axe is looking pretty worn down from all that grinding.... Reply
  • ddriver - Wednesday, April 26, 2017 - link

    Also, unlike you, I don't let personal preferences cloud my objectivity. If a product is good, even if made by the most wretched corporation out there, it is not a bad product just because of who makes it, it is still a good product, still made by a wretched corporation.

    Even if intel wasn't a lousy bloated lazy greedy monopolist, hypetane would still suck, because it isn't anywhere near the "1000x" improvements they promised. It would suck even if intel was a charity that fed the starving in the 3rd world.

    I would have had ZERO objections to hypetane, and also wouldn't call it hypetane to begin with, if intel, the spoiled greedy monopolist was still decent enough to not SHAMELESSLY LIE ABOUT IT.

    Had they just said "10x better latency, 4x better low depth queue performance" and stuff like that, I'd be like "well, it's ok, it is faster than nand, you delivered what you promised.

    But they didn't. They lied, and lied, and now that it is clear that they lied, they keep on lying and smearing with biased reviews in unrealistic workloads.

    What kind of an idiot would ever approve of that?
    Reply
  • fallaha56 - Tuesday, April 25, 2017 - link

    OMG when our product wasn't as good as we said it was we didn't own-up about it

    and maybe you test against HDD (like Intel) but the rest of us are already packing SSDs
    Reply
  • philehidiot - Saturday, April 29, 2017 - link

    This is what companies do. Your technology is useless unless you can market it. And you don't market anything by saying it's mediocre. Look at BP's high octane fuel which supposedly cleans your engine and gets better fuel efficiency. The ONLY thing that higher octane fuel does is resist auto-ignition under compression better and thus certain high performance engines require it. As for cleaning your engine - you're telling me you've got a solvent which is better at cutting through crap than petrol AND can survive the massive temperatures and pressures inside the combustion chamber? It's the petrol which scrubs off the crap so yes, it's technically true. They might throw and additive or two in there but that will only help pre-combustion chamber and if you actually have a problem. And Yes, in certain, newer cars with certain sensors you will get SLIGHTLY higher MPG and therefore they advertise the maximum you'll get under ideal conditions because no one will but into it if you're realistic about the gains. The gains will never offset the extra cost of the fuel, however.

    PC marketing is exactly the same and why the J Micron controller was such a disaster so many years ago. They went for advertised high sequential throughput numbers being as high as possible and destroyed the random performance, Anand spotted it and OCZ threw a wobbler. But that experience led to drives being advertised on random performance as well as sequential.

    So what's the lesson here? We should always take manufacturer's claims with a mouthful of salt and buy based on objective criteria and independent measurements. Manufacturers will always state what is achievable in basically a lab set up with conditions controlled to perfection. Why? Because for one you can't quote numbers based on real life performance because everyone's experience will differ and you can't account for the different variables they'll experience. And for two, if everyone else is quoting the maximum theoretical potential, you're immediately putting yourself at a disadvantage by not doing so yourself. It's not about your product, it's about how well you can sell it to a customer - see: Stupidly expensive Dyson Hairdryer. Provides no real performance benefit over a cheap hairdryer but cost a lot in R&D and is mostly advertising wank for rich people with small brains.

    As for Intel being a greedy monopoly... welcome to capitalism. If you don't want that side effect of the system then bugger off to Cuba. Capitalism has brought society to the highest standard of living ever seen on this planet. No other form of economic operation has allowed so many to have so much. But the result is big companies like Intel, Google, Apple, etc, etc.

    Advertising wank is just that. Figures to masturbate over. If they didn't do it then sites like Anandtech wouldn't need to exist as products would always be accurately described by the manufacturer and placed honestly within the market and so reviews wouldn't be required.

    I doubt they lied completely - they will be going on the theoretical limits of their technology when all engineering limitations are removed. This will never happen in practice and will certainly never happen in a gen 1 product. Also, whilst I see this product as being pointless, it's obviously just a toe dipping exercise like the enterprise model. Small scale, very controlled use cases and therefore good real world use data to be returned for gen 2/3.

    Personally, whilst I'm wowed by the figures, I don't see how they're going to improve things for me. So what's the point in a different technology when SLC can probably perform just as well? It's a different development path which will encounter different limitations and as a result will provide different advantages further down the road. Why do they continue to build coal fired power stations when we have CCGTs, wind, solar, nukes, etc? Because each technology has its strengths and weaknesses and encounters different engineering limitations in development. Plus a plurality of different, competing technologies is always better as it creates progress. You can't whinge about monopolies and then when someone starts doing something different and competing with the established norm start whinging about that.
    Reply
  • fallaha56 - Tuesday, April 25, 2017 - link

    hi @sarah i find that a dead hard drive also plays into responsiveness and boot times(!)

    this technology is clearly not anywhere near as good as Intel implied it was
    Reply
  • CaedenV - Monday, April 24, 2017 - link

    I have never once had an SSD fail because it has over-used its flash memory... but controllers die all the time. It seems that this will remain true for this as well. Reply
  • Ryan Smith - Tuesday, April 25, 2017 - link

    And that's exactly what we're suspecting here. We've likely managed to hit a bug in the controller's firmware. Which to be sure, isn't fantastic, but it can be fixed.

    Prior to the P3700's launch, Intel sent us 4 samples specifically for stress testing. We managed to disable every last one of them. However Intel learned from our abuse, and now those same P3700s are rock-solid thanks to better firmware and drivers.
    Reply
  • jimjamjamie - Tuesday, April 25, 2017 - link

    Interesting that an ad-supported website can stress-test better than a multi-billion dollar company.. Reply
  • testbug00 - Tuesday, April 25, 2017 - link

    based on what? Have they sent you another model?

    A sample dying on day one, and only allowing testing via remote server doesn't confidence build.
    Reply

Log in

Don't have an account? Sign up now