Kingston has announced that their latest and fastest datacenter SSD is now available. The new DC1000M is a 2.5" U.2 NVMe SSD, taking over the top spot in Kingston's datacenter SSD lineup. Last year, Kingston introduced the DC500M, DC500R and DC450R SSDs for the SATA market, but they have had few NVMe offerings. Back in 2017, Kingston partnered with Liqid on multi-controller NVMe add-in cards, but the DC1000M is Kingston's first foray into the more mainstream datacenter U.2 market segment.

Kingston's DC1000M uses the Silicon Motion's SM2270 controller, a 16-channel design that is SMI's first really large SSD controller. The internal architecture is a bit unusual: it consists more or less of the two 8-channel controller backends (derived from their consumer-grade controllers) behind a unified front-end with support for PCIe 3.0 x8 (though the DC1000M's U.2 connector limits it to only using an x4 link). That controller layout means the SM2270 has a total of three pairs of ARM Cortex R5 CPU cores: one pair for the front-end that handles the NVMe protocol, and one pair for each half of the backend handling lower level NAND management. Each half of the backend also has its own 32-bit DRAM controller. The SM2270 features the same third-generation LDPC encoder as SMI's upcoming PCIe 4.0 controllers, a step forward from what's in the current SM2262(EN) consumer controllers.

The flash memory used by the DC1000M is Kioxia's 64-layer BiCS3 3D TLC, so performance and power efficiency might be slightly worse than if they'd gone with the latest 96L NAND. Capacities range from 960 GB up to 7.68 TB, all with endurance ratings of 1 drive write per day (DWPD). That's a bit lower than the 1.3 DWPD rating their DC500M gets, but still twice what their -R models for heavily read-oriented workloads are rated for.

Kingston DC1000M SSD Specifications
Capacity 960 GB 1.92 TB 3.84 TB 7.68 TB
Controller Silicon Motion SM2270
Form Factor 2.5" 15mm U.2
Interface, Protocol PCIe 3.0 x4, NVMe
NAND Flash Kioxia (Toshiba) BiCS3 64L 3D TLC
Sequential Read 3100 MB/s
Sequential Write 1330 MB/s 2600 MB/s 2700 MB/s 2800 MB/s
Random Read IOPS 400k 540k 525k 485k
Random Write IOPS 125k 205k 210k 210k
Power
Draw
(Watts)
Idle 5.14 5.22 5.54 5.74
Avg Read 5.25 5.31 5.31 5.99
Max Write 9.80 13.92 15.50 17.88
Write Endurance 1 DWPD
Warranty 5 years
MSRP $356.20
(37¢/GB)
$552.50
(29¢/GB)
$975.00
(25¢/GB)
TBD

The sequential and random read performance ratings for the DC1000M both fall a bit short of saturating the PCIe 3.0 x4 interface, even for the larger capacity models. Sequential write performance is also a bit slow for a 16-channel drive, but random write speeds are competitive for the 1 DWPD market segment. Power consumption is fairly low for a drive with a 16-channel controller, with typical read performance in the 5-6W range and maximum power draw varying from about 10W up to 18W depending on capacity—all figures that are only slightly higher than we usually see on U.2 drives using 8-channel controllers.

The DC1000M supports the typical features expected for a datacenter U.2 drive, including hot-plug, end to end data path protection, and power loss protection. The smaller capacities of the DC1000M are currently available for purchase direct from Kingston or through distributors like CDW. The 7.68 TB model will be available soon.

Source: Kingston

POST A COMMENT

3 Comments

View All Comments

  • phoenix_rizzen - Monday, March 2, 2020 - link

    EPYC has 128 PCIe lanes, although most motherboards only have around 64 lanes available to PCIe slots. With the right HBAs and cables, one could get at least 16 of these into a server, which would give 128 TB of NVMe SSD storage (direct attached, no expanders).

    Now things are getting interesting. :D Although very expensive!
    Reply
  • antonkochubey - Tuesday, March 3, 2020 - link

    The capacity itself is nothing interesting, Crucial had the 9300 MAX with up to 15.36 TB in U.2 format out for quite some time now. And it's a good bit faster, too. Reply
  • bcronce - Thursday, March 5, 2020 - link

    Imagine the bandwidth Reply

Log in

Don't have an account? Sign up now