Now that the Intel Optane Memory M.2 SSDs are readily available on the open market, anyone with an electron microscope and the skills to use it can begin to probe the secrets of 3D XPoint memory that Intel and Micron have been keeping tightly under wraps since announcing the new technology in August 2015. The reverse engineering experts at TechInsights have been doing just that, and they recently published their initial findings.

Die Size

With some of the first high-resolution die photographs of 3D XPoint, TechInsights has provided precise measurements of the die size and memory density. The 128Gb 3D XPoint die is 206.5 mm2, much larger than is typical for modern NAND flash or DRAM but comparable to Intel's 128Gb 20nm planar MLC NAND. A large total die size is typical for Intel and Micron, as they have historically not catered to the mobile market with their NAND flash while competitors like Samsung and Toshiba have strived to ensure their flash will physically fit in devices like smartphones. (That trend is changing with the introduction this year of 64-layer 3D NAND where Intel and Micron are producing both a larger 512Gb TLC part and a smaller 256Gb TLC part.)

Bit Density

The Intel Optane SSD DC P4800X is using memory of similar density to the Intel SSD DC P3700 that it is displacing as the flagship of Intel's SSD product line. When comparing similar chips, die size is a strong predictor of manufacturing cost, but 3D XPoint memory is quite different from NAND flash memory, both older planar NAND or newer 3D NAND. Still, there's some value in noting that the P4800X is arriving with a price tag about 25% higher than the P3700 initially carried. This suggests that the manufacturing process for 3D XPoint is either more expensive than planar NAND or that 3D XPoint yields are not mature enough, but a lot of the markup can also be explained by the lack of high-performance competition for Optane SSDs.

TechInsights calculates that 91.4% of the 3D XPoint die area is occupied by the memory array itself. This is a much higher figure than for NAND flash, where the record is 84.9% for Intel/Micron 3D NAND with its "CMOS under the array" design that puts a large portion of the peripheral circuitry underneath the memory array instead of alongside. Samsung's current 48-layer 3D V-NAND manages an array efficiency of just 70%, and 3D NAND from Toshiba and SK Hynix has been comparable. This means that once Intel gets around to increasing they layer count in future generations of 3D XPoint memory, they should be able to get much closer to the ideal capacity scaling than 3D NAND memory can currently achieve.

The analysis from TechInsights confirms that 3D XPoint memory is manufactured using a 20nm process, with the same pitch in both the bitline and wordline directions of the memory array. The DRAM market is only just moving beyond this milestone, so comparing the density of 3D XPoint to current DRAM highlights the fundamental capacity advantage 3D XPoint enjoys: around 4.5 times higher density compared to typical 20nm DRAM, and about 3.3 times higher than the most advanced 1Xnm DDR4 on the market. This gap is likely to widen with future generations of 3D XPoint.

The materials and construction of an individual 3D XPoint memory cell have not been fully analyzed, but it appears to be a phase change memory element with a doped chalcogenide selector switch. The 3D XPoint memory array is constructed between the fourth and fifth metal interconnect layers above the silicon die.

Source: TechInsights

Comments Locked

22 Comments

View All Comments

  • lefty2 - Sunday, May 28, 2017 - link

    So, is it or isn't PCM?
    There's quite a funny article here: https://www.theregister.co.uk/2016/01/19/xpoint_in...

    "My guess is that the companies have found a way to make the same materials behave in a slightly different way," he continued. "Chalcogenide glasses are the basis of numerous emerging memory technologies such as oxygen vacancy and silver dendrites, so they don’t necessarily have to use the phase-change mechanism to achieve bit storage."
  • rolfaalto - Monday, May 29, 2017 - link

    Currently rocking a 32GB Optane as a very fast drive on my 4.7 Ghz Devil's Canyon that I use for running intensive Ghz-limited simulations. It's working brilliantly as a ludicrously fast pagefile, as I regularly bump into the memory limits of the machine. My code also uses a few GB of space to hand off large data to other workers, and the Optane is brilliant for that as wall.

    Can't wait for the next generation, but this one's a giant leap above the M.2 SSD it replaced.

Log in

Don't have an account? Sign up now