HBM2

Back in March at their annual GPU Technology Conference, NVIDIA announced the long-anticipated 32GB version of their flagship Tesla V100 accelerator. By using newer 8-Hi HBM2 memory stacks, NVIDIA was able to double the accelerator’s previous 16GB of VRAM to a class-leading 32GB. Meanwhile, at the time company representatives told us that the launch of the 32GB model would be a wholesale replacement of the 16GB model, with the smaller version to be phased out and all future cards to go out as the 32GB model. However, this week NVIDIA has reached out to inform us that this will not the case, and that the 16GB model is being continued after all. In a somewhat odd exchange, the official line from the company is that the...

NVIDIA’s DGX-2: Sixteen Tesla V100s, 30 TB of NVMe, only $400K

Ever wondered why the consumer GPU market is not getting much love from NVIDIA’s Volta architecture yet? This is a minefield of a question, nuanced by many different viewpoints...

28 by Ian Cutress on 3/27/2018

NVIDIA Bumps All Tesla V100 Models to 32GB, Effective Immediately

Update 05/24: NVIDIA has since reached out to us, informing us that their previous statement about 32GB cards replacing 16GB cards was in error, and that the 16GB V100...

7 by Ryan Smith on 3/27/2018

AMD to Ramp up GPU Production, But RAM a Limiting Factor

One of the more tricky issues revolving around the GPU shortages of the past several months has been the matter of how to address the problem on the GPU...

34 by Ryan Smith on 1/31/2018

Samsung Starts Production of HBM2 “Aquabolt” Memory: 8 GB, 2.4 Gbps

Samsung this week announced that it had started mass production of its second-generation HBM2 memory code-named “Aquabolt”. The new memory devices have 8 GB capacity and operate at 2.4...

17 by Anton Shilov on 1/11/2018

SK Hynix: Customers Willing to Pay 2.5 Times More for HBM2 Memory

SK Hynix was the first DRAM manufacturer to start producing HBM Gen 1 memory in high volume back in 2015. However, the company is somewhat behind its rival Samsung...

23 by Anton Shilov on 8/4/2017

Samsung Increases Production Volumes of 8 GB HBM2 Chips Due to Growing Demand

Samsung on Tuesday announced that it is increasing production volumes of its 8 GB, 8-Hi HBM2 DRAM stacks due to growing demand. In the coming months the company’s 8...

34 by Anton Shilov on 7/19/2017

SK Hynix Advances Graphics DRAM: GDDR6 Added to Catalogue, GDDR5 Gets Faster

SK Hynix has added GDDR6 memory chips to its product catalogue, revealing their general specifications and launch timeframe sometimes in Q4 2017. As expected, the new GDDR6 ICs will...

17 by Anton Shilov on 5/20/2017

SK Hynix Adds HBM2 to Catalog: 4 GB Stacks Set to Be Available in Q3

SK Hynix has quietly added its HBM Gen 2 memory stacks to its public product catalog earlier this month, which means that the start of mass production should be...

43 by Anton Shilov on 8/1/2016

NVIDIA Unveils the DGX-1 HPC Server: 8 Teslas, 3U, Q2 2016

For a few years now, NVIDIA has been flirting with the server business as a means of driving the growth of datacenter sales of their products. A combination of...

30 by Ryan Smith & Ian Cutress on 4/6/2016

AMD Unveils GPU Architecture Roadmap: After Polaris Comes Vega

Although AMD’s GDC 2016 “Capsaicin” event was primarily focused on game development – it is the Game Developers Conference, after all – AMD did spend a brief moment discussing...

54 by Ryan Smith on 3/15/2016

JEDEC Publishes HBM2 Specification as Samsung Begins Mass Production of Chips

The high-bandwidth memory (HBM) technology solves two key problems related to modern DRAM: it substantially increases bandwidth available to computing devices (e.g., GPUs) and reduces power consumption. The first-generation...

42 by Anton Shilov on 1/20/2016

Log in

Don't have an account? Sign up now