Samsung Electronics and SK Hynix are competing on the development of HBM2 (High Bandwidth Memory2) DRAM technology, and plan to announce new papers at 2016 International Solid-State Circuits Conference (ISSCC).
HBM is a standard for fast DRAM that uses TSV (Through Silicon Via) technology and greatly ncreases bandwidth greatly. It is predicted that HBM DRAM will be used for next-generation’s GPU, Super-Computer, server, and network devices and is expected that it will contribute in improving performance by greatly decreasing bottleneck situation during data processing.
Both Samsung Electronics and SK Hynix are finishing up development of HBM2 DRAMs. It is predicted that both businesses’ HBM2 DRAMs will be mass-produced in first half of 2016 at the earliest.
HBM layers DRAM Silicon Die through TSV technology and widens bandwidth. Early HBM had used 1.2 V operating voltage and exchanged data at 1 Gbps at 1,024 I/O. It is possible to process 128 GB of data per second and it is 4 times faster than previous GDDR5 (processes 28 GB of data per second). Speed for HBM2 will be even faster as it can process 256 GB of data per second by exchanging data at 2 Gbps.
Process of standardization for HBM DRAM was led by SK Hynix and AMD. HBM that was mass-produced by SK Hynix was first loaded into AMD’s GPU R9 Fury. For HBM2, many other businesses such as Samsung Electronics, NVIDIA and others have participated in standardization.
Samsung Electronics and SK Hynix will make announcements about their successes in developing HBM2 D-RAM standard at 2016 International Solid-State Circuits Conference (ISSCC) that will be held in San Francisco on the 31st of January. SK Hynix will introduce 64 GB HBM2 product while Samsung Electronics will introduce product that can exchange 307 GB of data per second.