Jump to content

Search the Community

Showing results for tags 'ai memory'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

There are no results to display.

There are no results to display.


Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Website URL


LinkedIn Profile URL


About Me


Cloud Platforms


Cloud Experience


Development Experience


Current Role


Skills


Certifications


Favourite Tools


Interests

Found 1 result

  1. Samsung has revealed it expects to triple its HBM chip production this year. “Following the third-generation HBM2E and fourth-generation HBM3, which are already in mass production, we plan to produce the 12-layer fifth-generation HBM and 32 gigabit-based 128 GB DDR5 products in large quantities in the first half of the year,” SangJoon Hwang, EVP and Head of DRAM Product and Technology Team at Samsung said during a speech at Memcon 2024. “With these products, we expect to enhance our presence in high-performance, high-capacity memory in the AI era.” Snowbolt Samsung plans a 2.9-fold increase in HBM chip production volume this year, up from the 2.5-fold projection previously announced at CES 2024. The company also shared a roadmap detailing its future HBM production, projecting a 13.8-fold surge in HBM shipments by 2026 compared to 2023. Samsung used Memcon 2024 to showcase its HBM3E 12H chip – the industry’s first 12-stack HBM3E DRAM - which is currently being sampled with customers. This will follow Micron’s 24GB 8H HBM3E into mass production in the coming months. According to The Korea Economic Daily, Samsung also spoke of its plans for HBM4 and its sixth-generation HBM chip which the company has named “Snowbolt,”. Samsung says it intends to apply the buffer die, a control device, to the bottom layer of stacked memory for enhanced efficiency. It didn’t provide any information on when that future generation of HBM will see the light of day, however. Despite being the world’s largest memory chipmaker, Samsung has lagged behind archrival SK Hynix in the HBM chip segment, forcing it to invest heavily to boost production of what is a crucial component in the escalating AI race due to its superior processing speed. SK Hynix isn’t going to make things easy for Samsung however. The world’s second largest memory chip maker recently announced plans to build the largest chip production facility ever seen at Yongin Semiconductor Cluster in Gyeonggi Province, South Korea. More from TechRadar Pro Samsung but beats Micron to 36GB HBM3E memorySamsung archrival sells out of precious HBM cargoA glimpse at what the future of memory and storage could look like View the full article
  • Forum Statistics

    63.6k
    Total Topics
    61.7k
    Total Posts
×
×
  • Create New...