Search the Community
Showing results for tags 'memory'.
-
Five years after LPDDR5 was first introduced, and a matter of months before JEDEC finalizes the LPDDR6 standard, Samsung has announced a new, faster version of its LPDDR5X DRAM. When the South Korean tech giant debuted LPDDR5X back in October 2022, its natural successor to LPDDR5 ran at a nippy 8.5Gbps. This new chip runs at 10.7Gbps, over 11% faster than the 9.6Gbps LPDDR5T variant offered by its archrival, SK Hynix. Samsung is building its new chips on a 12nm class process, which means the new DRAM isn’t only faster, but much smaller too – the smallest chip size for any LPDDR, in fact - making it ideal for use in on-device AI applications. Improved power efficiency “As demand for low-power, high-performance memory increases, LPDDR DRAM is expected to expand its applications from mainly mobile to other areas that traditionally require higher performance and reliability such as PCs, accelerators, servers and automobiles,” said YongCheol Bae, Executive Vice President of Memory Product Planning of the Memory Business at Samsung Electronics. “Samsung will continue to innovate and deliver optimized products for the upcoming on-device AI era through close collaboration with customers.” Samsung's 10.7Gbps LPDDR5X boosts performance by over 25% and increases capacity by upward of 30%, compared to LPDDR5. Samsung says it also elevates the single package capacity of mobile DRAM to 32GB. LPDDR5X offers several power-saving technologies, which bolster power efficiency by 25% and allow the chip to enter low-power mode for extended periods. Samsung intends to begin mass production of the 10.7Gbps LPDDR5X DRAM in the second half of this year upon successful verification with mobile application processor (AP) and mobile device providers. More from TechRadar Pro Samsung to showcase the world’s fastest GDDR7 memorySamsung is going after Nvidia's billions with new AI chipScientists inch closer to holy grail of memory breakthrough View the full article
-
ram Everything you need to know about random access memory
TechRadar posted a topic in General Discussion
At its simplest, RAM (Random Access Memory) is a type of computer memory, often referred to as short-term memory because it is volatile, meaning that the data is not saved when the power is turned off. When business users switch on the computer, the operating system and applications are loaded to the computer RAM which is directly connected to the CPU, making the data quickly accessible for processing. In corporate settings, RAM (memory modules) comes in different shapes and sizes. DIMM (Dual In-Line Memory Module) can be found in desktops, workstations and servers, while laptops require smaller physical size SODIMM (Small Outline DIMM). A memory module contains several DRAM (Dynamic RAM) chips which is a type of semiconductor memory. Dynamic simply means that the data held by transistors in the chips is constantly refreshed. The number of DRAM chips found on a memory module varies depending on its capacity (8GB, 16GB, 32GB). The lithography of DRAM chips has been revised and improved many times over recent decades and this has led not only to reductions in cost-per-bit, but also to reducing the dimensions of the component and increasing the clock rate. Overall, DRAM now delivers faster performance and higher capacities but uses less power which cuts energy costs, controls heat and extends battery life. DRAM operate in one of two modes, synchronous or asynchronous. Asynchronous was the common DRAM technology used up until the end of the 1990s. Synchronous mode means that read, write and refresh operations are controlled with a system clock, synchronous with the clock speed of a computer’s CPU. Today’s computers use synchronous mode, or Synchronous Random Access Memory (SDRAM) which connects to the system board via a memory module. New generations of DRAM The latest version of SDRAM is DDR5 (Double Data Rate 5th generation), which comes in a range of standard speeds, starting with 4800M/Ts (megatransfers per second) and is an indicator of the speed at which data is transferred on and off the memory module. Approximately every seven years, a new memory generation is introduced, which is designed to accommodate the ever-increasing demand for speed, density and configurations in business computing environments. DDR5, for example, is designed with new features that provide higher performance, lower power and more robust data integrity for the next decade of computing. It debuted in 2021. IT decision makers who are considering purchasing memory must be aware that memory modules are not backwards compatible. DDR5 memory will not physically slot into a DDR4 or DDR3 memory socket. Within a memory generation, faster speeds are backwards compatible. For example, if a user buys a standard DDR5–5600MT/s module and uses it with a 12th Generation Intel processor, the speed memory will automatically ‘clock down’ to operate at 4800M/Ts, the speed supported by the host system or lower. This will vary depending on the model of the CPU and the number of memory modules installed in the system. It’s essential to know the processor and motherboard already installed in the computer when planning on upgrading memory, but there are some other considerations too. Most PCs have four RAM sockets, some, such as workstations, have as many as eight, but laptops are likely to have only two accessible memory sockets, and in thin models, there may only be one. Different types of RAM Even though they may look similar and have the same function, the type of memory module found in HEDT (High-End Desktop) and servers is different than the ones found in PCs. Intel Xeon and the AMD Epyc range of server CPUs come with a higher number of CPU cores and more memory channels compared to Intel Core and AMD Ryzen desktop CPUs, therefore the specifications and features of the RAM for servers differ from the ones for PCs. Server CPUs require Registered DIMM which supports the ECC (Error Correcting Code) feature, allowing to correct bits error occurring on the memory bus (between the memory controller and the DRAM chip), ensuring the integrity of the data. RDC (Registered Clock Driver) is an additional component found on RDIMM, not present on Unbuffered DIMM (UDIMM), and it ensures that all components on the memory module are operating at the same clock cycle allowing the system to remain stable when a high number of modules are installed. The type of memory module made for desktops and laptops is generally Non-ECC Unbuffered DIMM. The data processed by users on these types of systems is considered less critical than the data being processed by servers which are hosting websites or handling online transactional processing, for example, and need to respect specific SLAs (Service-Level Agreements) and up times of 99.9999% 24/7. Non-ECC UDIMMs contain less components and features than RDIMMs and are therefore more affordable while remaining a reliable memory solution. Unbuffered types of RAM exist in both DIMM and SODIMM form factor. Boosting performance RAM memory is primarily sold in single modules, but it is also available in kits of two, four or eight, ranging in capacity from 4GB for DDR3 to 96GB for DDR5 (in single modules) and up to 256GB in kits (256GB is offered only as a kit of 8 in DDR4 and DDR 5). The configurations match the memory channel architecture, and when installed correctly can deliver a major boost in performance. To provide an example of the performance potential, upgrading a DDR5-4800MT/s module with a peak bandwidth of 38.4 GB/s to a dual channel setup, instantly expands the bandwidth to 76.8GB/s. Accelerating speed Users with industry standard speeds are limited to what their computer’s processor and motherboard will support, particularly if it won’t allow modules to be installed into a second memory bank. On a dual channel motherboard with four sockets, these are arranged in two memory banks, where each memory channel has two sockets. If a DDR5 user can install modules into a second bank, in most cases, the memory may be forced to clock-down to a slower speed to allow for limitations inside the processor. Users looking for a considerable boost, such as gamers, can opt for overclockable memory. This can be done safely using Intel XMP and AMD EXPO profiles however, professional help is advisable. Selecting the right gaming memory for overclocking a system means deciding on price verses speed versus capacity, the potential limitations of motherboards and processors, and RGB versus non-RGB (to bring in the benefits of lighting). Useful glossary of terms Apart from the acronyms we’ve already explained above, here are some additional terms that it will be useful to know: CPU – Central Processing Units are the core of the computer. PMIC – Power Management Integrate Circuits help to regulate the power required by the components of the memory module. For server-class modules, the PMIC uses 12V; for PC-class modules, it uses 5V. SPD hub – DDR5 uses a new device that integrates the Serial Presence Detect EEPROM with additional features, manages access to the external controller and decouples the memory load on the internal bus from external. On-die ECC - Error Correction Code that mitigates the risk of data leakage by correcting errors within the chip, increasing reliability and reducing defect rates. 2CH, 4CH, 8CH – Single RAM modules – dual channel, quad channel, octal channel. MHz – MHz is an abbreviation of megahertz and means a million cycles per second, or one million hertz. This unit of frequency measurement is used to denote the speed at which data moves within and between components. MT/s is short for megatransfers (or million transfers) per second and is a more accurate measurement for the effective data rate (speed) of DDR SDRAM memory in computing. Non-binary memory – The density of DRAM chips usually doubles with each iteration, but with DDR5, an intermediary density – 24Gbit – was introduced, which provides more flexibility and is called non-binary memory. GB/s - Gigabytes per second. A Gigabyte is a unit of data storage capacity that is approximately 1 billion bytes. It has been a common unit of capacity measurement for data storage products since the mid-1980s. Link! This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro View the full article -
Samsung has revealed it expects to triple its HBM chip production this year. “Following the third-generation HBM2E and fourth-generation HBM3, which are already in mass production, we plan to produce the 12-layer fifth-generation HBM and 32 gigabit-based 128 GB DDR5 products in large quantities in the first half of the year,” SangJoon Hwang, EVP and Head of DRAM Product and Technology Team at Samsung said during a speech at Memcon 2024. “With these products, we expect to enhance our presence in high-performance, high-capacity memory in the AI era.” Snowbolt Samsung plans a 2.9-fold increase in HBM chip production volume this year, up from the 2.5-fold projection previously announced at CES 2024. The company also shared a roadmap detailing its future HBM production, projecting a 13.8-fold surge in HBM shipments by 2026 compared to 2023. Samsung used Memcon 2024 to showcase its HBM3E 12H chip – the industry’s first 12-stack HBM3E DRAM - which is currently being sampled with customers. This will follow Micron’s 24GB 8H HBM3E into mass production in the coming months. According to The Korea Economic Daily, Samsung also spoke of its plans for HBM4 and its sixth-generation HBM chip which the company has named “Snowbolt,”. Samsung says it intends to apply the buffer die, a control device, to the bottom layer of stacked memory for enhanced efficiency. It didn’t provide any information on when that future generation of HBM will see the light of day, however. Despite being the world’s largest memory chipmaker, Samsung has lagged behind archrival SK Hynix in the HBM chip segment, forcing it to invest heavily to boost production of what is a crucial component in the escalating AI race due to its superior processing speed. SK Hynix isn’t going to make things easy for Samsung however. The world’s second largest memory chip maker recently announced plans to build the largest chip production facility ever seen at Yongin Semiconductor Cluster in Gyeonggi Province, South Korea. More from TechRadar Pro Samsung but beats Micron to 36GB HBM3E memorySamsung archrival sells out of precious HBM cargoA glimpse at what the future of memory and storage could look like View the full article
-
Micron has showcased its colossal 256GB DDR5-8800 MCRDIMM memory modules at the recent Nvidia GTC 2024 conference. The high-capacity, double-height, 20-watt modules are tailored for next-generation AI servers, such as those based on Intel's Xeon Scalable 'Granite Rapid' processors which require substantial memory for training. Tom’s Hardware, which got to see the memory module first hand, and take the photo above, says the company displayed a 'Tall' version of the module at the GTC, but it also intends to offer Standard height MCRDIMMs suitable for 1U servers. Multiplexer Combined Ranks DIMMs Both versions of the 256GB MCRDIMMs are constructed using monolithic 32Gb DDR5 ICs. The Tall module houses 80 DRAM chips on each side, while the Standard module employs 2Hi stacked packages and will run slightly hotter as a result. MCRDIMMs, or Multiplexer Combined Ranks DIMMs, are dual-rank memory modules that employ a specialized buffer to allow both ranks to operate concurrently. As Tom’s Hardware explains, “The buffer allows the two physical ranks to act as if they were two separate modules working in parallel, thereby doubling performance by enabling the simultaneous retrieval of 128 bytes of data from both ranks per clock, effectively doubling the performance of a single module. Meanwhile, the buffer works with its host memory controller using the DDR5 protocol, albeit at speeds beyond those specified by the standard, at 8800 MT/s in this case.“ Customers keen to get their hands on the new memory modules won't have long to wait. In prepared remarks for the company's earnings call last week, Sanjay Mehrotra, chief executive of Micron, said “We [have] started sampling our 256GB MCRDIMM module, which further enhances performance and increases DRAM content per server.” Micron hasn't announced pricing yet, but the cost per module is likely to exceed $10,000. More from TechRadar Pro Computer RAM gets biggest upgrade in 25 yearsNew candidate for universal memory emerges in race to replace RAMMicron's UFS 4.0 storage promises to make AI run faster on your smartphone View the full article
-
Forum Statistics
63.7k
Total Topics61.7k
Total Posts