Rambler's Top100
Реклама
 
Все новости World News

Cloud computing to boost global memory IC market

14 августа 2012

Despite suffering from significant oversupply problems, the global memory integrated circuits (IC) market will be driven on by data centers and the advent of cloud computing, expects a new report by international business analysts GBI Research.

According to the company's latest report, the increased adoption of cloud computing services among various enterprises and customers is leading to the design and development of memory ICs suitable for shared servers and storage devices.

In-memory cloud computing offers large amounts of memory storage through a new storage tier, and is currently in high demand, with applications such as Apple’s iCloud leading the way. Continuing developments will increase the market of memory ICs, which will in turn help cloud computing infrastructure and data centers to deal with issues related to storage and reliability.

The main challenge of such cloud computing infrastructure services is to maintain high system performance at low infrastructure costs, even when data size grows continuously. Advances in flash memory, specifically NAND flash, are of key importance as higher density types enable more applications per server.

Dynamic random-access memory (DRAM) is also a key memory component in large-scale computer structures. Samsung’s green DDR3 DRAM is currently impressing tech-heads around the world with its high density, low power and fast speed, making it ideal for the demands of cloud computing. Its energy efficiency has also earned it a place in the recently unveiled SuperMUC in the Leibniz Supercomputing Centre, Garching, Germany.

In 2011, the global sales revenue of memory ICs was just over USD 62 billion, and GBI Research forecasts this to grow at a CAGR of 6.3% during 2012-2016 to reach USD 85 billion. The memory IC market was damaged by the global economic crisis, but the growing demand for memory dependent devices will drive the industry in the foreseeable future.

 

Источник: EeTimes

Заметили неточность или опечатку в тексте? Выделите её мышкой и нажмите: Ctrl + Enter. Спасибо!

Оставить свой комментарий:

Для комментирования необходимо авторизоваться!

Комментарии по материалу

Данный материал еще не комментировался.