Startup Cerebras Systems developed largest silicon chip with 1.2 trillion transistors for faster artificial intelligence (AI) computing and training

Briefing

Startup Cerebras Systems developed largest silicon chip with 1.2 trillion transistors for faster artificial intelligence (AI) computing and training

August 20, 2019

Briefing

  • Largest Chip – Cerebras Systems announced Cerebras Wafer Scale Engine (WSE), largest silicon chip developed, with 1.2 trillion transistors and measurement 46,225 square millimeters, 56.7 times bigger than NVIDIA’s largest graphics processing unit (GPU)
  • Components – Include 400,000 AI-optimized compute cores, 18 gigabytes of local static random access memory (SRAM) and on-chip mesh-connected communication network, delivering 3,000 times more high-speed on-chip memory and 10,000 times more memory bandwidth
  • Advantages – Include high-speed calculations and communications at hundreds or thousands of times performance of existing solutions, with lower latency, greater efficiency, and less power consumption, for faster artificial intelligence (AI) computation and training
  • Manufacturing – Partnered with Taiwan Semiconductor Manufacturing Company (TSMC), which will manufacture chip using advanced 16 nanometer process technology
  • First Customers – Include U.S. Department of Energy, particularly Argonne National Laboratory and Lawrence Livermore National Laboratory, which will use and integrate chip to supercomputers for deep learning experiments in science, engineering and health
  • Future Plans – Include expanding to other labs and selling computers based on chip in 2019 Q4

Accelerator

Sector

Energy, Government (excluding military), Healthcare/Health Sciences, Information Technology

Function

IT Infrastructure, Research and Development

Organization

Cerebras Systems

Source

Original Publication Date

August 19, 2019

Leave a comment