Reducing data among proposed techniques to speed-up computers


Future computer systems need to be significantly faster than the supercomputers around today, scientists believe. One reason is because analyzing complex problems properly, such as climate modeling, takes increasing work. Massive quantities of calculations, performed at high speed, and delivered in mistake-free data analysis is needed for the fresh insights and discoveries expected down the road.

Limitations, though, exist in current storage, processing and software, among other components.

The U.S. Department of Energy’s four year $48 million Exascale Computing Project (ECP), started at the end of last year for science and national security purposes, plans to overcome those challenges. It explains some of the potential hiccups it will be running into on its Argonne National Laboratory website. Part of the project is being studied at the lab.

The most radical of the ECP undertaking’s thrusts is based on the assumption that memory is, and will continue to be too expensive to be used in the same way it is today—where it matches processing power.

Leave a Reply

Your email address will not be published. Required fields are marked *