What Role Do Computers Play in Weather Forecasting?

Because of the amount of computation required, meteorologists have used the fastest computers available to do their numerical modeling. NWP has advanced greatly in six decades, in large part due to the spectacular growth in speed and capacity of digital computers. To give you a feel for it, one of the first commercial computers used for atmospheric research was the IBM 1620 which could perform about a thousand (103) operations per second.

The majority of supercomputers today are designed as computer clusters, using thousands of processors (usually off-the-shelf), programmed to work together in parallel on increasingly complex problems. (Currently, since actual testing of nuclear weapons is no longer allowed, about half of the faster of the world's supercomputers are used for simulations for atomic research. Maybe that's a good thing.)

The very fastest of today's massively parallel supercomputers clip along in petaflops (1015 floating


 point operations per second). As of October 2012, the likely high-performance computing (HPC) leader is a Cray XK7 called the Titan at Oak Ridge National Laboratory. In addition to 16-core Opteron CPUs, each of its 18,688 nodes uses an NVIDIA Graphic Processing Unit (GPU) Accelerator to help keep down power consumption. Titan is capable of over 20 petaflops and has more than 700 terabytes of memory.

Atmospheric scientists and climate researchers (not just NWP people) use and generate huge amounts of data. The National Center for Atmospheric Research (NCAR) estimated that in 1997 their Mass Storage System maintained computer files totaling 30 terabytes (30x1012 bytes) of data. By late 2000 data stored had grown to over 200 terabytes; by 2003 that number continued growing exponentially to over a petabyte (1015 bytes); and by 2008, NCAR's mass storage surpassed 5 petabytes of data, with a net growth rate of 80 to 100 terabytes per month.

In the middle of 2006, Google's largest computer cluster was estimated to have 4 petabytes of random access memory! It's mass storage is far larger. And when fully operational CERN's Large Hadron Collider is expected to generate 15 petabytes of data each year in particle physics experiments.

But there are inherent limits to numerical weather prediction that even the fastest computers can't overcome.