Gung-Ho about Atmospheric Modeling

by CMS Director
Mike Rezny

We live in interesting times. Something that is particularly interesting to me is a major project being undertaken by the UK Met Office in collaboration with the UK academic community to have a new atmospheric model, Gung Ho, in operation around 2020. This project is being funded by their Research Councils, Natural Environment Research Council (NERC) and Science and Technology Facilities Council (STFC).

Given the complexity associated with developing a new model, and the need for it to outperform their existing world-class model, this is going to be an exciting project. Even more so since this is the first time that the Met Office has not developed a dynamical core in-house and is working together with a number of external partners on this new project. Hence the project name, Gung-Ho, which originally meant something along the lines of “working together harmoniously”.

The CMS team was delighted to have been invited by the Met Office to participate in the Gung-Ho project and we are working in two areas:

  • to provide guidance on the underlying software frameworks for the model, and
  • to understand and improve the processes of generating input datasets to make models more user friendly.

The current Met Office atmospheric model, the Unified Model, has been around for some years, perhaps parts of it are older than some of our readers. There was a major revamp some years ago, named New Dynamics and there is another refresh due out next year aptly named, End Game.

So, why would an organisation build a new model from scratch when it already has one of the best atmospheric and climate models in the world? The short answer is scalability.

Future supercomputers will most likely only get increased performance by utilising larger numbers of commodity processors. The high performance computer currently at the National Computational Infrastructure has 11,936 cores. Its replacement, due early next year, will have 57,472 cores.

The supercomputer occupying the number one slot on the June 2012 Top 500 list - an IBM BlueGene/Q at Lawrence Livermore National Laboratory - has 1,572,864 cores, although these are not considered to be commodity processors. Machines on this list are using up to 300,000 commodity cores. Supercomputers using over a million commodity cores are not going to be far away.

Scientists and operational weather centres are continually striving for increased resolution and to do this, their models will need to scale to increasing larger core counts. Unfortunately some of the underlying model fundamentals will not scale and a complete redesign will be necessary.

One of the bottlenecks to scalability of the Unified Model and most other atmospheric models is the rectangular lat-lon grid. Whilst this is a good choice of grid for simulating small areas of the earth, it has a major limitation, called resolution clustering, in dealing with the areas around the poles, as well as coping with the singularities at the poles. Oh, how much easier would atmospheric modeling be in Discworld, or with a flat earth!

The heart of an atmospheric model is the dynamical core, the part of the model that solves the equations governing large-scale adiabatic processes. These equations are solved numerically by discretising them on the underlying grid.

The Gung-Ho project is still in the evaluation stage and significant effort is currently focused on finding a suitable grid structure capable of allowing a dynamical grid to scale up to a million cores. There are a number of contenders:

Cubed-sphere, Yin-Yang and its variants (see top left), triangular, and pentagonal-hexagonal grids are all being considered.

Cubed sphere- (a) conformally projected with isolines equally spaced in map coordinates, and (b) gnomonically projected with equal angles at an axis of the sphere. A figure from the paper, Horizontal grids for global weather and climate prediction models: a review. Permission for images John Thuburn.

 

Icosahedral grid construction (c) after decomposition into 42 subtriangles, and (d) the dual pentagonal-hexagonal grid of (c). A figure from the paper, Horizontal grids for global weather and climate prediction models: a review. Permission for images John Thuburn.

 

 

All these contenders solve the pole clustering problem and promise superior scalability to a lat-lon grid.

Unfortunately, at present, none of these alternatives has the superior numerical properties of a lat-lon grid. A choice of an alternative grid may well boil down to choosing a scalable grid and accepting the one whose numerical degradation can be best tolerated.

This is an exciting research area and if you are interested I recommend reading a paper by Andrew Staniforth and John Thuburn: Horizontal grids for weather and climate prediction models: a review Q.J.R Meteorol. Soc. 138:1-26, 2011.

Another major area being investigated is the choice of numerical solvers, but I will leave that for a future article. However, for those who cannot wait, I recommend reading The Evolution of Dynamical Cores for Global Atmospheric Models by David L. Williamson, J. Meteor. Soc. Japan, 85B 241-269, 2007.

Mike Rezny,
Manager, Computational Modeling Systems team,
ARC Centre of Excellence in Climate Systems Science
Michael.Rezny@monash.edu

 

UNSW logo ANU logo Monash logo UMelb logo UTAS logo