Advertisement
Articles
Advertisement

Finding the Right Tools for Energy Research

Thu, 06/23/2011 - 9:52am
Aaron Knoll
Finding the Right Tools for Energy Research
Visual analysis in this fast-growing, diverse field demands new ways of leveraging HPC resources

fig1
Ball-and-stick ray tracing of a zeolite structure. Even high-quality rendering with depth cues has difficulty illustrating the overlapping channel structure.
Data courtesy of Lei Cheng and Larry Curtiss (PI), Argonne Materials Science Division.
The explosion of nanoscale research driven by computational chemistry is changing how materials are modeled, validated and synthesized. More than ever, research in energy storage, catalysis and alternative fuels is being conducted in a computational sandbox before it is validated experimentally and, ultimately, manufactured. This fast-growing and diverse field demands sophisticated and flexible means of visualization and analysis, and calls for new ways of leveraging high-performance computing resources.

Materials research poses distinct challenges to the conventional pipelines used in scientific visualization and analysis; chief among these is the enormous range in scale, from electrostatics to large-scale mechanics, of examined phenomena. Even problems at the same scale can exhibit different behavior and require different analysis. Nanomaterials research often straddles theory, model and reality. In this field in particular, it is imperative to disambiguate illustration and visualization. Choice of representation is important in understanding these problems; in many cases, standard analyses conducted on particles or individual surfaces are less helpful than measurements of electron density in a 3-D volume. Consequently, useful visualization of these phenomena calls for volume rendering techniques in addition to standard particle, ball-and-stick and isosurface modalities.

The scope and diversity of materials problems, as well as the multidisciplinary nature of applied chemistry and physics research, have led to a large but fragmented set of computation and visualization and analysis tools. The challenge is to identify which tools are appropriate for each given problem, and to leverage computational resources for simulation, visualization and analysis as efficiently as possible.

Picking the right scale
Typically, computational chemistry for nanoscale materials consists of molecular dynamics (MD) simulations of molecules consisting of dozens to thousands of atoms. This level of detail is generally sufficient for medium-scale diffusion processes. To better understand smaller-scale phenomena, such as adsorption or diffusion of single atoms, chemists rely on more expensive density functional theory (DFT) computation operating on individual electrons. MD and DFT both yield particle and volume geometry to be visualized, but often with different semantics. Choosing the appropriate visualization and analysis is not always straightforward.

fig2
Direct volume rendering of the charge density field of the zeolite structure, colored by depth.
Data courtesy of Lei Cheng and Larry Curtis (PI), Argonne Materials Science Division.
Illustration vs. visualization
Prior to simulation or experimental analysis, chemists create ball-and-stick structures in modeling software, then use offline rendering, such as ray tracing, for high-quality illustration. Animated illustrations involve moving particles explicitly, as opposed to simulating diffusion or bonding. Illustrations show an ideal model of how particles should behave, which differs both from visualization of a computational model and ground truth. The goal of visualization is not better visual quality, but better understanding of simulated phenomena.

Common modalities
There are several common visualization modalities used for materials, three of the most common are:

  • ball-and-stick rendering, often using an offline ray tracer for improved depth perception and visual quality
  • isosurfaces, which generate a single surface within the electrostatic volume
  • direct volume rendering, a more powerful but expensive way of interpreting volume data

Ball-and-stick is excellent for modeling small structures, but has difficulty manifesting depth and material boundaries. Isosurfaces are ideal for such boundaries, but are limited to one value, which may not accurately identify the boundary surface. Most molecular visualization software handles both representations, but leaves users to perform the volumetric rendering and the analysis separately. However, materials scientists often are interested in making measurements guided by the visualization, particularly concerning boundaries or empty space in the electrostatic. Direct volume rendering and a volumetric analysis pipeline are a powerful means of doing this. Volume representation allows for multiple measurements to be made from a single geometric representation, reducing conversion-related uncertainty.

A molecular dynamics sandbox
Nanomaterial simulation aims to determine under what conditions a structure is stable or unstable. MD simulation data is relatively small by today’s computational standards, consisting of hundreds or thousands of particles and megabytes per timestep. However, the number of timesteps per simulation run is high, numbering in the hundreds of thousands or millions. It is usually impractical to view a run as a continuous time sequence; simulations are frequently viewed as a sequence of snapshots. Complexity is compounded by the variety of possible parameters in each run, such as temperature, pressure and initial geometry. The high dimensionality of the problem space, more than size of the geometry, is the main obstacle to efficient computation and analysis.

In this context, batch analysis is unlikely to vanish anytime soon in materials science. However, there is a great need for faster and more automated analysis, collecting results as a simulation progresses and terminating the run early if possible. Such analysis could include:

  • automatic registration of features of interest
  • computation of geometric (surface area, volume, curvature) and statistical (error, uncertainty) quantities
  • correlation of derived quantities from simulation and analysis
  • determination of structural integrity

Volumetric analysis allows for estimation of multiple quantities (surface area and volume) from the same geometric representation within the same framework. Transfer functions let users conveniently select subregions of the volume on which to conduct analysis. Pairing ensemble computation with comparative visualization and a volumetric analysis toolset could prove a powerful framework for these types of materials problems and could greatly accelerate the process of scientific discovery with high-performance computing resources.

The goal is to enable scientific discoveries not only on today’s machines, but also on the extreme-scale supercomputers expected to be available in the next decade.

Aaron Knoll is an American Recovery and Reinvestment Act-funded Computational Postdoctoral Fellow within the Mathematics and Computer Science Division at Argonne National Laboratory. He may be reached at editor@ScientificComputing.com.

Advertisement

Share this Story

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading