現在位置: ホーム torrent No.2 日本語版 HPC Challenges for the Next Decade and Beyond-From Discovery to Applications at the Nano-Bio-Med Frontier

HPC Challenges for the Next Decade and Beyond-From Discovery to Applications at the Nano-Bio-Med Frontier

産官学をつなぐ̶CMSI産官学連携シンポジウム招待講演

Michael L. Klein
Professor, Institute for Computational Molecular Science,
Temple University, Philadelphia PA 19122 USA

Klein

 

As George Whitesides said recently in Nature Magazine ̶ scientists should try to solve problems that are important and recognizable to society. Practical problems can be more challenging than those typical academics tackle; e.g., catalysis and polymers started in industry before becoming exotic fields within theoretical physics and synthetic chemistry.
Things in biology are not so simple. For example, consider a hybrid system of a DNA and a carbon nanotube .
This is interesting because a device like this can pick up characteristics of functionalized DNAs, which are more and more important in biological environments. This is the intersection between biology, physics, and nanotechnology. Each time a DNA strand binds to a carbon nanotube it will take on a different structure. In order to discover the ideal packing we need to repeat the simulation hundreds of times and it required an IBM ‘Blue Gene’ using 2048 processors for one month. It’ s not just heroic calculations we need that use hundreds of thousands of processors. We also need capacity computing to gain time to solution.
The K computer will be a significant step on the pathway to empowering computation as a tool to complement experiment. But the real breakthrough during the last 30 years was about algorithms; the Feynman path-integral representation of quantum mechanics, the Nosé thermostat to simulate the canonical ensemble, and the Car-Parrinello method, which gave us the ability to simulate silicon both as a semiconductor and as a liquid metallic system. The latter, was a really significant breakthrough. It took a decade before the method was used widely, but since then it has become routine. So, algorithms have played a key role, and in the latter case we had to wait for big machines to catch up.
In biology, interesting things are micron size. They are assemblies of nanoscale machines. And these machines in the cell wall are membrane proteins that either transport ions or particles in and out of the cell. By understanding how these machines work, we can use the design principles from nature to do things that nature didn’ t do. This is a frontier area of material science. But the length scales are not appropriate to density functional theory. Heroic calculations on current supercomputers would allow us to handle about ten thousand lipids. But we are still a long way short of an interesting size. That has led to the idea of using different methods for different scales; the bridging of length scales. Unfortunately, the trouble is where we want to join a quantum method to a classical atomistic method. There are as many methodologies for doing this as there are groups doing calculations, and it’ s an intrinsically tough problem to deal with. In summary, we need more than just the big hardware. We need algorithms. We do need heroic calculations that could only possibly be done on a K computer type machine. But we also need capacity computing. The latter is especially important to solve real world problems.

撮影:由利修一