                           TESTIMONY BY
 

                        DR. MALVIN H. KALOS
                  DIRECTOR, CORNELL THEORY CENTER
 
 
          TO THE SENATE COMMITTEE ON SCIENCE, TECHNOLOGY,
                            AND SPACE
 
 
             HEARINGS ON S. 272, THE HIGH-PERFORMANCE
                       COMPUTING ACT OF 1991
 
                      TUESDAY, MARCH 5, 1991
 
 Mr. Chairman, it is a privilege to be invited to comment on the "High
 Performance Computing Act of 1991" in the company of such a
 distinguished group of representatives of government, industry, and
 academia.
 
 I am Malvin H. Kalos, Director of the Cornell Theory Center, and a
 professor of physics at Cornell University.  The Theory Center is an
 interdisciplinary research unit of Cornell University, dedicated to the
 advancement and exploitation of high performance computing and
 networking for science, engineering, and industrial productivity. As
 you know, the Theory Center is one of the National Supercomputer
 Centers supported by the National Science Foundation. The Center
 also receives support from the State of New York, and from industry.
 
 My career spans 40 years of work with computers as a tool in
 physics and engineering. I have worked in universities, industry, and
 as a consultant to the Los Alamos, Livermore, and Oak Ridge national
 laboratories in research devoted to the application of high
 performance computing to further their missions.
 
 We are witnessing a profound transformation of our scientific and
 engineering cultures brought about by the advent and adoption of
 high-performance computing and communications as part of our
 technological society. The changes, some of which we see now, some
 of which we easily surmise, and some of which we can only guess at,
 have had and will continue to have wide-reaching benefits. Our
 economic well-being and the quality of our lives will be
 immeasurably improved. I salute the foresight and leadership of the
 authors and cosponsors of this Bill, and the Administration. Senator
 Gore, Congressmen Hollings and Brown, and the President all
 understand the deep and positive implications for our future. We are
 also grateful for the support of Congressmen Boehlert and McHugh
 whose backing of our efforts at Cornell and for the entire program
 has been very strong.
 
 The Director of the Office of Science and Technology Policy, Dr.
 Bromley, has done essential work in translating the ideas into
 effective policy.  The Federal Coordinating Council for Science,
 Engineering, and Technology (FCCSET) has, for the first time, brought
 unity into the Federal approach to high-performance computing. This
 is a well designed, well integrated program that shows good balance
 between the need to exploit advancing supercomputing technology,
 the need for very high performance networking, and the need to
 bring these new tools to the widest possible community through
 research and education.
 
 I will begin with some historical and philosophical remarks about
 science, using the history of physics, which I know best. Science is
 not a dry collection of disconnected facts, however interesting. The
 essence of science is the dynamic network of interconnections
 between facts. For a scientist, making a connection never perceived
 before can be the highlight of a career; the more distant the
 connection, the more it is valued. Our aim is to connect all we know
 in a seamless web of understanding. Historically, the greatest
 contribution of the greatest scientists have been such connections:
 Newton's between the fall of an apple and the motion of the Moon
 and planets; Maxwell's between the phenomena of electricity,
 magnetism, and the propagation of light; Einstein's leap of
 understanding connecting quanta of light and the photoelectric effect.
 These connections must be, to the greatest extent possible,
 mathematical and quantitative, not merely verbal or qualitative.
 Making these connections in a quantitative way remains at the heart
 of pure science today, but it has become harder as we try to probe
 into more and more complex phenomena, phenomena that cannot be
 analyzed by the mathematical tools at our disposal.  There are many
 important examples in science that shed light on this paradigm.
 
 Chemistry is one of our most important sciences, one that contributes
 enormously to our grasp of the physical world and one whose
 applications lie at the core of our understanding of materials we use,
 wear, and eat, and of our health. The fundamental understanding of
 chemistry lies in quantum mechanics and electricity, well understood
 since the 1930s. Yet the translation of that scientific understanding
 into quantitative knowledge about chemical materials and processes-
 - polymers, chemical catalysis, drugs both harmful and healing, is
 very far from complete. Quite properly, chemistry is still largely an
 experimental science. But the power of modern supercomputers is
 transforming the face of chemistry at every level.  We are coming to
 understand how electrons cooperate to bind atoms into molecules,
 molecules into larger structures, and to elucidate their structural,
 dynamic, and biological effects.  However, extraordinary numerical
 precision, which can only be attained by very powerful
 supercomputers, is required for this vital work.
 
 Many other areas of science involve this kind of systematic
 connection among different phenomena at different scales of length
 or energy, including biology and medicine, the physics of materials,
 and astrophysics.
 
 The role of computation in linking disparate scientific fields is not a
 contemporary development. The early evolution of modem
 computers was dominated in the 1940s and 1950s by John von
 Neumann, who was also a great mathematician. He designed
 computers so that the very difficult questions that underlie such
 scientific and engineering problems as fluid flow could be explored
 and understood. Only later was it recognized that computers were
 also important business tools. The essential role of computers in
 science and engineering were well appreciated by many groups in
 the United States, including the national laboratories, and their use
 contributed very much to the development of nuclear weapons,
 fusion technology, and the design of aircraft.
 
 The use of computers in academic science and engineering evolved
 more slowly, partly because of the failure of many to see the
 possibilities, partly because the policies of the Federal government at
 the time discouraged scientists from participating fully. My own
 career was impacted negatively by these policies. It was the
 leadership of a few scientists, notably Dr. Kenneth Wilson, who
 created the modern climate of respect for the accomplishments and
 possibilities of computational science in the future of our country.
 The constructive contributions of the Congress and the National
 Science Foundation in creating the National Supercomputer Centers
 are noteworthy. That creation was, in a profound sense, the mark of
 the entry by the mainstream of American research into the era of
 computational science at the heart of science
 and engineering.
 
 It is also important to note that computational science is now an
 essential tool in experimental science as it is currently practised. The
 most advanced scientific instruments, optical and radio telescopes,
 particle accelerators, and computers themselves are studied,
 designed, optimized, and verified with computer simulation. Data
 collection is usually automated with the help of computers, and the
 reduction to comprehensible data sets and pictures may involve
 enormous computations. Exchange of large data sets and the
 cooperative work in understanding them will require very large
 computations and very heavy use of future high capacity data
 networks. Finally, in many cases, even reduced data are
 incomprehensible except when studied in the light of complex
 theories that can be understood only by simulation.
 
 Now the entire scientific and engineering community of the country
 has the opportunity to exploit these new tools. Many researchers are.
 Important new scientific discoveries are being made. New ideas and
 connections are seen everywhere. More important, students and
 young scientists, who are always the very heart of any important
 scientific change, are involved. They are coming to understand the
 techniques, the promise, and the limitations of computational science.
 Their knowledge and its applications are the most important
 products of our efforts, and they will carry the message to the rest of
 our society and to the future. It is they who will have the most direct
 impact upon industry in the United States.
 
 The science made possible throughout the nation by the resources of
 the Theory Center spans all scales of length and energy from the
 galactic through the planetary through the earth's crust, the behavior
 of man-made structures, of materials at the microscopic level, to the
 physics of elementary particles.  From another perspective, it spans
 the traditional disciplines of physics, chemistry, mathematics,
 biology, medicine, all fields of engineering, and agriculture and
 veterinary medicine.
 
 Although I describe research at or made possible by the Theory
 Center, the other National Centers, at San Diego, Champaign-Urbana,
 and at Pittsburgh, can easily list an equally impressive set of
 accomplishments in pure and multidisciplinary science.
 
 It is perhaps unfair to cite a few at the expense of so many others,
 but the work of Stuart Shapiro and Saul Teukolsky on fluids and
 fields in general relativity is outstanding and has been recognized by
 a significant prize, the Forefronts of Large-Scale Computation Award.
 Their research comprises both the development of mathematical and
 numerical methods for the exploration of astrophysical and
 cosmological phenomena and the use of these methods to develop
 quantitative understanding of the formation of black holes and the
 characteristics of gravitational radiation.
 
 John Dawson of UCLA uses the Theory Center resources to study the
 unexpected results of the Active Magnetic Particle Tracer Explorer
 experiments.  In these, barium and lithium were injected into the
 earth's magnetosphere, creating, in effect, an artificial comet.  The
 observations contradicted existing theories and simulations.  Dawson
 and Ross Bollens constructed a hybrid theory and simulation that
 models the observed effect.
 
 Henry Krakauer of the College of William and Mary uses a modern
 "density functional" theory of electronic structure to examine the
 nature of the electron-phonon interaction, known to be responsible
 for low-temperature superconductivity. The aim is to determine its
 role in high- temperature superconductivity. Work like this is being
 carried out throughout the world and will require the fastest parallel
 supercomputers of the future.  Having them available to American
 researchers, including those who are not at major research
 universities, gives them and American industry a competitive edge.
 
 The research of Harold Scheraga and his group at Cornell into the
 three-dimensional structure of proteins shows an equally broad
 range of activity: the investigation of the fundamental interactions of
 the amino acid units with each other and with solvent atoms, the
 basic computational techniques needed to find the optimal structure,
 and the biochemistry of proteins. This is research that is particularly
 well suited to highly parallel computing, and will require, in the long
 run, the full use of future teraflops machines.
 
 Understanding the properties of the earth's crust is the subject of the
 research of Larry Brown and the Consortium for Continental
 Reflection Profiling (COCORP). This national group uses the
 supercomputers to reduce, display, and interpret the huge data set
 that is gathered by seismic probing (to 30krn or more) of the
 continental crust.
 
 I cited earlier the fundamental importance of scientific computing in
 enabling the connections among different phenomena within
 scientific disciplines. Even more important is its role in permitting
 quantitative connections among different disciplines, that is, in
 supporting multidisciplinary research. Every one of the large
 problems that confront our society, and to whose solutions we expect
 science to contribute, is in some sense a multidisciplinary problem.
 For example, issues of the environment involve many sciences --
 chemistry, physics, engineering, fluid flow, biology, and materials.
 
 Medicine is equally demanding in its call upon diverse science.  As
 we have indicated, biochemistry and its relations to chemistry and
 physics plays a central role in medicine. But other areas are
 important as well. As part of my oral presentation, I will show a
 video of a supercomputing study of the uses of ultrasound in the
 treatment of eye tumors. The building of modem prosthetic devices
 uses many resources of computation, from the reduction of CAT scans
 to the computational optimization of the mechanical properties of the
 devices. Understanding blood flow in the heart requires a mastery of
 fluid dynamics of viscous media plus the knowledge of the elastic
 properties of the heart and its valves.
 
 Bringing the knowledge from these fields together to make
 quantitative predictions about the effects of some technological or
 regulatory proposal is a difficult undertaking, one that is utterly
 impossible without the use of computational modeling on high-
 performance computers. Computational modeling is the indispensable
 natural language of quantitative multidisciplinary research.
 
 An outstanding example of such work is that by Greg McRae of
 Carnegie Mellon University. He uses supercomputers and
 supercomputer-based visualization to explain from basic chemistry,
 fluid mechanics, meteorology, and engineering the scientific effect
 that underlie the development of air pollution in the Los Angeles
 Basin, and the probable effects of fuel changes and regulatory
 procedures.  His results have been used to influence regulatory
 policy constructively.
 
 The Global Basins Research Network (GBRN), a consortium directed
 by Larry Cathles of the Geology Department of Cornell University and
 by Roger Anderson of Columbia University's Lamont-Dougherty
 Laboratory and which includes eight academic and 11 industrial
 partners, has as its goal the multidisciplinary understanding of the
 chemical, physical, and mechanical processes that occur in a
 sedimentary basin such as the one in the Gulf of Mexico below
 Louisiana. They have assembled a composite database of the
 observations of the basin and are using computational modeling to
 explain the data. But simply the collection and display in a coherent
 visual way has led to new and deeper understanding of the geology.
 The outcome of this understanding is very likely to improve oil
 recovery world-wide. I will also show a video clip of a visualization
 of the data set that was prepared jointly by the Theory Center and
 the GBRN.
 
 It is important to note that this research covers a wide range of
 partners, geographically dispersed, and the that the medium of
 information exchange is usually visual.  High- performance
 networking is essential to the GBRN and to similar scientific
 enterprises.
 
 Another important development is the establishment at Cornell of
 the Xerox Design Research Institute, with the participation of the
 Theory Center, the Computer Science Department, and the School of
 Engineering. Directed by Gregory Zack of Xerox, and involving
 researchers from Xerox centers nationwide, the aim of the Institute,
 quite simply, is to improve Xerox's ability to bring better products
 more quickly to market. The techniques are those of computational
 and computer science. A vital aspect of the research is the
 development of methods whereby the geographically separate
 centers can effectively collaborate.  Again, high-performance
 networking is key.
 
 As our reach extends, the necessary partners required to carry out
 important collaborative research will rarely be found at one
 institution or even in one part of the country. Essential experimental
 devices or data bases may exist anywhere. Rapid, concurrent access
 is essential, and at higher demands in bandwidth.  The NREN is
 necessary for the full growth and exploitation of the scientific,
 technological, and educational implications of computational science.
 The GBRN and Xerox examples indicate how the greatest potential is
 for industrial use.
 
 The supercomputing community will soon find itself at a major
 crossroads -- where the increases in performance needed for the
 fulfillment of our scientific mandate will demand parallel
 architectures. To exploit these new machines, a major retooling of
 software and algorithms will have to take place. This is not a trivial
 undertaking, yet it must be started very soon if we are to make
 progress on the Grand Challenge problems in the mid-1990s.
 
 The High-Performance Computing and Communications program will
 offer us an essential opportunity to bridge the gap between today's
 high performance vector machines and tomorrow's highly parallel
 systems.
 
 I have emphasized how science and its application to societal
 problems are communal activities, activities that involve, more or
 less directly, the entire scientific community. Bringing to bear the
 transformation made possible by computational science in the most
 complete and positive way requires that its techniques and strategies
 be learned, used, and shared by the widest possible group of
 researchers and educators. That means advancing the art, acquiring
 the best and most powerful tools of hardware, software, and
 algorithms, and coupling the community in the tightest possible
 ways.
 
 The "High-Performance Computing Act of 1991" is a vital step in that
 direction.
 
