PhiloComp.net

Computer Models in the Physical Sciences

In the physical sciences – and related parts of the biological and medical sciences – use of computer models has become ubiquitous through sheer practical necessity. Even capturing the output of modern measuring equipment (from X-Ray diffraction and tomography, through Nuclear Magnetic Resonance spectroscopy and Magnetic Resonance Imaging, to detectors in particle accelerators) requires the computerised interpretation of patterns that are far too complex for humans to process unaided. In other areas, measurement can be relatively straightforward (at least in principle), but systems are so large and complex that working out the consequences of theories – and hence empirically testing them – is utterly unfeasible without computational assistance. Perhaps the most familiar example here comes from the science of Global Warming, where computer models play an essential role.

Other high-profile examples – combining extreme complexity of both measurement and prediction – come from Astrophysics, notably The Millennium Simulation Project. This "used more than 10 billion particles to trace the evolution of the matter distribution in a cubic region of the Universe over 2 billion light-years on a side. It kept busy the principal supercomputer at the Max Planck Society's Supercomputing Centre in Garching, Germany for more than a month [in 2005]. By applying sophisticated modelling techniques, scientists recreated evolutionary histories both for 20 million galaxies and for supermassive black holes. By comparing such simulated data to large observational surveys, one can clarify the physical processes underlying the buildup of real galaxies and black holes."