Musings on chemistry and the history and philosophy of science
The Curious Wavefunction HomeAboutContact
Theories, models and the future of science
September 5, 2012
| 16
Nobel Prize
for physics was awarded to Saul Perlmutter, Brian Schmidt and Adam
Riess for their discovery of an accelerating universe, a finding leading
to the startling postulate that 75% of our universe contains a hitherto
unknown entity called dark energy. This is an important discovery which
is predated by brilliant minds and an exciting history. It continues a
grand narrative that starts from Henrietta Swan Leavitt (who established
a standard reference for calculating astronomical distances) through
Albert Einstein (whose despised cosmological constant was resurrected by
these findings) and Edwin Hubble, continuing through George Lemaitre
and George Gamow (with their ideas about the Big Bang) and finally
culminating in our current sophisticated understanding of the expanding
universe.
But what is equally interesting is the ignorance that the prizewinning discovery reveals. The prize was awarded for the observation of an accelerating universe, not the explanation. Nobody really knows why the universe is accelerating. The current explanation for the acceleration consists of a set of different models incorporating entities like dark energy, none of which has been definitively proven to explain the facts well enough. And this makes me wonder if such a proliferation of models without accompanying concrete theories is going to embody science in the future.
The twentieth century saw theoretical advances in physics that agreed with experiment to an astonishing degree of accuracy. This progress culminated in the development of quantum electrodynamics, whose accuracy in Richard Feynman’s words is equivalent to calculating the distance between New York and Los Angeles within a hairsbreadth. Since then we have had some successes in quantitatively correlating theory to experiment, most notably in the work on validating the Big Bang and the development of the standard model of particle physics. But dark energy- there’s no theory for it as of now that remotely approaches the rigor of QED when it comes to comparison with experiment.
Of course it’s unfair to criticize dark energy since we are just getting started on tackling its mysteries. Maybe someday a comprehensive theory will be found, but given the complexity of what we are trying to achieve (essentially explain the nature of all the matter and energy in the universe) it seems likely that we may always be stuck with models, not actual theories. And this may be the case not just with cosmology but with other sciences. The fact is that the kinds of phenomena that science has been dealing with recently have been multifactorial, complex and emergent. The kind of mechanical, reductionist approaches that worked so well for atomic physics and molecular biology may turn out to be too impoverished for taking these phenomena apart. Take biology for instance. Do you think we could have a complete “theory” for the human brain that can quantitatively calculate all brain states leading to consciousness and our reaction to the external world? How about trying to build a “theory” for signal transduction that would allow us to not just predict but truly understand (in a holistic way) all the interactions with drugs and biomolecules that living organisms undergo? And then there’s other complex phenomena like the economy, the weather and social networks. It seems wise to say that we don’t anticipate real overarching theories for these phenomena anytime soon.
On the other hand, I think it’s a sign of things to come that most of these fields are rife with explanatory models of varying accuracy and validity. Most importantly, modeling and simulation are starting to be considered as a respectable “third leg” of science, in addition to theory and experiment. One simple reason for this is the recognition that many of science’s greatest current challenges may not be amenable to rigorous theorizing, and we may have to treat models of phenomena as independent, authoritative explanatory entities in their own right. We are already seeing this happen in chemistry, biology, climate science and social science, and I have been told that even cosmologists are now extensively relying on computational models of the universe. My own field of drug discovery is a great example of the success and failure of models. Here models are used not just in computationally simulating the interactions of drugs with diseased proteins at a molecular level but in fitting pharmacological data and x-ray diffraction data, in constructing gene and protein networks and even in running and analyzing clinical trials. Models permeate drug discovery and development at every stage, and it’s hard to imagine a time when we will have an overarching “theory” encompassing the various stages of the process.
Admittedly these and other models are still far behind theory and experiment which have had head starts of about a thousand years. But there can be little doubt that such models can only become more accurate with increasing computational firepower and more comprehensive inclusion of data. How accurate remains to be seen, but it’s worth noting that there are already books that make a case for an independent, study-worthy philosophy of modeling and simulation; a recent book by the University of South Florida philosopher Eric Winsberg for instance extols philosophers of science to treat models not just as convenient applications and representations of theories (which are then the only fundamental things worth studying) but as ultimate independent explanatory devices in themselves that deserve separate philosophical consideration.
Could this then be at least part of the future of science? A future where robust experimental observations are encompassed not by beautifully rigorous and complete theories like general relativity or QED but only by different models which are patched together through a combination of rigor, empirical data, fudge factors and plain old intuition? This would be a new kind of science, as useful in its applications as its old counterpart but rooting itself only in models and not in complete theories. Given the history of theoretical science, such a future may seem dark and depressing. That is because as the statistician George Box famously quipped, although some models are useful, all models are in some sense wrong. What Box meant was that models often feature unrealistic assumptions about the details of a system, and yet allow us to reproduce the essential features of reality. They are subject to fudge factors and to the whims of their creators. Thus they can never provide the certain connection to “reality” that theories seem to. This is especially a problem when disparate models give the same answer to a question. In the absence of discriminating ideas, which model is then the “correct” one? The usual, convenient answer is “none of them”, since they all do an equally good job of explaining the facts. But this view of science, where models that can be judged only on the basis of their utility are the ultimate arbiters of reality and where there is thus no sense of a unified theoretical framework, feels deeply unsettling. In this universe the “real” theory will always remain hidden behind a facade of models, much as reality is always hidden behind the event horizon of a black hole. Such a universe can hardly warm the cockles of the heart of those who are used to crafting grand narratives for life and the cosmos. However it may be the price we pay for more comprehensive understanding. In the future, Nobel Prizes may be frequently awarded for important observations for which there are no real theories, only models. The discovery of dark matter and energy and our current attempts to understand the brain and signal transduction could well be the harbingers of this new kind of science.
Should we worry about such a world rife with models and devoid of theories? Not necessarily. If there’s one thing about science that we know, it’s that it evolves. Grand explanatory theories have traditionally been supposed to be a key part- probably the key part- of the scientific enterprise. But this is mostly because of historical precedent as well a psychological urge for seeking elegance and unification. And even historically sciences have progressed much without complete theories, as chemistry did for hundreds of years before the emergence of the atomic and structural theories. The belief that a grand theory is essential for the true development of a discipline has been resoundingly validated in the past but it’s utility may well have plateaued. I am not advocating some “end of science” scenario here – far from it – but as the recent history of string theory and theoretical physics in general demonstrates, even the most mathematically elegant and psychologically pleasing theories may have scant connection to reality. Because of the sheer scale and complexity of what we are trying to currently explain, we may have hit a roadblock in the application of the largely reductionist traditional scientific thinking which has served us so well for half a millennium
Ultimately what matters though is whether our constructs- theories, models, rules of thumb or heuristic pattern recognition- are up to the task of constructing consistent explanations of complex phenomena. The business of science is explanation, whether through unified narratives or piecemeal explanation is secondary. Although the former sounds more psychologically satisfying, science does not really care about stoking our egos. What is out there exists, and we do whatever’s necessary and sufficient to unravel it.
This is a revised version of a past post.
Last year’s But what is equally interesting is the ignorance that the prizewinning discovery reveals. The prize was awarded for the observation of an accelerating universe, not the explanation. Nobody really knows why the universe is accelerating. The current explanation for the acceleration consists of a set of different models incorporating entities like dark energy, none of which has been definitively proven to explain the facts well enough. And this makes me wonder if such a proliferation of models without accompanying concrete theories is going to embody science in the future.
The twentieth century saw theoretical advances in physics that agreed with experiment to an astonishing degree of accuracy. This progress culminated in the development of quantum electrodynamics, whose accuracy in Richard Feynman’s words is equivalent to calculating the distance between New York and Los Angeles within a hairsbreadth. Since then we have had some successes in quantitatively correlating theory to experiment, most notably in the work on validating the Big Bang and the development of the standard model of particle physics. But dark energy- there’s no theory for it as of now that remotely approaches the rigor of QED when it comes to comparison with experiment.
Of course it’s unfair to criticize dark energy since we are just getting started on tackling its mysteries. Maybe someday a comprehensive theory will be found, but given the complexity of what we are trying to achieve (essentially explain the nature of all the matter and energy in the universe) it seems likely that we may always be stuck with models, not actual theories. And this may be the case not just with cosmology but with other sciences. The fact is that the kinds of phenomena that science has been dealing with recently have been multifactorial, complex and emergent. The kind of mechanical, reductionist approaches that worked so well for atomic physics and molecular biology may turn out to be too impoverished for taking these phenomena apart. Take biology for instance. Do you think we could have a complete “theory” for the human brain that can quantitatively calculate all brain states leading to consciousness and our reaction to the external world? How about trying to build a “theory” for signal transduction that would allow us to not just predict but truly understand (in a holistic way) all the interactions with drugs and biomolecules that living organisms undergo? And then there’s other complex phenomena like the economy, the weather and social networks. It seems wise to say that we don’t anticipate real overarching theories for these phenomena anytime soon.
On the other hand, I think it’s a sign of things to come that most of these fields are rife with explanatory models of varying accuracy and validity. Most importantly, modeling and simulation are starting to be considered as a respectable “third leg” of science, in addition to theory and experiment. One simple reason for this is the recognition that many of science’s greatest current challenges may not be amenable to rigorous theorizing, and we may have to treat models of phenomena as independent, authoritative explanatory entities in their own right. We are already seeing this happen in chemistry, biology, climate science and social science, and I have been told that even cosmologists are now extensively relying on computational models of the universe. My own field of drug discovery is a great example of the success and failure of models. Here models are used not just in computationally simulating the interactions of drugs with diseased proteins at a molecular level but in fitting pharmacological data and x-ray diffraction data, in constructing gene and protein networks and even in running and analyzing clinical trials. Models permeate drug discovery and development at every stage, and it’s hard to imagine a time when we will have an overarching “theory” encompassing the various stages of the process.
Admittedly these and other models are still far behind theory and experiment which have had head starts of about a thousand years. But there can be little doubt that such models can only become more accurate with increasing computational firepower and more comprehensive inclusion of data. How accurate remains to be seen, but it’s worth noting that there are already books that make a case for an independent, study-worthy philosophy of modeling and simulation; a recent book by the University of South Florida philosopher Eric Winsberg for instance extols philosophers of science to treat models not just as convenient applications and representations of theories (which are then the only fundamental things worth studying) but as ultimate independent explanatory devices in themselves that deserve separate philosophical consideration.
Could this then be at least part of the future of science? A future where robust experimental observations are encompassed not by beautifully rigorous and complete theories like general relativity or QED but only by different models which are patched together through a combination of rigor, empirical data, fudge factors and plain old intuition? This would be a new kind of science, as useful in its applications as its old counterpart but rooting itself only in models and not in complete theories. Given the history of theoretical science, such a future may seem dark and depressing. That is because as the statistician George Box famously quipped, although some models are useful, all models are in some sense wrong. What Box meant was that models often feature unrealistic assumptions about the details of a system, and yet allow us to reproduce the essential features of reality. They are subject to fudge factors and to the whims of their creators. Thus they can never provide the certain connection to “reality” that theories seem to. This is especially a problem when disparate models give the same answer to a question. In the absence of discriminating ideas, which model is then the “correct” one? The usual, convenient answer is “none of them”, since they all do an equally good job of explaining the facts. But this view of science, where models that can be judged only on the basis of their utility are the ultimate arbiters of reality and where there is thus no sense of a unified theoretical framework, feels deeply unsettling. In this universe the “real” theory will always remain hidden behind a facade of models, much as reality is always hidden behind the event horizon of a black hole. Such a universe can hardly warm the cockles of the heart of those who are used to crafting grand narratives for life and the cosmos. However it may be the price we pay for more comprehensive understanding. In the future, Nobel Prizes may be frequently awarded for important observations for which there are no real theories, only models. The discovery of dark matter and energy and our current attempts to understand the brain and signal transduction could well be the harbingers of this new kind of science.
Should we worry about such a world rife with models and devoid of theories? Not necessarily. If there’s one thing about science that we know, it’s that it evolves. Grand explanatory theories have traditionally been supposed to be a key part- probably the key part- of the scientific enterprise. But this is mostly because of historical precedent as well a psychological urge for seeking elegance and unification. And even historically sciences have progressed much without complete theories, as chemistry did for hundreds of years before the emergence of the atomic and structural theories. The belief that a grand theory is essential for the true development of a discipline has been resoundingly validated in the past but it’s utility may well have plateaued. I am not advocating some “end of science” scenario here – far from it – but as the recent history of string theory and theoretical physics in general demonstrates, even the most mathematically elegant and psychologically pleasing theories may have scant connection to reality. Because of the sheer scale and complexity of what we are trying to currently explain, we may have hit a roadblock in the application of the largely reductionist traditional scientific thinking which has served us so well for half a millennium
Ultimately what matters though is whether our constructs- theories, models, rules of thumb or heuristic pattern recognition- are up to the task of constructing consistent explanations of complex phenomena. The business of science is explanation, whether through unified narratives or piecemeal explanation is secondary. Although the former sounds more psychologically satisfying, science does not really care about stoking our egos. What is out there exists, and we do whatever’s necessary and sufficient to unravel it.
This is a revised version of a past post.
No hay comentarios:
Publicar un comentario