Science and the 'Routes to Truth'
Since Western technocratic culture has globalized so effectively, it is worth reflecting on what that culture considers the “routes to truth,” and how they have evolved over the centuries. This is especially pertinent if one has a sneaking suspicion that the grounds are shifting, perhaps even being re-negotiated, in ways that have serious implications for environmentalism, sustainability, and indeed, science generally.
At a very high level, there is an historical interplay in Western culture of two different models of truth. In medieval Europe, how an observation aligned with revealed authority – the Church Fathers, and to some extent Plato and Aristotle – was in large part determinative of validity: Faith based on authority trumped observation.
It was not that observations of nature weren’t understood as meaningful, but, rather, that nature was understood to be an expression of God, and the essence of God was approachable not through nature, but through theology. With the coming of the Enlightenment and writings by such scholars as Francis Bacon, however, observation and experimentation became first an independent, and then a dominant, source of validity. Science shifted from being subservient to religion to a belief system that now is regarded by most people as the authoritative discourse of modernity.
But this world view faces two broad challenges. The explicit one, posed in different guises by radical Islam and fundamentalist Christianity (cf. Intelligent Design), amounts to an attempted renegotiation of the Enlightenment elevation of observation over faith. The reasons for this challenge lie not in theology, which has considered, and by and large rejected, such unsophisticated framings of faith for many centuries, but in accelerating rates of change, the culturally disruptive effects of emerging technologies, and other factors, including in some ways a general assault on modernity.
The implicit, and frequently unrecognized, challenge is more profound, however, for it cuts to the question of whether scientific methods can still be regarded as avenues toward truth. Moreover, it arises not from reactions against change, which can to some extent be managed, but from the essence of the anthropogenic world that the Enlightenment has enabled. This world is characterized by increasingly complex and transdisciplinary systems that integrate the human, natural, and built domains in ways that give rise to emergent behaviors which, although increasingly engaged by science and technology, cannot be either studied or understood through the reductionist methods common to science since the Enlightenment.
As a result, observation and experiment are no longer the foundations of much cutting edge science and technology. The complex systems characteristic of the anthropogenic Earth – such as climate change, the tension between environmental protection and development, food versus biofuels, emergent technology systems in the information and communication technology, nanotechnology, biotechnology, robotics and cognitive science domains, and water systems in globalized market structures – cannot be understood merely by studying more and more about smaller and smaller subsystems, for it is precisely the emergent behavior of these systems qua systems that is the issue.
The only way to study these systems is through models which simplify their complexity in principled ways – that is, that adopt general rules, founded on particular world views, that determine what data are dropped, and what data retained, in constructing the models. Thus, for example, global circulation models focus on environmental rather than sociological data.
But models are neither observation nor experimentation (although they may contain both). Rather, they are a tool for exploring probability spaces of complex systems. This is especially true because when complex systems integrate built, natural and human domains they also integrate very different ontological perspectives, necessarily generating a complexity that cannot be understood by any single approach alone, no matter how sophisticated.
In short, while many scientific questions remain approachable through observation and experimentation, such simple methods are inapplicable to the complex systems which increasingly are our major challenges, and the tools we must use in such instances – computer techniques and modeling – raise far different validation and completeness issues than contemplated by the current scientific culture.
Environmentalism, sustainability, and science itself are increasingly trapped in a dysfunctional time warp, vesting the certainty of reductionist scientific method in new tools and frameworks that cannot justify such confidence. The problem is not inherent in the models, nor can we wish away complexity. Rather, the problem is with ourselves, in that we fail to recognize the growing mismatch between the putative authority we grant our tools, and their inevitable partiality and contingency given the complexity of the systems we attempt to parse.
By Brad Allenby
At a very high level, there is an historical interplay in Western culture of two different models of truth. In medieval Europe, how an observation aligned with revealed authority – the Church Fathers, and to some extent Plato and Aristotle – was in large part determinative of validity: Faith based on authority trumped observation.
It was not that observations of nature weren’t understood as meaningful, but, rather, that nature was understood to be an expression of God, and the essence of God was approachable not through nature, but through theology. With the coming of the Enlightenment and writings by such scholars as Francis Bacon, however, observation and experimentation became first an independent, and then a dominant, source of validity. Science shifted from being subservient to religion to a belief system that now is regarded by most people as the authoritative discourse of modernity.
But this world view faces two broad challenges. The explicit one, posed in different guises by radical Islam and fundamentalist Christianity (cf. Intelligent Design), amounts to an attempted renegotiation of the Enlightenment elevation of observation over faith. The reasons for this challenge lie not in theology, which has considered, and by and large rejected, such unsophisticated framings of faith for many centuries, but in accelerating rates of change, the culturally disruptive effects of emerging technologies, and other factors, including in some ways a general assault on modernity.
The implicit, and frequently unrecognized, challenge is more profound, however, for it cuts to the question of whether scientific methods can still be regarded as avenues toward truth. Moreover, it arises not from reactions against change, which can to some extent be managed, but from the essence of the anthropogenic world that the Enlightenment has enabled. This world is characterized by increasingly complex and transdisciplinary systems that integrate the human, natural, and built domains in ways that give rise to emergent behaviors which, although increasingly engaged by science and technology, cannot be either studied or understood through the reductionist methods common to science since the Enlightenment.
As a result, observation and experiment are no longer the foundations of much cutting edge science and technology. The complex systems characteristic of the anthropogenic Earth – such as climate change, the tension between environmental protection and development, food versus biofuels, emergent technology systems in the information and communication technology, nanotechnology, biotechnology, robotics and cognitive science domains, and water systems in globalized market structures – cannot be understood merely by studying more and more about smaller and smaller subsystems, for it is precisely the emergent behavior of these systems qua systems that is the issue.
The only way to study these systems is through models which simplify their complexity in principled ways – that is, that adopt general rules, founded on particular world views, that determine what data are dropped, and what data retained, in constructing the models. Thus, for example, global circulation models focus on environmental rather than sociological data.
But models are neither observation nor experimentation (although they may contain both). Rather, they are a tool for exploring probability spaces of complex systems. This is especially true because when complex systems integrate built, natural and human domains they also integrate very different ontological perspectives, necessarily generating a complexity that cannot be understood by any single approach alone, no matter how sophisticated.
In short, while many scientific questions remain approachable through observation and experimentation, such simple methods are inapplicable to the complex systems which increasingly are our major challenges, and the tools we must use in such instances – computer techniques and modeling – raise far different validation and completeness issues than contemplated by the current scientific culture.
Environmentalism, sustainability, and science itself are increasingly trapped in a dysfunctional time warp, vesting the certainty of reductionist scientific method in new tools and frameworks that cannot justify such confidence. The problem is not inherent in the models, nor can we wish away complexity. Rather, the problem is with ourselves, in that we fail to recognize the growing mismatch between the putative authority we grant our tools, and their inevitable partiality and contingency given the complexity of the systems we attempt to parse.
By Brad Allenby
You can return to the main Market News page, or press the Back button on your browser.