Modern civilisation has set a trap for itself, as ever more complex technologies are deployed at an accelerating rate. Every second, billions of devices, protocols, ideas, traditions and people interact around the world. The resulting increase in complexity poses a huge and possibly unmanageable challenge.
Experts understand parts of the system, but the whole is far beyond the comprehension of any scientist, citizen or political leader. To address the big global problems of the next decade, we need a paradigm shift in societal regulatory systems to break us out of the complexity trap.
While humanity arrived at this point gradually, there have been foreshocks at earlier stages of technological development. Over the past several hundred years, science and technology, guided by reason and knowledge, have clearly improved daily life for most of humanity. But progress is not linear. Each advance produces some kind of disruption and side effects that society then struggles to address.
For example, the Haber-Bosch process for artificial fixation of nitrogen increased agricultural yields but has led to waterways being polluted with runoff from excessive use of some fertilisers. Chlorofluorocarbons, used as refrigerants, caused the ozone hole, but efforts to replace them gave rise to hydrofluorocarbons, which are dangerous greenhouse gases. And although antibiotics have saved hundreds of millions of lives, they are now used so widely that drug-resistant strains have become a new risk to human health. There are many more such examples across all areas of science and technology.
Such problems arise because of system-level effects that are not obvious when new technologies are first introduced. Unanticipated consequences can occur at almost any level — chemical, biological, computational, economic, financial and sociopolitical. But emergent complexity (moving beyond any prospect of direct human comprehension) becomes an increasingly serious problem with the rise of computers, as individual components of the system become smarter, interact more rapidly and connect on a global scale.
All of these challenges are intertwined with broader issues concerning science and society. As a scientist, I had studied the structure and design of DNA-binding proteins, but I resigned a tenured faculty position in the Massachusetts Institute of Technology’s biology department to look at the larger challenges of human thought and humanity’s future. I studied finance, cognitive neuroscience, governance, climate change, the risks of environmental degradation and the dangers posed by the rise of artificial intelligence. One thing became clear: the limits of human cognitive capacity leave us struggling to grasp the complexity of the problems now facing the planet.
So, what are we to do? It is not reasonable to ask scientists or other experts to anticipate the full effects of their work. Instead, a new approach to handling emerging complexity should begin by recognising that this complexity engenders two kinds of external costs paid by society. Some involve direct damage, such as when Facebook was used to incite hatred and disrupt the 2016 United States presidential election. Others are less direct, such as the time and attention needed to sort through new problems and develop effective plans to address them, such as those associated with fossil fuels.
Society faces a fundamental challenge in allocating complexity’s costs and benefits in a fair, reliable and well-structured way that ensures that those developing and selling new technologies repay society for the external costs.
For starters, we need better methods for evaluating potential problems. Companies developing new technologies, for example, should evaluate and mitigate risks at key points in the research, development and implementation stages. These evaluations should aim to anticipate a range of outcomes and weigh their respective costs and benefits to society.
These initial assumptions do not solve the complexity problem, but they frame it well enough to serve as a call for advice and comment. Open-ended discussions could be funded by governments, tech companies or philanthropists to preserve democracy and guarantee a livable future.
Democracy and capitalism, coupled with science, have given rise to a flourishing of thought, creativity, expression, and invention, which has entrenched the longstanding assumption that knowledge — and prospects for human control of our fate — would steadily increase. But we have now entered a phase in which increasing complexity is creating a world that no one understands in detail.
Escaping this trap will require more than a technical fix involving a clever new program, device, or brain implant. The discussion should begin with gating mechanisms and new types of regulatory schema that can serve as precautionary tools when technology is first introduced.
Ultimately we will need to upgrade our methods of thought. This is a call to global action, worthy of our brightest minds. — © Project Syndicate
Carl O Pabo is the founder and president of Humanity 2050