Allenby and Sarewitz
The Techno-Human Condition
Prepared by Michael Marien
June 2011


The Techno-Human Condition. Braden R. Allenby   (Director, Center for Earth Systems Engineering and Management and Prof of Environmental Engineering and Ethics, Arizona State U) and Daniel Sarewitz (Prof of Science and Society, ASU). Cambridge MA: The MIT Press, May 2011, 222p, $24.95.

Very few books can freshly illuminate an entire sector or issue area. This important and provocative volume, on the role of technology in society at today’s critical point, provides valuable insight into at least three GFB generic categories, notably science/technology, security, and methods. It also offers a solid critique of transhumanism and human enhancement—the movement that favors a new stage of species development enabled by technology—to illustrate the limits of current ways of understanding technological change.
Transhumanism Questioned
The first two chapters are devoted to transhumanism, which can also be seen as a variety of the technological hyper-optimism that is often conspicuous in Western culture, and especially American culture. Transhumanists see many possible avenues of technological development that will continue to drive changes in human capabilities. But they radically oversimplify both the challenges that transhumanism claims to address, and the institutional and social frameworks in which people are defined and function. They assume that “human” will only be improved and enhanced, not transcended, rendered obsolete, or even degraded. The language used to promote transhumanism (by Hans Moravic, Ray Kurzweil, Gregory Stock, and others) is “an agenda for human betterment that in other contexts marks the domain of faith and spiritual practice.” Enhancing cognitive abilities and reducing pain and suffering are desirable of course, but the technologies that can achieve such benefits may also have less happy effects.
Enhancement at the individual level need not lead to an enhanced individual or to an enhanced society. Moreover, in a world dominated by large and competing institutions, “we can make two predictions with considerable confidence: 1) the beneficiaries of enhancement will generally not be individuals, but institutions; drivers for enhancement will be economic efficiency and competition for military and cultural dominance, not quality of life or ‘better humaneness,” even if we knew (or could agree on) what the latter was; 2) particular enhancements cannot be viewed in isolation: they are changes in highly complex and adaptive systems.” Future dilemmas about human enhancement issues will be much like the current ones: we will not suddenly find ourselves in a world where we can buy computer-brain interfaces that boost IQ by 100 points, or genetic modifications that make one impervious to aging. Such technologies will be approached slowly and unevenly, “with front-page claims of amazing advances one day and page-seven revelations of disappointed expectations a year later.”
Three Levels of Technology
A simple taxonomy showing levels of technological function is provided, to allow a clearer understanding of differences between toasters and nuclear weapons. A Level I technology such as a jet airplane may be hugely sophisticated, yet physically discrete, tangible, and recognizable, and it very effectively meets our requirements. Level II technology, such as airline corporations and the government security apparatus, is less bounded, and includes complex sociotechnical subsystems that, acting together, create behaviors that cannot be predicted.. Level II system complexity that accompanies a reliable Level I technology raises the likelihood of unintended consequences. A vaccine is an exemplary, bounded, Level I transhumanist technology. But contrast its effectiveness with chaos of the Level II U.S. health-care system---an emblem of inefficiency, dysfunction, and inequity—that administers vaccines.
A third level that we are not so familiar with—a level at which technology is best understood as an Earth system (a complex, constantly changing and adapting system in which human, built, and natural elements interact)—reflects our Anthropocene era as a world increasingly dominated by one species. “Any meaningful discussion of technology in the age of the anthropogenic Earth must emphasize the transformative role of technology at Level III., the level of Earth systems. At this level, technology is always coupled to other Earth systems, including other technologies.”
Technological change is always potent. Today we have five enabling technologies undergoing rapid evolution: nanotechnology, biotechnology, robotics, information and communication (ICT), and applied cognitive science. The most radical prediction for most people is probably that of “functional human immortality” within 50 years, either due to biotechnology or ICT. The pervasive implications of such change, especially if rapid, are difficult to overstate, e.g.: “the idea of sustainability…would have to be entirely rethought. And this is just one element of a wildly multifaceted wave of technological change.” Virtual immortality is a “Level I framing of a Level III problem, even at the individual level,” illustrating how the concept of transhumanism, and the yea-or-nay debate surrounding it, can now be seen as “desperately impoverished.”
“How can human intentionality and rationality—those paragons of the Enlightenment project—be meaningfully expressed when accelerating technological evolution and complexity make a mockery of conventional notions of comprehensibility?” Despite all the effort aimed at better understanding, there is a “remarkable absence of effective practice” in broad areas of human affairs such as ecosystems management, weapons non-proliferation, immigration policy, etc. A new measure of rationality in this world—one that suits the complexity we are creating—will need new concepts, new tools, new arrangements, and perhaps even new gods.
To the contrary, we may be at the beginning of the end of the great Enlightenment project of democratic power. Life in Level III is confusing and challenging, provoking among some groups a turn toward fundamentalist certainties that can offer the balm of stability amid spiraling complexity. “Fundamentalism is on the rise in virtually all major religions, as well as in certain belief systems—e.g., environmentalism, neoconservatism.” Even as technological transformation is a central component of cultural and economic dominance, so does it provoke opposition to itself. Opposition to technology is an honorable historical tradition, but “what may be unprecedented now is the rapidity, the cultural reach, and the global scale of the cultural transformations themselves, and, in turn, the magnitude and ferocity of social response.”
In sum, Level I technology deals with a simple system. We lose this simplicity with the networked technologies of Level II, where we begin to experience complexity that is often surprising and unpredictable, but it is complexity we can understand. When complexity becomes “wicked” at Level III, operating at Earth systems scale, all bets are off. Any solution to a wicked problem (a term apparently first introduced in the early 1970s), should be expected to create unanticipated but equally difficult new problems. “The techno-human condition embeds us irretrievably in wicked complexity.” There is no correct policy or resolution to a wicked problem, nor is there optimality. There is only muddling through, which is the best we can do, along with avoiding Level I and Level II thinking in a Level III world.
National Security as a Level III System
Human enhancement and technological complexity are at the very heart of the most powerful driver of innovation and social transformation: the rapidly evolving interplay among emerging technologies, military operations, and national security. This intimate relation appears to be central to the techno-human condition. A chart on page 138 illustrates the dramatic complexity of the techno-security challenge by identifying four major realms of coupled change: 1) Revolutions in Military Technologies (Level I technologies such as lethal autonomous robots, cyborg insects for surveillance and sabotage: implanting electronics in real insects or creating robots the size of insects, cyberspace conflict, telepathic helmets with a computer-brain interface, and CBI controlled weapons platforms); 2) Revolutions in Nature of Conflict (asymmetric warfare, “responsibility to protect” doctrine, democratization of WMDs, weaponization of Earth systems, conflict unbounded in cyberspace); 3) Revolutions in Civilian Systems (power structure shifts, civil blowback on military operations); 4) Revolutions in Military Operations and Culture (privatization of military functions, loss or fragmentation of military culture, warrier vs. gamer, etc.).
Each of these realms is changing, each is contingent, each is unstable, and each is coupled to the others. Taken together, they form a potent Level III system, where rapidly emerging military technologies are injected into this volatile context. Cyborg insects, for example, will be a powerful Level I military tool in a counterinsurgency environment, but at Level II they could be a threat to personal privacy if deployed in civil society. As this technology spreads, Level III implications will kick in, e.g. as regards the ongoing struggle between totalitarian and open government.
Methods for Engaging the Level III World
The essential attributes of a society that can wisely address the ever-complexifying turbulence are right in front of our noses. But our Enlightenment instincts send us in the wrong direction, seeking knowledge and certainty, when “what is needed most is the courage and wisdom to embrace contradiction, celebrate ignorance, and muddle forward (but intelligently).” We inhabit Level III, but we act as if we live on Level II, and we work with Level I tools.
Basic principles for engaging with the Level III world:
  • Eschew the quest for “solutions” (what is needed is adaptability in the face of change, not stability in response to problems);
  • Focus on option spaces (have both technological and social options available when our planned paths go off in wildly suboptimal directions);
  • Pluralism is smarter than expertise (this is the social-system equivalent of option spaces: the more perspectives and voices contributing to social perception of challenges, the more likely that alternative paths can be developed);
  • Play with scenarios (another way to develop social options for adjusting to unpredictable and rapidly changing situations);
  • Lower the amplitude and increase the frequency of decision-making (many small decisions allow much more attention to be paid to complex systems as they evolve; thus gaps between policy and reality don’t grow dangerously large);
  • Always question predictions (“efforts to predict the future of Level III technologies are always wrong”);
  • Evaluate major shifts in technological systems before implementing policies and initiatives designed to encourage them (people and economies tend to fall in love with some technologies and not to question their potential for serious consequences, e.g. the US adopting corn-based ethanol as a biofuel);
  • Ensure continual learning (in view of unpredictability and complexity, learning at the personal and institutional level must be built into any governance process);
  • Do not confuse economic efficiency with social efficiency (the former can be measured, and Level I technologies often enhance it; social efficiency is Level III because of its wicked complexity—conditions that are accepted and wisely managed, rather than “problems” that are solved);
  • Intervene early and often (the best time to start talking about alternatives is when ignorance is great and the horizon is fuzzy);
  • Accept and nourish productive conflict (humans are most adaptive and creative in periods of bounded conflict—not too much, which brings chaos or destruction, or too little which results in stasis).
“Our point is not to encourage ill-informed decisions and discussions, but to encourage, welcome, and embrace a capacity to reflect, at the early stages of technological decision-making.” Upstream, “ignorance-based reflection” improves the capacity of people to grapple meaningfully with the techno-human condition, enabling technological change to move toward more socially desirable directions. Ethical uncertainty looks much like factual uncertainty when it comes to the techno-human condition. “Moral dialogue with a continually evolving and uncertain system means that different, even mutually exclusive worldviews are aspects of effective action.” Muddling is an important ethical process, in that “ethics itself is an evolving system in a rapidly changing world.” That Level III macroethics shares traits of uncertainty and complexity with Level III technology evolution in the Anthropocene does not mean that one gives up on ethics; rather, it imposes different kinds of obligations.
Essential reading for anyone interested in Methods to shape the future, World Futures, Security, Sustainability, and Science/technology in general. Also, more specifically, Transhumanism, complexity, wicked problems, ethics, human enhancement, and more. Interestingly, Technology Assessment, quite fashionable in the 1980s and 1990s but now a seeming historical relic, is not mentioned per se at all. Yet this stimulating and timely book is very much about tech assessment, more appropriately seen at three levels of complexity.
Email Newsletter icon, E-mail Newsletter icon, Email List icon, E-mail List icon Sign up for our Email Newsletter
For Email Marketing you can trust