Has Technology Assessment Kept Pace with Globalization?

bridges vol. 18, July 2008 / Pielke's Perspective

by Roger Pielke, Jr.



{enclose Vol18_Pielke.mp3}


pielke_r_new_small.jpgRight now, in the area of energy policy, decision makers are debating a number of important but complicated questions.  For example, various perspectives are offered on the effects of biofuels mandates on global food prices; OPEC and the United States are presenting vastly different projections of gasoline demand in the 2020s based on differing views on the future adoption of renewable energy technologies; and the US Congress is considering implementing laws to regulate speculation on future commodity prices in global financial markets in the context of vastly different perspectives on the effects of such speculation on current energy prices.

A characteristic that is common to each of these areas of policy debate is uncertainty, or perhaps more accurately, competing claims to certainty, usually offered by those with a stake in the outcome of the debates.  In today's ever-globalizing world, the effects of technologies - such as biofuels, renewable energy technologies, and financial instruments - are far-reaching and hard to see.  Unintended consequences are to be expected.  In this context, decision makers would benefit from an authoritative, independent perspective on technology assessment.  Unfortunately, such capabilities do not appear to have kept pace with globalization and its consequences.

{access view=guest}Access to the full article is free, but requires you to register. Registration is simple and quick – all we need is your name and a valid e-mail address. We appreciate your interest in bridges.{/access} {access view=!guest} Of course, the challenges of globalization are not new.  Not long ago in this space I discussed a very interesting paper by M. Scott Taylor of the University of Calgary who argued that a European innovation in the tanning of animal hides led directly to the slaughter and near extermination of the American bison in the latter part of the 19th century.  One could follow a similar approach to tracing technological reverberations through governments and markets to ask whether early 1980s policy responses to technological advances in crop production led directly to the emergence of Mad Cow Disease in the United Kingdom.  

During the 1970s, farm productivity increased dramatically around the world due primarily to technological innovations in agriculture. In the United States, increasing productivity coupled with government production subsidies resulted in a supply of commodities that exceeded demand. The outcome was lower food prices and corresponding financial hardship for many farmers. Of course, no government likes to see its farmers suffer any financial hardship, so in January 1983 President Ronald Reagan announced a new farm policy designed to pay farmers to take certain crops out of production, in order to stimulate higher prices for commodities and thereby boost the incomes of US farmers.  

The effects of the policy were large and immediate.  By the end of 1983, the US Department of Agriculture estimated that US production of corn would drop by 49 percent from the year before, with rice dropping by 33 percent and wheat by 20 percent.  Although soybeans were not covered by the payment program, their production decreased by 33 percent as well because many farmers not covered by the payment program shifted their planting to the now higher-priced corn and wheat.  The reductions in crop production were exacerbated by a widespread drought during the 1983 growing season.

In the global marketplace, the decrease in US soybean production led to increased costs not just of soy meal - used as animal feed - but also of fish meal, which also served as animal feed.  One result of the increased costs of imported soya and fish meal in the United Kingdom was an immediate increase in the proportion of lower cost meat and bone meal used in cattle feed -- from 1 percent to 12 percent of the total (a contributing factor to the increased costs of imports was also the weakness of the UK pound in international currency markets).  Much later, after Mad Cow Disease became of wide concern and led to a scandal in the UK government, it was learned that the epidemic had its origins in the meat and bone meals used in cattle feed.

There is a general pattern here.  In the case of the near-extermination of the American bison, European wars created demand for leather which was the necessity that mothered the invention of techniques for tanning the bison hide.  The technological advances and their deployment stimulated a market demand with effects that were immediate and merciless.  Similarly, technological advances in agricultural production in the 1970s, when coupled with generous domestic farm subsidies, led to the production of crops at a rate that exceeded demand.  The US policy response to this situation was to take action to reduce the supply of crops in an effort to boost prices and benefit domestic farmers.  This action worked in the short term, but it also set loose a domino effect of consequences through the global economy, creating an economic incentive for a large shift in the content of cattle feed in the United Kingdom, which led directly to the conditions that caused an epidemic of Mad Cow Disease.

Of course, the fate of the bison was certainly far from the minds of 19th century European military leaders, and cattle feeding practices in the United Kingdom were of no concern when Ronald Reagan sought to boost the incomes of US farmers.   But given the profound changes that technology wreaks on society - and the potentially far-reaching effects of policies seeking to respond to the effects of technology - what can decision makers do to better manage the consequences of technological change?  Whose responsibility should it be to assess the unintended consequences of technologies in a globalized world?

As we see the dramatic effects of technology-related decisions reverberate around the world today, it may be time to develop an international technology assessment capability that is independent and authoritative in order to inform current debates.  The alternative is that policy makers will rely on competing claims to certainty, or worse, simply take actions with little understanding of either causes or consequences.

***


Roger Pielke, Jr. is the former director of the Center for Science and Technology Policy Research (2001-2007). He has been on the faculty of the University of Colorado since 2001 and is a professor in the Environmental Studies Program and a fellow of the Cooperative Institute for Research in the Environmental Sciences (CIRES).


{/access}