Showing posts with label History. Show all posts
Showing posts with label History. Show all posts

Sunday, January 26, 2014

Science Policies Can't Just Rest on Facts


Hurricane Sandy, shown here a day before landfall on Oct. 28, 2012, was one of the largest storms ever to hit the East Coast. Scientists pointed to the damage caused by the hurricane as one example of the problems climate change could exacerbate. NASA Earth Observatory image by Robert Simmon with data courtesy of the NASA/NOAA Geostationary Operational Environmental Satellite Project Science team.


Science and technology policies are based on some facts, but facts only account for a small proportion of the considerations that shape these policies. Reputation, money, politics and cultural perceptions all play major roles in determining which facts are pursued by scientists, as well as how those facts, once uncovered, are used by other groups.

Among scientists, reputations act as heuristic signals for colleagues. Reputations are built primarily through publications, awards, funding and other mechanisms that proceed from outside the laboratory.

Individual scientists may not always value these heuristic mechanisms as much as others. Percy W. Bridgman, for instance, ignored an interview request from the Associated Press about the announcement that he had won the Nobel Prize - so he could continue his experiments.[1]

But avoiding participation in the policy sphere is nearly impossible for most scientists. This was true even a century ago, when scientists and engineers in the electrical sciences, aeronautical sciences, and agricultural sciences wrestled with the effects of patents and industrial funding that threatened idealized notions of “pure science.” World wars I and II ramped up state support, thus giving many of the scientists what they had thought was independence.[2]

State patronage also comes with strings, however. David Edgerton points out that even during the heyday of science’s growth in the U.S. at the end of World War II, the vast majority of government sponsorship was for applied research with military goals.[3]

Edgerton says that if one “follows the money,” one quickly discovers that most science for most of history actually occurred in applied contexts, and almost always outstripped basic research in terms of funding, even if popular conceptions of science have not recognized this.[4]

Even where goals remained purely scientific, government-sponsored science faces political concerns that shape agendas. This became apparent to members of the meteorological community as they attempted to build a cooperative global infrastructure for exchanging weather data in the early twentieth century. The first incarnation of this effort, the International Meteorological Organization (IMO), was purely voluntary. To keep up with technological changes and to keep control of the standard-setting process, the IMO petitioned to become part of the U.N. Its establishment as an intergovernmental organization allowed it ultimately to expand its scope to become a truly globalized entity. But it also brought political and definitional challenges. It was forced, for instance, to create “observer” statuses for large states like China that were not at the time members of the U.N., instead of allowing them full inclusion in the coordination process.[5]

Politics and money can have profound influences on the arrangement of policies governing the practice of science. Despite these forces, scientists have been largely successful at uncovering facts within those research programs that have been funded, even when those facts have disagreed with the interests of industry or government. The question remains: how are these facts incorporated into new policies, if at all?

Global warming is perhaps the most striking recent example of the failure of facts effectively to inform political debate and policymaking in the U.S. Since the issue first exploded into public consciousness in the late 1980’s, industry-backed denial campaigns have sought to undermine public confidence in the science itself. Front groups for oil, gas and other fossil fuel companies have employed a small number of contrarian scientists to attack peer review and prestigious science organizations that recognize global warming, such as the National Academy of Sciences.[6]

Politicizing the issue has polarized the public. Gallup surveys showed that 49 percent of self-identified Republicans in 2001 believed the effects of global warming had already begun. By 2010, that had dropped to 29 percent. In the same period, the percentage of Democrats who believed global warming had begun rose from 60 percent to 70 percent.[7]

Public relations campaigns have also had a direct influence over perceptions of science among policymakers. Former Texas Republican Representative Tom Delay in 1995 dismissed the International Panel on Climate Change’s report without having read it, saying, “But it’s been my experience that . . . the conclusion is usually written before the study is even done.” Former California Republican Representative John Doolittle relied on a single think-tank-sponsored scientist, S. Fred Singer, to justify his denial of the peer-reviewed science supporting climate change.[8]

Public officials generally have neither the time nor expertise properly to weigh scientific issues – or on the credibility of a given institution’s work. They rely a great deal on lobbyists and think tanks to provide them with a bottom line. They also have cultural and political biases that influence their actions.

The Smithsonian's National Air and Space Museum is a prime example of the role that cultural concerns play when they intersect with scientific facts. Curators at the museum are constrained in what exhibits they can display by dominant historical narratives. Roger D. Launius offers 10 exhibits that he believes could be compelling, but would not be approved because of cultural sensitivities. Some of the considerations he cites as deal-breakers include images of crashes, death, and evoking Cold War fears.[9]

Science and technology policies must use facts in order to be effective. However, no policy can sustain itself without taking into account influences such as money, politics, and culture. The difference between the effective and the poor lays not in whether policies are free of these influences, but in how they are ultimately reconciled with the facts.


[1] Gerald Holton, “Candor and Integrity in Science,” Syntheses 145 (2005), 277-294, http://www.jstor.org/stable/20118593, 11-13-13;

[2] Christine MacLeod, “Reluctant Entrepreneurs: Patents and State Patronage in New Technosciences, circa 1870-1930,” Isis 103 (2012), 328-339, http://www.jstor.org/stable/10.1086/666359, 11-13-13;

[3]           David Edgerton, “Time, Money, and History,” Isis 103 (2012), 316-327, http://www.jstor.org/stable/10.1086/666358, 11-13-13;

[4]           Ibid.;

[5]           Paul N. Edwards, “Meteorology as Infrastructural Globalism,” Osiris 21 (2006), 229-250, http://www.jstor.org/stable/10.1086/507143, 11-26-13;

[6]           Riley E. Dunlap and Aaron M. McCright, “The Climate Change Denial Campaign,” SSN Scholars Strategy Network, January 2013, http://www.scholarsstrategynetwork.org/sites/default/files/ssn_key_findings_dunlap_and_mccright_on_climate_change_denial.pdf, 11-26-13;

[7]           Aaron M. McCright and Riley E. Dunlap, “The Polarization of U.S. Public Opinion on Climate Change,” Scholars Strategy Network, January 2013, http://www.scholarsstrategynetwork.org/sites/default/files/ssn_key_findings_mccright_and_dunlap_on_political_polarization_on_climate_change.pdf,11-26-13;

[8] Myanna Lahsen, “Technocracy, Democracy, and U.S. Climate Politics: The need for Demarcations,” Science, Technology, & Human Values 30 (2005), 137-169,http://www.jstor.org/stable/1558016, 11-26-13;

[9] Roger D. Launius, “American Memory, Culture Wars, and the Challenge of Presenting Science and Technology in a National Museum,” The Public Historian, 29, 1 (Winter 2007), 13-30, http://www.jstor.org/stable/10.1525/tph.2007.29.1.13, 11-22-13.

Tuesday, October 15, 2013

Cold War Politics and the Paradigm of Militarized Science



The MIKE EVENT, part of Operation Ivy, was the first successful detonation in the testing of experimental thermonuclear weapons. It was exploded on Oct. 31, 1952 at the Pacific atoll Enewetak.
Public domain photo courtesy of National Nuclear Security Administration / Nevada Site Office.


Thomas Kuhn’s seminal 1962 book, The Structure of Scientific Revolutions, argued that normal scientific practice is periodically interrupted by shifts in thinking, or paradigms, that change the nature of the entire practice. Paradigms are structures that form the cognitive environments in which science operates on a day-to-day basis.[i]

The Cold War was not in itself a scientific discovery or endeavor. However, the ideology and competition of this perpetual standoff so enveloped the American mindset, and especially scientists, during the latter half of the twentieth century that it became a paradigm of its own. Cold War military strategy became the primary impetus for nearly all major scientific projects, first in the physical sciences and soon after also in the social sciences.

The building of the first atomic bomb had little to do with U.S. worries about the Soviets. But nuclear technology quickly became the dominant tool of early Cold War policy. In 1950, U.S. foreign policy shifted strongly toward the threat of military force as a means of preventing Communist hegemony. President Truman announced that the country would pursue thermonuclear weapons, and despite some high-profile scientific opposition, the Soviet’s own development of such bombs kept the U.S. in an arms race to build bigger and more destructive weaponry.[ii]

An obsession with military-related research and development in the early Cold War led to an imbalance in science funding. In 1952, for instance, the civilian-oriented fledgling National Science Foundation’s budget was $3.5 million. The single military branch of the Navy, on the other hand, spent nearly $600 million between 1946 and 1950, or an average of $120 million per year.[iii] The security mindset permeated well into scientific circles. Because many scientific documents in specialties such as physics, electronics and oceanography were classified, most scientists needed security clearances, tying them inexorably to national defense interests and  Cold War politics.[iv]

The securitization and politicization of scientific information also affected close U.S. allies in Western Europe. When the U.S. State Department decided to pursue policies in favor of a “United States of Europe,” one of the main impetuses was to provide a strong front against the Soviet Bloc next door and take some pressure off the U.S. in defending against Communism. To attain this goal, the U.S. threw its weight behind a concept to unite a group of six continental nations under a single peaceful nuclear energy regime called Euratom, and restricted scientific information (and other nuclear resources) to potential member states unless they signed on. Though Euratom became a reality in 1958, it failed to create a united Europe and injured U.S. relations with such allies as France, Britain and Germany.[v]

Euratom was part of the prevailing paradigm of Cold War – that science and technology had the power to resolve all problems, no matter how intractable. Yet Euratom did not achieve the goals of the U.S., nor was it the only scientifically-infused foreign policy project that ended in debacle.

By the end of the 1950’s, the social sciences were being graced with the state’s largesse, too. Development theorist W. W. Rostow laid out his theory of economic takeoff, which suggested among other things that poor countries could be modernized through a combination of technology, foreign investment and the insertion of Western liberal values. The federal government seized on this, with Rostow’s support, as a way to win third-world countries over from Communist influence.[vi] Some of the resultant development programs met with mixed success, while others resulted in utter failure.[vii]

Rostow’s theory and others like it fall into what Richard Feynman dubbed “Cargo Cult science:” they have all the trappings of science, but they don’t work because their proponents fail to consider other possible explanations.[viii] At the height of the Cold War, ideological presumptions trumped attempts to ask fundamental questions about the uses of science and technology, for both good and ill. The flawed reasoning that science could fix foreign nations was also applied domestically in what became known as the “War on Poverty,” which was also largely abandoned after producing tepid results.[ix]

Although the Cold War would not end for more than another decade, the militarized science paradigm unraveled precipitously in the 1970’s. A low return on investment, growing social opposition, and health and environmental concerns largely led to what Kuhn would have called a crisis period, and then a revolution: a large-scale decoupling of the security and scientific establishments.[x]

Cold War tensions would flare up once more before the Soviet Union collapsed.[xi] But never again would science and the military be so closely and unquestioningly wed.


[i] Sergio Sismondo, “Fifty years of The Structure of Scientific Revolutions, twenty-five of Science in Action,” Social Studies of Science 42 (2012), 415-41, http://sss.sagepub.com/content/42/3/415, 9-1-13;

[ii]           Audra J. Wolfe, Competingwith the Soviets: Science, Technology, and the State in Cold War America (Baltimore: The Johns Hopkins University Press, 2013), 20-2;

[iii]           Wolfe, Competing with the Soviets, 25;

[iv]           Ibid., 33-35;

[v]               John Krige, “The Peaceful Atom as Political Weapon: Euratom and American Foreign Policy in the Late 1950s,” Historical Studies in the Natural Sciences, 30, 1 (Winter 2008), 5-44, http://www.jstor.org/stable/10.1525/hsns.2008.38.1.5, 9-12-13;

[vi]              Wolfe, Competing with the Soviets, 60-65;

[vii]          Ibid., 72-73;

[viii]          Richard Feynman, "Cargo Cult Science," http://www.lhup.edu/~dsimanek/cargocul.htm, 8-26-13;

[ix]              Wolfe, Competing with the Soviets, 85-88;

[x]           Ibid., 106-115;

[xi]           Ibid., 121-124.

Wednesday, September 4, 2013

A dream deferred, but not denied

Crowds gather for the 1963 March on Washington for Jobs and Freedom, where Martin Luther King, Jr. made his famous "I Have a Dream"speech. Public domain image.

Dr. Martin Luther King, Jr. was a frequent visitor of Bridgeport, Conn. He spoke at the city's Klein Memorial Auditorium on three occasions in the 1960s, a time when the industrial might of the city was drawing many African Americans to work at factories there.

Yet Bridgeport's original street named after the iconic civil rights leader no longer exists. It had run through Father Panik Village, one of the most dangerous housing projects in the U.S. during the 1980s and 1990s. Drugs, violence, and prostitution were so endemic that officials decided their best recourse was to raze the entire area.

After Father Panik was demolished in 1993, Stratford Avenue, one of the city's main arteries, was graced with the honorific title of Dr. Martin Luther King Jr. Boulevard. Aside from those who live there, however, few people know it by that name. It's as if King has disappeared from the city's memory.

King would not be any prouder of having his name attached to the new street. Dilapidated storefronts and crumbling Victorian-style houses run along it, as well as the husks of those once-vibrant factories, now barely standing. This area, too, is a hotbed of crime and gang activity. People avoid going there at night.

Most of the city today appears poised on the brink of economic renaissance, but the largely black and immigrant neighborhoods of the East End - through which Stratford Avenue runs - remain mostly neglected by revitalization efforts.

Bridgeport's Martin Luther King Jr. Boulevard is emblematic of streets by the same name in other cities across the nation. For most of the people who live in the shadow of King, his dream of equality remains just that – a dream.

Were he alive today, King would be appalled at the conditions under which too many blacks still live and the significant barriers that remain.

More than a third of students in the Bridgeport public schools fail to graduate each year. Those who do rarely have the wherewithal to pursue higher education. Families are blighted by absentee parents, malnutrition, and the ever-gnawing call of street life. Without the money or knowledge to make better decisions for themselves, the struggle for many black people to forge a better life is beset by obstacles that few whites ever have to face.

The vestiges of discrimination still exist as well. Police do not usually advocate the targeting of a particular group. But police go where the crime is, and develop through those experiences presumptions about suspicious behaviors and lifestyles. An officer may not have any conscious ill-will toward a given racial or ethnic group, but may be more keenly aware of the activities of one group over another. This is one reason why, though drug use may be equally prevalent among whites, blacks are more liable to be stopped, searched and prosecuted.

Still, 50 years after King made his famous speech on the steps of the Lincoln Memorial, there has been significant progress. Voting is no longer as daunting; indeed, at least in Connecticut, the legislature recently moved to expand voting opportunities. Black residents can move freely without restrictions. Some have held positions in municipal or state government. Others have found success, leaving the projects for more affluent outlying districts.

King would be proud of the many millions of African Americans who have climbed the ranks of society. He'd recognize that the barrier of overt oppression has largely been supplanted by the inertia of history.

And just as he did when he met with gang leaders in Chicago to talk them out of their violent habits, King would encourage today's blacks to focus their energies on continuing the hard task of improving their communities, including those on the streets that bear his name.

Saturday, July 7, 2012

World Artist Network Celebrates Jackson Pollock's 100th Birthday with Community Painting


One of the contributors on the World Artist Network's Community Pollock-style Painting, created Saturday by visitors to the Bridgeport Arts Fest.
Photograph by Brandon T. Bisceglia.

Passersby splattered and dripped reds, yellows, blues and blacks onto a canvas laid out on State Street beside McLevy Green in Bridgeport Saturday afternoon. With each new flick of the wrist, the Jackson Pollock-style painting took on greater dimension.

The collaborators on this masterpiece, however, were not professional artists. They were visitors to the World Artist Network's booth at the Bridgeport Arts Fest.

The community painting project was the brainchild of WAN Director Valeria Garrido, who said she wanted a unique way to celebrate the centennial of the famous painter's birth in 1912.

Pollock was a pioneer of abstract expressionism in the middle of the twentieth century, and is best known for the seemingly paint-splashed pieces created during his “drip period.”

“People forget that Pollock was revolutionary in the 40's and 50's, including by making New York City, rather than Paris, the center of the art world,” said Garrido.

After adding their marks, those who contributed to the community painting were encouraged to add their names to a list of participants. Garrido said the list would be used to give credit to all the artists involved in the painting's creation.

WAN plans to show the painting, along with the contributors' credits, during the Bridgeport Art Trail in November. It may also be put out at other upcoming events.

WAN participated in the Bridgeport Arts Fest for the second year in a row, providing arts activities for children and adults as part of its “Artist for a Day” series of events. Also displayed were pieces of art for sale from around the world created by artists who are members of the nonprofit.

WAN's canopies along State Street.
Photograph by Brandon T. Bisceglia.

Friday, July 6, 2012

Tips on Arguing: Inductive Reasoning



In ancient Greece, philosophers and thinkers invented a process for arriving at truths about the world that we know today as deduction. These processes relied on taking general statements about the world and applying various logical rules to them in order to answer particular questions.

Deduction was enormously useful, especially in mathematics. It is deductive reasoning that led to the Pythagorean theorem, a general statement that works with every right triangle you will ever encounter.

Deduction had weaknesses, however, because it could give you answers that did not fit your observations. That was what drove the seventeenth-century English philosopher Francis Bacon to popularize a new way of organizing thought: induction.

Inductive reasoning takes deduction and flips it around. Instead of inventing axioms and applying them to specific observations, induction worked by collecting numerous observations and then deriving general principles from the amassed observations.

By giving precedence to observations over theories, Bacon's empirical approach provided a springboard for a great leap forward in the study of the natural sciences. If you collected 100 observations and 99 of them could not be explained with your current theory, the theory would have to be changed.

The modern scientific method is dependent upon inductive reasoning. However, Bacon himself warned against equating the two. Induction is only half of science. Eventually, enough observations have been collected to develop a strong theory.

At that point, the theory becomes the standard for future observations, thus allowing us to once again use deduction. Our observation of a heavier-than-air jet does not lead us to conclude that gravity is being violated – we know that the plane is in fact operating according to the rules of gravity. We also assume those rules are consistent, or else we could never be sure if the next jet would get off the ground.

Induction is incredibly versatile, and it encourages a healthy skepticism about statements that can't be verified by facts. If used properly, this form of reasoning is one of the best ways to align your ideas and beliefs with empirical reality.

Thursday, July 5, 2012

Important Books: Understanding Media


Marshall McLuhan holding a mirror in 1967.
Photograph by John Reeves through the Library and Archives Canada. Reference: 1980-194, PA-165118, MIKAN 4170003. Some rights reserved.
Marshall McLuhan changed the trajectory of media studies.

During the mid-twentieth century, McLuhan was a Professor of English literature at the University of Toronto. While there, he wrote a number of books that advanced several key concepts critical to modern theories of media, though some of his ideas would not become widely accepted until the spread and adoption of Internet technologies.

In 1964, McLuhan's seminal book, Understanding Media: The Extensions of Man, would launch the professor into the international spotlight. A younger generation thirsty for radical new ideas would embrace his work as prophetic. The established media and academic cultures, however, were more critical of his work. Both would frequently misinterpret his writing.

The main thesis of Understanding Media was that media can be assessed independent of content – that the form a communications medium takes is what determines how it will be used. The first chapter, titled “The Medium is the Message,” opened with a summary of this process: “This is merely to say that the personal and social consequences of any medium – that is, of any extension of ourselves – result from the new scale that is introduced into our affairs by each extension of ourselves, or by any new technology.”

Concurrent with this theme was another, more prophetic notion: that humanity is in transition between two major periods: the print era and the electric age. McLuhan saw the ability of electricity to deliver instantaneous information as the beginning of a paradigm shift that would cause people to slough off the fragmentation and mechanization engendered by the printed word while reconnecting with their tribal roots in a wider “global village” (a term McLuhan coined).

The second half of McLuhan's book tackled individual forms of media and their different impacts on life and culture. He pointed out, for instance, that literary-minded people who are accustomed to linear arguments and cohesive storytelling misunderstand the press, which, he noted, has tended “not to the book form, but to the mosaic or participational form...not a detached 'point of view,' but participation in process.”

In addition to books, television, and other formats that are traditionally thought of as media, McLuhan included chapters on cars, clocks, and electric light.

Ironically, Understanding Media can be difficult to understand, with passages that are at times dense and esoteric. Rather than approach the field in a systematic fashion, McLuhan often embedded his ideas in analogies and anecdotes. As Lewis H. Lapham describes in his introduction to the 1994 MIT Press edition, “quite a few of the notions to which he [McLuhan] off-handedly refers in the early pages, as if everybody already knew what he meant, he doesn't bother to explain until the later pages, often by way of an afterthought or an aside.”

Predictions about the coming electric age were one of the most controversial aspects of Understanding Media. McLuhan, who identified himself as a product of the literary world, nevertheless insisted that that the power of print would soon be overtaken by fundamentally different electric technologies.

Some predictions came true, such as his suggestion that televisions would become common in the classroom and shift the emphasis of education to reflect new expectations about becoming involved with processes, rather than simply learning about them. Other prognostications were slightly off the mark, such as the assertion that the car would be replaced by “electrical successors” within ten years.

McLuhan's predictions for the electric age were unpalatable to some for at least two major reasons. First of all, the dominant media and academic institutions of the time had built their legacies through print media, which McLuhan seemed to claim were doomed.

The second reason for skepticism is more understandable: many of the changes he foresaw were as yet barely imaginable. The first desktop computer was a decade away, and commercial Internet Service Providers wouldn't appear for more than twenty years. Cell phones, video games and Facebook were the stuff of science fiction.

Despite McLuhan's initial fame, much of his work lost traction during the 1970's. It would not be until a generation later that his books would get a fresh look – and now, with a burgeoning online culture, would make a lot more sense.

In fact, McLuhan's once-radical insight into how the forms of media shape our lives is a widely accepted, much-discussed topic today, and is obvious to anyone who has witnessed the changes wrought by Microsoft, Apple or Google. McLuhan, it turns out, truly was ahead of his time.