Saturday, November 16, 2013

The Blindness of Science


Left: Sir John Kendrew assembles a molecular model of myoglobin. Right: A computer-rendered 3-D model of myoglobin. Kendrew photo courtesy of MRC Laboratory of Molecular Biology. Used under a CC BY 2.5 license. Myoglobin image by Aza Toth. Public domain image.

What we do not see determines what we know at least as much as what we do see. Science is no exception to this rule. It is as subject to the vagaries of social, political and other contingent forces as any other subject.

One of the classic examples of how expectations shape outcomes is the “invisible gorilla” experiment, in which subjects are asked to count basketball passes between two players. During the passes, a person in a gorilla suit walks onscreen, looks at the camera, and pounds her chest before exiting the scene.[i] Only about half the people who watch the video see the gorilla. The demonstration gave rise to the term “inattentional blindness” – that is, people see what they expect to see, often at the cost of noticing more compelling information.[ii]

The study of science’s history and institutions is replete with examples of a given viewpoint resulting in a particular set of practices or interpretations. Both material and social factors play a role in shaping these viewpoints.

Materiality has a profound effect on science. Christoph Meinel points out that three-dimensional stick-and-ball models, which were ubiquitous in molecular research before the advent of sophisticated computer programs, were a translation of the chemist's vision as a “builder of a new world out of man-made materials.” Eventually, the models took on a greater sense of the “reality” of molecular structure for these researchers than the actual chemicals.[iii]

The predominance of physical molecular models had a major impact on the graphics programs that replaced them. X-ray crystallographers demanded the ability to manipulate the structures they were working with in real time, and computer developers took pains to build this sense of physical manipulation into their programs. Now, as then, crystallographers incorporate a strong sense of embodied ownership into the work they do on molecular structures. No one, they feel, can know their molecules the way they do. The tacit knowledge they gain from their projects is something to which other scientists, who eventually come to work with these same molecules, are blind.[iv]

Blindness finds its way into the scientific process through social structures in many forms. Any student working in a lab toward a Ph.D. soon discovers that, throughout her undergraduate years, she has been presented with experiments that reinforce the notion that science is straightforward work with a high success rate. These impressions are dashed when she begins doing independent work and finds out that the majority of day-to-day science fails.[v]

Science historian Robert E. Kohler argues that the cultural spaces of science laboratories themselves actively shape what goes on inside them, and can be broken down broadly into distinct early modern, modern, and postmodern styles that broadly reflect the elite social sensibilities of the times in which they are built and used.[vi]

Language and the social milieu very much inform the impressions people have about seemingly scientific phenomena. Definitions have practical implications. The term “child abuse,” for instance, was not invented until the early 1960s. It eventually won out over the term “battered child syndrome.” The latter term did not include actions commonly recognized as abuse today, such as sexual touching or neglect. The meaning of “child abuse” has therefore been able to expand to encompass many more types of activity than previously used terms, and has shifted significantly since its inception the moral, judicial, and medical reactions used to deal with it.[vii]

Material and social characteristics often shape the practice of science simultaneously. As in the transition from moveable molecular models to manipulable computer graphics programs, social judgments about how a procedure “should feel” can introduce path dependency into new technologies.

Early music synthesizer technology demonstrated this phenomenon particularly well. Two rival inventors, Robert Moog and Don Buchla, created machines to reproduce musical sound. Buchla did not standardize his synthesizers, seeing them as a means for an exploration of the avant-garde. Moog made his inventions easy-to-use, and even built them so they could play using the familiar piano keyboard. Moog's more recognizable device succeeded, whereas Buchla's faded.[viii] Moog's success had nothing to do with technical superiority; he simply paid more attention to what other people wanted and allowed those social forces to modify his instrument.

Even historical judgments about the practice of science change depending on which aspects one pays attention to. The “distortionist” camp of science historians, for instance, tends to portray the militarized science of the Cold War period as fundamentally perverting the scientific process. Yet this was not the case for seismology, as science policy expert Kai-Henrik Barth points out. While military programs invested heavily in the field, the research agenda for seismology remained largely unchanged before, during and after this influx. As Barth notes, the distortionist view assumes a normative position based on unknowable speculation about how science would have progressed without military patronage.[ix]

With the myriad opportunities for science to be blinded, should we therefore lament that we cannot be absolutely sure of anything we know? No. The foundation of science is provisional truth; its success rests on the constant reevaluation of seemingly resolved questions. This is where new vistas open, where discoveries challenge former dogmas. In those moments, the gorilla suddenly becomes visible.


[i]           Christopher Chabris and Daniel Simons, The Invisible Gorilla: How Our Intuitions Deceive Us (New York: Broadway Paperbacks, 2009), http://www.theinvisiblegorilla.com/, 8-23-13; http://www.theinvisiblegorilla.com/videos.html, 10-5-13;

          Manohla Dargis, “What You See Is What You Get,” The New York Times (July 10, 2011), AR13, http://www.nytimes.com/2011/07/10/movies/why-difficult-movies-are-more-um-difficult.html?pagewanted=all, 10-5-13;

         Anna Maerker, "Review: Why Do They Look Like That? Three-dimensional Models in Science," Social Studies of Science 37 (2007), 961-965, http://sss.sagepub.com/content/37/6/961.full.pdf+html, 10-23-13;


         Natasha Myers, "Molecular Embodiments and the Body-work of Modeling in Protein Crystallography," Social Studies of Science 38 (2008), 163-199, http://www.jstor.org/stable/25474573, 10-23-13;


          Sara Delamont and Paul Atkinson, "Doctoring Uncertainty: Mastering Craft Knowledge," Social Studies of Science 31 (2001): 87-107, http://www.jstor.org/stable/285819, 10-24-13;


           Robert E. Kohler, “”Lab History: Reflections,” Isis 99 (2008), 761-768, http://www.jstor.org/stable/10.1086/595769, 10-5-13;


        Ian Hacking, “The Making and Molding of Child Abuse,” Critical Inquiry 17 (1991), 253-288, http://www.jstor.org/stable/pdfplus/1343837.pdf, 10-18-13;


       Trevor Pinch, "Technology and Institutions: Living in a Material World," Theory and Society 37 (2008), 461-483, http://www.jstor.org/stable/40345597. 10-18-13;


         Kai-Henrik Barth, “The Politics of Seismology: Nuclear Testing, Arms Control, and the Transformation of a Discipline,” Social Studies of Science, 33, 5 (Oct. 2003), 743-781, http://www.jstor.org/stable/3183067, 10-11-13.           

No comments: