Wednesday, November 30, 2011

Undermining Journalism

How the Media's Attempt to Adapt Could Kill it Instead

The heart of the New Haven Register's newsroom remains a place of active collaboration, for now. Photograph by Brandon T. Bisceglia.

CNN laid off approximately 50 editors, librarians and photojournalists in a surprise announcement on Nov. 11 as part of a three-year restructuring effort seeking to bring the company in line with changes in “technology and workflow.”

“Technology investments in our newsrooms now allow more desk-top editing and publishing for broadcast and online,” Jack Womack, the company's SVP of domestic news operations, wrote in a note to staff explaining the layoffs. “This evolution allows more people in more places to edit and publish than ever before.”

Womack continued that CNN management had considered the impact of user-generated content, social media and CNN iReporters (who create content based on prompts that the news agency provides) as part of the assessment.

CNN's layoffs are hardly shocking. They are only the latest in a decade-long series of changes to the American media landscape that have squeezed professional journalists into smaller and smaller corners. The closing of foreign bureaus, the shift to 24-hour online news cycles, and the increased reliance on “citizen journalists”: these are all symptoms of a systemic problem with the structure of news gathering and reporting.

The near-consensus is that under the current business model, the expense of producing news is too high to be sustainable. The solution for many agencies has been to trim - or sometimes gut - the newsroom while filling the gap with content created by nonprofessionals.

In that context, CNN's move to outsource its news function to unpaid users makes a certain sense. Video and audio recording technologies are now prolific and familiar to millions of people. Although the production quality of a video shot on an iPhone may not be polished enough for an advertiser's needs, they are perfectly suited for the two-minute throwaway story that a cable news agency can use to keep itself fresh on a slow day. Moreover, it is cheaper and faster than sending a reporter out into the field to cover an event that may be over by the time she or he arrives.

The layoffs prompted a satirical reaction from “The Colbert Report” host Stephen Colbert, who pointed out that CNN iReporters do not get paid.

“They get something even better: badges. Which I assume are redeemable for food and rent,” he said. He went on to promote his own user-generated video feature, which included footage of a colonoscopy, a goat, and a man waiting for a bus.

Colbert's comedic commentary revealed a darker side to this kind of corner-cutting. Volunteers, as eager as they may be, have different incentives guiding them, some of which may be suspect. Even those who mean well often lack the legal, ethical and technical knowledge of paid professionals. And with fewer trained reporters and editors responsible for curating content that they have not generated themselves, the depth, accuracy, and credibility of news can more easily be undermined.

The danger in cutting corners is that it can undercut the very relevance of journalism altogether.

The Shrinking Newsroom

Al Santangelo, news editor at the New Haven Register, has witnessed first-hand the slow collapse of the traditional newsroom.

Half of the offices of the Register are empty, like miniature ghost towns built out of cubicles. The business department is gone. The former design office is an empty room. The award-winning sports department used to stretch across one end of the newsroom, but now consists of four desks and a television.

Santangelo says that the newspaper makes a profit of millions of dollars each year. Yet it could still face further consolidation. There is talk of ditching the newsroom. Reporters would file stories remotely. Instead of printing the paper in-house, production of the actual paper would be outsourced to another location.

His workplace is being stripped in part because the paper's parent, the Journal Register Company, filed for bankruptcy in 2009. Two years later, the Journal Register Company is trying to rebuild its profitability under what CEO John Paton, who took over after the bankruptcy, calls a “digital first, print last” model.

When the Register was under family ownership, which lasted until the 1980's, the amount of money his paper makes would have seemed profitable. As part of a conglomerate driven by shareholder interests, however, the question shifts from one of absolute profitability to whether those profits are rising or falling. It also matters less how an individual publication is doing; profitability for the company is determined by the sum of its parts. In the The Journal Register Company's case, that's more than 350 multi-platform products in 992 communities.

“Not all of those publications make money,” says Santangelo. “Unfortunately, the ones that do end up subsidizing the others.”

The Journal Register Company released a chart Nov. 28 with its vision for the reorganization of the New Haven Register. It's not all bad news: a dedicated investigative reporting team will be created, and several dedicated beats are being added.

Much of the reorganization, though, is reminiscent of the shift that CNN and other news agencies have taken. It calls for aggregation of statewide content, linking out to other content providers, audience-contributed content, and partnerships with local outlets.

It is not likely that any of these steps will lead to much hiring. The investigative and beat reporting teams will be made up of long-standing employees who used to serve other functions in the newsroom, according to the Journal Register's press release.

For instance, the Register's former Business Editor, Cara Baruzzi, will be shifted to head up a new “breaking news team.” On top of covering the area's news, this team will also have responsibility for delivering “a Connecticut-wide curated breaking news report by linking out to other information sources – including The New Haven Independent, members of The Register’s Community Media Lab and sources traditionally viewed as competitors.”

The idea behind sharing content from competitors, says Santangelo, is to create a one-stop shop for the online reader, who would no longer have to jump around from source to source to find out what is going on in the state.

Santangelo is skeptical about the wisdom of that approach, however. He freely admits that such a sharing scheme allows other organizations to have the exact same breadth of content on their own sites. When asked what would prevent people from going to any of those other sources, he shrugs.

“Nothing. There's no loyalty on the Internet,” he says.

Just around the corner from the bustling newsroom at the New Haven Register, emptied cubicles sit in darkness. Photograph by Brandon T. Bisceglia.

Duplication and Verification

The practice of sharing news stories began long before the Internet. The Associated Press wire service is, in essence, a mechanism to allow papers to share news. It has become a staple of the newspaper industry because individual papers rarely have the resources to send reporters to faraway sites to cover every major breaking story.

The AP saves news agencies costs that might otherwise be prohibitive, and this is the key to its long-term success. However, it and other sharing schemes open news agencies to a potential worry: they cannot independently verify the content that they receive.

This trade-off was not much of a problem for newspapers with a local focus through most of the twentieth century. Many communities had multiple papers competing for the same audience, and they did not share with one another. This competition led to incentives for each paper to protect its reputation by being both fair and accurate. The wire services were mainly reserved for stories outside of the paper's coverage area.

The culture began to shift toward the end of the century. Cable news channels that ran 24 hours a day had more space to fill than they could with the amount of content they could afford to produce. Media mergers turned former competitors into colleagues and changed the profit motive.

Then the Internet put the pace of demand for continuous updates into high gear. For large media conglomerates, it simply made no business sense to send three or four reporters to cover the same story when one could run it across multiple platforms.

What makes business sense, though, is based on a monthly or yearly calculation often made by people who live out of state (or out of the country). The reputations of the old journalistic institutions were built on presumptions that they were integral to the communities in which they existed and would be around for decades to come.

In that old world, the unique observational capacities, background knowledge, and community ties of each reporter were valuable commodities. If a reporter from newspaper A wrote about the same event as a reporter from newspaper B, each paper could tout the different creative angle that its reporter would bring to bear on the story. The reader could compare different versions and learn things from one story that the other might miss.

More importantly, the reader could compare the facts in each story. If there was a contradiction, it would hurt the reputation of the paper that failed to properly vet its product. This created an incentive for both parties to be honest and careful about what they published.

In the brave new world of digital media, these incentives have largely disappeared. The encouragement of sharing to prevent story duplication leads news organizations to cite one another as a stand-in for independent verification, which can exacerbate the spread of misinformation.

This game of “telephone” blew up in the face of media professionals in June, when an unverified tip from police about a mass grave in Liberty County, Texas produced a rash of reporting by major media outlets the world over. The mass grave did not exist.

In an investigation by WNYC’s On the Media, it was discovered that the original story came from KPRC, Houston's Channel 2. Liberty County police called the station about a tip they had received from a psychic. They were planning to check it out. Someone in the newsroom posted to Twitter the following message: “Dozens of bodies have been found in Liberty County. Join us for KPRC at 5 p.m. for the latest information."

The Twitter post did not mention a source. Nor was it vetted; the news team had not yet visited the supposed grave site to verify the information.

From Twitter, Reuters picked up the story, citing KPRC as the source. The New York Times cited Reuters as its source. London’s The Guardian ran the Reuters version as well. SkyNews, the BBC and others also passed it around.

One newspaper that did not rely on the media’s rumor mill was the Houston Chronicle, which never said that the story was anything more than an unconfirmed report.

“I don't know how anyone in their right mind or with an iota of professionalism in their veins could have reported such a thing, absent any confirmation from anybody,” said Chronicle reporter Mike Tolson in an interview with OTM host Bob Garfield.

If the worldwide reporting debacle is any indication, Tolson and his colleagues represent a dying breed. The pressures are strong to get a story out now, without first placing a call to a local source for confirmation or sending someone down to check things out.

Many long-time reporters are all too aware of how the change of pace hampers their ability to vet stories. Hartford Courant reporter Christine Dempsey, who has been in the business for 25 years, said in a recent panel discussion with journalism students at the University of New Haven that she often feels uncomfortable with the quality of her fact-checking these days.

She said she had not made any major blunders she knew of. But, she added, “I’ve felt like I was walking a tightrope sometimes.”

Message Manipulation

Dempsey has a legitimate cause for concern. Time and resources are becoming increasingly scarce for smaller teams of editors and reporters, even as the amount of information they have to contend with is growing at an exponential rate.

At its heart, journalism is about selecting the most relevant information. The flipside is that, by necessity, some information is discarded or ignored.

Journalists have developed a number of ethical standards and rules of thumb to make the selection process easier. The system is not perfect, of course. The “equal time” rule may give people on different sides of a conflict a chance to have their views aired, but it can also create a perception of false balance. By giving the same space to mainstream and fringe views, the audience may come away with the perception that both views have similar factual weight or popular support.

These heuristic issues are troublesome, and journalists spend a lot of time debating about how best to resolve them. All of the possible solutions involve spending more time on stories.

Meanwhile, the newsroom is moving in the opposite direction. It is being facilitated in this process by individuals and groups with their own agendas who submit material designed to fit the mold of news production.

Public relations offices are notorious for writing stories “for” reporters, even though it violates the ethical standards of traditional journalism to reprint a press release verbatim. But overtaxed journalists can easily be lulled into believing that getting a slanted story out is better than not producing anything.

The UK charity Media Standards Trust developed a website called Churnalism.com in 2011 to combat the practice of reprinting press releases. Its “churn engine” allows readers to paste stories and find out how much of them are grafted from press releases.

Independent filmaker Chris Atkins developed some fake news releases of his own and sent them out to the press after speaking with Martin Moore, director of the trust. One story explained a new "chastity garter" that contained a microchip that would send text messages to a woman's partner if she was cheating on him or her. The story became “most read” on the website of the Daily Mail, and made headlines across the US and the UK before the hoax was revealed.

Slanted reporting need not come from outside sources. In the US, FOX News and MSNBC are well-known as mouthpieces for the political right and left, respectively. Though their audiences are smaller than the controversies that surround them, a growing proportion of the population turns to them for news that fits their views. The Internet is also a great boon for the echo chamber, in which people can seek information that confirms their preconceived notions without having their biases challenged.

Media fragmentation has led to a further erosion of an adherence to facts or fairness. The widespread adoption of these ideologically-driven approaches to reporting is relatively new, and determining their influence is a daunting task. Older people still overwhelmingly consume middle-of-the-road media from network television and newspapers. They may not agree with everything printed in their local papers, but they are used to formats that stress accuracy over assertion.

Veteran journalists like Dempsey and Santangelo are also loathe to violate the news gathering values they were taught. For the moment, they act as a bulwark against a tide of editorializing.

New Normal?

What about younger people, many of whom are growing up in a world where selective exposure is the norm? Can they distinguish between objectivity and spin, and do they care?

The relative prosperity of left- and right-leaning blogs and online news sites is one discouraging indication of a trend toward greater polarization. The Huffington Post is unabashedly liberal, while Andrew Breitbart and James O'Keefe have made their marks by promoting a conservative agenda.

Though politically partisan media has always been lurking on the edges of society, its new-found prominence has been accompanied by a remarkable willingness to dispense with standards of objectivity for the sake of rocking the proverbial boat. O'Keefe in 2009 sparked a national debate over what it means to be a journalist when he released a video purporting to show him and college student Hannah Giles getting advice from workers at the Association of Community Organizations for Reform Now devising ways to hide sex trafficking and and avoid taxes.

The corruption that O'Keefe and Giles uncovered may have been real. However, their methods of “news gathering” were disingenuous, unethical, and broke laws in several states.

Attorneys general in California and Massachusetts, the District Attorney's office in Brooklyn, and the US Government Accountability Office all conducted investigations into ACORN's actions. In the process, they reviewed the unedited versions of O'Keefe's and Giles's videos. In every case, they found the videos to be heavily doctored and thus absolved ACORNof any wrongdoing.

The truth did not matter to O'Keefe or Giles. They were out to attack ACORN for what they perceived to be a left-wing agenda.. They succeeded in devastating the organization, causing it to file for Chapter 7 liquidation in 2010. Most of its offices were closed.

O'Keefe was the mastermind behind the project, and he convinced Giles to pursue it with him to further her own career. When the story became a national sensation, she appeared on FOX News's “The O'Reilly Factor,” where host Bill O'Reilly called her an “undercover reporter.” He characterized ACORN's lawsuit against Giles as a “revenge play,” without ever offering the agency's rationale for its actions.

During the interview, a clearly excited Giles said she had always “wanted to be an investigative journalist.” Now, with the nationally recognized figure of O'Reilly validating her actions, she was surely convinced she had done what every good reporter should in the pursuit of a story.

And perhaps, with the new norm allowing newsmakers to violate every journalistic ethic, she will find a place to thrive. Such a norm would benefit the business owners, who know that sex and scandal pad the profit margins, regardless of how it's created. It would benefit public relations firms, because they can more easily infiltrate a media culture that doesn't concern itself with telling the whole story. It would benefit political ideologues, who prefer propoganda to balance.

The only question, then, will be: who will serve the public once the old guard dies off?

Sunday, November 27, 2011

Greenhouse Gases Reach All-Time High

This NASA image shows the nearly ice-free McClure Straight in northern route of the Northwest Passage in August 2010. The famed passage was almost completely clear, with the exception of a band of ice in the straight (far left).
Image: NASA/GSFC/Jeff Schmaltz/MODIS Land Rapid Response Team. Some rights reserved.

Atmospheric concentrations of carbon dioxide reached an all-time high of 389 parts per million in 2010 and rose at a faster pace than in previous years, according to a report issued Nov. 21 by the World Meteorological Organization, the U.N.’s weather agency.

The WMO's Greenhouse Gas Bulletin says that global CO2 levels are now 39 percent higher than they were at the start of the industrial revolution in 1750, when levels were at approximately 280 ppm. Those concentrations had remained relatively stable for 10 thousand years previously, according to climate researchers.

Carbon dioxide levels rose at a rate of 2.3 ppm between 2009 and 2010. That was faster than the average rate during the previous decade of about 2.0 ppm per year, and a significant acceleration compared to the average during the 1990s, when concentrations rose about 1.5 ppm per year.

The annual WMO report assessed the burdens and rates of several other greenhouse gases that are released by human activity, including methane and nitrous oxide. Methane is considered the second-most potent contributor to global warming. It increased 158 percent since 1750, from 700 parts per billion to 1808 ppb in 2010. Nitrous oxide increased 20 percent over the same period, from 270 ppb to 323.2 ppb.

“The three primary greenhouse gases are not only closely linked to anthropogenic activities, but they also have strong interactions with the biosphere and the oceans,” the report said.

WMO Deputy Secretary-General Jeremiah Lengoasa said in an interview with the Associated Press that although human emissions of greenhouse gases are directly related to increasing temperatures, there is a time lag between the two.

“With this picture in mind, even if emissions were stopped overnight globally, the atmospheric concentrations would continue for decades because of the long lifetime of these greenhouse gases in the atmosphere," he said.

At least a small amount of that carbon will not be locked back into the earth for hundreds of thousands of years.

The WMO report comes on the heels of a summary report on risk assessment issued Nov. 18 by the Intergovernmental Panel on Climate Change, which warned that, under the groups “high emissions scenario,” the frequency of hot days will increase by a factor of 10 in most regions of the world.

“Likewise, heavy precipitation will occur more often, and the wind speed of tropical cyclones will increase while their number will likely remain constant or decrease,” said Thomas Stocker, Co-chair of Working Group I in the summary.

Another study released in the Nov. 24 issue of the journal Nature provided the first evidence that the duration and magnitude of the current decline in Arcitic sea ice seem to be unprecedented for the past 1,450 years. Previously, the extent of ice loss was only known for last four to five decades, and questions remained about how much loss was due to natural variability. The researchers used land-based core samples to develop climate proxies so they could estimate the extent of the ice over a much longer period. The results suggest that Arctic ice loss is indeed being driven by manmade warming.

It remains to be seen whether the slew of new studies will make a difference in the stalled international negotiations to develop a comprehensive strategy to replace the Kyoto Protocol, which expires next year. Governments begin meeting for the seventeenth meeting of the United Nations Framework Convention on Climate Change on Tuesday in Durban, South Africa.

Countries met in Copenhagen in 2009 and again in 2010 in CancĂșn, Mexico to hammer out a new agreement, but made little progress toward a comprehensive treaty anything like Kyoto. The U.S., by far the highest per-capita emitter in the world, was the only nation out of 192 members never to ratify the treaty.

Sunday, November 20, 2011

The Birds


A humongous flock of birds made a racket in the trees at the edge of Ash Creek in Fairfield on a cloudy afternoon this week.

When my wife and I heard the clamor from our kitchen, we went to the backyard to see what was happening. There we witnessed hundreds of birds perched on practically every branch of three or four trees, chattering with one another.

The chirping lasted for almost half an hour. Then, as suddenly as they had arrived, the flock took flight in near-unison, leaving only silence in their wake.

You can listen to a portion of the chirping and the moment of flight by clicking on the player below.

Photographs by Valeria Garrido-Bisceglia.

Wednesday, November 16, 2011

Important Books: The Pilgrim’s Progress


Engraving from The Pilgrim's Progress, published in London, 1778. 
Pilgrim enters the wicket gate, opened by Good-Will. Public domain image.

John Bunyan’s seventeenth-century book is the allegorical tale of Christian, a humble pilgrim, on his journey from the town of Destruction to the Coelestial City, where God resides. In its day, this classic sold more copies than any other book except the Bible. It was particularly popular among the settlers of the colonies in New England, who commonly referred to themselves as "pilgrims."

Bunyan's book was meant to be a defense of his religious beliefs, and was written for the most part while he was in prison for refusing to conform to the mandated Anglican practices of the time. His Calvinist/Lutheran brand of religiosity assumed that it was the privilege of an elect group to enter into God's court - a group primarily composed of the poor and oppressed.

Many of his characters were meant to represent other sects of Christianity. He took frequent jabs at the Quakers and the Catholics. Over and over, The Pilgrim's Progress refutes the values of the elite culture of Bunyan's time, which was swiftly moving towards the naturalistic/materialist worldview that put England in a prime position for the Industrial Revolution.

The first book, which follows Christian exclusively, was so popular that imitations and fake sequels sprang up all over. There were some criticisms, though, concerning Christian's leaving his family behind (there were many women in Bunyan's congregation), as well as the esoteric nature of some of the symbolism used. In response to all these pressures, Bunyan wrote a second part, wherein Christiana and Christian's four children traverse the Way. This volume is typically included along with the first book in modern printings.

The second part attempts to explain the meanings behind Christian's travails while demonstrating the roles that women, children and others could play according to Bunyan’s theology. It suffers from certain faults, particularly in its allegorical style, which becomes strained and even nonsensical at times. Nevertheless, the two tracts are integral to one another. The first is captivating, and the second is necessary.

Monday, November 14, 2011

The Proper Role for Skepticism on Climate Change


Multiple groups of scientists have tracked the rise of temperatures on Earth over time. Working independently, they end up with results that are largely in agreement with one another. Prominent physicist and skeptic Richard Muller has now added his own analysis to the mix, and found that it conforms to the warming trend seen here. Image courtesy of NASA EarthObservatory/Robert Simmon.
Another high-profile skeptic of global warming has changed his mind.

Physicist Richard Muller announced at the end of October that his team at the University of California atBerkeley completed an analysis of climate data reaching back over 200 years. Their conclusion? The Earth has warmed about 1 degree Celsius since 1950. Their findings, which are available online in a draft paper, confirm what the National Academy of Sciences and other scientific groups have been saying for years.

Muller's findings are nothing new to climatologists, and they may not convince the entrenched climate change deniers who are more interested in ideology than observation. For the majority of non-expert fence-sitters, though, Muller's work is extremely important, precisely because it shows the proper role for skepticism in climate research.

Muller is a respected scientist. Although his skepticism about the evidence for climate change has long been used by climate change deniers to bludgeon their opponents, he was more interested in pursuing the facts. He had worried that some weather stations (where the raw data on climate is collected) were more sophisticated or accurate than others. He had worried that the rise of massive cities in the past century could have skewed some of the results.

These were legitimate questions for a scientist to look at. Even though other teams of climatologists claimed they had accounted for such factors, there is nothing wrong with replicating research. That is one of the cornerstones of the scientific method. It's the reason science is more useful than simple belief: no matter who does the experiment, they will get the same result.

Muller was prompted to embark on this study after the 2009 “Climategate” fiasco, in which an international team of climatologists were accused of changing data to fit with their understanding that the world was warming. Investigations into the team's activities by multiple groups, including the British government, revealed that their data was solid. Nevertheless, politicians and average people, especially in the United States, were increasingly convinced that the world might not be warming. Muller's study puts that scandal to rest.

Deniers still have some wiggle room. Muller says that he still isn't sure whether or not the planet's warming is being driven by humans, and his study did not answer that question.

There will still be plenty of people who claim that global warming is entirely natural and that human emissions of carbon dioxide don't matter. It's sad that such a belief persists despite the fact that scientists have known for centuries that greenhouse gases change atmospheric temperatures. It was all the way back in 1896 that Swedish scientist Svante Arrhenius first pointed out that human emissions of carbon dioxide were warming the Earth.

Mounds of evidence have built on Arrhenius's findings over the last hundred years. The basic mechanism that causes warming has never been contradicted, only refined.

There are still real scientific questions to pursue. Will warming lead to increased clouds that offset some of the heat on the ground? Will reservoirs of methane (an even more potent greenhouse gas) trapped in frozen lakes push warming past some “point of no return” as the lakes thaw? How much carbon dioxide can the oceans absorb, and how will that impact sea life?

These problems are being worked on by scientists now. They may overturn some of our current understanding of the details, but they won't change the fundamental role of human emissions.

In the meantime, the rest of us need to stop fighting about whether or not global warming is real. The whole time that we've been embroiled in an unproductive ideological battle, the planet has been changing. We need to start applying skepticism to the right questions about how we will deal with a reality that does not care if we believe in it.

Friday, November 4, 2011

Though Aristotle Supported Slavery, His Philosophy Would Abolish It

The hands of the marble statue “The Greek Slave,” carved in 1844 by Hiram Powers. Photograph by Zack Lee. Some rights reserved.


Aristotle allows for slavery in his classic account of virtue. Some philosophers criticize Aristotle’s ethics as inconsistent for this reason. However, Michael J. Sandel, the Anne T. and Robert M. Bass Professor of Government at Harvard University, argues that Aristotle’s criteria are strong enough to reject slavery on their own grounds. Sandel is correct: if applied rigorously, slavery would not persist under Aristotle’s specifications.

Aristotle’s position on slavery hinges on two requirements: necessity and suitability to nature. Slavery is necessary in that society needs a division of labor to function. It is suitable to some peoples’ natures in that they are “capable of becoming…the property of another, and if he participates in reason to the extent of apprehending it in another, though destitute of it himself (1).”

The function of society does demand a division of labor, but it can be accomplished without the institution of slavery. Many countries operate without slaves today under democratic regimes. Men share child-rearing responsibilities with their spouses. Labor-saving technologies have reduced the time it takes to perform many household activities. Low-wage jobs offer compensation for even the bottom tiers of workers (though other inequalities persist in this system).

Aristotle could not have known of the societal institutions and technologies that have eroded slavery. They had not yet been developed in his time. Examples of advanced slave-less cultures were hard to find. As Sandel notes, “It’s worth recalling that these injustices persisted for more than two thousand years after Aristotle wrote (2).”

Aristotle’s second requirement is teleological; it posits that at least some people are naturally suited to slavery. But it also acknowledges that not all enslaved people are suited for that kind of a life. “Not all those who are actually slaves, or actually freemen, are natural slaves or natural freemen,” he writes (3).

To sort the natural from the unnatural slaves, Aristotle proposes a practical test. Sandel describes this as seeing “who chafes in the role or tries to flee (4).”

Few groups throughout history have actually offered slaves the ability to choose their position. Part of the perceived “suitability” to slavery was instilled via a systematic oppression by the ruling classes that went beyond mere physical abuse. In post-Civil War America, “freed” slaves faced a lack of education, a lack of money, and a forced segregation that blocked access to opportunities – what sociologist W.E.B. Du Bois called “the problem of the color line” in 1903 (5).

Du Bois found many freed slaves to be more destitute in their new condition. He also pointed out, however, that a small proportion of African Americans were already excelling. He expected the number of blacks who succeeded to rise as American institutions and attitudes continued to reform (6).

Today, enough former slaves have demonstrated abilities to reason and participate in the polis to convincingly suggest that no human is naturally suited to slavery. If any twenty-first century American was placed in slavery, he or she would no doubt “chafe in the role.” Coercion, which Aristotle says is a sign of injustice, would have to be applied to create a new slave class. It would fail the test of justice under his own rubric.

Sandel points out that teleological reasoning like Aristotle’s might in fact lead to a more powerful indictment of slavery than that of the modern liberal ethic in that it would also claim some freely chosen jobs to be unjust, because they are “so dangerous, repetitive, and deadening as to be unfit for human beings (7).”

Sandel’s example shows why teleology, at least when applied to certain aspects of human life, remains a powerful tool for moral philosophy. Its emphasis on purposes are open to being revised based upon the introduction of new empirical evidence, which can shape our understanding of what the inherent natures are of peoples and institutions.

Aristotle’s philosophical shortcomings are less the result of Aristotelian inconsistency and more the result of limits to the perspectives available in ancient Greece. Only now that societies have tried living without slavery is the evidence available that it is neither necessary nor natural.

References:

1. Aristotle. The Politics. Translated by David Ross. (New York: Oxford University Press, 1925) Book I, chap. v, 1254b
2. Sandel, Michael J. Justice: What’s the Right Thing To Do? (New York: Farrar, Straus and Giroux, 2009) 200
3. Aristotle. The Politics. Translated by David Ross. (New York: Oxford University Press, 1925) Book I, chap. vi, 1255b
4. Sandel, Michael J. Justice: What’s the Right Thing To Do? (New York: Farrar, Straus and Giroux, 2009) 202
5. Du Bois, William Edward Burghardt. The Souls of Black Folk. (Chicago: A.C. McClurg & Co.; [Cambridge]: University Press John Wilson and Son, Cambridge, U.S.A., 1903) XXXI
6. Du Bois, William Edward Burghardt. The Souls of Black Folk. (Chicago: A.C. McClurg & Co.; [Cambridge]: University Press John Wilson and Son, Cambridge, U.S.A., 1903) 101-102
7. Sandel, Michael J. Justice: What’s the Right Thing To Do? (New York: Farrar, Straus and Giroux, 2009) 203