Data alert! A bit about graphs, maps, images, risk, statistics and uncertainty


Graphics and visuals like maps, charts, and timelines make information easy to understand and process. What might take paragraphs can be summarized in one image. Now, online graphics can be interactive, allowing readers an opportunity to explore data. Graphics can also be misleading.

Is the time scale appropriate for the trend being presented? Does the graph show all of the data, or a narrow window to convey a skewed picture? There is always more evidence than what is presented or published, but the key issue is whether evidence selection has compromised the true account of the underlying data (Tufte 07).

In maps, “large scale” means zoomed-in, detailed. “Small scale” means zoomed out, general. Is the type of map appropriate for the data being presented?

Look at the categories and the legend. Maps can be manipulated to show what you want.

Photos are easily mismatched to the text and the headline.

Risk is a possibility that something might happen or bring about some result.

High probability = high predictability (event more likely).

Low probability = low predictability (event less likely).

People (and sometimes the media) tend to overestimate the danger of rare events yet underestimate dangers of more common events. People tend to misjudge the relative risks from food safety issues, for example ranking pesticide residues as posing a much greater threat to human health than harmful microorganisms or an unhealthy lifestyle (lack of exercise, poor diet). Yet the statistics show that people are far more likely to die from lifestyle-related diseases such as coronary heart disease and cancers.

In fact, the top causes of death in US, according to the Centers for Disease Control and Prevention, are 1. heart disease, 2. cancer, 3. stroke.

Perceptions and knowledge of risk depend on whether the risk is individual, community, or societal. People tend to overestimate the role of forces inside the individual, such as personality, ability, disposition, and motivation, as causes for human behavior and to underestimate the role of environmental or situational factors, such as the varied opportunites and obstacles that exist for people in different social classes. When applied to whole groups, these attribution errors become the basis for sterotypes.

People tend to assume that if they can control a situation they are safer. We fear dying on airplane more than in a car crash, yet the number of traffic accident fatalities is much higher. Perhaps this is why we fear man-made disasters (radiation) more than natural disasters (tsunami). Trace amounts of radioactive iodine are being detected in rain over the US (CA and VT), but each news story is quick to point out how the levels are low and not a risk, but few offer any comparison to everyday risk.

People are more worried by dramatic but infrequent events than by “boring” risks like slipping on a wet floor. And alarmist, dramatic media coverage contributes to false risk perception. Take, for example, the shark attack. Fueled by Jaws and now Shark Week, our fears of sharks are conditioned. Bees, wasps and snakes are responsible for far more fatalities each year. In the United States the annual risk of death from lightning is 30 times greater than that from shark attack. For most people, any shark-human interaction is likely to occur while swimming or surfing in nearshore waters. From a statistical standpoint the chances of dying in this area are markedly higher from many other causes (such as drowning and cardiac arrest) than from shark attack. Many more people are injured and killed on land while driving to and from the beach than by sharks in the water. Shark attack trauma is also less common than such beach-related injuries as spinal damage, dehydration, jellyfish and stingray stings and sunburn. Indeed, many more sutures are expended on sea shell lacerations of the feet than on shark bites! (International Shark Attack File)

Second example: Avian flu caused 200 deaths in 5 years, with an unlikely possible mutation (from guts of birds to lungs of humans) resulting in a horrendous pandemic, hence alarmist media coverage. But as many as 40,000 people die each year from common seasonal flu. (Wulf 2010).

Risk is the result of events, conditions and situations, called “risk factors.” Where a risk factor has been consistently linked to an event or situation, the factor is said to “cause” death or illness: HIV causes AIDS, asbestos causes mesothelioma, cigarette smoking causes lung cancer.

With these well-proven exceptions, it is difficult to show that any one thing “causes” cancer because cancer doesn’t appear immediately after exposure, providing time for other factors to come into play. Without a direct cause and effect relationship, there are only associations, strong relationships, between a result/disease and an agent/situation, or risk factor. A risk factor is not a guarantee, not a cause, just an association, like high cholesterol and heart disease.

An association does not, by itself, indicate causation. Additional evidence is needed: the event must come before the result, and that other explanations are considered and ruled out. As humans, we seem wired to look for patterns, to want to explain things, hence our tendency to assume causation. But remember: Correlation does not equal causation.

Statistics attempt to quantify risk. But statistics are frequently misused and abused. All research involves choosing what to study and how to study it. Statistics, when applied to data, measure the strength of relationships. The greater the significance, the stronger the relationship, or the less chance that some other factor is important in explaining the relationship.

Where we have considerable knowledge of outcomes, we have an objective probability for a given outcome. In a coin toss, we do not know which face will turn up when it is tossed, but we have objective probabilities of what it will likely be. In complex systems, with many interconnected parts, scientists are often uncertain about the extent and magnitude of the connections. As a result, they have to make judgements about their strength, which is a subjective probability (Stephen Schneider, in Friedman et al.).

The most believable results will have certain characteristics: (Cohn)

Replication: They have been successfully repeated

Reevaluation: They have been tested by more than one method (mathematical technique)

Common attacks on statistics create the impression of numerous errors. Something is wrong with every sample, and pointing this out can begin the unraveling of any argument: the data are outdated, unrepresentative, missing, outliers. The r-squared value x only explains 100-x of the data. The scientist chose the wrong model (linear, non-linear, random, etc.). When additional variables are included, the results become insignificant. Other factors can result in the same effect. Any inconsistency or complication in the data are deliberately obscured or omitted–cast the perception of doubt. (Murray)

With our minds and our worlds filled with uncertainties and our days filled with only 24 hours, we often fall back on judgemental shortcuts, called heuristics, to make sense of things. People reconcile what they see and hear with what they already know from personal experience, friends and family, religious beliefs, political orientation, values, etc.

If someone tells us that things are uncertain, we think that means that the science is muddled. Uncertainty is everywhere, and leads to errors in interpretation. All too often, health benefit and risk statements are presented as if they were authoritative, definitive, and based on clear and compelling evidence. The result? An Illusion of Certainty.

Scientists do not just reduce uncertainty, they actively construct it. They look for problems in their own work by asking questions and probing for gaps and alternative explanations. Uncertainty is different than indeterminacy (when all the parameters of a system and their interactions are not known) and ignorance (when it is not known what is not known). Uncertainty means that the parameters are sufficiently known to make a qualitative judgement or attempt a conclusion; there is no such thing as absolute proof. Doubt (or curiousity or skepticism) is crucial to science (to a scientist, claiming or acknowledging uncertainty maintains an appearance of objectivity) but it also makes science vulnerable to misrepresentation. Uncertainty can appear as controversy, because it is easy to take uncertainties out of context and create the impression that everything is unresolved and thus plant seeds of doubt in the reader’s mind (Oreskes and Conway).

Another contributor to the illusion, as we’ve seen, is the habit of the news media to report research as “news,” presenting research findings out of historical and scientific context as new, very preliminary, and potentially groundbreaking. Reports can celebrate the finding, and downplay uncertainty. The accounts of each new project makes it appear to readers that scientists are much more uncertain than they actually are. Today’s news is easily contradicted by tomorrow’s reports. Other reports may emphasize early differences of opinion among scientists, highlighting uncertainty. Science is portrayed as a triumphant quest for certainty: the answer to a question, the solution to a puzzle, keys to unlock the door to knowledge, clues to a mystery. Often, the public is offered a view of the future in which scientific certaintly returns: “Researchers hope to be able to predict the behavior of hurricanes more precisely”; “By improving their understanding of X, researchers will solve problem Y.” (Zehr and Stocking, both in Friedman et al.)

Watch out for these phrases, or at least think about it before you use them. This is the challenge: how to communicate the ‘so what’ without claiming future certainty?

– Think about the outlet and the audience, and select your topic carefully. If the so what is a stretch, maybe don’t write the story.

– Interview others. A caution: the presence of multiple voices in a media story about emergent science allows the reader to glimpse the degree of consensus, yet it may be difficult for readers to evaluate. Are the uncertainties so great that reasonable people cannot come to a resolution? Is the finding so novel that other scientists simply have no useful expertise? With the Internet, readers can assemble meaning themselves by cobbling together stories about the same topic from a variety of places and times. If you cannot tell who is telling the truth or where the consensus lies, then the best you can do is accurately capture the message and attribute it. Or, you can present an array of viewpoints and let the reader decide (or feel overwhelmed) “This focus on the journalist as a passive transmitter allows us to make accuracy the most important characteristic of a story and often to bypass issues of validity all together…the objectivity norm urges journalists to leave their own analytical skills at home and to concentrate, instead, on conveying what they see and hear…if journalists are normatively limited to reporting rather than interpreting, then audiences are left to sift through the dueling representations of uncertainty themselves” (Friedman et al.).

– Explain changes in certainty or consensus. This requires historical context and knowledge of particular fields, and may be harder for a science generalist than for someone who specializes in certain subjects.

– Look at why people may be promoting or challenging uncertainty. We will look at this issue in more detail in a few weeks. If you say, ‘There is no evidence’, do you mean, ‘There are no studies done on X’, or, ‘There are lots of studies out there, and they show no risk of X causing Y’?

– Watch the use of anecdotes and false “trendsetting”. Anecdotes can be fine examples, but they are usually poor evidence. To a social scientist, what seems like a great interview with printable quotes is a convenience survey of an unrepresentative sample. Vivid anecdotes can interfere with a person’s judgement of risks (Griffin, in Friedman et al.) Make sure your examples are representative.

References

Best, Joel. 2001. Damned Lies and Statistics. Berkeley: University of California Press.

Best, Joel. 2004. More Damned Lies and Statistics. Berkeley: University of California Press.

Best, Joel. 2005. Lies, calculations and constructions: beyond How to Lie with Statistics. Statistical Science 20 (3):210-214.

Cohn, V. 1989. News and Numbers. Ames, IA: Iowa University Press.

Cope, Lewis. 2006. Understanding and using statistics, pp. 18-25 in A Field Guide for Science Writers, 2nd edition.

Drum, Kevin. 2010. Statistical Zombies. MotherJones.com

Friedman, S.F., S. Dunwoody, and C.L. Rogers. 1999. Communicating Uncertainty. Mahwah, NJ:Lawrence Erlbaum Associates.

Gould, Stephen Jay. The Median Isn’t the Message.

Huff, Darrell. 1954. How to Lie with Statistics. New York: W.W. Norton

Monmonier, Mark. 1996. How to Lie with Maps (2nd Ed.) Chicago: The University of Chicago Press.

Monmonier, Mark. 2005. Lying with maps. Statistical Science 20(3):215-222.

Murray, C. 2005. How to accuse the other guy of lying with statistics. Statistical Science 20(3): 239-241.

Niles, Robert. www.robertniles.com/stats/

Oreskes, Naomi, and Erik M. Conway. 2010. Merchants of Doubt. New York: Bloomsbury Press.

Rifkin, Erik, and Edward Bouwer. 2007. The Illusion of Certainty. New York: Springer.

Tufte, Edward R. 1983. The Visual Display of Quantitative Information. Cheshire, CT: Graphics Press.

Tufte, Edward R. 2006. Beautiful Evidence. Cheshire, CT: Graphics Press.

Tufte, Edward R. 1997. Visual Explanations. Cheshire, CT: Graphics Press.

 

Advertisements

Science Stereotypes


“Scientific discoveries are made by people; they don’t just happen,” wrote Ruth Levy Guyer in our textbook. Who are these people?

madscientistThe scientist is often portrayed as an isolated man in a laboratory driven by insanity, greed, or selfishness. He’s the awkward nerd spewing physics jargon in the face of a pretty girl. He’s the scatterbrained teacher who accidentally creates a monster or invents a miracle. She’s the hysterical woman whose message will not be heard. How can writers make scientist characters more accurate, diverse, interesting, and effective? Why should writers care about the truthfulness of their real or imagined scientists? One of ways to improve the coverage of science in the news media is to focus stories on how science works. And one of the best ways to illustrate how science works is to show scientists doing science. This approach also appeals to the human desire for narrative structure, for a protagonist, for action in stories.

The Harris Poll consistently finds scientists near the top of the “most prestigious occupations,” after firefighters and above doctors, nurses, teachers, and military officers. The percentage of Americans who say scientists are ‘odd and peculiar’ has dropped, although one-quarter still agree (Losh 2010).

So why do scientists have such a bad image?

According to Chris Mooney and Sheril Kirshenbaum in their book Unscientific America, there’s something about scientists that triggers a particular kind of stereotyping, and that this reflects our society’s uneasiness with the power they can sometimes wield.

As Stephen Shapin recently noted, “the modern American scientist is held in some esteem, valued as a useful sort of person, but there is little understanding of what it might be to engage in scientific inquiry for its own sake and little evident approval of such a thing.”

alchemistRoslynn Haynes, a professor of English in Australia, identified seven primary stereotypes of scientists. According to Haynes, “the master narrative of the scientist is of an evil maniac and a dangerous man.” These stereotypes provide a useful framework for thinking about humanizing science writing.

The evil alchemist. Alchemy began with metalworkers in Egypt. When it was translated from Arabic writings to medieval Europe, it was associated with heresy and the black arts: the sinister magician, the devil’s worker, illegal, proud, arrogant, secretive, power-hungry. Dr. Faustus and Victor Frankenstein continue to provide metaphors for modern, cutting-edge research. The alchemist appears in plots that depend on the supernatural and the paranormal—stories in which the credulous believer is always right and the scientist-skeptic is always wrong.

spockThe noble scientist. The first literary work to depict scientists in a positive light was Sir Francis Bacon’s utopian vision, New Atlantis (1627), which depicted the scientist as an altruistic idealist. Star Trek’s rational Mr. Spock brings order that often saves the Enterprise. Dennis Quaid’s character in The Day After Tomorrow advocates on behalf of all the ignorant people, risks his own life to save those caught in an unprecedented, climate change-driven storm. Bad Science author Ben Goldacre is critical of such portraits of scientists: “The media work around their inability to deliver scientific evidence by using authority figures, the very antithesis of what science is about, as if they were priests, or politicians, or parent figures. ‘Scientists today said…scientists revealed…scientists warned.’ And if they want balance, you’ll get two scientists disagreeing, although with no explanation of why (scientists are ‘divided’). One scientist will ‘reveal’ something, and then another will ‘challenge’ it. A bit like Jedi knights. The danger of authority figure coverage, in the absence of real evidence, is that it leaves the field wide open for questionable authority figures to waltz in.”

profThe foolish scientist (a.k.a. absent minded professor). Satires depict scientists as foolish, cultish, comic. The laughable, lovable absent minded professor clumsily creates flubber and wins hearts with his eccentricity. “Scientists are unusually insightful and intuitive people who have a strong need for organization and application of concepts. They might have difficulty expressing their ideas, because they don’t think linearly, and their life of the mind could lead others to regard them as aloof” wrote Steve Bunk. Gary Larsen loved the foolish scientist. I like to think his mocking comes from a place of affection and respect.

The inhuman researcher. As science and society evolved, so did science stereotypes. Dr. Frankenstein fits here, too, as do the atomic scientists working on the bomb during World War II. Then the Cold War gave added weight to this stereotype with their documented declarations of unconcern about the human cost of their inventions. What does it mean when we apply this metaphor to others, as in the “Frankenstein economy” of the 2008 financial meltdown on Wall Street?

The scientist as adventurer (e.g., Indiana Jones), as brave, optimistic explorer, traveler of space and time. Jon Palfreman (Nieman Reports 2002) sees television science documentaries as being drawn from a small handful of approved genres, one of which is “archaeology and legends: expeditions, lost treasures, mummies, dinosaur bones, mammoths, the use of forensic methods to uncover the past.” The other genres he identified are “forces of nature,” “modern history,” and “boys and their toys.”

farmersThe mad, bad, dangerous scientist. As science increased in power, so did the stereotypes, evolving from the alchemist tradition to the cataclysmic. In Unscientific America, Chris Mooney and Sheril Kirshenbaum wrote, “The uncaring scientist, unconcerned about consequences, pursuing knowledge at all costs—this is the ugliest scientist stereotype, and also the most deeply rooted. It hails from a long literary tradition, before Frankenstein to Greek stories that depict the search for knowledge as forbidden and dangerous, and leading to disastrous consequences. In this narrative, knowledge leads the scientist to play God, interfere with nature, and attempt to thwart fate by determining who lives and who dies.” The mad scientist can be easily exposed as a wannabe, a fraud, a rogue.

The helpless scientist. The seventh and last stereotype identified by Haynes is the scientist who becomes a victim of his or her own discovery. This is science out of control.

How pervasive are these stereotypes?

bigbangIn the Big Bang Theory, scientists are unattractive, socially inept, indifferent, uncool, but smart. Scientists are distant, long-winded, incomprehensible. The lead scientist in Bones, while female and attractive, adheres to the stereotypes of rationality (to a fault), atheism, and a cold lack of emotion. In fact, the cracks that emerge in this façade are a major plot that runs through the series.

Movies and television portray scientists that are absorbed in the details of their work, ‘wedded to the job.’ A a recent analysis of scientist portrayals on TV (Dudo 2010) found that of 2,868 characters, one percent were portrayed as scientists, mostly white males. Scientists are more likely to be characterized as good, although science as an activity is portrayed as dangerous and violent.

As David Kirby described in the 2011 book Lab Coats in Hollywood, filmmakers are not unaware of these stereotypes.  Science consultants are frequently brought in to comment on scientific matters involving the script, the actors, the sets, the props, and any other relevant factor during production. Concerns about science in the movies has led several scientific advocacy organizations to develop programs to facilitate more scientific involvement in the production of television programs and films, including the National Academy of Science’s Science & Entertainment Exchange, the Creative Science Studio, and the Sloan Foundation’s Film Development program.

Why should any of this matter? As Kirby wrote, “Popular films impact scientific culture by effecting public controversies, enhancing funding opportunities, promoting research agendas, and stimulating the public into political action…Moreover, entertainment texts can influence scientific thought by foregrounding specific scientific ideas and providing narrative reasons to accept them as representing them as reality.”

As science writers, we have an obligation to write truthfully about science, and to portray scientists not as stereotypes, but as real people, just like you and me.

REFERENCES

Steve Bunk, “The Natural History of Science Personalities” in Science Writers Spring 2003.

Anthony Dudo, 2010. Science on television in the 21st century: recent trends in portrayals and their contributions to public attitudes toward science. Comm. Res.

Ben Goldacre, www.badscience.net

Roslynn Haynes, From alchemy to artifical intelligence: stereotypes of the scientist in Western literature. Public Understanding of Science 12 (2003):243-253.

Kirby, David A., 2011. Lab Coats in Hollywood. Cambridge, MA: MIT Press.

Losh, Susan Carol. 2010. Stereotypes about scientists over time among US adults: 1983 and 2001. PuoS 19:372-382.

Chris Mooney and Sheril Kirshenbaum, Unscientific America (see Chapter 7).

Jon Palfreman, “Bringing science to a television audience” Nieman Reports 2002.

Writing “How To” and “How It Works” Stories


Science writers live in a world somewhere between scientists and public audiences. Much of a science writer’s skill involves explaining complex ideas and navigating complicated language and concepts. This assignment is designed to give you practice in explanatory reporting (“How It Works”) and understanding the scientific process (“How To”). Be creative!! Either of the options could be applied to almost any topic. Be original—this is your chance to explain something in a way that no one else has done before, or provide a unique set of instructions for a surprising activity.

Option 1: “How It Works”

Choose a scientific or natural concept, process, design, phenomena, etc. and explain how it works in terms appropriate for a magazine or newspaper audience. You might want to look at the history of scientific discoveries related to your topic, analogies, graphics, etc. Examples include:

§  Drinking water

§  Suspension bridges

§  Ocean acidification

§  Cave formation

§  Mountain formation

§  Black holes

§  Global warming

§  etc.

Option 2: “How To”

Choose an activity, occupation, lesson, etc. and describe how to accomplish it. Write as if your audience knows nothing about the task. How does one start? What are things to keep in mind? Warnings or cautions? This does not have to be about a scientific subject, just something involving action: how to catch a fish, how to ski/climb/camp/etc., how to build a snowman. This is a literary angle used in both nonfiction and fiction.

Suggested reading:

  • William Zinsser, “Science and Technology” chapter, On Writing Well (30th Anniversary ed.)
  • L. Rust Hill, “How to Eat an Ice Cream Cone,” Fierce Pajamas: The New Yorker Anthology of Humor Writing
  • “Explanatory Writing,” A Field Guide to Science Writing (2nd ed.)

Please share your favorite examples of How-To and How-It-Works stories in the comments!

What’s your relationship with science?


Most people do not go to school to become scientists, and most Americans don’t have science education beyond high school. So when I ask, “What is your relationship with science?” you might think of using microscopes to study pond water in seventh-grade, or your weird chemistry teacher in high school who liked to make Bunsen burner jokes, or a science fair for which you made a solar system out of Styrofoam balls. Try to recall your involvement with science throughout your lifetime. Did you love it, or hate it? Were you kept in a classroom or allowed to go outside to explore? When you hear the word “science,” what comes to mind?

ASSIGNMENT: Write 500 words on “My Relationship with Science.” Please submit to me via email or in hard copy on January 26.