Of Drunks, Lampposts, and P-values

blogEntryTopper
When it comes to discussion, data rules, right? If it's data, then it has to be fact. And if it is a fact, it has to be accepted. End of discussion!

I've never really been comfortable with that position. My discomfort principally revolves around two concerns. First, are we really sure about these so-called facts? And, secondly, have we critically examined the facts, or are we using the facts more, as they say, for support than illumination?

I have always loved that quip about facts and illumination. QuoteInvestigator attributes the earliest version of the allegory of drunks and lampposts to A. E. Housmann's critique of his colleagues' scholarship:

"And critics who treat manuscript evidence as rational men treat all evidence, and test it by reason and by the knowledge which they have acquired, these are blamed for rashness and capriciousness by gentlemen who use manuscripts as drunkards use lamp-posts,—not to light them on their way but to dissimulate their instability.”

The more current and colloquial version goes like this: Some people use statistics as a drunk uses lampposts - more for support than illumination. When this adage holds true, the search for meaning can run a bit astray, particularly if the supporting facts are, well, not so factual.

I was reminded of this problem when I read a recent article at bloomberg.com, entitled The Value of that P-Value. Let's assume a researcher wants to establish a hypothesized relationship between two variables, say weight and exercise. The researcher might analyze the different weights between participants who exercised vs. those who did not. A standard approach begins by asking what the chances are that there is no relationship between the variables, a so-called null hypothesis. If a calculated p-value is less than a predetermined threshold of significance, generally 5%, a researcher would reject the null hypothesis (exercise has no effect on weight) and may comfortably infer a significant relationship between the two variables, i.e. exercise has an effect on weight. A data-driven "fact" has now been established, ready to be used as an argumentative lamppost.

But what if that lamppost was built with faulty materials? What if the researcher adopted faulty research methodologies? Or, as alleged in the article, what if researchers knowingly adjusted the data to ensure a hypothesis-supporting p-value … cooking the methodological books, so to speak? Academic integrity issues notwithstanding, it leaves us leaning on shoddily built lampposts, building arguments upon questionable support.

Shoddy research is a serious dilemma when it comes to critical cognition in the classroom. Facts offer a wonderful anchor in classroom discussions. They keep us from drifting onto argumentative shoals and cognitive reefs. So, students are often encouraged to base their conclusions and decisions on fact, not supposition. This is good advice, but not sufficient advice. Critical thinking does not start with acceptance of fact, rather it starts with a gathering of fact as one part of a knowledge structure. Before building upon those facts, they need to be analyzed and verified: are they supported by research, is the research good research, and has the research been verified/replicated by others? If not, we may be falling prey to a weak lamppost.

Another issue raises its head when students not only rely on facts they have not analyzed, but select these facts to bolster a pre-determined position. This is known as a confirmation bias, the tendency of an individual to rely upon information that supports a pre-conceived idea or decision, and to select out any data or evidence that counters their predisposition. When your students manifest a confirmation bias by using shoddy lampposts, well … they're likely to end up on the argumentative ground alongside our allegorical drunk.

Failing to think critically is more than an academic exercise, though; it can have profound real world impact. Consider a recent article on recreational drinking from The Huffington Post. Doctors have been advising patients that recreational drinking my actually be good for your health, relying upon research that linked moderate drinking to longer life spans. It seems, though, that the research supporting the idea that recreational drinking is good for your health (lamppost) is flawed. The original research upon which the conclusion was based did not control for the fact that many non-drinkers avoided drinking because of serious pre-existing health issues. Thus, the number of non-drinkers who died was biased due to those pre-existing health issues. The upshot: many people may have imbibed more alcohol than they should have as a result of shoddy facts (a regrettable lack of illumination).

So, what can we do about these issues in our classrooms? In the middle-school to high school arena, the emphasis should be to remind students that not all facts are created equal. Lacking the statistical and research methodological background of college/graduate students, it would be counter productive to have them try to spot research and/or methodological flaws. Your students could, however, be asked to consider a so-called factual position that was later refuted by further research and discuss how the "facts" could have changed (i.e., "The world is flat" argument or the so-called physical limitations of African-American pilots refuted by the Tuskegee Airmen).

Secondly, illustrate the confirmation bias by asking students to develop a position on an issue ("How do you feel about global warming?") and support their position with two pieces of evidence. After discussing the issue in a group/class setting, have each student go back and identify two pieces of evidence that counter their original position. Meet again, discuss again, and then ask the students to reflect on how there could be contrary evidence to their original position, and why they did not originally find/accept the counter evidence. This helps initiate a meta-cognitive process by having them think about how they thought about their position.

Critical thinking is hard work, and almost impossible to improve if the learner can't identify the shaky foundations of their own analysis. Your efforts help learners assess flaws in the evidence at hand and to recognize biases in their own analysis will go a long way toward meta-cognitive improvement, and help develop knowledge structures that can stand on solid critical foundations … no lampposts needed.
blog comments powered by Disqus