Misleading statistics are easy to find in everyday life, as well as in our course resources. For instance, http://www.econoclass.com has an entire website devoted to providing examples and explanations of various misleading statistics typically run across by the average person (Alden, 2005-7). The majority of problems with statistics fall into two different categories: (1) misleading data and/or (2) selection bias. Within each of these categories, there are many variations. For instance, data can be manipulated in many ways such as being completely false, compared without adjusting for baseline differences, arbitrary comparisons and/or misleading comparisons (Korn, 2011; Gelman & Nolan, 2002). Two of my favorite examples demonstrate the ability of statistics to be erroneous in both categories.
First, the claim “a woman over age 40 has a better chance of being killed by a terrorist than of getting married” http://www.snopes.com. In fact, this claim is false in a number of ways. One, the original study had inherent to it specific contextual circumstances that were never publicized, as well as suffering from severe sampling bias. Second, the comparison between a woman getting married at any age has nothing to do with being killed by a terrorist. Third, the added comparison was not part of the study, but actually tagged on in a “Newsweek” article. Fourth, there was no validation that the number of women killed by terrorists approximately the 2.6% that get married over 40, up or down (Mikkelson & Mikkelson, 2008).
Second, that wonderful myth behind “people remember 10% of what they read.” Oh my gosh. That was a big one. When I investigated the background and research on this information, it made complete sense that it was faulty; however, until that point I had never questioned the data. In fact, there are many who still believe this one to be true. Dr. Thalheimer’s research regarding this particular myth began prior to 2002 when he first debunked the information (Thalheimer, 2006). When he contacted the authors of the original study they did not recognize the graph, nor was the statistics associated with same alleging the citation indeed, erroneous. Further evidence of fabrication include alteration of the study’s title, misspelling of one of the author’s names, inaccurate publication date and misleading representation of a publishing house as a possible author. Even if these issues were resolved, there are inappropriate comparisons made within the categories discussed. As Dr. Thalheimer inquires, “don’t you have to ‘see’ to ‘read?’” as just one example. Clearly, these categories of seeing, hearing, reading, writing, collaborating are all interwoven together and it would be difficult, if not impossible, to remove all constraining variables.
As for my own hypothetical study, it would be easy to mislead the general population with regard to studies involving ADHD learners. For instance, many laypersons are unaware that there is a high incidence of comorbidity of other disorders with ADHD, such as depressive disorders, anxiety disorders and Opposition Defiant Disorder (ODD) (Aupperlee, Swank, Lien, & Ripinski, n.d.; Brown, 2002). Should the statistics from a hypothetical study determine that the comorbid disorder had a significant effect on the outcome, rather than the attention network being studied, that information could be left out of the report. Further, symptomology of ADHD individuals differs greatly and there are, in fact, three classifications of the disorder. Again, without mentioning specifics, it would be possible to overlook a possible lurking variable in the type of ADHD the individual has and any research being done. In fact, in retrospect, although the studies I read for my literature review allege the researchers “matched” the ADHD learners with non-ADHD learners “as closely as possible,” who determines how close is close enough? There is also the possibility of sampling bias from region to region. It is entirely possible that some medical centers diagnose more stringently than others do, or that there may be cultural variables involved.
All in all, the general rule of thumb would be to read the original research, read additional reputable research, and use some common sense.
From a more personal perspective:
As I was driving home from work today I found myself reflecting on my life and in the many examples from my life that seem to fly in the face of “statistics.” For instance, I am only 42, but I have been married three times. Twice for more than ten years. I am only 42, but I have been in school well over one-half my life. I am only 42, but I also spent over 15 years enduring infertility treatments, none of which were successful. However, I did eventually get pregnant and have two sons. My youngest son died when he was 3 years old from a fluke, less than 1% chance of occurrence household accident. My sister-in-law who lives and breathes a healthy lifestyle, sunscreen, diet, and exercise was recently diagnosed with breast cancer. This same healthy, beautiful woman was never able to conceive. I am quite sure that I could find some statistics somewhere in complete opposition for most of these occurrences.
My point is not that I am all that strange although, I may be. The point is, statistics are just numbers, and they only have as much power as we give them. It is up to us to do our due diligence. When my son goes to the doctor and she grouses he has not gained weight and he is “statistically small for his age,” it is up to me to consider his genetic history, his ADHD medication, his eating habits, the fact that he has grown 2 inches in 2 months and been swimming every day. Statistics, at best, are a trend or pattern or likelihood. Even knowing statistics are accurate and reliable is not always enough to alter our decision making. A great example would be when I was teaching high school Health. My students thoughts the rhythm method was a very feasible choice for protection, despite the 20-25% chance of getting pregnant. This particular statistic, which is fairly accurate, held no weight with 10th graders who could not believe it would happen to them. In stark contrast, you cannot live your life afraid of every statistic either. Or, how many people struggle with weight issues, depression issues, anxiety issues, etc.? Out of all these people, it is likely they are well aware that eating a balanced diet and exercise are the best course of action and yet, they are unable to alter their lifestyle successfully over the long term. Statistics are just a tiny piece of the puzzle. We need all of the pieces for a complete picture.
Alden, L. (2005-7). Statistics can be misleading. Retrieved July 30, 2011, from http://www.econoclass.com/misleadingstats.html
Aupperlee, J., Swank, M. G., Lien, M., & Ripinski, A. (n.d.). Attention deficit hyperactivity disorder. Retrieved July 20, 2011, from https://www.msu.edu/course/cep/888/ADHD%20files/Home.htm
Gelman, A., & Nolan, D. (2002). Lying with statistics. In Teaching statistics: A bag of tricks. New York: Oxford University Press. Retrieved from http://www.stat.columbia.edu/~gelman/bag-of-tricks/chap10.pdf
Korn, B. (2011). Responsible thinking. Retrieved July 30, 2011, from http://www.truthpizza.org/logic/stats.htm
Mikkelson, B., & Mikkelson, D. (2008). Marry go round. Retrieved July 30, 2011, from snopes.com/science/stats/terrorist.asp
Thalheimer, W. (2006, May 1). People remember 10%, 20%…Oh really? [Web log message]. Retrieved from http://www.willatworklearning.com/2006/05/people_remember.html