By Leela Velautham
This was a statistic heralded by President elect Donald Trump, in a speech during the election campaign, to illustrate the apparently huge number of unemployed Americans, and thus, to expose the perilous state of the American economy. (Appelbaum, 2016)
However, if considered critically, this is also statistic that is incredibly misleading.
Trump may be correct that fewer Americans, as a percentage of the total population, are engaged in traditional employment today as compared to previous decades. However, the statistic above does not serve as proof that more Americans are unemployed, and indeed, is more indicative of the fact that 20% of American households are headed by retirees. In this statistic, Trump is tacitly classifying retirees, 16-17 year olds and stay-at-home moms as being within the ranks of the unemployed. Although this classification may be technically accurate, it is misleading information with respect to informing the public about the general state of the economy.
The impact of statistics
A prevailing view in the realm of social psychology has been that of ‘cultural cognition’ – the idea that people form risk perceptions, and thus make decisions and form worldviews that cohere strongly with their cultural and political values (Kahan, 2011). It is a theory used to explain why certain groups do not believe in climate change or the effectiveness of vaccines, despite overwhelming scientific evidence to the contrary. This is because groups have a tendency to view empirical evidence such as facts in a biased manner, confirming evidence that fits with their beliefs at face value, while holding disconfirming evidence to higher critical standards (Lord et. al., 1979). People are thus unable to be influenced by facts that do not fit within their existing views, discarding any information that is contrary to their closely held beliefs. However, cultural cognition theory has been repeated discounted by experiments carried out by the Reasoning group, which have empirically demonstrated the catalyzing effect of even a single, critical statistic in changing a citizen’s view on a social issue or policy, regardless of political or group identification (Ranney et. al., 2016).
The power of germane numbers can be illustrated by a study carried out by Michael Ranney in 2001 (Ranney et. al., 2001). In this study, he asked US-based participants to estimate the current legal immigration rate and state their preference for what they thought this rate should ideally be. The median estimate was a rate of 10%, with the median initial preference to keep the status quo (a rate of 10%).The participants were then shown the actual legal immigration rate (0.3%). After receiving this feedback, the median participant switched from their status-quo policy to wanting immigration to become thrice its current rate (i.e. 1%) – a belief revision and change in policy preference prompted by one single, salient number.
One’s beliefs regarding a topic such as immigration, global warming or nationalism are understood to be based on a series of connected ides that include personal experiences, media information, religious opinion and more general epistemic and experiential understandings of the topic. However, when exposed to a particularly surprising or shocking number, these understandings are challenged, inducing a cognitive conflict between previous beliefs and this new information. This conflict is usually resolved by people revising and/or reorganizing their previous beliefs to incorporate these new, striking pieces of information via the Piagetian process of accommodation (Piaget, 1964) or the mechanism of conceptual change (Chi, 2008), causing a shift in one’s views to a qualitatively different view of the issue than that previously held. Such a process is characterized in Thagard and Ranney’s Theory of Explanatory Coherence, which characterizes how people change their beliefs in ways driven by considerations of explanatory coherence and how belief networks are modified to maintain coherence with new information. (Ranney et. al., 1988) According to the Theory of Explanatory Coherence’s data priority principle, evidence that is critical, germane, repeatable and credible carries maximal weight in our belief systems – indicating that numerical information can carry notable weight with respect to leading to accommodative belief revision. (Ranney et. al., 1998)
Such conceptual restructuring on receiving new or surprising information is not necessarily a bad thing, indeed, it is what we characterize as ‘learning’, or what the Gestalt psychologist Wertheimer would term ‘productive thinking’ - the process by which becoming aware of a gap or knowledge void, prompts a person to increase global coherence amongst their beliefs (Wertheimer, 1959). Perception of this knowledge void is attenuated by surprise – and thus, it is the most surprising numbers and stories that have the most potential to spawn considerable cognitive change and belief revision (Ranney et. al., 2016). It has also been found that the more surprised people are by numbers, the less knowledgeable they report feeling about an issue – and thus, the more open they are to changing their beliefs in line with the number (a phenomenon known in the media as ‘the establishing effect’) (Yarnall et. al, 2017). This is illustrated by the fact that participants who were surprised by the immigration rate in the 2001 experiment, for instance, were four times more likely to significantly change their positions on the issue than those participants who were less surprised. (Ranney et. al., 2001)
Defining Critical Thinking
Numerical data, however, is most helpful when the data are reliable and accurate. However, media sources and elected officials can risk misinforming people with incorrect and/or misleading and unrepresentative data that is often not critically vetted by the press or the public themselves.
How can we give people the tools to resist being misled by such deceptive statistics and figures?
One possible avenue would be to encourage the development of ‘critical thinking’ in the general populace. ‘Critical thinking’ is often defined as a Gestalt-like process of learning, through becoming aware of one’s own ignorance. Ranney and Schank extended this definition, by hypothesizing that critical thinking means thinking more like a scientist, forming opinions using ‘scientific’ as opposed to ‘plain old’ reasoning (Ranney et. al., 1998). This kind of thinking employs more formal tools, such as deduction, alternative hypothesis generation and is more likely to involve the vigilant search for disconfirmation, as well as being more selective about which new information is accommodated by the reasoner. Such scientific reasoning is understood to have more empiricism, objectivity, rigor, accountability and less emotion than what is commonly understood by social reasoning.
An important component of such critical thinking, and some would argue, a distinguishing feature of it, is an awareness of the thought process itself – for instance, an awareness of how new information may fit with prior beliefs, and a conscious assessment of whether a statistic or figure offers strong evidence for what they are claiming, or not. Such regulatory thought processes assessing the act of learning as it takes place, are commonly defined in the education literature as ‘metacognition’ – a necessary pre-requisite for the development of expertise in a subject (Sternberg, 1998). Metacognition has been shown to be fostered in the classroom through several techniques, such as encouraging students to brainstorm and generate their own responses, and learning actively, rather than simply being shown the right answer (Schoenfeld, 1987). Argumentation in the classroom is another way metacognition has been fostered, and it has been shown that students who articulate, interrelate and revise their own arguments are more resistant to the biasing influences of extraneous information. (Kuhn, 2013)
Developing the Intervention
Our aim was to develop a short, text-based intervention that would promote critical thinking about statistics, drawing peoples’ attention to common aspects of uninformative and misleading statistics, and thus enabling them to more easily differentiate between misleading and representative statistics.
To start with, we drew on a numeracy curriculum, developed and piloted for journalism students at UC Berkeley by Michael Ranney in 2008 ( (Ranney et. al, 2008). This curriculum was created in response to the fact that journalists have a reported tendency to avoid backing up stories with relevant quantitative information. In one exercise the ‘Numbers, News and Evidence’ module of the curriculum, a fictional colleague called ‘Pat’ offered a series of alleged statistics, one-third of which were correct, with the remaining two-thirds being higher or lower than the true values. Journalism students exposed to the curriculum viewed Pat’s statistics increasingly critically the more they were exposed to, indicating that exposure to a mixed set of statistics in itself promotes increased skepticism with respect to quantitative information.
We extended this exercise for our training, providing a number of statistics that we asked participants to rate on a -4 to +4 scale, depending on how misleading, revealing and/or pointless they found them. In some of the example statistics, we had a blank where the numerical portion of the statistic was (i.e. __% as opposed to 42%) in order to ascertain whether this would have any effect on how the participants thought about and/or rated the statistic. In this, we were drawing on previous work of the Reasoning group, which has suggested that the practice of NDI (i.e. being asked to estimate unknown quantities related to important policy issues, before receiving the true values as feedback) fosters critical thinking. Not seeing or being told the number directly means that participants have to go through the step of estimating what this quantity would be, activating a network of facts, set relationships and causal beliefs about an issue. Such an activation mirrors the eliciting of prior knowledge in a classroom, where students are often encouraged to voice misconceptions or the prior beliefs about a subject they bring into the classroom (Hewson, 1983). The justification behind this is that if students do not perceive a conflict between their prior knowledge and new information, they are more likely to simply assimilate the new information to form a flawed, and inconsistent mental model. However, when the learner perceives a conflict between new information and their prior beliefs, then the process of belief revision occurs, leading to conceptual change and ‘productive’ learning. It is this process of active accommodation that we wish to activate in participants, because it is through this process that the evidential quality of the new information is most likely to be critically assessed. Another aspect of the training was providing the space for participants to self-explain or ‘think aloud’ their ratings of the statistics. This was informed by research carried out by Chi, who has shown that self-explaining leads to a deeper understanding of material covered and to the improved acquisition of problem solving skills (Chi, 1994)
Alongside example statistics to rate, we also gave participants textual instruction that drew explicit attention to potentially misleading and/or non-representative aspects of statistics, such as quantities lacking in temporal or spatial breath and quantities lacking in measurement precision. We also encouraged the examination of causality and the source of statistics – thus, encouraging the activation of mechanistic as well as numerical reasoning that will hopefully lead to misleading or misrepresentative information being more readily discounted.
The intervention was specifically designed to focus on building metacognitive and quantitative critical reasoning skills – an aspect missing from the majority of college statistics curricula. Indeed, a study by Sorto (Sorto, 2006) found that only 1.3-2.6% of statistics curricula dealt explicitly with statistical reasoning (i.e. how to form inferences and generalize from statistics) – with the majority having an over-emphasis on the procedural, overtly ‘mathematical’ end of statistics. Specifically mathematical training was something that we wanted to steer away from in this intervention, with the knowledge that for a large portion of the population, any form of explicit mathematical and / or scientific training is likely to generate high levels of fear and anxiety in the public, and will have a knock on effect on an individual’s motivation, persistence and even reasoning ability (Birenbaum, 1994).
The use of facts and figures in contemporary politics is a double edged sword – on the one hand, such numbers and statistics ground statements in an much needed objective reality, however, on the other hand, the ‘authority’ of such numbers and statistics can be very easily exploited. It is indeed even arguable that the repeated references to shocking and misleading numerical information by journalists and policy makers has led to an increased distrust of experts, facts and data itself by the American public, and thus heralded in an era of ‘post-truth’ politics – a politics in which debates are conducted via highly emotive appeals rather than based on verifiable facts. All this has led to a reality where the truth is indistinguishable from fiction – a recent Stanford University survey has shown that more than 80% of supposedly digital-savvy students could not tell the difference between a real news story and a fake piece of sponsored content. (Donald, 2016)
The reality of having a president unconstrained by facts, together with a media polluted with fake and unverifiable clickbait poses an urgent challenge for educators, and particularly those of math and science. Our challenge is to equip students and the general population with the tools to question and refute outright lies while appreciated the value of facts, statistics and verifiable truths in public debate. Testing the effectiveness of the intervention described above will shed light on the cognitive mechanisms at play when people engage with facts and statistics, and is thus a step in the right direction, with respect to empowering future citizens to behave intelligently in an increasingly complex and uncertain future.
Appelbaum, B. (2016, August 8). Fact-Checking Donald Trump's Economic Speech . Retrieved from New York Times: https://www.nytimes.com/2016/08/09/us/politics/donald-trump-fact-check.html?_r=0
Birenbaum, M., & Eylath, S. (1994). Who is afraid of statistics? Correlates of statistics anxiety among students of educational sciences. Educational Research, 36(1), 93-98. doi:10.1080/0013188940360110
Chi, M. T. (2008). Three types of conceptual change: Belief revision, mental model transformation, and categorical shift. International handbook of research on conceptual change, 61-82.
Chi, M. T., De Leeuw, N., Chiu, M. H., & LaVancher, C. (1994). Eliciting self-explanations improves understanding. Cognitive science, 18(3), 439-477. doi:10.1207/s15516709cog1803_3
Donald, B. (2016, November 22). Stanford researchers find students have trouble judging the credibility of information online. Retrieved from Stanford Graduate School of Education: https://ed.stanford.edu/news/stanford-researchers-find-students-have-trouble-judging-credibility-information-online
Hewson, M. G., & Hewson, P. W. (1983). Effect of instruction using students' prior knowledge and conceptual change strategies on science learning. Journal of Research in Science Teaching, 20(8), 731-743. doi:10.1002/tea.3660200804
Holan, A. D. (2016, December 13). 2016 Lie of the Year: Fake news. Retrieved from Politifact: http://www.politifact.com/truth-o-meter/article/2016/dec/13/2016-lie-year-fake-news/
Kahan, D. M., Jenkins‐Smith, H., & Braman, D. (2011). Cultural cognition of scientific consensus. Journal of Risk Research, 14(2), 147-174. doi:10.2139/ssrn.1549444
Kuhn, D., Zillmer, N., Crowell, A., & Zavala, J. (2013). Developing norms of argumentation: Metacognitive, epistemological, and social dimensions of developing argumentive competence. Cognition and Instruction, 31(4), 456-496.
Lord, C. G., Ross, L., & Lepper, M. R. (1979). Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence. Journal of personality and social psychology, 37(11), 2098. doi:10.1037/0022-35188.8.131.528
Piaget, J. (1964). Part I: Cognitive development in children: Piaget development and learning. Journal of research in science teaching, 2(3), 176-186. doi:10.1002/tea.3660020306
Ranney, M., & Thagard, P. (1988). Explanatory coherence and belief revision in naïve physics. Proceedings of the Tenth Annual Conference of the Cognitive Science Society. (pp. 426-432). Hillsdale, NJ: Erlbaum
Ranney, M. & Schank, P. (1998). Toward an integration of the social and the scientific: observing, modeling and promoting the explanatory coherence of reasoning. In S. Read & L. Miller (Eds.), Connectionist models of social reasoning and social behavior (pp. 245-274). Mahwah, NJ: Lawrence Erlbaum.
Ranney, M. &. Clark, D. (2016). Climate change conceptual change: Scientific information can transform attitudes. Topics in Cognitive Science, 49-75. doi:10.1111/tops.12187
Ranney, M., Cheng, F., Nelson, J., & Garcia de Osuna, J. (2001). Numerically driven inferencing: A new paradigm for examining judgements, decisions and policies involving base rates. Paper presented at the Annual Meeting of the Society for Judgement and Decision Making.
Ranney, M. A., Munnich, E. L., & Lamprey, L. N. (2016). Chapter Four-Increased Wisdom From the Ashes of Ignorance and Surprise: Numerically-Driven Inferencing, Global Warming, and Other Exemplar Realms. Psychology of Learning and Motivation, 65, 129-182
Ranney, M. A., Rinne, L. F., Yarnall, L., Munnich, E., Miratirx, L., & Schank, P. (2008). Designing and assessing numeracy training for journalists: Toward improving quantitative reasoning among media consumers. In P A.
Kirschner, F. Prins, V. Jonker, & G. Kanselaar (Eds.), International Perspectives in the Learning Sciences: Proceedings of the Eighth International Conference for the Learning Sciences, Volume 2 (pp. 2-246-2-253). International Society of the Learning Sciences, Inc.
Schoenfeld, A. H. (1987). What’s All the Fuss About Metacognitlon. Cognitive science and mathematics education, 189.
Sorto, M. A. (2006). Identifying content knowledge for teaching statistics. In Working cooperatively in statistics education: Proceedings of the Seventh International Conference on Teaching Statistics, Salvador, Brazil
Sternberg, R. J. (1998). Metacognition, abilities, and developing expertise: What makes an expert student? Instructional science, 26(1-2), 127-140.
Wang, A. B. (2016, November 16). 'Post-truth' named 2016 word of the year by Oxford Dictionaries . Retrieved from The Washington Post: https://www.washingtonpost.com/news/the-fix/wp/2016/11/16/post-truth-named-2016-word-of-the-year-by-oxford-dictionaries/?utm_term=.c9b26a04b4a1
Wertheimer, M. (1959). Productive thinking. M. Wertheimer (Ed.). New York: Harper.
Yarnall, L., & Ranney, M. A. (2017). Fostering scientific and numerate practices in journalism to support rapid public learning, Numeracy, 10 (1), retrieved from http://morenumerate.org/downloads/YarnallAndRanney-Numeracy2017.pdf
Leela Velautham is currently a Masters Student in the E.M.S.T. program (Education in Math, Science and Technology) at the Graduate School of Education, UC Berkeley. Her research interests include scientific and statistical literacy in the general public, and climate change mitigation through education- and psychology-based interventions. She can be reached at email@example.com