21st Century Literacy
( a free textbook )
An Introduction to 21st Century Literacy
Literacy has always meant much more than simply reading and writing.
For most of human history, traditional literacy programs have focused on venerating cultural authorities by reading and replicating sacred literature, which promoted cultural reproduction.
By the late 19th century, with the secularization of education in Western Europe and the United States, literacy programs gradually turned to the reading and venerating of fictional literature, especially novels and short-stories, which promoted introspection and subjective opinions.
Unlike the past, 21st century literacy entails critical thinking and metacognition skills. In order to successfully communicate in school, on the job, in politics, and in life, students now need advanced cognitive and emotional skills. They also need to be able to research and judge reliable information, as well as the ability to read objectively and to write clearly.
Our organization seeks to redefine literacy skills for the 21st century. Human beings need to be able to think critically and to self-monitor their mental processes so they can actively construct, evaluate, debate, and use their knowledge in a global, multicultural world.
We teach the practice of literacy with a focus on science, evidence, and critical thinking. We also teach how to be a successful college student, a short history of literacy and democracy, and an introduction to literature with a focus on understanding human psychology and culture.
We are trying to bring more science to the teaching of writing, more writing to the teaching of science, and more critical thinking to the teaching of reading.
Our materials and workshops are based on highly acclaimed books by Josh M. Beach, including How Do You Know? The Epistemological Foundations of 21st Century Literature (2018), Can We Measure What Matters Most? Why Educational Accountability Metrics Lower Student Learning and Demoralize Teachers (Forthcoming, 2021), and The Myths of Measurement and Meritocracy: Why Accountability Metrics in Higher Education are Unfair and Increase Inequality (Forthcoming, 2021). Beach has been an educator and educational administrator for over 20 years, from kindergarten to the university, across the U.S., South Korea, and China.
21st Century Literacy is more than just reading and writing. It is knowing how to learn and know. Utilizing scientific research on cognition and meta-cognition, students need to understand how the brain creates and uses subjective knowledge, and the different processes that create objective knowledge. Students need to know how concepts work to define and categorize knowledge, and how concepts can be organized into conceptual frameworks that interconnect facts into larger fields of knowledge.
Students need to be able to understand concepts as tools, which can be used to solve real-world problems. Most importantly, students need to recognize threshold concepts, which enable new ways to see and know the world. Two of the most important threshold concepts involve learning to see writing as two separate tools. First, writing is a tool for thinking and knowing. And second, it is a tool for communicating knowledge and persuading people to see the truth.
Students need to understand the theoretical purposes and the concrete practices of research, thinking, and writing. Psychologists call this holistic understanding “meta-cognition,” which means "thinking-about-thinking" and "thinking-about-doing." Such higher order thinking enables us to better understand ourselves ( both our strengths and limitations), which then enables us to know better and perform better. Students need to be able to do, not just know.
This free web textbook will utilize these learning tools. Threshold concepts will be explained as concrete writing and thinking practices, and these concepts will be interconnected into the following conceptual frameworks: (1) The history of literacy, (2) how knowledge is created and how different forms of knowledge are used as tools to know, (3) and finally how knowledge is communicated through writing.
These core concepts will be combined into a single concrete process, which is set within a specific social context. This book is about constructing and debating knowledge in 21st Century multicultural societies. This focus on process, rather than products, is based on the concept of social interaction through language as the fundamental basis for learning and knowledge creation.
And as the specific social context of multiculturalism implies, 21st century literacy must also include political literacy. Students need background knowledge and training to become engaged citizens capable of fostering the public good. This important form of literacy will not be fully covered by this book, but the links between literacy, public schooling, democracy, and political freedom will be introduced and explained, especially in the first part of the book focused on the history of literacy.
21st century literacy is a collection of many higher order skills. Students need to be able to critically evaluate the reliability of diverse sources of knowledge in order to construct knowledge with scientific methods. It also entails openly arguing with diverse groups of people in order to explain and prove the truth.
But we cannot forget that these 21st century skills are built on the foundation of traditional literacy: reading, writing, and basic mathematics. Knowledge is the essential first step to good communication and effective action.
Truth has to be actively constructed by critical thinkers through meticulous and rigorous scientific methods. And this truth needs to be effectively communicated to diverse audiences through arguments in order to direct collective action to solve real-world problems.
Orality & Literacy (Part 1) The Origins of Human Communication
1. Understanding literature and literary genres has been a cornerstone of the traditional liberal arts curriculum. But rarely is this important subject discussed within its broader evolutionary context. It is not only essential for 21st century students to understand the origins of human communication, but you must also understand how the act of communication changes in relation to various social technologies. Social media is not a new phenomenon. It refers to any technology, or tool, that facilitates communication and social interaction. Many different types of social technologies have been developed over the last 10,000 years of human history. It is important to understand how our ability to communicate is shaped by technology. Humans have used many different communication tools over the centuries, such as oral discussion, the book, the newspaper, or the internet. Each tool enables and constrains our ability to create and debate knowledge. Let us explore the earliest social media.
2. Many people do not realize that reading and writing are relatively new human skills in our evolutionary history. Humans reached their present state of biological evolution around 40,000 years ago (Diamond, 1992, p. 47). The earliest forms of human communication were oral languages (Ong, 2002, pp. 28-29) and artistic expression (Diamond, 1992, p. 170). As scientists recently discovered in the 20th century, the form of communication we use affects our ability to think about the world and to create knowledge (Goody, 1977, pp. 10, 37). Thus, understanding how oral communication works is doubly important. We should understand the structure of oral communication because it is still a vital medium of social exchange. And we need to understand how modern forms of communication, like print literacy or computer literacy, work differently.
1.1 Orality & Literacy: Origins of Human Communication
3. Oral cultures use language and art to create meaningful stories called myths, which people used to understand their world and pass on important traditions. However, oral cultures are highly limited by their mode of communication. All the knowledge of a particular culture has to be memorized by a trained specialist (Boyer & Wertsch, 2009), a professional story-teller (Havelock, 1963; Rubin, 1995). Knowledge consists of memorizing important stories that had been passed down as a tradition: stories about gods, heroes, important battles, how the seasons change, and more mundane skills, like how to hunt or make a spear. Knowledge is also a performance, a social act. The community would often participate with the story teller in retelling or singing the story together in a ritualized way (Havelock, 1963, pp. 44-47). In ancient Greece, oral performance consisted of acting out events through the production of public theatre, which involved singing, dancing, drunkenness, and prizes for competing story tellers (Demand, 1996, pp. 213-14).
4. Writing and reading were developed only about 5,000 years ago, although it is still unclear which cultures actually invented writing independently and which cultures merely borrowed the technology. Most scholars agree that writing was developed independently in at least three places: the Middle East, China, and, later, in Central America. Once writing was developed in a society, it opened up new possibilities for communication and critical thinking. Writing created a way to permanently store information as an object, which opened new opportunities for thinking about accuracy and meaning.
5. It also allowed information to travel beyond the single performance of a local group. The written text could be independently read by many people over longer periods of time and over greater distances. A text may have been written down, or authored, by a specific person, but not always. Regardless, written texts were still social, albeit not shared by the whole society. They became the intellectual property of a small group, often students and teachers, rather than the shared property of the whole society.
6. A text also became an object, which was independent of its author or group. Thus, people could either read a text to understand its author's intended meaning, or readers could bring new meanings into the text that the author did not intend. A text allowed a reader the time and mental space to think about the idea being communicated. It also allowed the reader to agree or disagree with that idea or how it was communicated (Goody, 1977, pp. 37, 78, 149; Goody, 1986, pp. 12, 78), which is a process we now call reader-response theory (Fish, 1982).
7. Writing opened many new possibilities for critical thinking that were not previously available. In oral cultures, a person was only able to memorize or listen to stories, and it is difficult, if not impossible, to think about a story while in the act of memorizing it, listening to it, singing it, or dramatically acting it out. When people read a text, they have more options. They can re-read passages, investigate the meaning of words, or they can stop reading at any time to think about what they have read and ask questions, like whether or not the information is true or false, good or bad (Goody, 1986, pp. 38, 78; Ong, 2004/1958, pp. 110-111). The development of writing opened the door to a new intellectual practice called philosophy, the art of critical thinking.
1.2 Philosophy: The Origins of Critical Thinking
8. In ancient Greece, India, and China the invention of writing gave birth to a new activity called critical thinking, or "philosophy." The word philosophy is derived from two ancient Greek words: philo (“love”) and sophia (“wisdom”). In India, the ancient Sanskrit word for philosophy is darsana, which means using rational thought to "see clearly" (Nussbaum, 1997, p. 45). Around the 6th century B.C.E, philosophia and darsana were new intellectual practices that brought many benefits. The philosopher sought to investigate a text and compare it to reality to see if it was true or false. Furthermore, philosophers created theories about the world to test in order to explain how and why things were the way they were (Gottlieb, 2000). Philosophers were also concerned with virtue and ethics. They wanted to know if certain ideas or activities were good or bad. Knowledge of the true and the good enabled wisdom, which can be roughly defined as using knowledge practically to live life better and to enable a more prosperous and harmonious society (Nozick, 1989, pp. 267, 270).
9. In ancient Greece, philosophy was practiced by sophoi (“wise men”). In ancient India, they were called parivrajaka, or "wanderers" who sought truth and spiritual enlightenment. These men created truth, lived their truth, and they used rhetoric to debate values and to arrive at some collective notion of the good life. They also were teachers who tried to educate the young about the truth and wisdom through discussing important social problems and demonstrating the best way to live (Gottlieb, 2000; Skilton, 1997, pp. 14-17). It is important to remember that most sophists did not consider truth to be singular, nor did they agree about what the good life entailed. Even within the fairly homogeneous culture of ancient Greece, there were always competing truths, values, and ways of life. Likewise, in ancient India, the parivrajaka had many different ideas about spiritual enlightenment and the best way to live, and different sects competed for followers and financial support (Skilton, 1997, pp. 17-18). Sophists and parivrajaka constantly debated over different visions of the good; thus, they had to master rhetoric as well as critical thinking in order to persuade others why one version of the "truth" was right and another version wrong.
10. Socrates is perhaps the paradigmatic ancient Greek sophist, or teacher of wisdom, given what we know about him through the stories of Plato. A sophist was a professional teacher, who may or may not have accepted money for teaching. Socrates taught for free. While Socrates created and lived his own truth, he also debated with fellow citizens and sophists, never taking his own ideas too seriously and always examining all ideas of goodness and truth. He investigated the oral myths of his society and found many to be false and immoral (Havelock, 1963).
11. Many people did not appreciate his critical investigation of the traditional truths that everyone in his society took for granted. But Socrates took his vocation as sophist very seriously, and he continued to ask difficult and unsettling questions. In fact, he took his vocation as philosopher so seriously that he died for his pursuit of truth and his way of life when his teaching became unacceptable to his society. Offered the chance to be exiled or to die, Socrates chose death (Plato, 1997). His followers later used his example as a martyr for truth to attack other teachers who merely used philosophy as a vocation to make money. Socrates’ student Plato changed the word sophist into loaded label to attack these working philosophers, some of whom corrupted truth in search of profit. It was Plato who popularized the new term of philosophy as a disinterested practice of truth for the sake of truth.
1.3 Training Elites: The Origins of Schools
12. For thousands of years, literacy and critical thinking were reserved for a relatively small group of social and political elites. These tools enabled merchants to record their business inventories and financial transactions and think of new ways to build profits. Writing and thinking enabled political rulers to record laws, preserve the official history, and stay in power. And they enabled religious leaders to write down sacred stories and codes of conduct to preserve the moral order. Official histories and sacred stories were primarily transmitted to the common people using the older oral practices. The majority of people were not literate because they still found their older oral traditions important and meaningful, and because there were not many schools to teach common people how to read, write, or think. Also, reading and writing were not really necessities for most people who spent their short lives doing manual labor in the service of a monarch or aristocratic lord.
13. For much of the world up until the 19th century, schooling was reserved for a small population of male elites in each society. This privileged group learned to read and write for two primary purposes: to be a bureaucrat in the service of the king, or to be a priest in the service of the church. Over time, a few more occupations opened up: law, medicine, and education. In Europe, by the 13th century, privileged boys were chosen around the age of seven to begin instruction at newly created schools called "universities," where they studied the liberal arts of Latin grammar, Latin rhetoric, and philosophy. Successful students became a “master of arts” by around sixteen or seventeen, and then continued to study medicine, law, or theology for four more years (Ong, 2004/1958, pp. 136-37). Students could use their medical or law degree in service of the king, or their theology degree in the priesthood. The priesthood also enabled a career as a professor in the university, which was an official organization of the church (p. 152).
14. In ancient East Asia, schools taught young boys how to read and write in the imperial language of the Chinese empire. Literacy included the memorization of classical Chinese texts and ritualized socialization in the various arts of war and formal etiquette (Mote, 1971). While Confucian and neo-Confucian educational principles did stress individual development as “self cultivation,” the emphasis of formal schooling, especially in later neo-Confucian institutions, focused more on situating the individual within the hierarchical “structure” of society than on actually developing the potential of individuals (Kalton, 1977, pp. 6-9, 82). Thus, much of a student’s instruction was geared toward a socialization process, whereby one learned appropriate social discourse, deference to superiors, and traditional rituals. Instruction culminated in a final “examination” that served as the gateway to a social title and a position in the state bureaucracy.
15. This East Asian educational system produced a small population of literate and cultured elites, trained in a traditional and largely unchanging body of ethical and technical knowledge. In South Korea these literate elites, known as yangban, served as ministers in a "rigidly hierarchical bureaucracy" and ran the day to day operations of the state (Palais, 1984; Seth, 2002, pp. 9-12). The yangban class became a hereditary aristocracy during the Choson dynasty (1392-1910), and thus, access to quality education and the civil service examination became restricted by birth. The Korean system of schooling, much like the Chinese system, was based on socializing a small political elite, teaching them how to attain and keep "power, privilege, and status" (Seth, 2002, p. 12; Lett, 1998, pp. 19-21).
1.4 The Printing Press & Mass Literacy
16. It wasn't until the invention of the printing press in China and Europe, and later the European Protestant reformation, that literacy began to spread to the common people. Early printing presses were invented independently in China in the 11th century and in Korea in the 13th century (Febvre & Martin, 2010/1958, pp. 75-76). The modern movable type printing press was first invented in Europe around 1450 by the German printer Johannes Gutenberg. As paper became cheaper, the printing press enabled the spread of books, newspapers, and literacy (Wright, 2007, p. 110). By 1500, around 8 million books had been printed, but this number grew astronomically over the next 200 years to almost 200 million (Febvre & Martin, 2010/1958, p. 115). But even up until the 18th century, most people still could not read or write, and "the book was still the preserve of a small and favored elite" (Febvre & Martin, 2010/1958, p. 104).
17. The primary book that European printing presses sold was the Bible. The diffusion of this book to greater amounts of people created a need for literacy. People who owned a copy of the Bible wanted to commune with the words of God directly, instead of listening to a priest, so they learned how to read. Increased access to books created a new demand for teaching literacy, both for adults and children. Increased literacy and ownership of books, in turn, led to new generations of readers that "valued book-learning" and the need for more complex systems of schooling, including the creation of higher education (MacCulloch, 2003, pp. 73, 75).
18. The development of widespread literacy and the mass-production of books eventually led to two important cultural developments in Europe. These developments not only furthered the creation of new knowledge, but they also lead to new political debates over how society should be governed. First, humanist scholars began to develop new methods for analyzing the authenticity, accuracy, and meaning of books, which created a new style of academic learning that would lead to a Socratic type of philosophical criticism of Christianity and monarchism. The second cultural development was the Protestant Reformation, which sought to "overthrow the old ecclesiastical system" of the Roman Catholic Church in order to create new forms of religious devotion based on direct readings of the Bible and personal communication with God (MacCulloch, 2003, p. 83).
19. The Protestant Reformation led to a profoundly new emphasis on literacy and the education of common people. It produced a new cultural focus on reading the Bible, which meant the development of public and private schools for literacy instruction (MacCulloch, 2003, pp. 583-590; Howe, 2007, p. 449). These fundamental cultural developments in Europe lead to a revolution in learning, which in turn lead to revolution in politics with the birth of modern democracy. In the 21st century, we must never forget the important connection between literacy, critical thinking, and political freedom.
Education & Democracy (Part 2) The Origins of Public Schooling and the News Media
1. For most of human history, common people were exploited by elites and often discouraged or prohibited from going to school. Once common working people began to read, write, and send their children to school, the whole structure of society began to change. These newly educated people no longer wanted to be ordered about by established elites. Eventually, common people began to rise up. They debated traditional elites and demanded new democratic systems of government, a revolutionary idea which continues to powerfully shape the 21st century. Increased knowledge not only fosters political responsibility and economic development, but it can also open up new possibilities, which can lead to momentous transformations in technology, society, and government.
2. In Europe during the early modern period (1500s - 1700s), the Catholic Church and established monarchs were concerned that the spread of literacy (and later public schooling) would undermine both religious and secular authority (Glenn, 2011; MacCulloch, 2003). If people could read and write, as well as be able to critically analyze the religious, political, or economic views of their leaders, then these activities might very well lead to dissent and possibly to revolt. Religious and political authorities were right to worry (MacCulloch, 2003; "How Luther went viral," 2011). As late as 1832, the Catholic church warned against the corrosive influence of literacy, public schooling, and democracy. In the papal encyclical On Liberalism and Religious Indifference, Pope Gregory XVI attacked modern education, which "corrupted" youth and led to "the perversion of morals," the "destruction of public order," and the "overturning of all legitimate power" (qtd. in Glenn, 2011, p. 138).
3. As more and more people began to read, write, and think for themselves, they also began to demand some say over how their society should be organized and who should lead it, and also by what terms. This led to the development of "popular sovereignty," an idea that governments and political leaders should act in the general interests of the majority, the "people." This idea was inspired by the democracy of the ancient Greeks and the republicanism of the ancient Romans, but its modern variation was largely invented by the English in the 17th century (Morgan, 1988), and later developed by the Americans who used the notion of popular sovereignty to rebel against the British (Wood, 1991, p. 243). The idea of popular sovereignty eventually led to the idea of modern democracy and to several democratic political revolutions during the 17th and 18th centuries in England, the United States, France, and Haiti. In a democracy, the common people did not just want to be represented by the government, but they also wanted to fully participate in the political system and rule themselves.
2.1 The Political Origins of the Newspaper
4. But how would the uneducated majority learn about their best interests and become responsible citizens? How would the fractured and powerless people become mobilized into responsible voters who would debate issues and make their political will known? Books were an important source of information, but they remained relatively expensive through the 19th century. They were also harder to transport due to weight and harder to hide from snooping authorities due to their size. Thus, one of the primary tools of political education, mobilization, and participation during the 17th to 18th centuries was the newspaper (Wood, 1991, pp. 60, 107).
5. Newspapers were smaller than books and thus much cheaper to make, faster to produce, and easier to distribute and hide. The broadside and the pamphlet were the earliest and cheapest types of early newspaper. The broadside was a single large piece of paper printed front and back on heavy, quality paper (usually rag stock). It included advertisements, engravings, literature, political opinions, and news. The pamphlet consisted of a few sheets printed front and back and folded into a small book-like object. This early form of printing often used engravings or woodcuts, which were produced by local artists or by printers themselves (King, 1991; "Portrait of the artist," 2011). Later newspapers were produced by large machines (printing presses), and they took on what would become their standard form: large pieces of cheap paper printed in black and white, with text on the front and back, and folded into pages.
6. While newspapers served an important political function, it is important not to forget that they were primarily business ventures supported by advertising, either directly selling the wares and services of the printer who made the paper, or by selling the services and products of local businesses. Early forms of the newspaper were first used by printers to advertise the books and print services they were selling, sometimes including excerpts from the books they made. Gradually, newspapers became mostly a medium to discuss current events and share opinions. By the 18th century, newspapers had become a profitable business and an essential tool to inform the public about important issues and events. But we must remember that any "news" printed in newspapers up through the 19th century was highly local and reflected the biased opinions of the printer, who usually wrote all the articles himself, or had his apprentice write them.
7. Like ancient oral stories or religious texts, the popular press was a "social media" that brought people together through common cause and motivated people to act ("How Luther went viral," 2011). Newspapers were usually bought, sold, read, and discussed in local meeting places, like pubs and coffee houses, establishments that often served a variety of functions. A single pub offered a number of services: restaurant, bar, hotel, general store, post office, and later, a voting station ("Back to the coffee house," 2011).
8. By the 18th century, newspapers became a primary medium of democratic discussion and political activity. They were a "marketplace of ideas" (Schumuhl & Picard, 2005). Newspapers served many important social functions: a public forum to post information, a site for competing political groups to share views and debate, a tool to mobilize political participation, and a focal point for discussion and debate during social gatherings (Schudson, 2008; Thorson, 2005).
9. Accordingly, monarchs and the church began to fear the power of the popular press. These established authorities often took steps to control, censor, and sometimes ban the production or distribution of certain newspapers or books. In many places around the world, governments still to this day censor the press. Sometimes they even jail journalists who write on forbidden topics.
10. Political elites also learned how to use the popular press to mobilize their own supporters and wage verbal and ideological battles against their democratic opponents. These reactionary attacks eventually led to the development of "interest" based politics (Clemens, 1997; Wood, 1991, p. 257), and newspapers played a central role in publicizing the causes of various religious and political groups ("How Luther went viral," 2011; Schudson & Tifft, 2005, p. 19). To promote their cause, politicians and other political actors used the "competitive democratic" political arena of the popular press to win the support of voters who would, thereby, vote for a candidate (Wood, 1991, p. 257). Once elected, these politicians would later try to create laws to promote the particular interest of the group that supported them, like subsidies for farmers, tax breaks for merchants, or political rights for women.
11. While interest based politics was the only game in town, the established elites often used the press to promote their own traditional ruling class interests. Elites would often claim to be disinterested, above the fray of fractious democratic politics, and focused on political unity. Conversely, they claimed their democratic opponents were selfish, unruly, and bent on political conflict, sometimes even violence. James Madison famously argued in the Federalist newspaper article No. 10 that the ruling elite should use the government as a "disinterested and dispassionate umpire" to settle the petty squabbles of the "different passions and interests" of the common people (as cited in Wood, 1991, p. 253).
12. But not everyone was fooled by such clever ploys. The 19th century Norwegian dramatist Henrik Ibsen (2009/1882), in his famous play Enemy of the People, warned against the power of political elites to manipulate public opinion in the popular press. Ibsen showed how the press can both inform voters about important issues and manipulate voters with the clever lies of politicians. Politicians themselves were very aware of this fact. In 1807, as president of the United States, Thomas Jefferson complained to a friend, "The man who never looks into a newspaper is better informed than he who reads them, inasmuch as he who knows nothing is nearer the truth than he whose mind is filled with falsehoods and errors" (as cited in Schudson & Tifft, 2005, p. 19). From the very origins of modern democracy, political propaganda went hand in hand with the popular press.
2.2 The Development of Public Schooling
13. While newspapers and books became available to a wide audience of general readers by the 18th century, formal schooling was still largely restricted to a small class of privileged elites. Up until the 18th century, most households in Europe and America educated their children at home, and their instruction primarily consisted of learning to read the Bible (Glenn, 2011, p. 3; Howe, 2007, p. 449). The Protestant reformer Martin Luther was one of the first educational reformers to argue that parents were not qualified to teach their children, thus, "public" school systems needed to be built. Reformers further argued that the state needed to "compel the people to send their children to school" (qtd. in Glenn, 2011, p. 5). Luther called for the "secularization of the organization, though not in any respect of the content, of schooling" (Glenn, 2011, p. 4). By the early 19th century, Prussia became the first European state to create an organized, secular "public" school system (Glenn, 2011), which became the envy of the world. The German system influenced American educational reformer Horace Mann, who helped create the first American public school system in Massachusetts (Cremin, 1980).
14. By 1840s America had one of the highest literacy rates in the world. But subordinated non-white minorities and the growing immigrant working class were often excluded from public schooling, and in the case of black slaves, prohibited from reading entirely. According to the U.S. census, which first asked questions about literacy in 1840, about 91 percent of adult, white Americans were literate. This rate was similar to Prussia’s (Northern Germany), the most literate country in Europe at the time, and much higher than England’s, which had a population that was only 59 percent literate (Glenn, 2011, p. 455). In the New England states, the American literacy rate was 98 percent and above, while the state with the lowest white literacy rate was North Carolina, with 72 percent, still a high number of literate adults (p. 455).
15. Prussia and the United States were the two nations with the highest literacy rates in the 19th century, and they also had the most developed systems of public schooling. Americans had embraced the idea of state-funded public schools during the Revolution, but individual states did not begin to really support “common” primary schools open to the general public until the early 19th century. Common schools were first established in the East and Midwest, with the southern states lagging behind because people were suspicious of centralized schooling and because of aristocratic contempt for educating the common masses (Cremin, 1980). Only about 40 percent of southern white children attended schools (Howe, 2007, p. 452), and it was still largely illegal to teach black people to read until after the Civil War (Anderson, 1988).
2.3 Institutions of Higher Education
16. As a bastion of privileged elites, most institutions of higher education, up until the middle of the 19th century, were intensely local, highly religious, and discriminatory. In the United States by 1848 there were 113 small, liberal arts colleges, mostly founded by various Protestant denominations, especially Presbyterians, Methodists, and Baptists. Only 16 colleges were state-funded public institutions (Howe, 2007, pp. 459, 462). These institutions of higher education enrolled a small population of wealthy, white, young, Protestant men (Cremin, 1980, p. 400-09; Thelin, 2004, p. 107), although four colleges did enroll women before mid-century (Howe, 2007, pp. 460-61). Educated elites lived mostly in New England, so higher education developed from this geographical base. Most American colleges were located in the East and Mid-West until the late 19th century, and the Northeastern establishment remained the center of the American intellectual world until at least the mid twentieth century.
17. Northeastern colleges were formative in the socialization of wealthy American gentry. This liberally educated gentry class actively excluded many groups from full participation in the social, political, economic, and educational opportunities that America had to offer (Dawley, 1991). By the late 19th century, a social and political transformation was taking place in Europe and the United States, as democratic pressure began to open up schools and the labor market to more Americans. Slowly, very slowly, more middle-class and lower-class people were able to gain upward social mobility, political representation, and economic stability (mostly because of various radical social movements and protests). Because of this increased opportunity, more American young adults gained access to elementary schools, high schools and also to higher education.
18. Public colleges, which later developed into research universities, did not spread widely across the U.S. until the second Morrill Land Act of 1890, which institutionalized steady state funds for higher education. By the late 19th century, practically oriented and publicly open state systems of public higher education began to emerge in places like Wisconsin and California, and similarly oriented private universities also emerged, such as the University of Chicago and Johns Hopkins University. These new institutions were academically modeled on the modern Prussian research university. Gradually they became more democratically oriented, as they broadened their student base to include a larger swath of ambitious white, middle class Americans, especially white Protestant women. These institutions also began to develop, according to the educational leaders of the time, a new national-oriented Americanism, rationalized professional standards, depoliticized civil service training, and a Protestant infused mission focused on efficiently engineering social problems in the name of the public good (Thelin, 2004; Veysey, 1965).
2.4 Opening Up: Democracy & Education
19. It would take over a century for the “progressive” social and political movements to open American society and its systems of primary, secondary, and post-secondary education to a majority of citizens (Dawley, 1991; Foner, 1998; Jilson, 2004). Only five percent of the 19- to 22-year-old population was enrolled in an institution of higher education in 1910. A more diverse array of white, middle class, Protestant men were the first to break into exclusive American colleges after the Civil War (excluding the few Roman Catholic colleges that exclusively served Catholic men, a largely Irish population). White, middle class Protestant women took advantage of co-educational public institutions, and by 1880, women constituted about one-third of all American college students (Howe, 2007, p. 464).
20. By the turn of the 20th century, other white ethnic/religious groups, like reformist Protestant sects, Jews, and Catholics, were allowed greater access to mainstream institutions of higher education, but there were often implicit, if not explicit, discriminatory quotas that limited particular ethnic and religious groups to a certain percentage of the total student population. Only belatedly, in the second half of the 20th century, did the most disadvantaged Americans (non-white ethnic and racialized minorities, the working class and poor, and the physically and learning disabled) gain full political rights and access to some form of higher education (Foner, 1998; Jilson, 2004; Thelin, 2004).
21. While education has been central to the development of democracy across the world, access to education has been restricted in most countries due to various forms of discrimination that lasted until the late 20th century. Now that education has become more and more important in our globalized world, many people have argued that access to public primary and secondary schools and universities should be universal in order to give everyone a fair chance at success, especially in democratic countries; otherwise, education becomes just another means of discrimination by the rich and powerful. But the idea of social and political equality for all people is still a new idea, as most democracies (including the U.S.) have actually excluded (and continue to exclude) many groups of people from full participation in society and the political system. In the 21st century, the spread of literacy will have to be combined with the spread of social and political democracy in a process economist Amartya Sen (1999) has called "development as freedom."
22. Democracy can be a complicated notion to understand because this word is often used in two different ways. Democracy can be used to describe an actual form of government that certain nations practice. Democracy can also be used to express an ideal form of government that may or may not have ever been practiced by an existing nation. Both the idea and the practice of democracy are very, very old, about 2,500 years (Dahl, 2000). The first democratic nation that scholars have documented was ancient Greece, which formed around 500 B.C.E. Another ancient democracy was republican Rome, a colony of Greece, formed several hundred years later. The word democracy comes from the Greek word demokratia, which combines the words demos (people) with kratos (to rule). Thus, democracy meant ruling or governing by the people. The word republican comes from the Latin word respublicus, which combines the words res (thing or affair) and publicus (public or the people). Thus, republican meant “the thing that belonged to the people” (p. 13).
23. But these early democracies never extended citizenship to all the people who lived within the nation. In both Greek and Roman societies, there were different classes of people, and not everyone was considered worthy of freedom or participating in government. Both societies held slaves, who were considered non-persons. Women, children, and foreigners were also considered non-persons. Without the full rights of citizenship, these groups were not free and they could not participate in government. Rome also distinguished between rich and poor. The small, ruling class of rich people were called patricians, and the majority of poor people were called plebeians. It took a long time for plebeians to gain full citizenship in ancient Rome, and even once they did, very few were able to realize and practice full political rights, such as participating in government (Dahl, 2000).
24. There have been a few other historical examples of democratic governments after ancient Greece and Rome, but the most influential example would become The United States of America. But like its predecessors, America was also divided by various classes and types of people, and not everyone could be a full citizen with political rights and the responsibility to participate in government. When the nation was founded after the Revolutionary War, in the late 1780s, citizenship was restricted to white males, but not all white males, because most states had property requirements for a white man to gain full legal rights and be able to vote and participate in government. Women, children, and non-white people, like Blacks and Native Americans, were not allowed to be citizens.
25. Political scientist Robert A. Dahl (2000) has argued that democracies need to have inclusive citizenship, freedom, and equality in voting and participation in government if these types of governments are to be truly called a democracy. Using Dahl's definition, The United States of America did not really become a democracy until 1965 when the Civil Rights Act, for the first time in the country’s history, enabled almost every person in America to have full legal rights (pp. 38, 85; Foner, 1998; Jilson, 2004). However, to this day some groups of American citizens, like homosexuals or Native Americans, still do not have full political and social rights.
26. The ideal form of democracy was expressed by Thomas Jefferson in The Declaration of Independence, where he stated that a democracy was a form of government made by free people to protect their freedom, lives, and property. The free people living under a democratic government were expected to participate in that government by making their ideas and needs known, and to make sure that the government did not do anything to endanger their rights. If the government was not working well or if it started to infringe on the people’s rights, then citizens had the ability to change the government or make a new one that would work better. At the heart of any ideal democracy are two fundamental principles: political equality and an educated citizenry. First, political equality means that every member of the society, all the people, should have full rights and should be able to freely and equally participate in the government and society at large. And further, for people to be able to build a fair democratic government, they need to be educated so that they can make good political decisions.
27. However, education for democracy means more than just memorizing facts and learning an occupation. Education for democracy means something more personal. It means learning how to understand one’s self and the world one lives in, to grow as a human being, to be able to communicate with fellow citizens, and to have the thinking and social skills necessary to participate in society and government. Both democracy and education, according to the philosopher John Dewey (1916/1966), were ways of living (pp. 6, 89, 99). The purpose of living an educated and democratic way of life was to realize one’s full humanity and to change the world for the better (Dahl, 2000, p. 8-9; Gutmann, 1987).
2.5 Democracy as a Way of Life
28. The nation called The United States of America has developed over several centuries and is still in flux today. The American people have almost all been immigrants coming to North America, both freely and as slaves, from diverse parts of the globe. Here they have mingled together, often violently, not only with themselves, but also with the native inhabitants. From the start, notions of an “American” nation and an “American” people were contested ideological battlegrounds by which diverse participants verbally, symbolically, and physically fought over the defining contours of a nation. The idea of America remains to this day an unsettled and contested ideological terrain – the contours of which remain divisive and ever changing. As the historian David Waldstreicher (1997) pointed out, the history of our country "shows us that America's common political culture consists of a series of contests for power and domination, contests over the meaning [of America]...and who counted as truly 'American'" (p. 352).
29. Professor of education James A. Banks (2008) has argued that a major problem facing modern, multicultural nations is “how to recognize and legitimize difference and yet construct an overarching national identity that incorporates the voices, experiences, and hopes of the diverse groups that compose it” (p. 133). One solution to this problem has been offered by English professor Gerald Graff (1992). He argued, educators should show students that “culture itself is a debate” and, thereby, “teach the conflicts,” which define American culture both past and present: “Acknowledging that culture is a debate rather than a monologue does not prevent us from energetically fighting for the truth of our own convictions. On the contrary, when truth is disputed, we can seek it only by entering the debate” (pp. 8, 12, 15).
30. The historical record makes it very clear that America has rarely been either a democracy or an equal society, and if America is ever to become a fully functioning democracy, then more and more Americans need to learn how to participate effectively in their culture and political processes. Many Americans have challenged the elitist, antidemocratic, and exclusive pronouncements of political and social leaders since the birth of this nation. And yet courageous and inspiring individuals have not always been able to change their world in the ways that they would have liked. American democracy is an unfinished project that still requires knowledgeable, committed, and courageous individuals who will work on furthering the ideals of equality, freedom, and justice for all people – not just a privileged few.
31. But the political knowledge and engagement of Americans has been stagnant, if not decreasing, over the past 50 years. According to one report, recent college graduates have about the same political knowledge as the high school graduates of the 1940s. The opportunity to teach college students how to be knowledgeable and engaged citizens has been “wasted” for the past half century (Colby, et. al., 2008, pp. 45, 51). Because many young people don’t care about their public responsibilities as democratic citizens, economic inequality and social injustice in America have been steadily growing over the last quarter century (Mishel, Bernstein & Allegretto, 2007). One of the great challenges of democracy is to create an educated and informed citizenry willing to stand up and work for democratic values. Most of the students I have tried to educate over the last fifteen years have not known the meaning of democracy, let alone try to practice or promote it.
32. Politics is a defining feature of all of our lives (Smith, 2011). It is important for all Americans to learn how to enter and contribute to public debates, whether they are for scholarly purposes or for social and political discussions with friends, family, or strangers. There are many political institutions in which we take part: learning in a school, listening to a public lecture, reading a book, going to a political event, or participating in a social organization. Each one of these activities is political and each enables people to argue over what is good and what is right. Philosopher Kenneth Burke (1941/1973) called the myriad political debates of all societies the “unending conversation” of history (pp. 110-111). Burke’s unending conversation was a metaphor for the pursuit of human knowledge and the peaceful practice of citizenship (Beach, 2012). This conversation is composed of all the people who actively try to understand their world and its problems, so that they can debate the best solutions for a better world:
Imagine that you enter a parlor. You come late. When you arrive, others have long preceded you, and they are engaged in a heated discussion, a discussion too heated for them to pause and tell you exactly what it is about. In fact, the discussion had already begun long before any of them got there, so that no one present is qualified to retrace for you all the steps that had gone before. You listen for a while, until you decide that you have caught the tenor of the argument; then you put in your oar. Someone answers; you answer him; another comes to your defense; another aligns himself against you…the discussion is interminable. The hour grows late, you must depart. And you do depart, with the discussion still vigorously in progress. (pp. 110-111)
It is important for every human being to gain knowledge about his or her life and times in order to participate in public debates and help contribute to the betterment of society, not only locally, but also nationally and globally.
33. One of the goals of education in diverse democratic countries should be to enable new generations to participate in this conversation. In the 21st century, everyone should be able to read, write, and think in order to enter into the conversation of history, debate fellow citizens, and enact a true democracy (Gutmann, 1987; Smith, 2011). The historian David Waldstreicher (1997) reminds us that "The 'nation' is never just an idea or a thing; it is also a story, an encompassing narrative or set of competing narratives...[that] suggest not only identification but [also] a script or course of action" (p. 142). Democracy is based on the premise that all citizens should have a say in defining and debating the identity of a nation. But citizens need to be knowledgeable, critical thinkers in order to effectively and responsibly exercise their political rights.
34. The concept of a vast, unending conversation is an appropriate metaphor for the future of 21st century literacy in a globalized, multi-cultural world. We can imagine the many global debates going on right now as an orderly, yet “heated” discussion conducted by engaged human beings. Hopefully, they are using logical, evidence-based arguments to debate each other over the best way to live peacefully, promote prosperity, and solve our common problems. The 21st century world needs more educated and politically engaged people to take part in these diverse conversations. But to fully take part in such important debates, you will need not only traditional skills, like reading, writing, and arguing, but also new skills, like critical thinking, scientific reasoning, and open debating methods. And where does one learn these new and necessary skills? They are taught in institutions of higher education.
Subjectivity, Culture, & Common Sense
1. We do not see the world clearly, and we do not completely understand what we see. Our mind is not a “mirror of nature,” as many early theologians, philosophers, and scientists assumed (Abrams, 1953; Rorty, 1979; Polanyi, 1964). Human consciousness is not a passive receptor of experience, like a mirror simply reflecting an image of the real world. Instead our consciousness is like a lamp shining on the real world, but coloring what we see with the light of our own unique vision. Our consciousness actively engages with experience through our perceptual process in order to create knowledge, meaning, and values. Our brain connects the "fragments of knowledge" we experience into a coherent narrative. We understand our experience through our meanings and values, thereby, making our knowledge useful (Kahneman, 2011, p. 75). Consciousness also colors our experience with emotion, which helps us remember important events and give them meaning (Pinker, 1997). Our perception does not directly reflect the reality of the world we experience. Instead, we see a subjective world that is mediated by our biological brain and also by our culture.
2. We all inherit ways of thinking and acting that are particular to our unique social context. We call these ways of thinking and acting culture. We mimic the actions and beliefs of the individuals who shape us, such as our parents, peers, teachers, priests. Culture also includes larger social institutions that mold our behavior, such as families, schools, churches, organizations, and governments. Culture entails the language we speak, the customs we practice, and the beliefs we think are true (Geertz, 1973/2000; 1983/2000). All of this makes up our "social heritage" (D'Andrade, 2002, p. 223). Culture is a tool. It is a large assembly of "technological and social innovations" (Pinker, 2002, p. 65-66), which fosters our development as human beings, gives us meaningful lives, and helps us survive (Nussbaum, 1997).
3. While we are born into a culture, we have the power to accept and reject the various sub-cultures to which we are exposed. We can shape our own individual identity and character. Social scientists call this phenomenon subjectivity. Our subjectivity is our own unique identity and personal world view. But we are influenced by others in our culture, and since we seek to be like our friends and family, our subjectivity will be very similar to those around us. We use our subjectivity to understand our world, create knowledge, and communicate with others. Our subjectivity co-creates experience with the objective world and our minds create what has been called "subjective realism" (Flanagan, 2011, p. 66). The phenomena we see and experience (Kant, 1781/1994, p. 48) is real to us. It appears real, although it might not be objectively real; therefore, other people might not be able to verify what we see and believe.
4. While subjectivity enables us to live a rich and meaningful life, it can also cause many problems. Our brain can often misperceive the objective world, and these misperceptions can lead us to make bad decisions (Kahneman, 2011; Thaler & Sunstein, 2008). For instance, we might hear a noise at night and believe there is a burglar in our house. We might see a weird flying shape in the sky and believe it to be a UFO. We might see a political protester burn a flag and believe the act to be unpatriotic. We might see a soldier step on a Qur'an and believe the act to be sacrilegious and an affront against our God. The conclusions reached in each of these examples may or may not be objectively true, but every example is subjectively true: the individual believed the phenomenon to be true as he or she experienced it. In each case the culture of the individual shapes perception, which leads the individual to classify experience in a particular way. Prior belief in house burglars, UFOs, a patriotic ideal, or a religious code would lead an individual to classify new experience with these frames of reference. This process of framing is all part of a normal functioning brain.
5. Outside of subjective framing, our brains can also malfunction or become damaged. Such malfunctioning can lead to false perceptions that can exacerbate the problem of subjectivity. The brain perceives a phenomenon that seems very real, but which is a product of the brain itself and does not objectively exist in the real world. Such malfunctioning might include color blindness, schizophrenia, or autism. It could include deliberate malfunctioning, such as taking mind altering drugs. It could also include being manipulated to believe a false memory, but really all our memories falsify reality even when the brain is working properly. We all tend to believe that our unique subjective vision represents the objective world as it really exists. But, in fact, the phenomenon we see is more of a product of our own brain rather than the objective world we partially (and imperfectly) perceive or remember. Even when it's fully functional, our biological brain does not work very well. It is, as behavioral economist Dan Ariely (2008) points out, "predictably irrational" (p. xx). We often make "naive, random" decisions based on "gut feelings," which can be "self destructive" (pp. 45, 53, 166), and even the brains of experts and scientists fall prey to these same flaws (p. 197). We are all, as one reporter explained, "confident idiots." Dan Ariely (2008) concluded his bestselling book on the brain by saying, "We are pawns in a game whose forces we largely fail to comprehend. We usually think of ourselves as sitting in the driver's seat, with ultimate control over the decisions we make and the directions our life takes but, alas, this perception has more to do with our desires - with how we view ourselves - than with reality" (p. 321).
You Are Almost Definitely Not Living In Reality
6. The flawed process of subjective belief gets augmented and further distorted by our culture. Particular individual beliefs become shared by a large group of people, and thereby, they become the orthodox or official beliefs of that group or culture. Anthropologists and political scientists call orthodox beliefs "ideology" (Geertz, 1973/2000; Eagleton, 1991) or "common sense" (Geertz, 1983/2000). Anthropologist Clifford Geertz (1983/2000) explained common sense as a widely shared "cultural system" (p. 76) that everyone accepts as "normal" and "natural" (p. 81). It is a collection of minds shaped by the same "presuppositions" (p. 84), which when heard over and over again become true by a default mechanism in our brain (Kahneman, 2011, p. 62).
7. Culture often acts like a "rubber stamp," which is "inked with advertising slogans, with editorials, with published scientific data, with the trivialities of the tabloids and the platitudes of history" – all imprinting our plastic minds with common sense truths that we passively accept (Bernays, 1928/2005, p. 48). The early 20th century intellectual Walter Lippmann (1922/1997) explained, "For the most part we do not first see, and then define. We define first and then see. In the great blooming, buzzing confusion of the outer world we pick out what our culture has already defined for us, and we tend to perceive that which we have picked out in the form stereotyped for us by our culture" (pp. 54-55). Thus, common sense is "what anyone clothed and in his right mind knows" (Geertz, 1983/2000, p. 75) because he or she has heard it proclaimed and seen it as truth so many times before.
8. But common sense varies between different cultures, a fact which causes a lot of conflict when different cultures come into contact with each other. What seems “normal” or acceptable common sense in one culture can be labeled outrageous by another culture. Just think, for a moment, about how you instinctively view cannibals. Most, if not everyone, in our culture would say cannibalism is disgusting and immoral because it violates our common sense values of life and liberty. But how do you think cannibals view you? Likewise, think about the horrible atrocities committed by the Nazis in the early 20th century. The Nazi regime created a program to systematically brutalize and murder Jews, Communists, homosexuals, and other undesirable groups of people who were deemed inferior by the standards of common sense. Heinrich Himmler, the chief of the Gestapo secret police, explained, “In my work for the Fuhrer and the nation I do what my conscience tells me is right and what is common sense” (as cited in Kihlstrom, 2013, p. 11). Most people do not think about the values and behaviors considered common sense by their culture – they just do what everyone else is doing, even if that includes exterminating another group of human beings.
9. But we can learn to question our culture, and we deliberately chose to accept or reject what we are told by others. In her memoir, Tara Westover (2018) recounted how she was brought up in a small community in the woods of Idaho. She was told to never question her parents and to do whatever they asked her to do. She was prohibited from going to school or even going to the doctor. She explained, "My life was narrated for me by others. Their voices were forceful, emphatic, absolute. It had never occurred to me that my voice might be as strong as theirs" (p. 197). But eventually she left her family and went to college (even though she had never been to school before and she only had the basic skills of reading and writing). While she was in college, Westover (2018) learned about the world outside of her small community where she grew up, and this new information challenged the way she saw her world and herself: "Something had shifted...I had started on a path of awareness, had perceived something elemental about my brother, my father, myself. I had discerned the ways in which we had been sculpted by a tradition given to us by others, a tradition of which we were either willfully or accidentally ignorant" (p. 180). Eventually, Westover (2018) rejected the way of life she was forced to lead as a child in order to choose a new way of life, which included going to Europe to earn her PhD and then later becoming Dr. Tara Westover, a college professor.
10. But unlike Dr. Westover (2018), many people never become fully aware of traditions, let alone question or reject them. By definition, common sense is "fiction accepted without question" (Lippmann, 1922/1997, p. 80). Common sense is declared "self-evident truth" because everyone already knows that it is supposedly true, as Thomas Jefferson proclaimed in the Declaration of Independence. Common sense cultural fictions are very important to our psychological and social well-being. Common sense is the glue that makes society work. The historian Edmund S. Morgan (1988) explained, "fictions are necessary, because we cannot live without them... [they] make our world conform more closely to what we want it to be... The fiction takes command and reshapes reality" (p. 14). While the subjective magic of fiction can be denigrated by outsiders as mere myth-making, all human beings have their own ideologies and need their myths in order to survive.
11. And when our experience doesn't fit our ideology or common sense, then most people disregard or "disguise" the facts (Geertz, 1983/2000, p. 82) so as to reaffirm what they already believe. Most people are detached from the reality of the objective world. Instead, they rest serenely in their own subjective illusions - safe in the self evident truth of common sense. As PR man Edward L. Bernays (1923/2011) explained, it is the culturally programmed mind of the average person that "is the greatest barrier between him and the facts" (p. 133).
12. We can never escape our subjectivity, nor can we wholly eradicate the cultural influences that have shaped us since we were born. The 16th century British philosopher Francis Bacon called such subjective and cultural phenomena "idols" (Gaukroger, 2001, p. 120). He saw human subjectivity as "a corrupt and ill-ordered predisposition of mind" (as cited in Klein, 2003). Bacon believed, as have many scientists since, that we can destroy and abolish these "idols" so as to see the world with pristine and unencumbered eyes – as through "clear glass" (as cited in Klein, 2003). But this belief is a lie. Complete objectivity is a "false ideal" (Polanyi, 1962, p. 18). We can never escape Plato's (1997) epistemological cave.
13. We cannot "command" our nature nor the objective world. Our minds can never be "thoroughly freed and cleansed" (as cited in Klein, 2003). As the English philosopher David Hume (1888/1739) famously put it, "We have, therefore, no choice left but betwixt a false reason and none at all" (p. 268). The American philosopher Ralph Waldo Emerson (1957/1844) later agreed: "We have learned that we do not see directly, but mediately, and that we have no means of correcting these colored and distorting lenses which we are, or of computing the amount of their errors" (p. 269). Our perceptual tools are naturally flawed.
14. But Bacon, Hume, and Emerson admitted that our subjectivity was not a prison, as Plato once thought. As part of the natural world, we were still uniquely situated and endowed with an inborn capacity to know the objective world, however flawed that knowledge may be. Bacon said that we can "interpret" ourselves and the natural world, but this ability is grounded by the constraints of the physical world, including the limits of our own biological brain, which "must be obeyed" (Bacon as cited in Gay, 1995, p. 312). As reflective and critical beings, we can become more aware of how our biology, subjectivity, and culture influence our perception and behavior. We can also become more aware of how our biology, subjectivity, and culture can be influenced and modified, in turn, how they can be changed, not commanded. Our ability to alter ourselves and our environment produces the conditions of true freedom and moral responsibility (Dennett, 2003, pp. 1, 162).
News Media Biased and Unreliable
1. Where do most people get information about the world in which they live? Most of us get information from people around us, our friends and family. This information mostly comes in the form of gossip, which may or may not be completely true. We also get information from various news media. Some of the news media present information written by professional journalists or scholars, but most news media sources offer only the subjective opinions or gossip of regular people who have no specialized training. Do you know what news media sources of information are true or false? Can you tell the difference between professional journalism or amateur gossip?
2. Most people don't know much about the world they live in. We trust authoritative individuals or like-minded groups rather than critically analyze information to decide if it is true or false. But trusting information, rather than knowing information, is very dangerous. We need to be able to critically evaluate the news media in order to know what is true or false. There is an important connection between literacy, critical thinking, and political freedom. Democratic societies rely on the open and free exchange of ideas. But not all ideas are equally true or useful. Many people don't realize that traditional news media sources deliver biased and false information, which makes almost all news media stories highly unreliable forms of knowledge. This chapter will explain why.
4.1 Trusting Tradition and Authority
3. Most people trust authority figures, like parents, teachers, priests, business leaders, and politicians. These figures usually work for an authoritative institution, which people also trust as an "official" source of knowledge, like a school, university, church, successful business, or a government agency. Most people also instinctively trust like-minded people who have the same worldview, which includes shared cultural identity, language, belief system, and values. These worldviews are often officially represented by political parties, religious groups, academic disciplines, or professional organizations. We naturally trust any information that supports our established worldview, whether it is actually true or not. We find it very difficult to accept new information that would call into question previously held beliefs. We get most of our information from authority figures in the form of "common sense," which is not really knowledge because we don't really know how or why a claim is true. We just believe those authorities who we trust. Trusting common sense is dangerous because we don't actually know how or why something is true, so we simply can't see or understand truth or lies.
4. Besides trusting authority figures, most people turn to the news media for information, especially about the world beyond one's local community. Every society has a few traditional sources of news, which most people trust to some degree. In America these traditional news sources include newspapers, The New York Times, The Wall Street Journal, and The Washington Post, television shows, CBS News and 60 Minutes, or television networks, like CNN or Fox News. Each country around the world has its own traditional news media outlets. In some places, the news media is directly subsidized by the government, like the British Broadcasting Corporation (BBC) in England. Or it can be partly funded by the government and partly funded by private donations, like the Public Broadcasting Service (PBS) and National Public Radio (NPR) in the U.S. Most of these traditional news media sources employ professional journalists to write and report the news. People usually trust news media sources that offer a shared cultural worldview (often defined by political ideology or religious affiliation). But increasingly, people turn to news media for entertainment and escape rather than information.
4.2 Getting the News: The Evolution of Social Media
5. Where does new information, "the news," come from? Ultimately, it comes from people who have done something important or from people who directly observed an important event. The news is usually first spread through word of mouth (what we often call "gossip"), and then it gets written down and published, often via official media outlets. Since the 17th century, printed newspapers have been the primary source of written news in the western world. Books are also an important source of news, but because books take so long to write, produce, and distribute, the news is not so "new" once the book is published, a fact which makes all books somewhat outdated by the time they are available to read. Because newspapers are so cheap to produce and quick to distribute, for several centuries they have been the single most important source for daily news. But since the mid-19th century, several new types of news media have been developed to compete with the newspaper. Each new media development (first radio, then television, and recently the internet) delivered the news quicker, to a larger audience, over greater distances. Each new form of technology caused the public to shift its attention and trust towards the new media and away from printed newspapers. After flourishing for over four hundred years, 20th century television finally eclipsed newspapers as the major source of news for most people in developed countries.
6. Since the development of television, and more recently the internet, and due to the increased cost of paper, printing and shipping, scholars and journalists have been wondering if printed newspapers might eventually become extinct. But surprisingly, print newspapers still remain a popular source of news all over the world, although more so in developing countries, like China and India, and less so in developed countries, like Europe, the United States, Japan, and South Korea. While most experts see the internet overtaking both television and print newspapers as the primary source of news within the next few decades, no one seriously believes that the newspaper will disappear anytime soon (Bulletins from the future, 2011; Fallows, 2010; "The strange survival," 2010).
Handout: Where Do Americans Get Information?
7. It has also become fashionable these days to get excited about new forms of "social media," as if these developments are a brand new phenomenon. In fact, social media simply refers to any technology, or tool, that facilitates communication and social interaction. There have been many different types of "new" social technologies that have excited people over previous centuries, starting with the book and the newspaper (as discussed in chapters 1 and 2). Each new media "alter[ed] the physics of perception, changing the ways that people saw, experienced, and understood the material world and their place within it" (Ewen, 1996, p. 67). The first "new" modern social media technology to rival the newspaper was the telegraph, which was co-invented in the 19th century by Samuel Morse and others, and was later developed and commercialized by the Western Union Company. This advanced technology spread the news almost instantly across large distances and connected people in faraway places as was never before possible in human history (Howe, 2007, pp. 693-97; Blondheim, 1994, pp. 11-29).
8. Later in the 19th century, the radio was invented. By the early 20th century, the Radio Corporation of America (RCA) had developed relatively inexpensive radios for the home, which enabled this new technology to rival newspapers as the most popular medium for news and entertainment. In 1921, only five radio stations existed in the world, but by 1927, there were several hundred stations. By 1949 there were 1,600 radio stations in the United States alone (Schudson & Tifft, 2005, p. 26; Patterson, 1996, p. 348). Politicians, like Franklin D. Roosevelt, were quick to realize the vast political potential of the radio, and they began to bypass newspaper reporters and hostile newspapers to speak directly to the public (Schudson & Tifft, 2005, p. 26).
9. Other "new" forms of media appeared in the 20th century. The magazine, like Time (1923), Newsweek (1933) and U.S. News & World Report (1933), became a popular medium for current events, although this form of media was mostly read by educated audiences who wanted more substance than a typical newspaper provided. Unlike the daily newspaper, magazines were published once a week or month. They took longer to produce because they offered a broader synthesis and deeper analysis of current affairs. Magazines catered to a middle class audience. They offered an elevated focus on style and culture, reviewing art, literature, music, drama, and dance (Teachout, 2002). And they included lots of photographs, which were popular with audiences, and which helped develop a new type of news called photojournalism (Ewen, 1996, p. 53; Schudson & Tifft, 2005, p. 25).
10. Another form of news media developed in the early 20th century was the partisan "think tank." This political institution brought academic researchers together with political strategists to create public policy recommendations based on science (McGann, 2005; Troy, 2012). A think tank often uses scientific methods to research public issues, but not always. Think tanks are funded to support partisan values and enact partisan legislation, so think tank research is often openly biased. A director of a think tank once explained how these institutions use “the trappings of scholarship” to “put a scientific cover on positions arrived at otherwise,” usually because of political ideology (Crawford, 2009, p. 108).
11. One of the oldest and most respected think tanks in the United States is the liberal Brookings Institution, which was founded in 1916. More conservative think tanks were developed during the Cold War. The center-right Rand corporation was founded in 1948. The conservative Heritage Foundation was later founded in 1973. The mission statement of the Heritage Foundation is to "formulate and promote conservative public policies" (as cited in Alterman, 2003, p. 84). Burton Pines, a past vice president of the Heritage Foundation, said, "We're not here to be some kind of Ph.D. committee giving equal time. Our role is to provide conservative public-policymakers with arguments to bolster our side" (as cited in Alterman, 2003, p. 83). While many think tanks produce quality scholarship, this information is often compromised by the political agenda of the organization.
12. From about 45 think tanks during World War II, there are now over 1,800 think tanks in the United States today. Most focus on single policy issues, like the environment, energy, or policy in the Middle East, but many are comprehensive in scope. While the majority of think tanks founded before 1960 employed mostly academic researchers with PhDs, more recently founded think tanks have been enlisting greater numbers of partisan thinkers and writers without academic credentials, many of whom transitioned from political careers into "research" careers (Alterman, 2003, p. 83).
13. Because of their political bias, the quality of think tank research varies greatly. Some think tanks produce exceptional research and can be considered "universities without students" (Troy, 2012). But many of these organizations produce no research at all, publishing only partisan talking points to help members of their political party get elected. Political influence is, of course, the primary purpose of all think tanks. In fact, some of the more successful, like the conservative Heritage Foundation, have seen more than 60% of their policy recommendations adopted by presidential administrations.
14. By the 1950s, the TV was mass produced and began to become a fixture in the American living room. In 1948, only 172,000 households in the U.S. owned a TV, but by 1952, TVs were in 15.3 million U.S. households, and this number increased to 32 million in 1955, representing around 75 percent of all homes (Patterson, 1996, p. 348). By 1960, over 87 percent of American homes had a television, in what economist Jeffrey D. Sachs (2011) called "the fastest adoption of a major new technology in history" (p. 138).
15. Serious news reporting and coverage of political campaigns became an important function of all TV networks. Americans were turning to TV news as their primary source of information by 1963 (Shenkman, 2008, p. 102). However, most of the programming on this new device was created to entertain American audiences, not inform them. For example, the TV quiz show was one of the most popular early forms of television entertainment, until the public was scandalized by news that they were rigged (Goodwin, 1988). Later, the sitcom, sometimes called the "soap opera," became the major genre of t.v. entertainment, and these shows helped give rise to another powerful news medium, the t.v. advertisement.
16. The sitcom was actually created by advertisers in the 1940s as a way to get people to listen to the radio for extended periods of time, thereby exposing listeners to multiple product advertisements, which helped dramatically expand sales. The reason radio sitcoms were called "soap operas" was that soap and other cleaning products were marketed to housewives who would listen to the radio while they cleaned the house and took care of the children. The basic formats of the news program, the sitcom, and the advertisement were all first created for the radio and then later developed into their current forms on the television.
17. Up until the early 20th century, news media were primarily focused on delivering information for civic purposes. But over the last half century, many new forms of media have been developed solely for the purpose of entertainment and social networking, not for information about current affairs or for participation in political processes (Jacoby, 2009, p. 125). These various new forms of entertainment media include movies, music, music videos, video games, entertainment magazines, and the internet. The new social networking media include blogs; social networking websites, like Facebook and LinkedIn; and micro-blogs, like Twitter.
4.3 Conflicting Purposes: Inform or Entertain?
18. Because news media have always been businesses that want to make a profit, the separating "line" between news-as-information rather than news-as-entertainment has never been clear. But over the 20th century, it seems that many news media organizations have tended to focus more on making money and entertaining than on informing citizens for the public good (Jacoby, 2009, p. 125). Some news organizations, like The Economist, The New York Times and The Washington Post, deliver high quality, evidence-based reporting with rigorous documentation and fact-checking processes, but these organizations are the exceptions rather than the rule. As many scholars have pointed out, by the end of the 20th century that "line between news and entertainment, already blurred, became fuzzier still" (Schudson & Tifft, 2005, p. 38). Many media critics refer to the artful blending of information and entertainment as “infotainment,” mindless amusement or political commentary masquerading as news (Jacoby, 2009, p. 262).
19. The various forms of "new media" are changing the way people think about and access news, especially with the development of the internet. But these new forms of social media should be handled with caution. For the most part, these sources should rarely be used in an academic paper. This is because most "new media" delivers highly unreliable and heavily biased information, although some forms of new media are much more unreliable than others. Some news organizations have rigorous fact-checking and documentation processes, but most media organizations publish anything that will make money. H. L. Mencken (1914), one of the most famous reporters of the early 20th century, sarcastically noted, "One of the principle marks of an educated man, indeed, is the fact that he does not take his opinions from newspapers." Take Twitter, for example. Due to its short and constrained format, it is almost impossible to say anything half-way intelligent on Twitter. Thus, most Twitter users "tweet" incomplete thoughts in idiosyncratic codes, which outsiders find hard to decipher. Scholars of digital media are currently trying to investigate who would want to write or read these "tweets"?
20. Almost everyone has access to social media these days, airing all sorts of personal opinions about everything from what you’re eating or wearing, to your plans for the weekend. People are even sharing the "news" that they are bored and have nothing to say! About 64 percent of Americans use some form of social media. The most popular is Facebook, which 51 percent of Americans use ("Between main street," 2012, p. 5). The public broadcasting of all these personal opinions has produced what The Economist has called a "baffling blizzard of buzz" ("Too much buzz," 2011).
21. Let alone true, are all these opinions even real? Did you know that between 25 to 30 percent of online product reviews are fake (Lindstrom, 2011, p. 113; Streitfeld, 2012)? People are paid to write bogus reviews, which most of us assume come from real consumers or professional reviewers. Sometimes you may want to listen to biased information, when talking to friends, or when trying to understand personal opinions about important issues. For this purpose, using the internet, blogs, Facebook, or Twitter could be appropriate. But even then, social media should be used cautiously because you never know if people are being honest. Most seem to intuitively know this, as only 30-34% of Americans trust social media sites or blogs ("Between main street," 2012, p. 6). Besides, even if you want to know about public opinions, there are more sophisticated and valid ways of measuring opinions, like using survey research conducted by professional or academic organizations (Igo, 2007).
22. The internet and social media are exciting new ways to communicate quickly to a wide audience, and these new media forms were given a lot of positive publicity as a force for citizen journalism and democratic change in the Arab revolutions of 2011 ("Internet democracy," 2011). However, on the whole, these sources should be considered highly unreliable given that they represent such biased personal points of view. The internet is a useful tool, but as Susan Jacoby (2009) has pointed out, it can “foster the illusion that the ability to retrieve words and numbers with the click of a mouse also confers the capacity to judge whether those words and numbers represent truth, lies, or something in between” (p. xviii). Always remain vigilant and skeptical of online information.
23. Yet we should not entirely discount the internet and social media as valid news sources. These media also allow users to exchange valid types of information, such as e-books, articles and documentary videos. But in these cases, the internet or social media would not be considered the source of the information, just the medium by which it is exchanged. Social media and the internet can thus act like digital libraries, a virtual depot, where we can rapidly exchange traditional and valid forms of information. It is important to remember, the internet is an open access medium containing a wide variety of information, some of it good, but most of it heavily biased and highly unreliable. The internet is no more or less reliable than the public street corner or a local book store. It contains many different types of individuals and organizations with many different types of ideas. Some deserve our attention and careful consideration, but many do not.
4.4 Understanding News Media Bias
24. How do we decide which news is true or false? This is a difficult question that has always plagued the news media. The recent development of new forms of social media and the proliferation of news organizations, websites, personal blogs, and Twitter accounts has only intensified the importance of these questions. First we need to remember, as journalist Walter Lippmann (1922/1997) once explained, "news and truth are not the same thing, and must be clearly distinguished" (p. 226). To find the truth in any news story, we must first understand "the limited nature" of all news media (p. 228). The news is a complex, contradictory, and profit-focused business that tries to make sense of a highly political world. We can't expect much truth under such circumstances.
25. For every story, there seem to be multiple sets of facts and multiple interpretations (Hirschorn, 2010). Journalists almost never agree on what happened and why. We are also bombarded by advertisers, public relations firms, and political leaders who seem to invent their own versions of the truth, which often conflicts with the views of journalists. Many public relations (PR) and advertising professionals seem to have a "contempt for truth" (Ewen, 1996, p. 80). Ivy L. Lee, a founder of PR, famously argued, "Besides, What is a fact? The effort to state an absolute fact is simply an attempt to...give you my interpretation of the facts" (as cited in Ewen, 1996, p. 81). Did you know that in 2008 in the United States there were over three times as many PR operatives spinning corporate sponsored "news" than there were professional journalists reporting actual news (Sullivan, 2011)?
26. Everyone seems to have their own version of the truth. There seems to be no way to decide if anyone is really right or wrong. Marvin and Meyer (2005) argue, "Just as printing destroyed the illusion of an authoritative biblical text and challenged a Catholic hierarchy, conflicting journalistic accounts cast doubt on the press's ability to present the world with authority" (p. 401). Some journalists are sincere, but biased. Many are just plain ignorant. Professor of Journalism Eric Alterman (2003) explained, "most reporters are ignorant about most things, which is rarely seen as a barrier to coverage. Ignorance is not the same thing as bias" (p. 105). Ignorant journalists selling news is an old phenomenon. The famous early 20th century journalist H. L. Mencken criticized the press corps as "rogues and charlatans" who sell "intolerable incompetence and quackery" (Teachout, 2002, pp. 228, 1). Few journalists have ever stood up for the truth. Most are afraid of criticizing each other, their editors, corporations, or political leaders, even when it means attacking outright lies. Reporters also fear being labeled "biased" by political opponents, which could undermine their appeal or authority with certain audiences (Poniewozik, 2012).
27. There is so much confusion and outright lying in the news media that a whole new type of news organization has been recently created just to expose the lies of other news organizations and our political leaders. On the one hand, you have satirical news shows like The Daily Show with Jon Stewart and The Colbert Report, which point out lies and hypocrisy in order to cynically laugh (Bates, 2011; Moyers, 2007; McGrath, 2012). On the other hand, you have encyclopedia-type news organizations, like PolitiFact.com (see article on Politifact's mission) or FactCheck.org, which rate the truthfulness of statements made by politicians and other journalists, including the fake journalists like Jon Stewart ("Political fact-checking," 2011). The public is now pressured to wade through multiple and conflicting news stories in order to compare different opinions in order to find some truth ("Political fact-checking," 2011). Recent evidence suggests that people are not doing a good job of finding "true" news because most people believe a lot of false information (for example, see this recent article). As The Wilson Quarterly recently reported, we are living in a "disinformation age" (The Wilson Quarterly, 2018).
Don't Believe Everything You See on the Internet
Video Manipulation Is a Threat to Truth
28. While over half of American voters say they have heard information they consider misleading or false, they don't seem to know how to find truthful information. Around 88 percent of Americans think that fake and misleading news has caused widespread confusion over basic facts (Barthel, Mitchell, & Holcomb, 2016). One voter fretted, "They want you to vote intelligently, but how do you find the facts" (as cited in "Medicare," 2012, p. 32)? Around 30 to 90 percent of Americans have false beliefs about important political issues (depending on the issue), largely due to trusting misleading (or lying) sources of news (Ramsay, et. al., 2010; Popkin, 1994, p. 86).
American's Can't Distinguish Fact from Opinion
29. Part of the problem lies in purposeful deception. For as long as there have been newspapers, there have been "fake news" printed alongside factual news (read about the history of fake news). Even in the 21st century, there are a large number of “fake news” websites peddling false, and sometimes outlandish, stories. Political operatives create some fake news stories. They want to attack an opponent, support a candidate, or influence an election. Some stories are just created by savvy teenagers trying to make money by duping stupid people into reading fake stories. One such teenager claimed, “A fake news article is way more opened than any other” (as cited in Cimili & Satter, 2016, para. 1). The proliferation of fake news has lead to a “bewildering assault of misinformation and propaganda,” according to Cimili & Satter (2016, para. 18). Many people can’t tell the difference between fake news and real news, or they just don't care. Alarmingly, one such gullible person acted on false reports of an underground slavery ring under a pizza parlor by showing up at that restaurant with a loaded rifle, threatening the owners and customers (Dvorak, 2016). Do you know the difference between real news and fake news? Sadly, most students do not.
30. But even when accurate news stories are found and corroborated, they are "often disturbingly short on detail" (Garber, 2008, p. 44). While some news stories "tell the truth," it usually isn't "the whole truth" (p. 44). Bias can be detected in both what is and what isn’t said. The media serves as a "gatekeeper," talking only about particular issues deemed important and framing those issues with a particular spin. The news media shape not only how we think, but also what we think about (Popkin, 1994, pp. 85-87). Awareness of such bias has created a lot of cynicism. Many people no longer trust news sources, and they seem to trust political leaders even less (Ramsay, et. al, 2010).
31. Since the 20th century, many newspapers and other new forms of media have explicitly sought to act in the general "public interest" by informing citizens with factual information. But this public mission has often been compromised by other considerations, like profits and political power. Professional journalists argue that the news media is supposed to inform citizens from a neutral perspective so people can have the knowledge they need to act responsibly, especially in a democratic country like the United States of America. Many professional journalists take their political mission very seriously by using their factual reporting to influence legislation and inform voters (Schorr, 2005). However, this mission to act in the interest of the "public good" has always been complicated by other interests, like business and political concerns. Many times a journalist will claim to be working on behalf of the public interest while publishing biased reports meant to help a politician or political party.
32. Regardless of any noble claim to serve the public, most news organizations are private companies seeking to maximize profits, and for these organizations money is the primary objective. The profit motive often leads news companies to publish stories that will sell newspapers or increase television viewers (and thus advertising revenue), and conversely, news organizations will sometimes not publish controversial or boring material, however true and important it may be. Profit-motivated journalism has been called "yellow journalism" because it usually exaggerates or sensationalizes the facts to make the story more interesting, or in some cases, completely fabricates a fabulous story (see example). Many times the commercial motives of a news company can conflict with or compromise the democratic goal of informing all citizens of the general public good (Picard, 2005; Marvin & Meyer, 2005). Sometimes raising revenues by pleasing advertisers is more important than investigating and communicating the truth, especially if the truth involves discrediting a powerful leader or damaging the reputation of a successful company.
33. Perhaps one of the most famous cases of business interests trumping the public good was the story of Jeffrey Wigand (Brenner, 1996), who was a scientist and former head of Research & Development for a cigarette company. He was one of the first to publicly prove not only that cigarettes were unhealthy and caused cancer, but that cigarette companies lied for decades about the health risks of their products. The cigarette companies exerted powerful pressure to discredit Wigand and threatened to sue the CBS news program 60 Minutes if it broadcast Wigand's story. The investigative reporter behind Wigand's story was Lowell Bergman, who is a Pulitzer Prize winning journalist and distinguished professor of journalism at the University of California, Berkeley. Bergman tried to get the whole truth broadcast on air, but CBS made a business decision to avoid expensive litigation and forced 60 Minutes to drastically edit the interview. This decision caused a fury in the press. Bergman quit the show (under pressure from management). Eventually, numerous journalists corroborated Wigand's story, and CBS was heavily criticized and discredited by the journalistic community for holding back the truth. In 1999, these events were dramatically captured in the award winning film The Insider, directed by Michael Mann.
34. But business interests are not the only source of bias in the news media. Acting in the "public interest" is also complicated by the fact that there is no single "public" and no single "interest." In the 21st century, most people live in pluralistic societies composed of multiple "publics," each with different worldviews, beliefs, and values. Therefore, there are many competing "interests," and objective, fact-based journalism is bound to upset some group for one reason or another. Diverse groups of citizens continually argue over their competing worldviews and political interests. Often, we align ourselves with a particular social or political cause and we band together into "interest groups," or like-minded people who support the same issue. People tend to listen to only those people with similar values and ignore (or attack) those people with different values. Thus, public debates often turn into shouting matches rather than actual discussions, and in the ensuing conflict, important issues cannot get solved.
35. Particular news organizations often privilege the beliefs or values of one interest group over another, which is called cultural or political "bias." Even if a news organization tries to be neutral, it often ends up offending some community by being noncommittal to any cause and sitting indecisively in the middle of the road. Thus, in order to fully understand the news media, and to decide who has true or false information, we need to first understand that all news organizations are politically biased in some way, and being officially "unbiased" is actually just another form of bias. (A full discussion on the nature of subjective bias and how it affects knowledge will come in chapter 7). Even scientific researchers working within academic disciplines are biased; however, as we will discuss later in this book, academics strive to control their subjectivity, clearly state their bias, rationally explain it, and leave it up to the reader to decide.
36. Rarely does the news media or business media expose manipulation and outright lies. In fact, the news media often reproduces such misinformation. Journalism professor Eric Alterman (2012) claims we live in a "post-truth" society (p. 11). In such an environment, he argues, American journalism "ultimately fails to justify itself in its most basic purpose: to ensure accountability for citizens and their leaders and to offer the kind of information necessary to help voters make an educated choice for the future of their country" (p. 11). In the 21st century, news media can rarely be trusted, and merely trusting a traditional source of authority is dangerous because we don't know what information is actually true or false. If you read news media sources then you always need to be cautious. Someone may be (and probably is) trying to manipulate your opinion. In order to find the truth and make an informed opinion, it is always best to turn to a scientific study produced by academics working at a research university.
Fallacies Argumentative Magic Tricks
1. There are a lot of people out there who try to manipulate you. These "hidden persuaders" want to "engineer" your consent (Packard, 1957/2007, p. 200). As one of the early inventors of modern propaganda, Edward L. Bernays (1928/2005) explained, "We are governed, our minds molded, our tastes formed, our ideas suggested, largely by men we have never heard of" (p. 37). These hidden persuaders want to trick you into doing something that you probably don't want to do. These hucksters know that "our irrational minds, flooded with cultural biases rooted in tradition, upbringing, and a whole lot of other subconscious factors, assert a powerful but hidden influence over the choices we make" (Lindstrom, 2010, p. 18).
2. Thus, we need to be on guard not only against liars and manipulators, but also on guard against the weaknesses of our own minds. Sure, everyone lies at some point in their lives (Ariely, 2012), but many people are paid to lie for a living: politicians, advertisers, public relations spokespeople, and some news reporters. These people are paid to "phish for phools," as Nobel Prize winning economists George A. Akerlof and Robert J. Shiller (2015) have explained (p. xi). We are susceptible to their manipulation for many reasons. We are ignorant of basic facts. We are ignorant of our brain's flaws. We are enveloped by our subjectivity and culture. And we don't understand the common tricks of liars and propagandists. Thus, we often "allow" ourselves "to be manipulated" (Sachs, 2011, p. 133; Packard, 1957/2007, p. 240). This chapter will give you the basic tools to unmask the tactics of hucksters and con artists who "try to invade the privacy of our minds" by preying upon our ignorance and our psychological weaknesses (Packard, 1957/2007, p. 240).
3. The oldest and most notorious manipulators are politicians (Packard, 1957/2007, p. 171). Relying on ancient traditions of power and authority, Politicians use "magical words," leading people to believe anything that can be spun into a convincing story ("Deeds," 2012, p. 33). The political scientist John J. Mearsheimer (2011) identified seven types of common lies political leaders make to manipulate their own people. He explained that lying is "a useful instrument of statecraft in a dangerous world," and sometimes leaders believe in "noble lies," which are falsehoods used in the name of a good cause (p. 12). News reporters can be another type of propagandist. Partisan media, like Fox News, routinely lie and spin the news in order to manipulate viewers to think and vote in certain ways. Modern news organizations have developed "technologies of mass persuasion" (Sachs, 2011, p. 137), which most often utilize fallacies to alter the truth in order to "distort our most important decision-making processes" (p. 142).
4. Advertisers are also in the business of lying. They peddle all sorts of tricks and falsehoods to manipulate consumers into buying expensive products they don't need (Lindstrom, 2011). In his popular handbook on copywriting, Robert W. Bly (2005) explained how copywriters use “false logic” to effectively “manipulate” consumers (pp. 71-74). Advertisers manufacture consumer wants and needs just as effectively as businesses manufacture products. Marketers also pay scientists to not only research the flaws of the human brain, but also to develop specialized techniques to take advantage of these flaws in order to manipulate us more effectively (Lindstrom, 2011; Packard, 1957/2007, p. 31). These corporate scientists know that the human brain is malleable and can be easily effected by tantalizing stimuli that often affect us unconsciously. They also know that we are very vulnerable to addictive substances and habits. Did you know that many popular foods, like Doritos chips, were specifically designed by scientists to make them irresistibly delicious and addictive (Moss, 2013)? Corporate scientists have spent decades studying "the whys of our behavior, so that they can more effectively manipulate our habits and choices" (Packard, 1957/2007, p. 32). Even experts like myself can fall victim to lies and manipulation, as the letter below illustrates.
Letter to Car Dealer over Deceptive Practices
5. Public relations is another profession built on lies (Ewen, 1996; Tye, 1998). PR representatives use language to spin the truth by either questioning the validity of facts to manufacture doubt, or by inventing fiction and masquerading it as fact. The fictional protagonist in the movie Thank You for Smoking is an exceptional PR man who explains to his son, "If you're paid to be right, then you're never wrong" (Reitman, 2005). The first professional PR man was Edward L. Bernays who wrote several influential books, including Crystallizing Public Opinion (1923) and Propaganda (1928), which explained how PR is the "conscious and intelligent manipulation of the organized habits and opinions of the masses" (Bernays, 1928/2005, p. 37).
6. Bernays (1923/2011) used PR to "sell" the image or brand of a company, not just its products (p. 71). But Bernays wasn't interested in just selling a company, he actually wanted to re-engineer public opinions and behavior. His biographer explained, "Hired to sell a product or service, he instead sold whole new ways of behaving, which appeared obscure but over time reaped huge rewards for his clients and redefined the very texture of American life" (Tye, 1998, p. 52). Bathing at least once a day with soap, the healthy "toasted" delight of cigarettes, and even America's favorite breakfast of bacon and eggs all originated from the mind of Edward L. Bernays (Tye, 1998). One Supreme Court justice called Bernays one of the most insidious "professional poisoners of the public mind" (Tye, 1998, p. 63).
7. You probably don't know that politicians and businessmen have been applying scientific research on human thinking and behavior for over a hundred years. Why? To better manipulate you. In 1895 the French social psychologist Gustav Le Bon published La Psychologie des Foules (The Psychology of the Crowd). This book instructed conservative politicians on how to manipulate and control their citizens, who were then pushing for more political and social democracy. Le Bon's work influenced the research of many American writers, like Walter Lippmann's Public Opinion (1922) and Edward Bernays' Crystallizing Public Opinion (1923) and Propaganda (1928). Both Lippmann and Bernays used their knowledge of social psychology to help the U.S. government and American corporations develop various forms of propaganda, which was used to manipulate public opinion and control the behavior of American citizens.
8. Most people were unaware of these propaganda efforts until journalist Vance Packard (1957/2007) published his expose in 1957 on the public opinion industry, The Hidden Persuaders. Now there are many writers trying to uncover the lies and manipulations of politicians and businesses, including the muckraking journalism of Thomas Frank (2000) and the exposes of neuro-marketer Martin Lindstrom (2010; 2011). Interestingly, Lindstrom turned against his neuro-marketing profession and now warns consumers about the tactics he once used to help businesses manipulate consumers, in such works as Buyology and Brandwashed. Lindstrom (2011) has even demonstrated how companies have gone so far as to use human psychology to literally addict consumers to certain brands and products (p. 61).
9. Lies are often easy to detect if you are well informed about public issues and human psychology. But professional liars know that the majority of Americans are largely ignorant about such matters. Yet, even if you are highly educated about the facts, there are still tricks that professional liars can use to manipulate us. These tricks often affect us unconsciously, so they are much harder to guard against than simple lies. These tricks are called fallacies, which some scientists call "psychological weapons" (Van Der Linden, 2018).
10. A fallacy is a logical sleight-of-hand. It is an argumentative magic trick, which presents a claim as true without any logical reasoning or evidence. In fact, many fallacies are designed to bypass our critical thinking skills in order to control our automatic thinking biases: Fast system 1 thinking overpowering our slow system 2 thinking (Kahneman, 2011, p. 28; see Ch 10, para. 5-9). Fallacies manipulate us. They lead us to make conclusions that are usually false. Fallacies look like part of an argument, but instead of evidence and reasons to back up a claim, there is a trick. Drink our brand of soda because it is refreshing! But what does "refreshing" even mean? How do you know it is refreshing? And what about all the other refreshing drinks? Fallacies are often implicit arguments designed to subliminally affect our judgment without any argument at all. Why do you think images of naked or half-naked women are used so much in advertising (Lindstrom, 2010, p. 177; Packard, 1957/2007, p. 95)? You don't even need an ad campaign or a slogan. Just put a naked woman next to your product and many men will buy it because their hormones override their brains.
11. In order to help you guard against fallacies, this chapter will explain many of the most common. I have broken up these fallacies into four categories. The first is Errors of Reasoning. This type of fallacy has two causes. The first is accidental. As humans, we have many cognitive biases that distort both our perception and our reasoning, some of which were discussed in chapter 7 (See picture below). People think illogically much of the time, and they accidentally reach erroneous conclusions based on bad data and/or bad thinking. But with this knowledge, an unethical person can use these biases against the unwary. Tricksters can manipulate the brain of the ignorant and engineer a false conclusion.
You Are Almost Definitely Not Living in Reality
12. The other categories are all intentional tricks. The second kind of fallacy is Evading the Issue. This is an attempt to ignore your argument (because you know you cannot prove it), and instead change the subject to a topic you can win. The third category is Attacking the Opponent. Like the second category, this category seeks to avoid addressing an argument, but it redirects the audience's attention with a specific trick: Attacking the personal character of the opponent. Finally, the last category is Appealing to the Audience. This category of fallacy seeks to affect the audience psychologically so as to manipulate the audience's ignorance or emotions.
13. But beware, these are only a few of the most common fallacies. There are many, many fallacies out there and they are often used in novel combinations that make them very hard to detect. Hopefully after understanding some of the most commonly used fallacies you will be able to better defend yourself against the snares of professional liars and cheats.
Handout: Common Types of Fallacies
A. Errors of Reasoning
14. A Hasty Generalization usually follows most of the rules of good argumentation. This fallacious argument has a claim, evidence, and a conclusion; the problem is that it does not have enough evidence, or the right kind of evidence, to prove a claim true or false. Instead of doing the hard work of finding the best evidence, a person using this fallacy just jumps to a premature conclusion, which, of course, is not justified. Sometimes a hasty generalization can be made with no evidence whatsoever, but that is rare.
15. One of the most common hasty generalizations is the stereotype, an idealized category that supposedly captures the essential characteristics of a whole group of people or things. But the stereotype is usually based on only a few examples, often from personal experience and cultural common sense; in other words, it is a category based on a highly limited amount of evidence and is, therefore, not valid. Often stereotypes have a grain of truth about them; however, it is important to remember that even partially true stereotypes "will cover only some of the truth for part of the time" (Lippmann, 1922/1997, p. 97).
16. Stereotypes usually lead to two other types of fallacies, hasty generalizations that are mirror images of each other: the Ecological Fallacy and the Fallacy of Composition. The ecological fallacy claims a stereotypical essence for a group, and then claims that all members of this group must possess these essential characteristics. To illustrate, all Asians are bad drivers; therefore, because you are an Asian, you must be a bad driver. Soldiers are patriots; therefore, if you are a soldier, you must be a patriot. People who smoke are more likely to die of cancer; therefore, because you smoke, you will die of cancer.
17. The fallacy of composition works in reverse. An individual with particular characteristics is used to make a stereotype of a whole group to which that individual supposedly belongs. The terrorists of 9-11 were Muslim; therefore, all Muslims are terrorists. Those protesters attacked the police; therefore, all protesters are violent. That atheist behaved immorally; therefore, all atheists are immoral. In each case, a whole group of people is labeled based on the actions or characteristics of only a single person.
18. False Cause is another error of reasoning that follows most of the rules of argumentation. This error is usually caused by the ignorance of the arguer, but it can be a deliberate attempt to exploit the ignorance of the audience. A false cause fallacy usually mistakes a correlation or temporal sequence as evidence for a cause. Just because one variable is often linked with another variable, it does not logically follow that one causes the other. Likewise, just because one event often follows another event, it does not logically follow that the first event causes the second. For example, rain and wind often come together, but it would be false to claim that rain causes wind, or vice versa. Losing at least one game comes before winning a championship, but it would be false to claim that losing a game causes winning a championship. In some cases, a false cause fallacy is simply a false cause. Adolf Hitler claimed that the cause of Germany's decline in the 1920s was due to a conspiracy of the Jews. It was not.
19. One specific and often used type of false cause is the Appeal to History. History is not an exact science. Due to the scarcity and subjective nature of most historical documents, it is incredibly difficult, if not impossible, to prove exactly what causes important historical events. Many times an arguer will claim that "history shows" or "history proves" and then use this fallacious appeal to history to supposedly "prove" another claim true. Unless there are copious amounts of historical detail and references to reputable historical studies, be highly skeptical of any historical claim. Most people are completely ignorant of history and invent their own imagined version of the past. Another variant of this type of false cause is the Appeal to Nostalgia. Often arguers will present an exaggerated, idealized, or simply false version of the past and claim it as a factual representation of what really happened. In the United States, one of most common appeals to nostalgia is the idealized "small town America," which is a heavenly place of good people united together by a single set of traditions, values, and practices. Of course, such a place has never existed and never will.
20. A False Analogy is an error of interpretation caused by the ignorance of the arguer or a deliberate attempt to manipulate the audience. (An analogy is a metaphor where two people, objects, or events are described as similar.) If we can understand the meaning of event A, then it will also help us to understand the meaning of event B. An analogy is an attempt to make a fact meaningful, usually based on a system of values, so that an audience knows what to do about a situation or problem. For example, the U.S. invasion of Iraq was going badly in 2004 and 2005, so many media commentators made the analogy that Iraq was either like World War II (a just war of liberation that needed more time to succeed), or they said it was like Vietnam (an unjust war that failed and got worse the longer the armed forces remained). In both cases, the analogy was supposed to help the American public understand the war and act accordingly. However, the war in Iraq was not really like World War II or Vietnam, so these analogies misled the American public.
21. Begging the Question, also called Circular Reasoning, attempts to prove the main claim by simply restating the main claim or by stating another claim that is related, hence, the description of circular. This type of fallacy is usually a result of ignorance rather than an attempt to manipulate an audience because the arguer believes the claim being made is common sense; therefore, they believe there is no need to prove it. For example, after the events of 9-11 in the United States, it was common to hear media reports condemn "terrorists" as evil because terrorism is evil. Such reporting demonstrates circular reasoning since these claims all rest on a faulty stereotype that is not defined or proven. This series of claims does not answer anything; it just raises questions, hence the phrase, "begging the question," which means an ambiguous or argumentative statement that raises a lot of questions needed to clarify or prove it). What are (and are not) terrorists? Why are their actions horrific? Is everything a terrorist does terrorism, and is all terrorism always horrific? What makes terrorists evil? What is evil? Another example is that free trade is good because it allows an unrestricted flow of buying and selling. This simply restates the claim as "proof," but does not actually explain why free trade is good. This trick works because our brain is naturally programmed to believe repeated claims it hears over and over again.
22. As with the previous fallacy, Appeals to Authority are based on ignorance and common sense. One of the oldest traditional forms of reasoning is the "do what you're told" command by an authority figure. Most parents practice this type of reasoning with their children, and it is highly popular with businessmen, politicians, and religious leaders. Instead of explaining why something is true and providing evidence, one simply states that a claim is true because an authority figure said so. The Nobel Prize winning physicist Albert Einstein warned that "foolish faith in authority is the worst enemy of truth" (Isaacson, 2007, p. 22).
23. The only element that has changed in the traditional appeal to authority is the source of authority that people accept as common sense. Political and religious leaders have been displaced over the 20th century with scientific and business leaders, and later in the 20th century, there was a decisive swing toward the authority of youth. Celebrities are also seen as authority figures, even though many celebrities have no expertise or special skill. Such celebrities are famous merely for being famous, as political activist Jerry Rubin pointed out: “People respect famous people – they are automatically interested in what I have to say. Nobody knows exactly what I have done, but they know I’m famous” (as cited in Jacoby, 2009, p. 173).
24. One of the earliest and most effective advertising campaigns used an authority figure to endorse a product. Edward L. Bernays was one of the first PR men to employ this tactic. For example, Bernays advised cigarette companies on how to ignore the health risks of smoking by having doctors endorse their products and claim that certain brands were actually healthy (Tye, 1998, ch 2). Bernays also invented the deceptive spin of "toasted" cigarettes, which supposedly were "kind to your throat" because they were "free from harsh irritants" (Tye, 1998, p. 45). The use of the authority of doctors and the red herring of "toasted" (see above) were deliberate lies that deceived audiences by appealing to their ignorance. Once lawmakers established that cigarettes did, in fact, have health risks, they enacted the Federal Cigarette Labeling and Advertising Act in 1965, which banned advertisements that claimed health advantages for cigarettes. Clever advertisers kept the authority figure of doctors and, in a deliberate red herring ploy (this fallacy is explained later), changed the claim from a cigarette is healthy to it is "less irritating" and "It's toasted."
25. Besides the traditional common sense appeal of authority, we also need to understand that authority figures have psychological power over us. Recent research on the human brain has shown that "people will actually stop thinking for themselves when a person they perceive as an expert offers them advice or direction" (Lindstrom, 2011, p. 177). When we get advice from someone we trust as an expert, our brain basically stops thinking and intuitively accept as true everything that expert or authority figure tells us. We intuitively want to believe them. Thus, we have to learn how to distance ourselves from professional advice so that we can critically analyze what these experts say in order to make sure it is actually true and good advice. This is not an easy task and we have to fight against the natural inclination of our brains to do this. Even when authority is justified, like a doctor, lawyer, or professor, the authority figure must still prove his or her arguments with valid evidence instead of trying to manipulate people on the basis of authority alone. Remember, even experts make mistakes. As finance professor Burton G. Malkiel (2012) stated, “We should not take for granted the reliability and accuracy of any judge, no matter how expert” (p. 168).
26. As a side note, it’s important to recognize that experts are not the only source of authority in our world. Celebrity works in the exact same way. Most people intuitively trust and believe celebrities, as if they were experts or authority figures, which they usually aren’t (Lindstrom, 2011, p. 165). What qualifies a celebrity to know about clothing design, cars, perfume, or any other product? Nothing. Celebrities are just being paid a lot of money to make false claims about a product so you will go out and buy, buy, buy.
27. An Appeal to Probability is an error of reasoning which falsely claims that just because something could possibly happen, then it must happen and/or will happen. This type of reasoning simplifies and exaggerates a cause and effect sequence. Usually an appeal to probability leads to a Slippery Slope fallacy. Like taking one step over the edge of a mountain and quickly falling down the slope, this fallacy simplifies a series of events by stating that if a person takes one step in that direction, then these future events must happen, leading to a horrible end.
28. For example, many anti-drug and anti-smoking campaigns use this fallacy. Smoking can lead to various forms of cancer, and cancer can lead to an early death, so these campaigns often just jump to the conclusion and say smoking will kill you, and usually imply right away. As you know, not all smokers get cancer, and not all people with cancer die an early death. Further, many people who never smoke get cancer, and some people who never smoke and never get cancer will die an early death. Jumping to one conclusion because it fits a theory, stereotype, or agenda is always a fallacy. Such tactics are meant to manipulate people into believing a false claim.
29. The appeal to probability often leads to two other fallacies: the Fallacy of the Single Cause and the False Dilemma. Cause and effect sequences are always highly complex with multiple variables as causes producing a wide range of effects. Often an ignorant or unscrupulous arguer will oversimplify the equation and claim that one cause will necessarily lead to one effect. Drugs = Death. War = Patriotism. Capitalism = Freedom. Atheism = Immorality. Wealth = Happiness. Whenever you see a highly simple equation like this, it is always a fallacy. Life is never so simple. This type of simple equation also creates a false dilemma fallacy. To take one of the examples above: If you are an atheist, then you must be immoral, so if you want to be moral, you must believe in God. Of course, this simple line of reasoning begs more questions: Are all believers in God moral? What makes God moral, and how do we define morality? Whose God? Are all gods moral? Are any gods or their followers immoral? Are any atheists moral?
30. Overly simplified arguments that involve false dilemmas usually employ the fallacy of Reification, which I like to call the fallacy of bullshit. Often people use generalized words without fully understanding what these words mean. All general ideas have very limited meaning because objective reality is defined by details. No two trees, people, or places are exactly the same. People use general words in order to reduce the complex diversity of the world into universal categories. Bullshit categories include: religion, society, capitalism, terrorism, or faith. It can also include any general noun, such as dog, tree, bird, or person. These words are too general and vague; therefore, they are almost meaningless.
31. Take the word dog, for example. While we all instinctively think of a furry creature with four legs and tail, does this general idea fit any specific dog? If I said, “I have a dog,” would you know what my dog looks like or how it behaves? You wouldn’t have a clue. You would only have a vague, generalized idea in your head, which doesn’t correspond to any real dog. Or take the word society. Society is a large collection of diverse individuals, organizations, and institutions, many of which are in open conflict with each other. So what could statements invoking society actually mean? "Our society believes." "Our society must act." "Our society prohibits." When we use such language, we are often referring to general categories, which are an important type of knowledge, but when you are talking about categories, you need to tie it down to specific examples; otherwise, it’s nonsensical bullshit that is utterly meaningless.
32. The Naturalist Fallacy is a common error of thinking that is usually the result of ignorance. This fallacy confuses what realistically does exist with what ethically ought to exist –it confuses an is for an ought. Violence is part of nature, and humans are just animals; therefore, it is right to behave violently. War has been a major tool to solve international conflict; therefore, war is just. Everyone cheats and cheating leads to winning; therefore, cheating is the right thing to do. Just because people do something does not automatically make it the right thing to do. This form of sloppy reasoning is routinely used to justify what is called the status quo, what people are already doing, because it would be too hard to imagine something different.
33. The last error of reasoning is the Non Sequitur Fallacy, which is also a common error due to ignorance. This Latin phrase means "it does not follow" – i.e. the conclusion is not logically connected to the original claim of the argument. This fallacy entails any claim or evidence that is not logically relevant to the original claim of the argument. Ignorant people often don’t understand true causes so they explain events either haphazardly or according to cultural common sense. Ask an unintelligent person why the president did or did not do something and they will often come up with some wild conspiracy theory. Why did an earthquake happen? Some might say aliens caused it, or negative energy forces, or the gods, or bad luck. Often, but not always, a non sequitur represents sincere confusion in the mind of an ignorant person who just doesn’t know any better, but sometimes this fallacy can be used deliberately to evade the argument at hand.
B. Evading the Issue
34. Another category is composed of tricks used to evade an argument by changing the subject. This highly effective tool relies on your opponent’s and/or audience’s ignorance. These fallacies generally cannot work on intelligent people because these fallacies are not errors of reasoning; they are tricks designed solely to deceive the foolish. You need to make sure that every supporting claim is logically connected to the thesis, and you need to make sure that all evidence (if there is any) is logically connected to each supporting claim. Sometimes you will run into a dishonest arguer who will throw curve-balls that have nothing to do with the claims at hand. In such a case, bring attention to them; disclaim them as irrelevant and move on.
35. The first trick of evasion is the Red Herring fallacy. It is like the non sequitur, in that both illogically connect unrelated claims or evidence, but the Red Herring is used deliberately to manipulate an audience. The Red Herring leads away from the main claim (which can't be proven) towards another claim or claims (which often can be proven). Supposedly, this fallacy is named after a cultural practice of using smelly fish to divert scent hounds from the pursuit of a criminal suspect. One reason to use a red herring is to avoid talking about a relevant claim that you cannot actually prove true or false. By mentioning, while avoiding, the claim with a red herring, it makes it seem as if you have talked about the issue, when, in fact, you have not.
36. Another reason to use this fallacy is to divert attention away from a relevant claim you cannot prove true or false. Thus, instead of facing a claim you can't prove, you quickly turn toward an unrelated claim you can prove; thus making it appear as if you won the argument, except it’s not the same argument anymore. For example, in the debate over health care reform in the U.S., a common tactic of opponents of this reform is to re-frame the debate away from the topic of health care and towards the different, more general topic of personal freedom. In arguing for personal freedom and winning that argument, it may seem as if the argument for health care had lost, when, in fact, the shift was just a red herring diverting attention away from the original argument.
37. Another type of evasion is Shifting the Burden of Proof towards your opponent. The rules of open argument and scientific research are clear: If you make a claim, then you need to prove your claim with evidence and reasoning. The burden of proof is on the person who makes the claim to prove the claim. The burden is not on critics to prove that the claim is not true. So, if your audience is ignorant of these basic rules of argumentation, an unscrupulous debater may declare a claim true without any evidence, unless an opponent can prove the claim wrong. If the opponent cannot supply the evidence in question, then it seems as if the claim is true, but it is not because no evidence has actually proven it.
38. For example, in the classic debate about the existence of God, the argument against this fallacy is one of the strongest claims of atheists. Besides the stories found in holy books, nobody has ever provided direct empirical evidence about the existence of any god, gods, angels, demons, or fairies. Some individuals have claimed to have seen or experienced these beings, but the problem is that no two subjective accounts are alike, which raises questions about the validity of these claims. The same goes for claims about Bigfoot, aliens, elves, trolls, and other mythical creatures. People who make a claim must definitively prove the claim with sufficient evidence from the objective world. If the arguer cannot, then the claim should not be taken seriously, and the doubter of such a claim does not have to prove anything.
39. A final trick of evasion is the Argument Ad Nauseam fallacy. Basically, this trick claims that the argument has been repeated too much to argue anymore; thus, the person using this tactic simply declares the matter settled, true or false, when, in fact, no such consensus has been reached. For example, a religious believer might claim that for thousands of years people have been trying to disprove the existence of a supreme deity, what many call God, and they haven't done so yet because most people still believe in God. This person might claim, we should just move on and realize that God actually exists. Well, the first part of this statement actually begs the questions: Has anyone actually disproved the existence of God? Cannot people still believe even if the belief has in fact been proven wrong? Also, this claim shifts the burden of proof onto the doubter rather than the believer. Shouldn't the believer have to prove that God exists, rather than making the unbeliever prove that God does not exist?
C. Attacking the Opponent
40. The next category is a different set of tricks designed to evade an argument by changing the subject. But instead of an issue-based red herring, the arguer shifts attention to the personal character or words of the opponent. If you can personally discredit the character of your opponent, then you don't actually have to deal with the argument being discussed. Ideally, this fallacy works mostly on the ignorant because they are easily distracted. But because all people have some ethical values they hold dear, it can be difficult for even intelligent people to avoid becoming sidetracked by a juicy attack on someone's character, especially if the attack is about dishonesty or a lack of integrity.
41. One of the most common attacks is Poisoning the Well, although it must be noted that this popular fallacy can also be used in reverse. Poisoning the well involves some disparaging remarks designed to frame a topic, claim, evidence, or even the opponent in a negative light. Our brains have a natural framing bias that operates without our conscious control. When we hear positive or negative words, they will unconsciously shape how we think about the topic being discussed (Kahneman, 2011, p. 88). So, if you can smear your opponent's argument or character from the start, then the audience will be predisposed to view anything your opponent says in a negative, distrusting frame of reference. Likewise, this tactic can also work in reverse with praise, thus giving the audience a more favorable disposition. Even if your negative or positive claims are false, they still have an unconscious psychological effect on the audience.
42. When someone attacks the personal character of an opponent, doing so is called an Ad Hominem Attack. This is one of the oldest tricks in the book. It does not matter if the attack is true or not because the character of a person has no logical bearing on the truth or falsity of a claim. Only empirical evidence can prove or disprove a claim. Even if a person is a habitual liar, it still does not logically follow that everything the individual says is automatically a lie. Even a liar can sometimes tell the truth. Thus, an ad hominem attack is always, regardless of whether it is true or not, a deceptive attempt to draw attention away from the argument at hand in order to confuse and manipulate the audience. Unscrupulous arguers use this tactic because it works. People frequently get swayed by these types of attacks, even intelligent audiences. We can't help ourselves. Hence, political campaigns in democratic countries seem to get dirty with ad hominem attacks (often called "slinging mud") the closer it gets to election time because this form of appeal really works with undecided audiences. Ad hominem attacks can also be used as a type of red herring, which The Economist calls "whataboutism" ("Muddying the Waters," 2017, p. 30). When a speaker is personally attacked, rather than address the claim, the speaker deflects attention away from their own failings (real or imagined) and redirects blame to others. Whataboutism often goes: Ignore my guilt..."what about" her guilt?
43. Another common attack is called the Straw Man fallacy. Unlike the last fallacy, this attack is not focused on the character of an opponent, but on his or her argument. In an open argument, there is a procedure for criticizing the claims of someone. First, summarize and explain those claims to show the audience that you have looked into the claims, evidence, and conclusions with an open mind to fully understand them. Only then do you begin to criticize the argument. A straw man fallacy takes a short cut. Instead of honestly summarizing the whole argument, the arguer cherry picks certain points, usually taking them out of context, and then altering their meaning.
44. Instead of fairly criticizing the whole argument exactly as it was stated, the arguer, intentionally or not, uses the straw man to misrepresent an opponent’s argument to make it look weaker than it is so that it can be attacked and refuted more easily. The unscrupulous arguer could take short quotes out of a larger context to misrepresent the claim. Or the arguer will overly generalize a specific argument and then attack the misrepresentation for being too general. Sometimes a critic is ignorant and just doesn't understand the argument at hand, so they summarize and criticize their own misunderstanding, rather than the original argument. In rare cases, but it happens, an arguer will simply lie and attribute a claim to an opponent, which was never actually said.
45. The Argument from Silence is our last popular and dishonest evasion tactic. Basically, the arguer takes the silence of an opponent as "evidence" of being right, which, of course, is nonsense. Silence is not evidence of anything. Usually, this tactic will not work in a face-to-face situation because an opponent can always say something to refute a claim. But in print or in a lecture, your opponent appears only in the way you want, so you can always have the last word. It is common for a sneaky arguer to start with a straw man, criticize that misrepresentation, and then rhetorically claim that the opponent could not possibly refute this criticism, thereby, leading to a strong (but illusory) conclusion.
D. Appealing to the Audience
46. The last type of fallacy includes tricks designed to appeal to the psychology of the audience so the audience's mental weaknesses can be manipulated. On the one hand, this type of fallacy is an evasion tactic. Instead of trying to prove a claim, the arguer manipulates the weak minds of the audience to make it seem as if the claim has been proven. But this type of fallacy is also an error of reasoning because it relies on a traditional form of common sense logic. Basically, these fallacies presuppose that if a group of people all agree that something is true, then it must be true. But, of course, this thinking is illogical. A group of people could all agree that the world is flat (as Europeans did for hundreds of years), but this group consensus is in no way proof of anything about the objective world. It is simply an agreement based on subjective opinions.
47. The most basic form of this fallacy is the Appeal to the People or Bandwagon Appeal. This tactic can occur at various places within an argument: before, during, and/or after. Often a smooth arguer will use these tactics in all three places. Now, all audiences like to be spoken to directly, they like to know that their experiences and opinions matter, and they all like to feel they play a part in the argument itself. All arguers need to speak to and include an audience, but there is an honest and dishonest way to communicate, and the line between these two motivations is not always clear. Crossing the line into dishonesty involves trying to manipulate the audience into believing a claim which has no evidence. For example, one might say, "All of us [insert stereotype: Americans, good people, guys, soccer fans] know this issue is wrong, except my opponent." This statement creates an "us" vs. "them" frame, thereby putting the arguer and the audience on the "right" side, and poisoning the well against the isolated opponent on the "wrong" side. The fallacy relies on this oversimplified either/or fallacy to hide the fact that no evidence is presented to prove any claim right or wrong. Clearly, this tactic just manipulates the audience.
48. Politicians and marketers know that people are social animals. We form flocks or herds, just like cows, sheep, or birds. Everyone likes to be part of a group and it is very uncomfortable for most people to stand out or be apart from a group (Lindstrom, 2011, p. 107; Thaler & Sunstein, 2008, ch 3). Because of this social and psychological instinct, people are very susceptible to peer pressure. As marketer Martin Lindstrom (2011) has explained, "We instinctively look to the behavior of others to inform the decisions we make" (p. 108). Peer pressure is one of the easiest ways to manipulate people into believing a claim. If you can convince your audience that everyone else already believes, then peer pressure and the psychological need to conform will do all the work, eliminating the need to prove the claim with evidence.
49. Most appeals to the people involve an Appeal to Ignorance because an intelligent person who knows the rules of argument generally is not susceptible to such tricks. Unscrupulous arguers know that the majority of people in most countries have never been to college, don't know the rules of argumentation, have a limited amount of information about the objective world, and live their lives based on common sense and tradition. To a certain extent, we are all "confident idiots" (Dunning, 2014). An audience like this is easy to manipulate if you know the right buttons to push. Hence, arguers can really say anything at all, no matter how outrageously false or ridiculous, as long as they don't get caught! Of course, getting caught is the major limitation of this highly effective strategy. No audience likes to feel duped. So, the arguer has to be careful to keep bullshit or lies plausible; otherwise, this tactical advantage can turn into a liability. An audience can turn against you if they figure out they are being manipulated.
50. A stronger and safer tactic is the Appeal to Emotions, to which all people are susceptible, smart and ignorant alike. Similar to the bandwagon appeal, this tactic is not necessarily cheating, until it replaces evidence and crosses the line into manipulation. As with the bandwagon appeal, the barrier between good argument and manipulation can be hard to discern. In the first treatise on rhetoric, Aristotle explained how important it was for an arguer to address the emotions of the audience because emotions are an important part of being human. Modern cognitive scientists now understand that emotion plays a significant role in our "intuitive judgments and choices" (Kahneman, 2011, p. 12). Because emotions are hard-wired into our brains, we are naturally susceptible to "emotional contagion," which simply means we are predisposed to be sympathetic to other people and their emotional state of mind (Bloom, 2004, p. 116). We feel the same way as those around us feel. If they are sad, we feel sad. If they are happy, we become happy. Psychologist Daniel Kahneman (2011) has shown how our "emotional attitude...drives [our] beliefs" and our behavior (p. 103). Marketers believe that consumers base around 80 percent of their buying decisions on emotion (Lindstrom, 2011, p. 100). One of the most powerful marketing tools is the "brand," an image engineered with personality and feeling, which appeals to us solely on an emotional level (Packard, 1957/2007, p. 65; Lindstrom, 2010, p. 27).
51. Emotional thinking is part of the automatic, intuitive part of our brain, the part Kahneman called system 1 thinking. We are not fully conscious of how emotions affect our reasoning, which means we cannot fully control our emotions, and so we are vulnerable to manipulation. And strong emotions, like fear or sadness, can easily take over our thinking processes and lead to dangerous decisions. Fear is one of the most effective ways to trick people into accepting a claim, especially if you are selling a product (Lindstrom, 2011, ch 2). Generally, in an open, academic argument with a focus on claims, evidence, and the reasonableness of conclusions, the emotions should play little to no part. Emotions can distract us from system 2 critical thinking, which is something an audience needs to evaluate the truth or falsity of claims. Always be on the lookout for emotional appeals in arguments because, more often than not, they are ploys to manipulate an audience, rather than honest appeals to our humanity.
52. In conclusion, you always need to be on guard against fallacies and lies in arguments. Many unscrupulous arguers just want to win an argument and gain some measure of power over an audience. Most politicians and media personalities actively engage in these underhanded tactics. It is important to remember that when evaluating an argument, you need to focus on the claims, evidence, reasoning, and conclusions of the arguer. If all of these parts are not present or clearly explained, then you should become very skeptical. The absence of one or more of these parts could be a clue that the arguer is using fallacies. Remember, the burden of proof is always on the arguer who makes a claim. Be on the lookout for arguers who do not fully or clearly make an argument, or who engage in fallacies to manipulate an audience. If you encounter such people, and there are many out there, do not take them seriously. In general, such people cannot be argued with because they have no commitment to the truth. An appropriate response is to simply walk away.
53. In the 21st century, "truth" or "facts" have to be actively constructed by critical thinkers through meticulous and rigorous scientific methods. Further, truth alone doesn't do anything. One has to argue for the truth in open debate in order to convince a skeptical public. Debating with others about truth means both arguing for the truth and demonstrating it with valid logic and evidence. It also means arguing against false opinions, manipulations, and lies. 21st century literacy entails not only being able to construct knowledge with scientific methods, but also openly arguing with diverse publics to explain and prove the truth. We will discuss the process of how to make an effective argument in the next chapter.
Science The Search for Objective Knowledge
1. We currently live in an age of "Truthiness" where there seems to be no empirical facts and no objective world – only endless, conflicting opinions (“The Death of Facts,” 2012). When facts are invoked in debates over public policy, many political groups try to fight facts with what Oreskes and Conway (2010) call the "Tobacco Strategy" (pp. 5-6). Unscrupulous political groups manipulate ignorant audiences through systematic, relentless, and well-funded "doubt-mongering" (pp. 5-6, 16, 18, 34). Partisan media and think tanks reduce reporting to an "echo chamber" of politicized talking points and manufactured lies to manipulate the public (Oreskes & Conway, 2010, pp. 7, 236). Success in politics and law seems to require cynicism, as the indistinct line between subjective and objective reality is mystified behind obfuscated doublespeak, manipulated spin, and the "cacophony of conflicting claims" (Oreskes & Conway, 2010, p. 241). Most of us already know that social, political, and business leaders engage in various acts of official deception, which often take the form of outright lying (Mearsheimer, 2011; Sachs, 2011, p. 24).
2. The philosopher Isaiah Berlin (2000) noted, “Because there is no hard and fast line between 'subjective’ and ‘objective’, it does not follow that there is no line at all” (p. 170). The unscrupulous might try to erase this line out of ignorance, deceit, desire for power or profit, but the line remains. The truth exists because the objective world exists. However, knowing and understanding that objective world has proven much harder than anyone expected. The world is big and complex, but we can directly experience only a small part of it. Even when honest, academically trained professionals try to understand the objective world, philosopher Amartya Sen (2009) has pointed out, there is no "guarantee of reaching truth" (p. 40). Even "the most rigorous of searches," Sen explains, "could still fail" (p. 40). Objective truth is out there, but it is very hard – and sometimes impossible – to reach.
3. We all see the objective world every day, but very few of us actually look past our subjectivity or culture to know the objective world. Stuck in a highly local context, most people dwell within their subjectivity and culture and rarely venture outside of it. People believe what they subjectively see, which is often called common sense. Most people believe they know the objective world, but what they really know is their own unique subjective experience of that larger objective world. And they only really know about what they directly experience in their local context, which is an infinitesimally small part of the complex universe. But while we are separated from the objective world by our consciousness, we are still connected to and a part of that objective world. Our brain and our bodies are physical parts of the objective world. Our subjectivity is naturally attuned to an objective world that is knowable, largely because our mind is part of and has evolved within this objective world. Thus, we can understand the objective world fairly well. But do we really see and know the objective world as it actually exists?
Handout: The Ecology of Knowledge
4. For most of us, the answer is a qualified “yes.” We see the surface appearance of the immediate objective world, what the ancient Greeks called doxa. This limited form of knowledge creates a sense of practical realism that helps us make day-to-day decisions.
5. Our culture also produces general beliefs about the world called common sense. The ancient Greeks called this orthodoxia, or the body of official and unquestioned beliefs shared by a culture. But our common-sense knowledge is always highly limited because we can empirically validate only what we directly experience on a daily basis in our local environment - all the rest we have to trust based on the authority of cultural institutions or powerful people. However, even our senses can fail us. When we directly perceive our local world, we usually see only the superficial surface of objective reality. Rarely do we fully understand the depths of the objective world that surrounds us. Thus, while subjectivity (doxa) and culture (orthodoxia) are forms of knowledge, they are very limited forms of knowledge that often tell us more about ourselves and our culture than about the objective world in which we live.
Handout: Common Sense vs. Science
Handout: How Do You Know? The Seven Types of Knowledge
5.1 Science Is a "Technology of Truth"
6. The most reliable information comes from scientists or professionals trained in scientific methods, but even science is not a perfect form of knowledge. And what's worse, science often becomes a form of magic for the average person because most people don't understand why scientific conclusions are better than common sense. Thus, many people either trust science as a cultural authority, or they reject science as just another biased cultural belief. Many, if not most, people do not understand why science produces the most valid forms of truth (Sagan, 1996; Jacoby, 2009, p. 211). Even practicing scientists can't always agree on what they do or how it works. The practice of science is an important human endeavor that can get us very close to objective reality, but it is not a perfect tool leading to absolute certainties. The promise of science is not in the conclusions it reaches, nor in the technology it produces. Instead, the promise of science lies in its unique process of knowing the objective world. In order to know more about the objective world so that we can make more informed decisions, we first need to understand how science works and what its limitations are.
7. What is called science is actually a bunch of different research disciplines (see chapter 4) that are unified by one basic method, the scientific method. However, this basic scientific method is actually adapted by each discipline in a particular way. It has been broken up into a diverse set of procedures that can be used to explore the objective world in different ways. Some believe that scientific activity is too diverse to talk about as a single, distinct phenomenon. To make matters worse, the endeavor of science is a "continually evolving" set of ideas and techniques, so it is never stable enough to pin down exactly (Toulmin, 1961, p. 109). But it is possible to generalize a basic process and purpose of science. These similarities account for the significance of science as a general knowledge-creating tool. These similarities constitute the basic foundation of all scientific practice common to all scientists. As the biologist Ernst Mayr (1997) pointed out, "One would not be able to speak of science in the singular if not all sciences, in spite of their unique features and a certain amount of autonomy, did not share common features" (p. 33).
8. Physical and social scientists carefully create theories, methods, and technologies. These tools enable greater description, explanation, and sometimes prediction of the objective world. The knowledge scientists create allows for some measure of control over ourselves, our society, and our environment. Science is the practice of disciplined reflection, theory, experiment, and critical debate. Science is “a technology of truth,” as Daniel Dennett (2003, p. 6) described it. Scientists filter out their subjectivity by using precise research methods. They use theories to experiment on the objective world, gather evidence, draw conclusions from the evidence, and then present their conclusions to a critical community so that others can analyze these conclusions and try to prove them wrong. The end result of this process is the creation of provisional truths. The scientific process is based on a belief that empirical observations and laboratory experiments produce a knowledge that "corresponds" with reality (Bloor, 1991, pp. 40-45).
Handout: The Scientific Method
9. Often scientists are trying to investigate the deeper layers of reality that we cannot directly see without specialized methods and equipment (Deutsche, 1997, pp. 3, 7). Many scientists believe that they can use the scientific method to not only describe the objective world, but also to explain "the fabric of reality itself" (p. 3). Once scientists gain an insight into the objective world, they use their knowledge to create better technology, a term which refers to new tools that help improve our society and our lives. Scientists also use their knowledge to try to predict the future. By anticipating possible causes and effects, researchers try to solve the important social, economic, political, and ecological problems that threaten the stability and sustainability of our societies and our planet.
10. The first step of the scientific process is the literature review. Scientists explore not only a specific topic, but they enter into an academic conversation about that topic, a debate that has been happening for years, decades, and sometimes centuries. Scientists need to understand what is known and unknown about a topic. They also need to understand how the current knowledge was created: what tools did previous scientists use and how well did the various tools work? Once a scientist figures out the best tools to use, then they plan out a research project in order to re-test what is known or to discover some new knowledge.
11. A scientist starts with a theory, which is a model explaining how some part of the objective world works. A theory will not only explain all the parts, but it will explain how all the parts work and fit together into a structured whole. Philosopher, and motorcycle mechanic, Matthew B. Crawford (2009) explained how he once could not understand the workings of a motorcycle engine because he lacked a theoretical model to explain how an engine was supposed to work: “A more experienced mechanic has pointed out to me something that was right in front of my face, but which I lacked the knowledge to see. It is an uncanny experience; the raw sensual data reaching my eye before and after are the same, but without the pertinent framework of meaning, the features in question are invisible” (p. 91). A theory is a model of an object, interaction, or process, which provides a “framework of meaning,” as Crawford (2009) pointed out. The theory makes the data or evidence meaningful by explaining how the evidence fits together into a larger whole.
12. Once a scientist has chosen a specific theory, she will use this model to create several hypotheses, which are clear statements that can be proven true or false. Then the scientist will use specific tools called research methods to discover and collect data, which will be evidence used to prove the hypothesis true or false. There are thousands of specific research methods to collect different kinds of data, depending on the academic discipline a scientist works within and the topic being studied. We will discuss some basic types of research methods and evidence later in this chapter (ch 6.5). Once enough data has been collected, a scientist will use another kind of tool, analytical methods, which are used to make the data meaningful by organizing the data into theoretical categories and connecting the data according to the theoretical model.
13. While the individual research of each scientist is important, the combined effect of the community of scientists and all their research is much more important. This scientific community is developed and fostered by scientific institutions, like university academic departments, conferences, and peer reviewed journals. These institutions form the social foundation of science, which is a group activity. After an individual scientist has finished his or her research and proven a hypothesis true or false, then that scientist shares these findings to the community of scientists. This is the last step of the scientific process, and the most important part of the scientific method. This last step is a social process called "peer review." The purpose of this general process is to critically analyze the theories, methods, data, and conclusions of scientific research in order to look for errors and false conclusions.
Scientific Peer Review: Meet the "Data Thugs"
14. Each scientist proposes theories about reality based on research and then submits a finished report to a peer-reviewed journal. There are over 50,000 such scientific journals around the world (Judson, 2004, p. 276). Reviewers for these journals criticize research papers and decide if these papers should be published. Once a paper is published in a journal, its findings are then evaluated, judged, and later refined through the disciplined debate of the larger scientific community (Popper, 1959/2002; Judson, 2004). Oreskes and Conway (2010) argue that peer review "is what makes science science...no scientific claim can be considered legitimate until it has undergone critical scrutiny by other experts" (p. 154).
15. But critical debate is not enough. Scientific truths are also evaluated with further tests designed deliberately to falsify claims. If the scientific claim cannot be falsified by such tests, then it still remains provisionally “true,” although now accepted as more true than before (Popper, 1959/2002; Mayr, 1997, pp. 47, 51). Most scientific claims are falsified in small ways and modified over the years so as to become more true, but rarely, if ever, completely true. A fact is a theory that has been "repeatedly confirmed and never refuted" (Mayr, 1997, p. 61). But even theories that don't reach this final stage are still useful tools that help explain how reality might work once the evidence is found to back them up (Mayr, 1997). As David Deutsch (1997) explains, "In science we take it for granted that even our best theories are bound to be imperfect and problematic in some ways, and we expect them to be superseded in due course by deeper, more accurate theories" (p. 17).
16. Over the past two centuries, the practice of science has improved the lives of billions of people through increased knowledge and technology, which have increased the health of individuals and the wealth of societies. To take but one notable example, the agricultural scientist Norman Borlaug won the Nobel Peace Prize in 1970 for developing disease resistant and high-yield food crops in the 1950s and 1960s, which led to the Green Revolution. This scientific improvement was a major breakthrough for the human species. It kept hundreds of millions in the developing world from starving, and it stabilized and sustained the economic market for food (Norman, 2009). Nobody can honestly deny the possibility of objective knowledge and the benefits that science and technology can provide.
VIDEO: "Me & Isaac Newton"
5.2 Provisional Truths: On the Limits of Science
17. The practice of science is perhaps the greatest human innovation of all time. However, science and technology are not unqualified goods. Science is not perfect. It is only the best method we have to create truth and the technology upon which we increasingly depend. But as the practice of science advances and pushes down more and more boundaries, we must collectively acknowledge that "science does have the potential to do great harm, as well as good" (“And Man”, 2010, para 5). Nuclear and biological weapons, as well as environmental pollution, are all scientifically produced evils that threaten life on earth. As Dr. John Ioannidis has stated, "The scientific enterprise is probably the most fantastic achievement in human history, but that doesn't mean we have a right to overstate what we're accomplishing" (as cited in Freedman, 2010, p. 86). Thus, as the philosopher of science Robert Klee (1999) has argued, while science's "track record of achievements is neither errorless nor continuous," the strength of the scientific process has been continuously proven by the ability of scientists to "learn more from mistakes than from successes" (pp. 2-3). With this in mind, it is instructive to demonstrate some of the limitations of science as it is currently practiced so that we can evaluate the validity of scientific research more carefully and accurately. There are many serious flaws in the practice of science, including limits to what scientists can and cannot accomplish, and the unintended consequences of scientific discoveries.
18. While science helps us better understand the objective world, scientists themselves are not immune to subjectivity and culture. Scientists can be just as irrational and biased as the rest of us (Judson, 2004, p. 148). Physicist Freeman Dyson once humbly acknowledged, "We are scientists second and human beings first" (as cited in Finkbeiner, 2006, p. xxx). We must never trust scientists merely because of their social status as scientists because these knowledgeable people are still human and make mistakes. For example, many scientists, both men and women, have a strong bias against female students, believing women are less intelligent than men when it comes to practicing science (Chang, 2012). Chang (2012) explained that this bias "probably reflected subconscious cultural influences rather than overt or deliberate discrimination" (para. 3). Scientists also routinely make mistakes (Kahneman, 2011, p. 8), just like the rest of us, and they don't like to admit when they are wrong. Some scientists don't question their assumptions before investigations. Many cling to traditional scientific theories like religious "dogma" (Gray, 1995, p. 231; Kahneman, 2011, p. 9). The widespread practice of holding on to discredited theories led philosopher of science Thomas Kuhn (1996) to assert that powerful scientific paradigms resist change, even when new evidence has proven them wrong. One scientist acknowledged, "Even when the evidence shows that a particular research idea is wrong, if you have thousands of scientists who have invested their careers in it, they'll continue to publish papers on it" (as cited in Freedman, 2010, p. 84).
19. Some scientists even lie and cheat, which is not science at all. Such deceptive practices have been called pseudo-science, junk science (Jacoby, 2009, p. 210), voodoo science (Kirsch, 2010, p. 53), or just fraud (Judson, 2004). These pseudo-scientists may willfully distort data or plagiarize the ideas of others, which of course, is a "violation" of the principles of science (Goodstein, 2010, p. 1). Even respected scientists, doctors, and academics can cross the line into pseudoscience, especially if they are speaking through a news media outlet like a television show. Dr. Oz is a respected Columbia University heart surgeon, but he has been accused of peddling false and misleading information. Charles Gross (2012) recounts several studies of scientific misconduct where roughly 7 to 27 percent of scientists (depending on the study) reported firsthand knowledge of "fabricated, falsified or plagiarized research over the previous ten years" (p. 26). Another study claims that 67 percent of journal article retractions are due to ethical "misconduct" rather than scientific "errors" (Basken, 2012, para. 11). And there is evidence that scientific misconduct has been rising in recent years, due to increased pressure to publish more research so that scientists can get better jobs, earn more research money, and gain more recognition (Zimmer, 2012). Because of such misconduct, there are a large number of retractions each year.
20. But even when there isn't a willful distortion of the data, there is still a lurking subjective tendency to acknowledge only the data that corroborates one's theory. This type of bias is called "cooking" the data (Goodstein, 2010, p. 33). It is very widespread, even in published research. Cooking the data can also come in another form. It’s called a "publication bias." This type of bias occurs when research journals accept and legitimate only certain types of claims or theories, which they favor, while ignoring and refusing to publish other types research that may be unfashionable or controversial (Kirsch, 2010, p. 25).
21. Both voodoo science and the plain old variety of bad science perpetuate themselves for many reasons: the prestige of powerful scientists; the inherent bias of funding sources, which can often create conflicts of interest; and the subjective "black box" of the peer review process itself (Judson, 2004). The scientific publishing industry has come under heavy scrutiny over the past decade. It is well known that funding sources often bias research results. For instance, pharmaceutical companies tend to publish only those data that support their drugs and ignore unfavorable data (Kirsch, 2010). But less well known is the inherent bias within the peer review system itself (Judson, 2004, ch 6). Many articles are published not solely on their merit, but because of contingent factors, such as the subjective interests of reviewers or the professional connections of the writer. Several scientists have argued that the peer review system "rewards conformity and excludes criticism; " thus, popular consensus can often lead to mindless replication of the same conclusion (Taubes, 2007, pp. 52).
22. Scientists are also not critically analyzing each other enough through peer review. Even when many scientists are all publishing on the same topic, they are mostly focused on their own research (and reputation), and rarely do they do the important work of retesting each others' findings to replicate results (Freedman, 2010; Gross, 2012). A lot of errors go undetected by the scientific community because few want to do the hard and thankless work of peer review. In fact, Dr. John Ioannidis’ meta-analysis of medical research has found that 80 percent of non-randomized studies offer false conclusions. He also found that around 41 percent of the most cited medical research from the most prestigious journals turns out to be "wrong or significantly exaggerated" (as cited in Freedman, 2010, pp. 80-81). Instead of doing peer review, scientists want to design their own study to attract professional acclaim and research dollars. No one gets famous for criticizing the work of others or proving theories wrong. As physicist David Goodstein (2010) has argued, "It's far better to prove a theory is right than to prove it is wrong" (p. 132).
23. But even good science filtered through peer review is rife with problems. The practice of science is expensive, complex, and time consuming. Studies ideally last years, if not decades. This length of time creates a backlog of untested theories and a long delay before any conclusions are reached. And because scientific studies are so complex and time intensive, they cost a lot of money. The best studies are usually the longest and largest studies, and, of course, they cost the most. The National Institutes of Health and the National Science Foundation are the primary sources of government funding of science in the United States. Together they invest tens of billions of dollars every year, and the vast majority of applications (almost all quality projects) are turned down for funding. As one economist put it, "Theory is cheap, and data are expensive" (Solow, 1997, p. 75).
24. Good scientists are also overly influenced by research traditions. Young scientists will generally use the theories and methods of their professors from graduate school, who also practice the same methods they had previously learned in school. Hence, the cliché: a "school of thought." These traditions lead to a perpetuation of older, established methods and theories, and a neglect of newer ones, even though the new methods and theories might be better. Currently, the positivist model of empirical physical science has been held up as the only true or valid model for all knowledge claims. This theoretical model has led to the denigration or dismissal of diverse scientific practices, especially in the social sciences and humanities (Cole, 2009, pp. 100, 151-155).
25. The practice of science is also narrowly specialized, which creates a unique problem at the heart of the scientific process. Many scientists struggle to keep up with the vast proliferation of research in their own narrow niche and have to rely on faith in the ability of other scientists working in different fields. As Michael Polanyi (1962) explained, "Nobody knows more than a tiny fragment of science well enough to judge its validity and value at firsthand. For the rest [the average person] has to rely on views accepted at secondhand on the authority of a community of people accredited as scientists" (pp. 163, 216). Professional scientists have been trained to research the objective world in a very narrow domain, and outside of this domain, scientists are often just as subjective and ignorant as the rest of us.
26. Scientists must also have faith that all the "fragments of evidence" within their own narrow domain and across the curriculum add up to some significant advance in the whole of knowledge (Taubes, 2007, p. xxii). Most of the truths scientists produce are relevant only to other specialized scientists. Thus, whatever progress happens is far removed from the daily lives of most human beings. Scientific knowledge only occasionally pays "practical human dividends" in terms of solving real world problems (Toulmin, 2001, p. 79). Most of the time, scientific knowledge cannot be used by the average person. But it could be potentially useful to specially trained experts one day.
27. Another problem is the overly simplified outcome of scientific studies, which often focus on a handful of discrete variables in a causal chain (Wheelan, 2013, p.2). Thus, scientific studies either ignore or try to control for various complex systems surrounding the phenomenon being studied, which gives a false picture of how variables actually interact within densely layered and overlapping environmental levels: sub-atomic, atomic, chemical, biological, social, and ecological. Because scientists often ignore or overly simplify the larger, contextual ecology of the real world, they aren't aware of the significant bias in their data gathering methods.
28. For example, in the field of psychology, the traditional methods of data gathering have been so biased as to call into question the validity of the entire discipline. Only recently did psychologists realize that the majority of their test subjects were "WEIRD" ("Experimental," 2012; Watters, 2013). This acronym stands for Western, Educated, Industrialized, Rich, and Democratic people. For the past half century, most academic psychologists had been studying undergraduate volunteers, primarily from the United States, and making the assumption that these young American college students represented the whole of humanity. From 2003 to 2007, approximately 96 percent of research in psychology used WEIRD test subjects (Watters, 2013, p. 49). But Joseph Henrich and his colleagues have pointed out that these Western test subjects, who represent approximately 12 percent of the world population, have a very unique culture, which leads to very unique thought patterns and cultural beliefs. Thus, scientific conclusions based on Western test subjects will not be valid if these conclusions are used to understand other, non-Western cultures. Because of the extreme uniqueness of American culture, as Watters (2013) reports, researchers have “concluded that social scientists could not possibly have picked a worse population from which to draw broad generalizations” about humanity as a whole (p. 50). Scientists need to be more aware of the larger, social and political contexts surrounding their research.
29. Scientific knowledge does not exist in a vacuum. Not only do scientists need to be aware of larger cultural contexts in order to set up valid experiments, but scientists also need to be aware of how their culture might interpret or use their research. Valid scientific research can be misinterpreted and/or misused by public, especially by politicians and journalists. During World War II, the political leaders of both Germany and America controlled the field of nuclear physics in order to use scientific research to develop weapons of mass destruction (Finkbeiner, 2006, p. 7). American scientists during the Vietnam War helped the government create new weapons, which escalated the war and killed many civilians (Finkbeiner, 2006, p. 113). Physicist Robert Oppenheimer helped build the first atomic bomb, which was later dropped on hundreds of thousands of innocent civilians in Japan. He explained, "When you see something that is technically sweet, you go ahead and do it and you argue about what to do about it only after you have had your technical success" (as cited in Finkbeiner, 2006, p. 42). However, there is a problem with this widespread attitude. Once scientists invent potentially destructive technology, it’s hard to put the genie back in the bottle and keep it from harming human life or the environment. Many American scientists were shocked and horrified by the dropping of the atomic bomb, which they helped to create, and yet, they bear some responsibility for the devastation it released.
30. All scientific studies have to be translated by journalists in the popular press for a mass audience, but much of this reporting turns about to be wrong or exaggerated. Untrained in scientific methods, many reporters and public intellectuals mistake pseudo-science or voodoo science for the real thing. Thus, they mislead the public with false information (Jacoby, 2009, p. 63). Other reporters who do have training in science often focus on eye-catching research and write about early groundbreaking results, but these reporters rarely, if ever, follow up on the peer review process that exposes the many mistakes scientists make. When it comes to science writing in the popular press, the public is often misled, rarely getting refined scientific truth ("Journalistic," 2012).
31. Scientists themselves can also mislead the public by oversimplifying complex research. Take, for example, the medical research and public health policy surrounding nutrition and diet. Gary Taubes (2007) has convincingly argued that "nutritionists for a half century oversimplified the science to the point of generating false ideas and erroneous deductions" (p. 152). He concluded,
“Practical considerations of what is too loosely defined as the ‘public health’ have consistently been allowed to take precedence over the dispassionate, critical evaluation of evidence and the rigorous and meticulous experimentation that are required to establish reliable knowledge. The urge to simplify a complex scientific situation so that physicians can apply it and their patients and the public embrace it has taken precedence over the scientific obligation of presenting the evidence with relentless honesty” (pp. 152, 451).
Scientists, like all people who claim expertise, can also become overconfident in their knowledge and abilities, and thereby, be more prone to make mistakes. In one study of the flawed judgment of experts, a researcher found that "people who spend their time, and earn their living, studying a particular topic produce poorer predictions than dart-throwing monkeys who would have distributed their choices evenly over the options" (Kahneman, 2011, p. 219). There is also evidence that scientists themselves can deliberately obscure the truth for political purposes, ranging from hiding the dangers of smoking to denying the reality of global warming (Oreskes & Conway, 2010).
32. Because of all these flaws, the social status of science, the judgments of scientists, and the products of science should not be venerated as perfect knowledge. Scientific knowledge is never perfect. Science "is not, and cannot be, as authoritative" as most scientists wish it to be (Lindblom & Cohen, 1979, p. 40). Too often we believe scientists "because of their visible display of the emblems of recognized expertise and because their claims are vouched for by other experts we do not know" (Shapin, 2010, p. 88), elevating scientific claims to near divine, dogmatic status. The flaws of science are significant problems because scientists often situate their work or themselves within larger political debates. Scientific expertise often informs political policy, which affects the public. Laws based on flawed science (or bad information in general) can lead to serious adverse consequences for the whole society.
33. Thus, constant vigilance and criticism are warranted. In science, both honest mistakes and even outright fraud should be expected, as part of the "sloppy" nature of scientific research (Feyerabend, 2010/1975, p. 160). As Susan Jacoby (2009) has pointed out, you should never trust anyone just because they use “scientific-sounding language” (p. 221). You always need to critically analyze and verify the methodology, evidence, and reasoning behind every conclusion. But even when the larger scientific community examines and verifies objective truth over time, it is important to remember that scientifically produced “truth” is never static or absolute. There is no room for complacency or orthodoxy. Scientifically produced truth is only a provisional and "probable,” as philosopher of science Hans Reichenbach argued, "whose unattainable upper and lower limits are truth and falsity" (as cited in Popper, 2002/1959, p. 6).
34. Science is built on the foundation of constant critique and revision of old, probable truths so as to continually create better truths or new truths. The sociologist Max Weber argued, “In science, each of us knows that what he has accomplished will be antiquated in ten, twenty, fifty years. That is the fate to which science is subjected; it is the very meaning of scientific work…We cannot work without hoping that others will advance further than we have” (as cited in Judson, 2004, p. 30). The whole notion of scientific truth is always in perpetual flux, albeit often within relative limits based on large bodies of data (Mayr, 1997, p. 77).
35. But even with all these flaws, the practice of science is still one of the most significant and useful human technologies ever conceived. The philosopher of science Karl Popper went so far as to claim, "Next to music and art, science is the greatest, most beautiful and most enlightening achievement of the human spirit" (as cited in Mayr, 1997, p. 41). And without fully understanding the practice of science, most people benefit from its practice, using the products of science to live better, know better, and communicate more clearly. We all benefit from the laborious practice of scientists, even if we do not behave scientifically or do not completely understand scientific processes or outcomes.
36. Science is an important invention, and it will continue to be a worthy endeavor, but this "technology of truth" (Dennett, 2003, p. 6) does not eliminate the problems of subjectivity and culture. Science needs to be understood as an important, but limited tool. We should not blindly trust science. As David Lindley (2008) recognized, "Scientific knowledge, like our general, informal understanding of the everyday world we inhabit, can be both rational and accidental, purposeful and contingent. Scientific truth is powerful, but not all-powerful" (p. 216). Karl Popper (1979) concluded, "There is no absolute certainty...The quest for certainty, for a secure basis of knowledge, has to be abandoned" (p. 37). At root, the indeterminacy of science is related to the foundational flaw of all human knowledge: We still have to subjectively interpret the objective world (Toulmin, 1961, p. 81). Therefore, in a world without perfect and direct knowledge of the objective world, we must find the best methods to improve the quality of our knowledge. But regardless of what method is used, all techniques are flawed and all lead to partial approximations of the real world. Thus, all truth must be built on the provisional and tumultuous foundation of reasoned debate and human judgment.
Information Literacy Critically Evaluating Different Types of Evidence
1. Evidence is the most important part of any inductive argument. You need to be able to prove your claims with evidence. But it is important to recognize that not all evidence is reliable. Some types of evidence can't be trusted at all. And even when good evidence is used, it might not be appropriate for certain types of claims. Thus, evidence has to be both reliable and logically appropriate, in a word, evidence needs to be valid, in order to prove a claim true or false.
2. To oversimplify for instructional purposes, I want to explore nine major types of evidence that are most commonly used in arguments, starting with the weakest form of evidence and moving forward toward the strongest form. I will briefly explain each type of evidence, focusing on how reliable it tends to be (strong to weak) and how it could be appropriately used.
Handout: Types of Evidence & Sources
3. The first type of evidence is personal experience, sometimes called testimony or an "eye-witness account." This evidence is based on what an individual experiences directly or indirectly. Psychologically, we naturally consider this type of evidence to be the strongest because it is the most immediate and intimately tied to our local world, which we know the best. However, from a sociological perspective, personal experience is actually the weakest type of evidence. Critics have noted the unreliability of personal experience since writing was invented. About 2,500 years ago, the ancient Greek historian Thucydides warned: "Those who were eyewitnesses of the several events did not give the same reports about the same things, but reports varying according to their championship of one side or the other, or according to their recollection" (as cited in Schiff, 2010, p. 10).
4. Even when our eyes see clearly and our brain works well, what we directly see is only an infinitesimally small part of the objective world. What we can directly see and experience is not statistically significant. Our perception is unique. Our experience of a situation does not represent the experience of the average person. We are also not able to fully understand what we see. When a scientist tries to make a generalization about the complex objective world, he or she needs to gather a lot of evidence to filter out the idiosyncratic or random data, called outliers. Scientists use large bodies of data in order to determine statistically a valid generalization that matches the larger population or phenomena being studied. Statistical analysis is hard for us to understand because our brain cannot easily comprehend statistical averages and it is difficult for us to acquire such knowledge (Kahneman, 2011).
5. Our personal experience is an awful basis for knowledge because our brains never work perfectly. Our brains are fundamentally flawed due to the evolutionary biology of our species. Some of these flaws are based on older evolutionary adaptations suited for a simpler way of life. Many automatic thinking reflexes bias our judgment in numerous ways. Psychologist Daniel Kahneman (2011) calls these biases "system 1" or "fast thinking" (p. 28). While automatic thinking reflexes may be very useful when avoiding predators or natural disasters, these natural biases interfere with our ability to really understand our situation. They usually make our conclusions unreliable and false (Thaler & Sunstein, 2008).
6. In order to control these automatic thinking reflexes, we have to deliberately slow our thinking down. We have to engage in what Kahneman (2011) calls "system 2" thinking, or what philosophers call critical thinking. We need to examine our thought process to become aware of the errors caused by the automatic reflexes of system 1 so we can correct them (p. 28). But even when we engage critical thinking, Kahneman’s research demonstrates that "biases cannot always be avoided" (p. 28). Our minds are naturally "gullible," "biased to believe" almost anything, and "lazy" when it comes to critically thinking (Popkin, 1994, p. 218; Kahneman, 2011, pp. 44-49, 81). Finance professor Burton G. Malkiel (2012) explained, “Understanding…how vulnerable we are to our own psychology can help us avoid [our own] stupid[ity]” (p. 258).
7. We take shortcuts whenever we can. This can sometimes be useful because it saves us time and energy. According to Kahneman (2011), our brains are naturally biased to "jump to conclusions on the basis of limited evidence," leading us to see the world with a WYSIATI framework: "What you see is all there is" (p. 86). Our brains often behave like a drunk looking for car keys in the dark: Instead of looking in the most logical places, which are dark, a drunk starts looking in the most improbable places because there is better light (Popkin, 1994, p. 218). While thinking shortcuts save us time and energy, they often produce unreliable information.
8. But automatic thinking reflexes are not the only problem with subjective experience. Our perception can also be clouded by emotions, such as fear or happiness. We literally cannot see clearly when we are overcome by strong emotion. Information presented in a highly emotional way will almost always take precedence over information presented neutrally (Popkin, 1994, p. 16; Kahneman, 2011). Our brains can also become damaged or chemically unbalanced in many different ways. Such distorted perception can cause the hearing voices, the blurring of vision, or smelling non-existent scents. Because modern psychology has shown that personal experience is so unreliable, most modern courts of law would not convict a person on the basis of only a single eye-witness.
9. While we live our entire lives on the basis of our own personal experience, we rarely explore the flaws of our personal knowledge. For the most part, we don't need to because our brain is naturally tuned to the frequency of the objective world, and we generally don't need much knowledge on a daily basis to live our lives. And luckily, we maintain many social relationships while living in large, complex societies. Our personal experience and knowledge is moderated daily by the experience and opinions of others. The input from our community helps correct any errors we might make.
10. But our social networks can also help compound our thinking errors into larger cultural fictions that everyone in our culture accepts (i.e. the common sense of conventional wisdom). Most people have a limited grasp on the objective world. But we don't realize how little we know. We are either unaware of our own subjective thinking errors, or we are trapped in the collective deception of cultural common sense. We suffer from the "pernicious illusion" that we understand the world we live in (Kahneman, 2011, p. 201). As Daniel Kahneman (2011) has pointed out, "Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance" (p. 201).
11. The second type of evidence consists of documents, images, or other cultural artifacts. Anthropologists call these things material culture. These forms of evidence should also be considered unreliable because they are produced by individual human beings, largely on the basis of that individual's personal and cultural experience. However, documents and artifacts become more reliable when large numbers are analyzed and compared with each other. The triangulation of multiple artifacts helps filter out subjective or cultural bias.
12. Historians, for example, will pour through the major newspapers and books of a time period in order to compare the common elements of all the stories, which most likely (but not always) get down to the basic facts of an event. But in order to get a fuller picture of what actually happened, these basic facts would also need to be correlated with the diaries, letters, and other personal documents of people who were part of the event. The historian might also look at the actual geography of the place to see where the event took place. They might try to find historical artifacts from this geography, often cataloged in museums. Or that historian might be an archeologist who would dig beneath the ground to find new artifacts that might still be buried. The materials and technology used to make the artifacts can tell us much about the society in which it was created.
13. The third type of evidence is the survey. Thousands of years ago, various governments developed the census survey to keep track of the population for military, commercial, and tax purposes. The modern survey was developed in the late 19th century as a way to gauge the opinions and conditions of people in democratic countries, ostensibly to find out what social problems existed so the government or private charity organizations could fix them (Igo, 2007, p. 29; Ewen, 1996, p. 186). Historian Stuart Ewen (1996) explained, "The polling system and a burgeoning market research establishment would provide the channels through which the public would be known and then responded to" (p. 186).
14. The survey is simply a list of questions asked to a large sample group. A sample group is a random collection of data meant to represent the larger population of which it is a part. The survey was designed as a tool to accurately describe a large population according to specific categories, which were represented by specific questions on the survey. However, it became apparent to early social scientists who developed the survey that it was an "admixture of fact and opinion such that it cannot be classified as a thoroughly scientific study" (as cited in Igo, 2007, p. 34). This is why many scientists look down on survey data as a less than legitimate type of evidence. Behavioral economist Richard H. Thaler (2015) noted tongue in cheek, "To this day, the phrase 'survey evidence' is rarely heard in economics circles without the necessary adjective 'mere,' which rhymes with 'sneer,'" although Thaler pointed out that survey data can be very useful if appropriately filtered through statistical analysis, which we will discuss below (p. 47).
15. One problem with surveys is the unreliability of respondents. There is no way to verify the truthfulness of the answers people give. Sometimes people lie for various reasons. Therefore, because surveys rely on the subjective opinions of ordinary people, they suffer from the same basic flaws of personal experience as a form of evidence. But surveys are also flawed by the common sense or theoretical assumptions of the authors who design them (Igo, 2007, pp. 54, 59; Ewen, 1996, p.188). These assumptions not only influence what questions are asked (or not asked), but also the definitions of important words separating one category from another. These assumptions can often affect how questions are asked. Leading questions are loaded with particular assumptions and they push people to answer in a particular way in order to manipulate public opinion. Ewen (1996) has pointed out, "Carefully worded questions could be crafted in order to elicit any desired response" (p. 188). Thus, the validity of surveys must be measured against the questions asked, the theoretical or cultural assumptions behind those questions, and the sample size and composition of the population polled. Generally, surveys give an interesting but highly unreliable indication of public opinions, especially if the sample is biased (see example); thus, surveys should never be taken too seriously. A survey should also be statistically analyzed to check the quality of the data.
16. The fourth type of evidence is the interview. Like the survey, this type of evidence is highly flawed because it rests on subjective experience and opinions. However, structured interviews are always better than surveys because the interviewer can ask the interviewee follow-up questions to get more elaborate and nuanced answers. An interviewer can also read the body language of the interviewee to capture clues about the reliability of the answers given. Also, conducting an interview in an embedded setting, like interviewing teachers at the school they work, can lead the researcher to reevaluate questions based on the environment. Direct knowledge of a local environment can open up new lines inquiry that the investigator may have overlooked as an outsider.
17. One of the biggest logistical problems with interviews is that they take a lot of time and effort to conduct and transcribe. It is difficult to get more than a handful of interviews for a research project, a circumstance which greatly limits the researcher's ability to generalize the data. However, the in-depth answers of interviews compensate for this disadvantage. To get the fullest picture of public opinion, whenever possible, it is best to combine large scale surveys that are statistically analyzed with in-depth interviews. This way you can combine statistically valid generalizations of the whole population with rich specific details from the unique experience of individuals.
18. The fifth type of evidence is field research, which comes in many different forms depending on the population or phenomenon being studied. To name a few: the study of living human cultures is called ethnography, the study of extinct human cultures is called archeology, the study of animal culture is called biology or sociobiology, the study of extinct animals is called paleontology, the study of inanimate ecosystems is called geology, and the study of living ecosystems is called ecology. Field research is similar to laboratory research (evidence types 8 and 9), except that field research is messy because it is conducted in uncontrollable "natural" settings (Hammersley & Atkinson, 2003, p. 6; Diamond & Robinson, 2010). In a natural setting, the researcher can see how variables interact with each other within the dynamic environment where they naturally occur.
19. There are trade-offs between natural and laboratory settings. In laboratory settings, scientists study isolated variables in the controlled environment of the laboratory; doing so can be very insightful because scientists concentrate on only a few interactions between a small number variables. But in the real world, myriad variables interact with each other in dynamically changing environments. Also, when dealing with conscious organisms that can modify their behavior (like people, chimpanzees, or dogs), a laboratory setting can often induce highly unusual behavior in the test subject, which could bias the results. In a natural setting, organisms are likely to behave as they normally would under routine circumstances; as a result, researchers may get a more accurate and complex understanding of such behavior. However, if the research becomes too conspicuous in the natural setting, this situation too could change the behavior of the subjects and bias the data.
20. A researcher in the field will describe, measure, and categorize various phenomena. For example, a biologist might watch monkeys all day, recording their size, movements, relationships, and habits. Anthropologists study people, but they would observe the same factors. But this process of collecting data is limited by the fact that researchers are "outsiders" to the natural setting (Geertz, 1973/2000; Geertz, 1983/2000; Hammersley & Atkinson, 2003). Thus, scientists do not understand the setting like an "insider" would; hence, the need to conduct interviews with insiders whenever possible to get the insider point of view. Of course, this technique requires knowledge of insider language, and it obviously doesn’t work with animals, which greatly limits our understanding of animal behavior and culture. Field research is very valuable because it combines structured observation of the subject with the analysis of the environment, artifacts, and, where appropriate, surveys, interviews, and statistical analysis of the data.
21. The sixth type of evidence is statistical research, often called quantitative data. Statistics is a branch of mathematics that has become one of the main scientific methods for collecting and analyzing data. There are two different forms of statistics: descriptive and inferential. Descriptive statistics take a bunch of data and organizes that data into defined categories. This form of statistics just describes a large amount of data in a few relatively simple categories, and expresses the data with numerical values, like whole amounts, percentages, or fractions.
22. The second form of statistics is much more complex. Inferential statistics take two or more variables and tries to determine two basic conclusions: how connected are these variables and how strong is that connection. The connection between variables is called "correlation." If there is a strong correlation between two variables, then where you find X you would also find Y. For example, when you find people who smoke two packs of cigarettes a day, you also expect to find lung cancer because there is a high correlation between these two variables. The strength of a correlation is somewhere between random and perfect. Random implies there is no connection at all, while a perfect correlation means that when you find X you always find Y. Most correlations are somewhere between these two poles. If the relationship between two variables is strong, then a scientist would say that this relationship is "statistically significant," which means that by using statistical mathematics, the connection can be proven to be highly probable.
23. Statistics is a strong form of evidence; however, it has its limits. First of all, in order to prove a statistically significant relationship, you need to have a lot of good data. A small study would need at least a couple hundred data points, while a large study would have at least 1,000 data points. The larger the study, the better. You need a lot of data because the world is a complex place with much diversity, so you need a large set of data to prove that the connection you found is not a random fluke. The mathematics behind this point are very complex, but basically statistical analysis done with less than a hundred data points is unreliable and should not be taken seriously.
24. Second, how you get your data is also important. Good data will produce reliable statistical conclusions, but bad data produces statistics that are either meaningless or misleading. Economist Charles Wheelan (2013) explains, “If the data are poor, or if the statistical techniques are used improperly, the conclusions can be wildly misleading and even potentially dangerous” (p. xiv). For statistics to work properly, you need a "random sample" of the population that you are studying. This creates two common problems, both of which can reduce the reliability of statistical results. The first problem is the definition used to label and characterize a population (Wheelan, 2013, p.38). Who are "smokers," or "Asians," or "youth," or "middle-class housewives," or "non-traditional students"? What are the characteristics that all members of this supposed group share, and what happens if a test subject has only some of those characteristics – does this subject go in or out?
25. Answering these simple questions is problematic. If I want to study the voting preferences of Asian Americans, who do I study? Asian immigrants? Only those immigrants who have passed a citizenship test or all of them? Documented and undocumented? Adult and children? What about the daughter of a third-generation Asian man who married an Irish woman? What about a second generation son of immigrants from India? Even though India is part of south-east Asia, are Indians considered Asians? In order to study a phenomenon or group, you need to have strict definitions in order to make sure you are studying only the phenomenon you have identified. This can be a very difficult process.
26. But even more difficult is finding out how to select a "random sample" of this phenomena or group. Finding such a sample of your defined population is the second most common problem. If the test group was finite and small, you could collect data on every member. But most of the important issues that need to be statistically analyzed are very large, complex phenomena that have a large amount of potential members. It would be impossible and extremely expensive to interview every American, or every resident of New York, or every student at your college, or every 24-32 year old mother in the United States with a 4 year old son. How would you even start to look for this population of 24-32 year old mothers? And what makes you think they all want to participate in your study?
27. So because we cannot feasibly measure all the members of the group we want to understand, we must choose to measure only some of those members. But which ones? If we want to understand the whole group then we need a "random sample" that represents this whole group. Instead of interviewing all residents of New York, we randomly call every 200th person in the phone book until we reach the end. Instead of interviewing all students at our college, we randomly pick 10 students from every academic discipline. We get an official list of students from the registrar’s office broken down by majors, and we interview every 25th student on the list.
28. If the sample is not "random" then the data is "biased" and this will lead to faulty conclusions. If we talk only to New Yorkers who are walking down Madison Avenue, then we will most likely get a skewed sense of the people of New York. Just like if we only talked to students in the Art department, when we really wanted to know about all the students at the university. When you are constructing a statistical research project, as Darrell Huff (1993) has pointed out, you are always "running a battle against sources of bias" in your data (p. 24).
29. Once you have a specific set of phenomena to study, and you have gathered a random sample of data, then you crunch the numbers to see how connected your variables are. But there is a third major problem with statistics. There is never a perfect connection between any two variables, and there are always outliers in every set of data. If there is a correlation between two variables (when people smoke there is also a higher likelihood of lung cancer), this connection is never perfect. Thus, a statistical correlation is always expressed as a fractional number where 0 equals no connection and 1 equals perfect connection. Some people who don't smoke also get lung cancer, and some people who smoke two packs a day never get lung cancer. These types of people are called outliers because they don't fit the general pattern of connection. In general, the more smoking a person does, the more likely they are to get lung cancer. Let’s just say, for arguments sake, that around 9 out of every 10 heavy smokers develop lung cancer. This would be a 0.9 correlation, which is a very strong connection. But there are still approximately 10 percent of heavy smokers who don't develop lung cancer, so our correlation is not perfect.
30. There are special statistical tests to determine the strength of a correlation. A correlation with a lot of outliers is very weak, while a correlation with only a few is very strong. Less outliers means that most of the data fits the general pattern. More outliers means that a lot of data does not fit the general pattern, which means there may or may not actually be a pattern. The strength of statistical correlation is usually expressed in terms of probability, which can be quantified using a measurement called standard errors. Because no correlation is ever perfect, there will always be some outlying data. These outliers are considered a type of "error" because they are evidence against the pattern we are trying to prove.
31. Because there are always outliers, our statistical conclusions will always be expressed as a probable range of outcomes. As Charles Wheelan (2013) explains, “Statistics cannot prove anything with certainty” (p. 144). If there are a lot of outliers then there will be more standard errors, and if there are more standard errors, then we cannot predict with much accuracy if new data will support our pattern. Here are two examples. Taking what we assumed about smoking and lung cancer, if I was a doctor and I had a patient who smoked, I could say that this person has approximately 90 percent chance of getting lung cancer. It is a very probable, but not completely certain outcome. Now what about the link between heavy smoking and heart disease? This correlation is only around 0.3, or 30 percent, which means that the majority of smokers are outliers, disproving a general pattern. So, if I was a doctor, I could not say to my patient that it is probable he or she would develop heart disease, but there is a moderate, 30 percent chance, which is significant. Probable knowledge is the best that statistics can provide.
32. Finally, there is a fourth major problem with statistics. Just because you have established that two things are highly correlated does not mean that one causes the other. For example, Stephen Levitt (2009) found an inverse correlation between abortions and crime: More abortions are connected to less crime; fewer abortions are connected to more crime. But this finding does not mean that presence or absence of abortions causes crime, or vice versa. It is much more complicated than that.
33. Also, just because two things are highly correlated does not mean that there aren't many other correlated variables, and it doesn't mean that the correlation explains anything because the connection may just be random. For example, while abortions and crime are correlated, there are many other factors that are also correlated to crime. Levitt (2009) found that increased numbers of police and innovative policing strategies were also correlated to decreased crime, but the correlations were not strong. Some variables that people thought were connected to reduced crime (like increased use of capital punishment or increased carrying of concealed weapons), actually had no connection at all. And sometimes two variables that have nothing to do with each other can be randomly connected. For example, did you know that the spending of the United States government on science is almost perfectly correlated with suicides by hanging, strangulation, and suffocation? What does this mean? Absolutely nothing! The connection between these variables is completely random and meaningless. Just because two or more variables are correlated does not mean they are actually connected by a cause and effect process in the objective world. We call these random flukes "spurious correlations." Statistics is a limited, but powerful form of knowledge. It can tell us with great precision how connected variables are, but it cannot tell us how those variables fit together, nor which variables cause other variables to act in certain ways.
34. If prepared correctly, statistics are a strong form of evidence. However, statistics are often not done correctly. Bad statistics can be the result of human error, or they could be due to deliberate manipulation by the unscrupulous. There are many ways to lie with statistics (Huff, 1993). Just because you see statistics does not mean that you have good evidence. Statistical charts and graphs often have serious errors, the most common being a small sample size or a biased sample. If you go looking for a connection between car crashes and alcohol and only conduct research in bars, then you have a very biased sample that does not reflect the population at large. And even when the statistical mathematics are done correctly, bad data will create absurdities. Charles Wheelan (2013) warned, “Some of the most egregious statistical mistakes involve lying with data; the statistical analysis is fine, but the data on which the calculations are performed are bogus or inappropriate (pp. 117-118).
35. Statistics are also dangerous because people are naturally bad with numbers and statistical reasoning (Popkin, 1994, pp. 72-73; Kahneman, 2011, p. 77); thus, audiences can be easily manipulated by the magic of numbers they don't understand. This also leads to a natural bias against mathematics and statistical reasoning. Most people will not easily understand the true value of statistics. They will often discount statistics if given a countervailing, emotionally powerful narrative that connects with their personal experience (Kahneman, 2011, p. 174). To compensate for this natural bias against statistical reasoning, statistics have to be fully explained with words to a general audience in order to be effective. Statistics are most effective, and more easily remembered, if they are explained using a cause and effect story.
36. The seventh type of evidence is the reasoned sequence of ideas. In the 21st century, this type of evidence is usually expressed in mathematics, a highly specialized form of knowledge that relies primarily on deductive reasoning. We all know that 2 + 2 = 4 and that 4 + 4 = 8, and further, that 4 is 1/2 or 50% of 8, while 2 is 1/4 or 25% of 8. These are logical truths that are independent of the objective world, although many mathematicians and natural scientists have claimed that mathematics is the language of nature, like Galileo Galilei. He was one of the first astronomers to use a telescope to collect data about the universe, which he used to modify existing theories about how the universe was organized. As a reward for his painstaking contribution to science, the Catholic Church denounced Galileo's work, his books were burned, and he was forced to deny his findings on pain of death. Galileo (1623/1957) claimed that the laws of the universe were written in the "language of mathematics" and the shapes of "geometric figures, without which it is humanly impossible to understand" the natural world (p. 238).
37. Mathematics is a language based on quantification (measuring phenomena using numbers), computation (adding, subtracting, dividing or multiplying), and logical relationships, like geometry and calculus. Most scientists use mathematics because it is the most precise and unbiased language we have to describe the objective world. However, most people do not understand even simple mathematics; therefore, many think that math is magic. When they see numbers or equations, most people make the dangerous assumption that these numbers or equations are automatically true. When performed correctly, mathematics is one of the most reliable types of evidence we have; however, math always needs to be explained in common words and/or with common examples so that general audiences can understand it.
38. The eighth and ninth types of evidence are different forms of the controlled scientific experiment. There is the standard controlled experiment and there is the random and blind controlled experiment. A scientific experiment uses a theory to predict how particular variables interact to produce an outcome. In order to test the theory, a scientist will isolate the variables in a controlled laboratory setting so as to limit any interference by other variables that could produce unforeseen consequences. Consider, for example, an experiment about how refined carbohydrates effect weight gain. To truly isolate only the effect of refined carbohydrates, the test subjects would have to be kept in hospital for weeks or months at a time and fed a strict diet. If you only surveyed people's eating habits, you have to trust that the test subjects aren’t lying and that they know the exact quantities and types of food they eat, which most people don't know. The only way to study the specific variable of refined carbohydrates in relation to human weight would be to completely control the diet of a test subject, a difficult and costly process.
39. The random and blind scientific experiment adds quality control mechanisms to make the data much more reliable than data produced in a laboratory without these controls. Random means the test subjects are chosen indiscriminately so as to represent a general population. There can be no sampling bias in a random study. Blind means that the test subjects do not actually know the purpose of the study; therefore, they cannot consciously interfere with the study in any way. A double-blind experiment means that neither the test subjects nor the researchers collecting data know what the experiment is about. Researchers who are looking for particular types of data are more likely to find it because of a concentration bias (i.e. they are concentrating really hard to find something so they are more likely to see it, or think they see it), or because they deliberately falsify data to confirm their theory. The deliberate falsification of data happens more than scientists like to admit. A double blind experiment eliminates this risk.
Handout: Question the Evidence
40. It is important to understand the advantages and limitations of various types of data. You need to analyze the arguments of others to find their strengths and weaknesses. Then, you need to construct better arguments for yourself. When you critically analyze a source, you need to do more than just understand the conclusions an author makes. You need to understand how the author arrived at those conclusions and if those conclusions are logically justified by the data. And further, you need to investigate how the data were created and analyzed to make sure that the research was done correctly, and to see if there are any flaws or limitations to the data.
41. Remember, no argument and no scientific study is perfect. There are always flaws and limitations. Relying on flawed sources of data means making decisions based on unreliable or false information. Relying on flawed sources could lead you or others into danger, so be careful and critical. Never trust a source unless you fully investigate all the claims, evidence, methods for gathering and analyzing the data, and conclusions. A good rule of thumb: assume every argument is false until it can be proven true with substantial and valid evidence.
Handout: How to Read a Scientific Journal Article
Commentary: How to Read a Scientific Paper
Ask the Experts: How to Read a Scientific Paper
Rhetoric: Understanding the Rhetorical Context
1. One of the most important philosophical discoveries of the 20th century was the idea that truth cannot be passively "discovered," and that once discovered, truth is not readily accepted by all. Early philosophers and scientists had assumed that truth was easy to find, and that once found, everyone would simply agree on the truth because it would be obvious to all. These early philosophers and scientists had naive beliefs about the simplicity of the natural world, the powers of the human mind, and the existence of a divine order, which is why they thought knowledge was so easy to find and prove. The universe turned out to be a lot more complex than people realized, and scientists discovered that our brain and cultures had many flaws that influenced, and sometimes distorted, our knowledge.
2. In the 21st century, we now know that "truth" or "facts" have to be actively constructed by critical thinkers through meticulous and rigorous scientific methods. Further, truth is never obvious to all people and truth does not fix anything by itself. One has to argue for the truth in open debate in order to convince a skeptical public, which includes arguing about how to use the truth to solve specific problems. Debating with others about truth means both arguing for the truth and demonstrating how it is true with valid logic and evidence. It also means arguing against false opinions, manipulations, and lies. 21st century literacy entails not only being able to construct knowledge with scientific methods, but also openly arguing with diverse groups of people to explain and prove the truth. This is never an easy task.
10.1 Rhetoric: Persuading an Audience with Arguments
3. Rhetoric is a very old tool that humans have been using for thousands of years. For as long as there has been language, people have been using words to convince others to accept particular beliefs and act accordingly. The ancient Greeks were one of the earliest cultures to devise rules for speaking persuasively to an audience. This was largely due to the democratic nature of their political system, which required citizens to be able to speak knowledgeably and persuasively in order to influence public policy. The ancient Greek philosopher Aristotle (1995) wrote Rhetoric, a textbook on rhetoric in the ancient world. In this well-known book, Aristotle taught his pupils how to engage in "political debate" by arguing persuasively about different topics and making reasonable "judgments" (p. 2153). But rhetoric involved more than just an argument being made by a speaker. It also involved creating a "persona," which helped the speaker appeal to an audience (p. 2155). Aristotle's three parts of rhetoric have become known as the "rhetorical triangle": text, speaker, and audience. But as I'll explain in this chapter, the rhetorical context is much more complex than Aristotle originally thought.
4. Up until the 19th century, most scholars and speakers understood rhetoric in the simple formula Aristotle had first conceived back in the 4th century B.C.E. But our understanding of rhetoric would greatly change in the 20th century. The American philosopher Kenneth Burke formulated a new conception of how rhetoric works. Burke (1950/1969) explained how difficult it is for a speaker to persuade an audience with good reasoning, and how it is even more difficult to "move people" into action (pp. 41, 42, 46). Most speakers fail to move their audience. It is so difficult, in fact, that many, if not most, speakers cheat by using dishonest language in order to "mystify" their true purpose (Beach, 2012, pp. 79, 81).
5. Speakers will often try to appeal to their audience using plain-spoken language and common sense appeals. This tactic supposedly proclaims in an honest and direct way, "no rhetoric here" (Beach, 2012, p. 79). But Burke warned that this tactic is dishonest because the speaker was still employing rhetoric. Rhetoric always shapes our communication (Beach, 2012, p. 79). Even simple and clear language can be used to manipulate people. Simple language often only appears clear when, in fact, it disguises hidden assumptions and manipulative agendas. Burke was one of the first philosophers of language to delve into the revealing and concealing nature of language. He argued that we need to focus on what people don't say just as much on what they do say. Likewise, we can detect the motives and truthfulness of speakers from how they speak just as much as what they say.
Handout: The Rhetorical Context
6. When we communicate, we share not only our personal beliefs about the world, but we also share our cultural world-view, often unconsciously. Aristotle didn't really know about culture because all Greeks were mono-culturalists. They thought that Greek civilization was superior to all other cultures, and that all non-Greeks were inferior "barbarians." The Greek word barbaros was an antonym for politis, which meant citizen of a Greek city state. If citizens of Greece were good, then everyone else was bad and treated accordingly. From birth we are shaped by our culture (parents, teachers, priests, politicians, media, and peers) to see the world in a very particular way. This cultural "common sense" is reflected in the ideas we talk about (or don't talk about), the language we use (or don't use), and the behaviors we do (or don't do). We usually don't see or understand this cultural process until we meet someone from a different culture or travel to a different country. When this happens the familiar becomes strange and we learn to see the world in a new way.
7. When we talk about our knowledge, we often use "loaded" language that reflects our own cultural biases of what we consider "normal." Our common sense knowledge usually consists of stereotypical truths. As the journalist Walter Lippmann (1922/1997) once explained, we use language to create and reinforce cultural stereotypes. We use "a small vocabulary" to "express a complicated world" (p. 18), and thereby, we "represent" the world as "a simpler model" that we can use to "manage" our lives and navigate our environment (pp. 10-11). Stereotypes make us feel safe because they take "the great blooming, buzzing confusion of reality" and "order" it with simplistic and familiar cultural categories (p. 63). Lippmann (1922/1997) pointed out, "the way we see things is a combination of what is there and of what we expect to find" (p. 76). Stereotypes can be quite comforting, but they can also create conflict and lead us into trouble.
8. While we would like to think that our 21st century civilization is much more advanced than the ancient Greeks, in fact, we are still constrained by the same problem that even the wise Aristotle could not see or understand. We are all brought up to think that our culture and values are "normal," while other cultures and values are weird or wrong. This attitude creates a lot of unnecessary conflict, and it is a constant cause of violence. Kenneth Burke helped revise the old mono-cultural concept of rhetoric to fit the multi-cultural realities of the modern world. Thus, we need to revise Aristotle's rhetorical triangle to include the cultural influences of the speaker and the audience, and also of the text itself.
9. Burke (1941/1973) argued that we use language in "strategic" ways to achieve specific purposes (pp. 1, 109). A good speaker should use rhetoric honestly by using open and appropriate language to make a direct argument with sound reasoning and evidence. Open rhetoric allows the audience to understand an argument in a clear way so that they can weigh reasons and evidence in order to make up their own mind. But we must always guard against speakers who use rhetoric dishonestly. These speakers "mystify" their argument with language tricks and lies, thereby manipulating the audience and coercing them to do things they might not want to do. We will discuss these tricks, which are called "fallacies," later on in this chapter. First, we need to understand Burke's revised rhetorical triangle, which I will call the rhetorical context for argumentation (see chart below).
10. We need to investigate the "rhetorical context" of an argument in order to understand what people say, why they say it, and whether or not it is true. Likewise, if we are going to argue effectively to an audience, we need to first understand the rhetorical context of our argument. Knowing this context allows us to present our ideas clearly and effectively.
11. Unlike Aristotle, we now know that the first and most important part of the rhetorical context is our own culture. This cultural context is often taken for granted by many speakers because it seems like "common sense" that is shared by all "normal" people in a particular community. However, we are now more aware of our globalized world, which is full of diverse, multicultural countries. Every nation has lots of "sub-cultures," which are different ethnic, political, and religious groups – all of which hold different beliefs, practice different customs, and usually speak different languages. We are born into the nation and sub-culture of our parents, but as we grow up, we make decisions to accept or reject our parents' culture. We also can voluntarily join new cultures, adopting new beliefs, practices, and languages.
12. Our personal acceptance, or rejection, of cultures and sub-cultures shapes our individual identity and character, making us who we are. Social scientists call this subjectivity, our own personal world view. We partially create ourselves as "subjects." We also partially create our views of the world. As individuals, we have the power to accept and reject the subcultures in which we are raised. We use our subjectivity to understand our world, create knowledge, and communicate with others. As speakers, we present our subjectivity as a "persona," which is the particular role we play when we argue in front of an audience. Often, we adopt different personas for different audiences. For example, a person might present one persona at church, a different persona at home, and a drastically different persona with friends at school. We often take it for granted that our audience will understand and accept our persona and will share our "common sense." These assumptions are not always true. Sometimes our audience will reject our persona as disingenuous or even fake, thereby cutting off any hope of communication. Thus, before we begin to communicate with an audience, we first need to explore who we are (our subjectivity) and where we are coming from (our culture).
13. Once we understand our cultural background, beliefs, and identity, then we need to investigate these same items in our audience. This is the second part of the rhetorical context. Who are we speaking to (subjectivity)? Where do these people come from, and what beliefs do they hold (culture)? Understanding an audience requires some research into their culture, history, and geography. Speaking to one’s local community is relatively easy because there is a lot already known about the audience. But what about a diverse audience at a university lecture in Los Angeles or an art gallery in New York? What about the audience for a talk show interview on national television? And how about speaking to a group of businessmen in Hong Kong, China, or to government officials in Paris, France?
14. You have to know about who you are talking to in order to decide how to effectively communicate: What language should you speak? What words should you use (or not use)? What topics should you discuss (or not discuss)? How much should you explain? What kinds of examples should you use? What cultural assumptions might your audience share that differ from your own? Every speaker always fails to reach the whole audience because it is impossible to anticipate the background of every single individual in the audience. You will always be disliked and/or misunderstood by someone. But a good speaker will make an effort to understand the basic cultural characteristics of the audience so as to communicate with the majority of listeners. There always needs to be some modification of one's persona and one's text in order to more effectively reach a particular audience. If you are locked in your own subjectivity and culture, and if you try to speak to all audiences with the same persona in the same biased way, then you risk being rejected and misunderstood.
15. Once you have analyzed both yourself and your audience, then the final part of the rhetorical context is the text you are composing. Now, many might consider the text to be the most important part of any argument, but it is not. First, you need to understand yourself before you start writing. You need to critically analyze your subjectivity and culture. This background will affect your choice of topic, thesis, main ideas, logic, and evidence, not to mention the language you use and the assumptions you make. So you need to make sure you know where you are coming from and why you are speaking, especially if you are creating an academic or scientific text that will demand a high degree of objectivity. Second, you need to understand your audience before you start writing. Who will you be speaking to and how will you persuade these people to accept your thesis and evidence? It is only after you have investigated both your subjectivity and your audience that you are ready to research your topic and outline your essay or speech. But even writing your text involves culture – the culture of the text, or "intertextuality."
16. We never write or speak in isolation. What we say is always connected to what other people have said before us. Thus, we need to be aware of not only what other people have said about our topic (the conversation), but also how they said it (genre), and how previous audiences accepted or rejected those arguments (history). All of this is called intertextuality because this historical context is preserved in other types of texts, which you will need to research before writing. Most students don't realize that they need not only to research their topic, but they also need to research the long conversation that people have been having about this topic. Some of the oldest and most important topics on religion, government, or art might involve researching several thousand years’ worth of data from cultures all over the world!
17. You need to be aware of intertextuality for three main reasons. First, you need to know what others have said. You want to avoid past errors, build on agreed upon facts, and hopefully, say something original that will drive the conversation forward. You also need to know about intertextuality so that you can better reach your audience. Many people in your audience will have already heard about your topic from a diverse variety of sources. Some of these people might be very old and have a very large historical context, and some people might be very young and have no context at all. You have to think about your audiences' previous knowledge of the topic in order to situate your text within the context of previous writers in the history of your topic.
18. A third reason to be aware of intertextuality is "genre." This term refers to how you speak – the form of your writing or speaking. Most audiences will expect you to speak in a very specific way based on what they've heard before or the conventions of specific social contexts. Take for example the university lecture. You have to know what discipline you are working within so that you can follow the professional conventions and use the professional jargon that your audience deems essential and appropriate to the topic. If you do not meet these formal expectations of your audience, then they might consider you an amateur (or worse) and not listen to what you have to say. Likewise, speaking in verse at a political rally, or using prose at a poetry reading, might also get you into trouble with your audience. You need to know not only what to talk about (topic), but also how your audience expects you to communicate (genre).
Research Before Writing (Step 1): Preparing Secondary Research
1. 21st century literacy entails many higher order skills. You need to be able to critically evaluate the reliability of diverse sources of knowledge in order to construct knowledge with scientific methods. 21st century literacy also entails openly arguing with diverse groups of people to explain and prove the truth that you have found. But 21st century skills are built on the foundation of traditional literacy: reading and writing. This chapter will review the basic skills you will need in order to prepare and create an academic essay or speech.
2. Most students don't realize the fundamental key to all good writing: good research. Knowledge is the essential first step, as we have already discussed. Research can take many forms. But there is a single basic theoretical approach that will help you understand how to write about your research. The literary critic Kenneth Burke (1941/1973) described the marketplace of ideas as one vast "unending conversation" throughout human history (pp. 110-111). As you listen to your first lecture or read your first newspaper or academic book, you enter a long conversation, which has been taking place for years, decades, or sometimes even centuries. Academic experts, policy makers, and the public at large have been discussing important contemporary topics for a long time, way before you entered the conversation by picking up the newspaper. And most likely, a new generation will be discussing these same topics long after you have died.
3. Your job is to "enter" a conversation that you find interesting or important. You first participate as a listener (or reader) so that you can understand how the topic is being discussed: what are the issues at hand? What are the central concepts and keywords? What are the problems that need to be solved? What are the possible solutions? Have any already been tried, and if so, how did they work? How much money or resources are available? What do the experts say? You need to first listen to the conversation in order to become knowledgeable. Eventually, once you have some knowledge to share with the group, you can "join" the conversation and become a participant.
4. While knowledge is the all important first step, most students want to rush past it. Ludwig Wittgenstein and Bertrand Russell were two of the most famous philosophers of the early 20th century. In an influential book (Wittgenstein, 1921/2011) based on ideas they exchanged, they warned that no one should ever rush into any important public conversation. In fact, they forcefully stated that if you don't actually have any real knowledge to share, then you should just sit silently. They made this principle one of their foundational axioms in their book on human knowledge and the limits of science.
Opinion: The Ignorant Have No Right To An Audience
5. Quoting Wittgenstein, Russell (1921/2011) declared, "What can be said at all can be said clearly; and whereof one cannot speak thereof one must be silent" (p. xix). This rather simple statement contains several complex claims. First, a person must be knowledgeable before opening his or her mouth to speak. Second, one of the best tests of knowledge is clarity: can you clearly explain what you're talking about. If you can't, you don't truly know. Many people try to hide the fact that they don't actually know what they're talking about, which is a form of lying. They use vague language, especially big, important-sounding words, like "society" or "religion," to make it seem as if they have knowledge. But they don't. They are just throwing around big words, as a smoke screen, without any understanding of the topic.
6. We call this vague, insincere nonsense "bullshit" (Frankfurt, 2005; Fredal, 2011), and it is the favorite tactic of many know-nothings, including students, teachers, politicians, and barflies. Frankfurt (2005) goes so far to say that bullshitting is worse than lying! He argues, "The bullshitter...does not reject the authority of the truth, as the liar does, and oppose himself to it. He pays no attention to it all. By virtue of this, bullshit is a greater enemy of the truth than lies are" (p. 61). If you know your topic, then you would be able to talk about it in clear and detailed language. If you can't, you don't have real knowledge. This led to Wittgenstein and Russell's last point. If you don't have knowledge, then keep your mouth shut. No one wants to waste time listening to lies or bullshit. If you don't really know, then say so, and seek out someone who does.
7. However, it is important to remember that these lessons are not meant for us in our everyday life. These rules were meant to govern public speaking. If we tried to live up to these principles in our private lives, then most of us would walk around as mutes. No, these principles were meant specifically for scientists, and in a larger sense, for concerned citizens, professionals, politicians, and all who have a role and a responsibility in addressing public problems and shaping public policy (Feinberg, 2012, p. 17). As students entering into a professional discipline in a research university, these principles were certainly meant for you. You are currently being trained to be a scientist, or you are being trained to be a professional who can understand scientific literature and use it to make informed decisions. Thus, you need to take these principles to heart.
8. So the question becomes: How do you acquire the knowledge you need in order to be able to communicate clearly with an audience? The answer to this question is the important subject of this chapter. First, let me turn to psychology to point out an important feature of our brain in relation to the acquisition of professional expertise. Daniel Kahneman (2011) has explained that while the skill and knowledge of a professional might seem magical, it is, in fact, a "large collection of mini-skills," which have been refined through "thousands of hours of practice" (pp. 238-41). It is also important to read information several times to produce "cognitive ease" (pp. 61-62), which helps with both understanding and memory. The more times you see and think about information, the more likely you are to understand it and remember it.
9. This chapter will introduce you to the skilled use of three important tools that professional researchers use to read, to fully understand what they read, and to prepare information for inclusion in a research paper or speech. Most researchers use all three of these tools as a single process. First, while you read a source you should be annotating as you go along. Then, you should skim the source again and re-read your annotation so that you can outline the thesis, main ideas, and major details of the source, especially noting the page numbers where important information is located. Finally, you should re-read your outline and write an abstract, which is a short summary (and sometimes analysis) of the source. Doing all three steps of this process with every major source will not only help you become more knowledgeable, but it will also help you remember this important information.
10. These three steps help produce knowledge, albeit a lesser form of knowledge because you are not actually producing anything new. Reading the knowledge of other people is known as "secondary research," which is not as important as "primary research." Secondary research refers to reading the primary research of other scholars and summarizing it, but you do not engage in the conduct of primary research yourself. The best form of knowledge comes from conducting primary research through the scientific methods discussed in a previous chapter of this book. For primary research, you directly design a research study with the methodology appropriate to the discipline you are working in, collect and analyze data, and draw conclusions from your data and analysis. Most likely you will not engage in primary research until your junior or senior year, if at all, as an undergraduate. Most students aren't taught how to do real primary research until graduate school. Secondary knowledge is an important form of knowledge, and much better than common sense, but its validity should not be overstated. This form of knowledge is limited because what you "know" is basically the summary of other people's knowledge.
8.1 Annotation
11. The practice of annotation is as old as writing. It’s the process of reading a book with a pen in hand and underlining important passages or writing notes in the margins. It’s a way to leave footprints in the text so that you can easily and quickly find the thesis, main ideas, and key concepts of the text without having to re-read the whole thing. It is also a way to physically and intellectually have a conversation with the author by writing responses onto the text. Annotation is also known as "glossing" or "marginalia," which in effect are short "conversations" between reader and author, which the reader writes into the book.
12. Writing marginalia is an important exercise for a couple of reasons. First, it keeps you alert and engaged with the text, as if you were listening to a lecture and constantly looking to raise your hand to ask questions or make critical comments. Second, it helps you make focused comments or ask questions that get to the heart of the main ideas, which you might be able to use in your own writing. Third, by reading and re-reading important parts of a text, you are actively thinking about the information and building a longer-term memory of it, which will help you more clearly remember the text's important data or arguments. Finally, it is a time-saving mechanism. By writing notes on all of the important pages of the books, you can quickly and easily find the main ideas of the text for future reference, especially if you are going to use the book for years to come. I read many of the books on my shelf decades ago, and I simply don't have time to re-read every book when I need to use it again. But because I have left footprints in all of my books via annotation, I can quickly skim through each book to find all of the important parts within hours, instead of spending days re-reading the entire book.
8.2 Outlines
13. Outlines are another important tool, for both note taking and thinking. Outlines are effective because they combine both written and visual communication, and they force a researcher to crystallize the thesis and main ideas of a text in short, clear language. What information goes on an outline? All of the important information in a text: topic, thesis, main ideas, some major details, and page numbers. You may also want to include important references cited in the text, which you might want to read to more fully understand the topic. I will show you a sample outline in the next chapter.
14. The basic task of an outline is to organize information, both thematically and visually. You break down a text into its fundamental parts so that you can identify all of the parts and understand how they all fit together to prove the author's thesis. Outlining can also help you identify the weaknesses of the text. Some authors don't fully prove each point with enough evidence. Some don't make clear points at all. Some main ideas may support the thesis, but some may be unrelated, which signals disorganized thought. Some points may not be documented, and some documentation may be sloppy. By organizing all of the important information on an outline, you are going part by part to analyze and evaluate the strength of the text. You are also demonstrating your knowledge of the text by clearly summarizing the topic, thesis, main ideas, and major details. This process of analysis, evaluation, and summary (activities that will be fully explained in later chapters) also help you understand and remember the information of the text, which prepares you for using the text in your own research or writing.
8.3 Abstracts
15. The final stage of understanding and preparing your secondary research involves a short summary and analysis of each major source. This summary is called an abstract and it should include properly formatted in-text citations and references. While this last stage might seem like a lot of extra work, there are many reasons for writing an abstract. First, by reviewing your annotation and outlines, you are consciously re-thinking about the thesis, main ideas, and major details, which helps you understand and remember the material. Understanding and memory are also reinforced by turning the short visually organized information on the outline into verbally organized information in sentences and paragraphs. Second, you can more fully summarize and analyze a secondary source via sentences and paragraphs than on an outline. Writing an abstract forces you to more fully explain and critique the information, which helps you understand it better. Finally, the abstract prepares this secondary information for inclusion in a longer research paper. While most scholars don't simply cut and paste abstracts into a research paper, you might be able to re-use some sentences or maybe even a paragraph.
16. In an abstract, you will have crystallized your understanding of a source in clear and organized language. When they go to write, many scholars re-read and summarize their abstracts (rather than re-reading the whole book or article). By focusing on the abstract, rather than the original text, a scholar has all of the important information clearly stated, while still having the verbal flexibility to write about that information in a way that fits the future organization and purpose of the research paper. Cutting and pasting the exact same language of an abstract is dangerous: these old sentences or paragraphs may break up the flow and coherence of the future research paper.
17. It is important to remember that annotation, outlining, and abstracts are just tools to help you understand and remember information. They are not finished products in and of themselves. Going through this three-step process is the most effective way to understand and remember all of the information you read, especially if you are reading many sources in preparation for a long research paper. The more work you put into the initial research stage, the better the final research paper will be. There is a reason academic articles and books take many months and years to write. It takes a long time to read and understand all of the secondary literature on any given topic, all of which has to be finished before a scholar can even begin to conduct primary research. The creation of knowledge is not an easy or quick endeavor.
How to Write (Step 2): Communicate Your Knowledge,
not Your Opinions
1. Once you have completed your research, then you are ready to move on to the next step, which is communicating your knowledge. But beware; this is a tricky next step. You cannot just blurt out random bits of information from all the sources you just read. Nor should you ignore the majority of your research by relying on just a few sources. You need to review all of your research to decide what to use and how to use it. You should try to use most, if not all, of your research, which could mean using five, ten, twenty, or fifty sources. But how do you decide? The foundation for any successful writing (academic, business, or personal) is starting with a clear point, which is often called the "thesis." A thesis is a clear statement that your paper will prove true or false by the strength of your evidence, the quality of your reasoning, and the clearness of your prose. But even when you have a good thesis, writing a paper (especially a long and complex paper) is not an easy task.
2. Writing an academic paper takes careful planning, just like building a house. No contractor in his or her right mind would just start nailing boards together to build a house. Every contractor has to first produce a detailed set of plans, called blueprints, which describe the structure of every part of the house, the materials needed to build it, and the order in which parts have to be built. Before any concrete is mixed or any board is nailed, these blueprints have to be examined by architects, contractors, city officials, and, of course, the owners of the new house. You can know everything about a house by looking at blueprints. They outline the entire structure of the house and they show you how everything fits together. By looking at blueprints, an architect or city official can tell if the house will be solid, safe, and energy efficient, among other things. Once the plans are approved, the contractor follows the blueprints exactly and begins to build the house.
3. Writing an academic paper is not really different than building a house. As Stanley Fish (2011) has pointed out, writing is a “technical” activity: It allows us to “organize” and communicate our knowledge of the world (p. 7). Like an engineer building a house, a writer needs to gather resources, make a detailed plan, and then build. Only the materials are different. For writing, the first step (as described in the last chapter) is doing your research to see what others know, and don't know, about a topic. You need to make sure that you are fully knowledgeable about your topic before you begin to write. Once the research is complete, the next step is formulating a thesis statement, which is the clear point that you will prove true or false in your essay. The thesis is your purpose. It explains why you are writing.
4. Your thesis also allows you to make a detailed plan of action, the outline. The thesis tells you which sources will help you explain and prove your point and which sources will not. The outline organizes everything you are going to say in a logical order (and it is important to remember that there are many different logical orders, for example, from strongest to weakest point, or first event to last event). The outline also organizes your sources around the main ideas you will present in order to prove your thesis. The logical organization you have outlined is called the "structure" of your essay, just like the framework for a house.
5. Only after the outline is completely finished are you are ready to write. You need to "think before writing," as the famous writer H. L. Mencken once explained (Teachout, 2002, p. 168). Most students dread writing academic essays because they rarely know what to say. But this problem occurs only if you don't have any knowledge about your topic, or if you haven't organized a plan. If you are knowledgeable about your topic, and if you plan an outline correctly, then "not knowing what to say" should never be a problem.
6. A good writer always knows exactly what to say because he or she has carefully researched a topic and outlined a plan in advance. With a detailed outline, the actual writing of your prose should be quick and easy because you have already organized every point you are going to make and every detail you will use to explain and prove your points. The next to last step in the writing process is sitting in front of a computer and typing up sentences and paragraphs. Once this work is finished, the last step is editing your essay to make sure you explained and proved your points as clearly and logically as possible.
9.1 Making a Point: Topic and Thesis
7. Many students don't fully understand the difference between a topic and a thesis. To fully understand these words, it is helpful to go back to their linguistic and social origins in ancient Greece. The word topic comes from the ancient Greek word topos, which meant "a place." The ancient Greek philosopher Aristotle used a variant of this word, topoi, to mean subjects that people commonly discussed or debated - conceptual places, or what we would call subjects. The word thesis comes from the ancient Greek thesis, which meant to take a position on a topic by making a statement that could be proven true or false with evidence and reasoning. The ancient Greeks believed that humans use language to publicly debate different points about important topics in order to try to persuade other people what is true and what is false, and thereby, how people should act on knowledge to create a better society (Burke, 1945/1969).
8. Public debating and persuading an audience is called rhetoric, from the ancient Greek rhetorikos, which meant public speech. Rhetoric is how you talk in order to communicate and move people to act. It is a repertoire of all the verbal tools a speaker uses to communicate effectively and to persuade an audience (Aristotle, 1984; Burke, 1950/1969). In order to talk about a topic, you need to first know about that topic (i.e. research), then take a position on that topic. Taking a position means communicating a thesis. A thesis is a statement about your topic which you will prove true or false through evidence, reasoning, and the effective use of language. Your ability to speak clearly, logically, and persuasively to an audience represents your knowledge and authority on the topic, which you publicly declare as a responsible citizen seeking to make your polis, or society, a better place – an activity the ancient Greeks called "politics." All knowledge is political in this way. We want to not only know, but also to communicate our knowledge. We use our knowledge to be responsible citizens, which includes trying to create a better society.
9. But how do you know what thesis point to make about a topic? The answer comes from your research. When you investigated the secondary research on your topic, you entered into a conversation of experts. These people have been debating what they know and what the public should do about that topic. There are a range of existing positions in any public debate. Your job is to understand these different positions and take a side for, against, or somewhere in between (see chapter 12, section 12.1 for ten thesis statement strategies). Rarely does someone add a completely new insight or position to a debate. Those that can add something new are sometimes called visionaries or geniuses because they make us see existing topics in a whole new way (think of scientists like Charles Darwin or Albert Einstein, or inventors like Alexander Graham Bell or Steve Jobs).
10. Most people take an existing position and refine that position in some small but meaningful way by adding new information or a new type of argument. For students, this is what you need to do. As you research, you need to choose a position that seems to be the most reasonable based on the evidence available. You will both agree and disagree with the various sources you read, so you should look for a way to contribute to the conversation with your new information or new argument. The "new" element that you bring to the debating table is your thesis, your point, your contribution that establishes you as an active participant in an important conversation.
9.2 Pre-write: Outline Essay
11. Once you have identified your topic and taken a position on that topic through a clear thesis point, then you are ready to outline your essay. The outline is an overview of the structure of your essay, which includes the major parts organized in a coherent and logical way. Every essay has the same basic foundational parts: topic, thesis, main ideas, details/evidence, and sources. These parts are organized into four major sections: Introduction, Body, Conclusion, and Reference Page. You've already found your topic and formulated your thesis. The next step is to articulate the supporting points (sometimes called main ideas) that you will need to explain and prove.
12. The supporting points are specific points that you will combine to prove your thesis true or false. Each main idea is a point about your topic that you will explain and prove with lots of detailed evidence in your body paragraphs. On your outline, you need to organize these main ideas into a logical sequence (what point needs to come first, second, etc.), and you need to review your research to find the best evidence to explain and prove these main ideas. You are unlikely to use all your sources, and you certainly can't discuss all the evidence that might be available. You must choose the best sources with the best evidence, and then logically organize this information around your main ideas. As you do this, you must make sure to indicate where you got each piece of evidence. Use in-text citations on your outline to save you time later on when you write the essay (see section on Citations below). If you can organize all of your evidence and citations on the outline, then you will have a lot less work when you actually write the first draft of the essay.
Handout: Sample Outline Format
13. The size and detail of your outline will vary depending on your skill as a thinker and writer. A professional academic writer will usually map out only the foundational parts of an essay, but not plan every paragraph. On the other hand, developing writers should not only plan the foundational parts, but also organize these parts down to a specific number of paragraphs and map out all the important parts for each paragraph.
Handout: Two Methods to Organize Paragraphs
14. This may be a lot more work in the short-term, but doing all this work saves you time in the long-term and will ensure an organized and fully-developed essay. By outlining every paragraph, you make sure to fully develop each paragraph with clear topic sentences, enough detail, and proper citation. It also allows you to see how each paragraph logically fits together into the whole essay. Knowing the full structure of an essay will help guarantee appropriate transitions (which can also be placed on the outline). Plus, when you have a fully developed outline, there will never be any “writer's block” on the first draft because the outline contains all of the information that will go into the essay.
15. Since developing writers should outline down to the paragraph level, it is important to distinguish between two different methods for organizing paragraphs. A paragraph is a self-contained unit within the larger whole of the essay, just like a house has several self-contained rooms that make up the larger whole. All paragraphs have the same basic function: they contain a point and the details and evidence needed to explain or prove that point. Thus, all paragraphs have the same basic parts: a topic sentence (the point), details (to support and explain the point), and a conclusion sentence (repeats the main point), as well as the transitions needed to connect these parts, and to connect each paragraph to the other paragraphs preceding and following it.
16. A simple paragraph contains a single main idea and all of the details needed to explain and prove it. But sometimes a point is very large and/or complex, and it will take multiple paragraphs to fully explain it. In this case, students should use the complex paragraph method, which breaks up one main idea into several paragraphs (see chart above). One paragraph could contain the main ideas and some details, and then several paragraphs could follow with more details to explain and prove the same main idea. This method is especially useful if you are using scientific studies as your evidence. You might introduce your main idea and then go into one study in a single paragraph. Then you might take three more paragraphs to explain three more scientific studies, all proving the same main idea introduced in the first paragraph of this section of your essay.
Sample Student Argument Outline
9.3 Academic Essay: Reporting Knowledge
17. The academic essay has one specific purpose: to communicate knowledge. This information includes explaining what is known and unknown about a topic. It also includes arguing with other knowledgeable people in order to point out agreement or errors. But how does an academic writer know what is true or false, what is fact and what is erroneous? This question is actually very important and difficult to answer, and it was discussed in earlier chapters on subjectivity, science, evidence, and fallacies. Right now, it’s important to remember that all academic essays are based on extensive research. This research consists of the data (also known as facts), which are produced through a vast array of scientific and humanistic research methods. These methods are ways of finding, gathering, and making sense of the data.
18. In this chapter, we will be more focused on how academics write, rather than on what they write about. We will discuss the major writing tools that all academics use to write professional research essays. The purpose of this chapter is to outline and explain these tools so that you can learn how to use them as you begin to write academic essays.
A. Citation
19. The most important and distinctive feature of academic writing is the citation. We use citation as a tool to indicate the source of our information. But it is also a way of rational thinking. For most of human history, it did not really matter where you got your information because there were only a few authoritative places to go for information about the world, and these sources were generally seen as public property. Thus, it was right and proper to simply quote an authoritative passage as if it were your own without any mention of the author or where you found the quote. Many cultures still operate this way. And how did you know if the quote was true or not? Well, no one bothered to ask such questions. Of course it was true; otherwise, why would anyone say it or write it down in the first place? For thousands of years, the basis of all truth has been authority, usually religious and/or political authority. If a king or priest or holy book or teacher said something was true, then it was true – case closed.
20. Although some early philosophers questioned traditional authorities and truths, like Socrates who declared "the unexamined life is not worth living" (Plato, 1997, p. 33), there was no systematic effort to really formulate a new way of knowing what was really true or false until the 17th century European Enlightenment and the development of a powerful new tool called the scientific method. It is important to remember that many early philosophers and scientists were hated and harassed by their society. The philosopher Socrates was put to death for asking questions that threatened the authority of powerful politicians, questions like what is really true or what is really good. Later, the scientist Galileo was threatened with torture and death for discoveries he had made that threatened the authority of the Catholic Church. The threat of excommunication, torture, and/or death has kept many people from questioning authorities and examining what is really true about the world we live in.
21. It wasn't until the 17th century that philosophers systematically investigated the nature of truth by measuring all claims against the new standard of empirical evidence, which is evidence that can be seen and confirmed through our senses (i.e., not taking something as true because of faith in doctrine or trust in authority). This time in history has been called the "Age of Enlightenment" because so many philosophers and scientists were writing and arguing about the truth and searching out new evidence. Echoing the earlier motto of Socrates, the French philosopher Denis Diderot proclaimed, "Everything must be examined" (Gay, 1995/1966, p. 142).
22. These 17th and 18th century philosophers were the first scientists. The practice of science grew out of philosophy, the original organized effort to investigate human knowledge. One of the new methods these philosophers developed was the footnote. This tool grounded truth claims back to a source of rational or empirical argument, rather than repeating the edicts of religious or political authorities (Grafton, 1997). The footnote also demonstrated a method for thinking. The footnote traced back the origin of the cited fact or rational argument. The footnote placed it within the larger context of other writers, which demonstrated a dialogue or debate with other rational participants (Gay, 1995/1966, p. 176).
23. These philosophers were also very concerned about educating the public. They wanted to find ways to teach people how to know the truth and how to make more rational decisions (Gay, 1996/1969, p. 511). The footnote was a way to point out the important texts on a topic so readers could educate themselves by reading the larger debate. The footnote also demonstrated a new way of thinking and writing, which was grounded in the give-and-take of debate. This new intellectual style helped promote both rational thinking and the broader use of peaceful debate to discuss and solve social problems (Gay, 1995/1966, p. 176).
24. We still use this same basic tool of citation today, but now we have many different forms of citation beyond the footnote, and there are also many specific rules about how to properly present specific types of citations within specific academic disciplines. Today we have two basic forms of citation: (1) the footnote or end-note and (2) the in-text or parenthetical citation. Most scholars do not use the footnote anymore unless they are writing a book, and then the standard has become the end-note, rather than the footnote. Historians still regularly use the footnote in their research essays and books. But, most scholars use the in-text or parenthetical citation method. You may have noticed that this book has been using these in-text citations in order to indicate the sources for specific factual claims. I have been doing this in order to demonstrate how in-text citations should be used in academic writing.
25. I chose APA style because it represents a style of citation that most students will actually have to use in their professional lives. Few students study English or modern languages, so there is no reason to force students to learn MLA only to have to un-learn it later, and then learn a new style in their academic major. Chicago style is mostly used by historians and those that publish academic books for general audiences. APA is used mostly by psychologists and social scientists, but it is also similar to many of the citation styles used in the physical sciences, medicine, and engineering. And unlike these other citation styles, APA is widely incorporated into most writing handbooks.
26. The footnote or end-note is the oldest form of citation. This type of citation is still used to write academic essays in some disciplines, like history, and it is used when writing an academic book. This form of citation can come in two different places: the footnote comes at the bottom (or foot) of the page, and the end-note comes at the end of the essay, or book. A footnote or end-note will include all of the important bibliographic information about a source: author's name, title of source, publication information, and page numbers. But this form of citation can also include extra information, evidence, or sources that may help the reader more fully understand the topic. This form of citation is now rarely used because it takes up a lot of extra space on the page, making the page visually cluttered, and if there is lots of additional information, it can also make extra work for the reader. But this type of citation really gives readers all the bibliographic information they would need to become fully informed about a topic.
27. The in-text or parenthetical citation has the same function as the footnote/end-note, but this form of citation takes up much less space. It also goes inside of a sentence, hence the name "in-text" which refers to the location of this citation inside of the sentence structure, rather than "outside" the text, found at the foot of the page or end of the essay. I've been using in-text citations throughout this book, so you have already been exposed to several examples. This form of citation is used most widely today in academic essays because it does not break up the flow of the paragraph (unless there are several citations used at once). It also creates less work for readers because they do not need to stop reading to go down to the foot of the page or the back of the book to find the citation. In-text citations always carry much less information than footnotes, usually only author, date, and page number, and sometimes (when using MLA format), only author and page number. Finally, there will never be any extra information inside an in-text citation.
28. Whatever method you use, it is important to remember that an academic essay must always cite the source of all information discussed; otherwise you will be accused of plagiarism. This word refers to many forms of academic dishonesty or incompetence. First of all, plagiarism means that you intentionally stole or unintentionally borrowed the intellectual property of another person without giving the owner credit. But it also means that you are a sloppy researcher and writer. It leads your reader to believe that you don't really know what you are talking about because you didn't explain where the information comes from.
29. An academic essay must always discuss sources because you have to explain why the sources you are using are credible, authoritative, and factual. If you don't, then your reader will not trust what you say. You don't really know a fact is a fact unless you know both (a) the source of the fact and (b) that the source is credible and authoritative. Footnotes, end-notes, and in-text citations are different ways of alerting the reader to the source of the information you use in your essay. These tools enable readers (1) to know the broader conversation, (2) to find for themselves the information you referenced in the essay, and (3) to trust you as an authoritative researcher and writer.
B. Summary, Paraphrase, and Quoting
30. Most academic writing involves talking about what others have said, and here are three specific ways to do this important activity: summarizing, paraphrasing, and quoting. First we will discuss the art of summary, which is the most frequent tool that writers use. Then we will discuss paraphrasing and quotation.
31. A summary has a simple purpose: to restate the idea of another writer, but with less words. How do you do this? While the idea belongs to another writer, the words you use to explain the idea are your own. A summary needs to condense the idea down into a shorter amount of space without losing any of the original meaning. It also needs to accurately explain the idea without changing that meaning, for example, by adding something the author did not say, or leaving out something the author did say. In essence, you are trying to form a generalization based on the original author's essay. In order to do this, you first you need to make an outline of what you read in order to pull out the thesis, main ideas, major details, and some minor details. Second, you need to focus on just the most important parts of the original text. You need to restate and explain the author’s general idea in your own words, without most of the original author's details. You will also need to locate the most important words an author uses so that you can quote these words on your outline.
32. Let us look at an example using the text of the Declaration of Independence. Remember, the purpose of an outline is to break information into core parts (main ideas, major details, some supporting details) organized around the most important part of any essay, the thesis statement. The outline helps you understand not only the point of the essay, but also how that point is broken down into main ideas and how those main ideas fit logically together to support the thesis.
A Sample Outline
Thesis: many “truths” are “self evident” for the people of the new nation: truths about “rights of people and duty of government (author, date, p. #)
1. “all men are created equal” (p. #)
2. “all men” are “endowed” by a “Creator” with “inherent and inalienable rights” (p. #)
3. Three of the rights are “life, liberty, and the pursuit of happiness” (p. #)
4. Governments
a. supposed to “secure” the people’s “rights” (p. #)
b. “instituted among men” (p. #) c. gets “just powers” from “consent of the governed” (p. #)
5. A fourth right of people
a. if government is not securing rights (p. #)
b. people can “alter” or “abolish” government (p. #)
c. people can make a new government that will protect tights and create “safety and happiness” (p. #)
33. There are three important pieces of information that you should note when you read a text: the author, the title of the text, and the date it was published. The second piece of information noted on the outline is the thesis statement, which is the main point and unifying idea of the whole text. Sometimes thesis statements are hard to identify and sometimes they are very easy to identify. Jefferson’s text began with a clear subject-verb-object clause (“We” “hold” “truths”), and this clause ended with a colon ( : ), which meant that a restatement or a list would be following. It is clear that Jefferson is providing a list of “truths” because the rest of the abridged text uses the word that to begin five separate ideas and each ends with a semi-colon ( ; ), except for the last, which ends with a period.
34. The outline reproduces the core concept with key words of each main idea. You will also notice that the 4th and 5th main ideas had multiple parts, which required further subdivision. I broke down these larger ideas down into smaller supporting points, indicated with the lower-case letters. Within the thesis statement, I demonstrated my categorization of Jefferson’s truths by breaking down this main point into two categories: “rights” of people and duties of government. I was able to more accurately and clearly explain Jefferson’s ideas with this categorization.
35. After the outline is complete, the next step is to summarize the important information that you have just identified and organized. As already discussed, a summary is a condensed explanation of the main ideas and overall purpose of a text; it restates and condenses the author’s main ideas. However, in order to summarize well, two additional tools are required: quoting and paraphrasing.
36. A quotation is simply a reproduction, word for word, of what an author said. There are three basic rules that should guide when to use quotations. The first rule: Try to quote only those words or phrases that are either clearly important and central to the text or unique to the author. A quote should be a special word, which cannot be paraphrased. If you can paraphrase the idea or word then do so. But be on the lookout for special words that cannot really be paraphrased because the language is subjective value judgment, unique language, or artful language. Jefferson’s phrase “all men are created equal” is an important part of the text, and it has a lot of historical significance, so we would probably want to quote it, or at least the words "all men" and "equal," which are the most important value words. Jefferson also used two unique words that express subjective value judgments, inherent and inalienable to describe rights, so it would seem appropriate and detail oriented to quote these two adjectives. The second rule: do not quote too much. If you quote too much your voice or your coherence might get lost in the author’s words. Ideally, about 5% or less of the total words used in your essay should be quotes, unless there is some specific reason why you need to quote more, like being a historian and wanting to re-produce the specific words of important people. The third rule: you must explain difficult, unclear, vague, or imprecise words used by the original author. For instance, I wanted to quote “inherent” and “inalienable,” but these words need some explanation. My summary sentence might sound like this:
Jefferson argued that all men had "inherent" and “inalienable” rights that could not be separated from them as human beings.
37. The second tool is paraphrase. This tool is a precise restatement of another person’s ideas, but explained and clarified using your own words. A paraphrase will almost always be much longer than the original because we have to fully explain the author's idea and wording. Most of the time, we need to both quote and paraphrase together because the meaning of an author's words are not always clear. A summary is always shorter, but because a paraphrase seeks to fully explain and clarify an author's idea, this technique is often longer than the original.
38. For instance, Jefferson declared a short phrase that is highly ambiguous, “We hold these truths.” To explain and clarify this statement, I could state,
Jefferson believed the new American government and its people should share certain foundational “truths,” which defined the core principles of the new nation (Jefferson, 1776/1995, p. 341).
Notice that as I begin to summarize, I also include in-text citations so that I do not forget them in the final draft of the essay. As you may have noticed, my paraphrase is much longer than the author’s original words. Sometimes an author will say something complex or profound, and it will take more space to explain exactly what he or she might have meant. When we paraphrase, we often substitute certain words or phrases for the original text so that we can more clearly explain the meaning to a contemporary audience. For instance, Jefferson used the word hold and I interpreted that to mean both believed and shared based upon my understanding of the overall text. We might also add the historical context, which affected Jefferson, but which he did not directly mention in the text because he took it for granted. Thus, I mentioned the new American government in my summary, even though Jefferson did not specifically talk about the new type of political system that he and the other founding fathers were creating. A summary is always focused on what an author said, but it sometimes also needs to include background information, which an author often leaves out because contemporaries within the shared historical context take it for granted.
39. Students or professionals write summaries so they can accurately and concretely restate another person’s ideas. As writers, our primary purpose is often to converse with other people. We read what others have said and we respond to their ideas. Often this exchange of ideas takes place via writing instead of talking directly to a person. Summary is an important part of the conversation process. We need to not only understand other people’s ideas, but we also need to be able to restate them clearly to show that we understand what has been said. As writers, we usually want to respond to, criticize, or argue with what other people have said; however, before we can get to that next stage, we first have to be able to restate other people’s ideas accurately and clearly. We summarize to show that we know what others have said, and to show that we fully understand their claims and evidence. We also need to show that we are honest and trustworthy writers by not misrepresenting other’s ideas, facts, or opinions.
40. When you write a summary, you also need to be clear about when you are speaking (as the summarizer) and when the original author is speaking. To be clear, you need to continually remind the reader by stating the name of the speaker (“Thomas Jefferson said” or “Jefferson argued”), or by using third person pronouns (“He went on to say”). You can also use the first person "I" to refer to yourself ("I think he meant"). There are some basic rules about how to refer to an author. Once you have introduced an author’s first and last name, it is customary to refer to that person, male or female, throughout the rest of the essay by his or her last name or a third person pronoun. Never use an author’s first name unless you know that person intimately, and even then, it is rarely done. The other point to be aware of is time. If you are summarizing a voice from the past, you need to use past tense verbs. If you are summarizing a voice from the present, then use present tense verbs. This seems obvious, but many students refer to long dead authors as if they were sitting next to them in the room.
41. The last point about summarizing is that you should not make any judgments or criticisms about the author’s work. When you summarize, you are simply restating and clarifying what an author has said. You need to understand the author and his or her ideas from the author’s point of view and/or historical context, which may be different from your own. When you summarize, you must refrain from inserting your own reactions to what you read. Imagine that you are a reporter and that your job is simply telling the audience what happened and what someone else has said. The point of summarizing another person’s ideas is so that you can understand that person and his or her point of view. Now, of course, everyone wants to talk back to an author by making a comment, a criticism, or asking a question, but these insertions should always come after you summarize. We discussed how to respond and argue in a previous chapter.
42. I would like to end this section with a sample summary paragraph of Jefferson. Pay particular attention to how I used summary, paraphrase, and quotations. Also, notice how I follow my outline exactly, as I turn each part of my outline into one or two sentences to fully explain each point. Finally, note how I end the summary with a concluding point (in italics) where I insert an argumentative claim, which could then serve as a thesis for a longer argument that would follow my summary.
Thomas Jefferson (1776/1995) wrote The Declaration of Independence in 1776 in order to declare the birth of a new nation. He set forth the proposition that many “truths” were “self evident” about the rights of the American people and the duty of their new government (p. 341). He stated first that “all men are created equal” and that they were “endowed” by a “Creator” with “inherent and inalienable rights” (p. 341). Some of those rights included “life, liberty, and the pursuit of happiness,” but Jefferson made it clear that the people had many other rights as well, although he did not name these rights (p. 341). Jefferson then went on to discuss the role of the new American government. He declared that governments were supposed to “secure” the rights of the people (p. 342). Because governments were made by human beings for the protection of human beings, they should get their “just powers” from the “consent of the governed” (p. 342). The people’s consent would be a way to keep the government responsive to the people it is supposed to serve. Jefferson ended by stating a fourth “right” of the people, which, in essence, asserted that people can “alter” or “abolish” their government at any time if the government is not protecting the rights of the people (p. 342). If the government is not securing the rights of the people, then the people can make a new government that will protect their “rights” and work towards their “safety and happiness” (p. 342). Jefferson’s Declaration made the foundational rights of American citizens and the core role of the government very clear; however, it is not clear at all if the federal government has ever completely lived up to Jefferson’s idealized principles.
43. You should have noticed that my stated purpose was to accurately summarize the thesis and main ideas of Jefferson's text, and thus, I did not critically analyze, question, or disagree with any of his points until the very end of the last sentence. I inserted this last argumentative point to show how you can use summary to lead into an argument; however, if it was not my intention to argue then I could simply delete this last point to keep the focus on Jefferson. It is important to separate the activity of summary from the very different activities of critical analysis and argument. An accurate summary always should before an argumentative claim so that the reader can see that you both knowledgeable and fair minded.
C. Charts & Graphs
44. Charts and graphs are an important component for almost all academic essays. These tools help visually organize and display large amounts of quantitative (numerical) information in a small amount of space. Charts and graphs efficiently organize a lot of information, which both helps the writer better prove a point, and helps the reader comprehend that point, much quicker and easier than reading a lot of words. To put a half page table into sentences and paragraphs would take at least one or two pages of text. Charts and graphs are often based on statistical research, which relies on the quantitative data of numbers. These numbers represent many different empirical examples of the objective world that were collected by a researcher. A claim backed up with a lot data is generally going to be a lot stronger and more true than a claim backed up with only a few data points or no data at all. A chart or graph can quickly communicate to a reader how much data a researcher collected and displays the significance of that data.
45. Charts and graphs are a form of statistical analysis. As already discussed, there are two different forms of statistics: descriptive and inferential. Most charts of graphs used for a general audience will use descriptive statistics. This simple form of statistics takes a bunch of data and organizes that data into specific categories, which are expressed with numerical values, like whole amounts, percentages, or fractions. The second form of statistics is more complex and rarely used for general audiences because it is not as easy to understand. Inferential statistics take two or more variables and try to determine two basic conclusions: how connected are these variables and how strong is that connection. The connection between variables is called "correlation." If there is a strong correlation between two variables, then where you find X, you would also find Y. The strength of a correlation is somewhere between random and perfect. Random implies there is no connection at all, while a perfect correlation means that when you find X you always find Y.
The Economist Explains How to Design Charts for Instagram
46. I will not explain how to create inferential statistics in this book because it is too complex for most undergraduates, both in terms of the foundational mathematics and in terms of the computer software required to perform statistical analysis. In this chapter, I want to focus on descriptive statistics, which are much simpler. Descriptive statistics can be constructed by all academic researchers and they can be understood by all audiences. There are four major types of descriptive statistics: the Table, the Bar Graph, and Pie Graph, and the Line Graph.
47. The Table is simply an organized list of numerical data broken down into categories. Why would you use a table? A table uses space more efficiently than trying to explain all the information with words alone. Now just because you use a table does not mean that you don't still have to explain what is on the table and how to read it. But you don't have to explain all of the data on the table because that would defeat the purpose of the table. You would first need to explain the point in a clear title at the top of the table, and then you would pick out some of the most important pieces of information to explain in the body of a paragraph.
48. The Bar Graph, Pie Graph, and Line Graph are also effective ways to display a lot of data in a small amount of space. These charts and graphs take the raw, organized data from a table and they display part of the data as a picture, which makes it easier to understand. The Bar Graph displays two types of information for comparison. On the left side of the chart is a short horizontal list of variables that are linked to a vertical set of numeric values on the bottom of the chart. The audience will quickly see the differences in height between the variables, which spatially represents the different numerical values. The Pie Graph is used to represent percentages of a whole. A subject is broken down into its different constituting characteristics or parts, which are represented by different size slices of the pie, each of which corresponds to a percentage of the whole. The Line Graph comes in simple and complex forms. All line graphs have one or more variables represented by a line that goes across the graph. The vertical axis is usually some numerical value, and the horizontal axis is usually a period of time. A simple line graph will trace one variable over time to show how it increases and decreases. A complex line graph will do the same with several lines so the reader can compare and contrast the different variables over time.
D. Editing
49. The last step of all types of writing is editing. There is no secret to good writing, just as there is no secret to being good at any other human activity. As already mentioned earlier in this book, success does not depend on "a single skill but rather a large collection of mini-skills" (Gladwell, 2008, p. 238). Every successful person perfects a core group of mini-skills with around 10,000 hours of sustained practice in a specific domain. It’s no different for writing. If you want to write a good essay, it takes practice and hard work.
50. The final stage of the writing process is one of the most important, and one of the most overlooked, especially by students. Nothing is ever perfect, or even very good, when you first put it together. A seasoned craftsperson will always critically analyze and refine the first attempt, whether it is an essay, a book, a sculpture, a house, or an iPod. Steve Jobs, the founder of Apple, described the greatness of his company in much the same way as I am describing the writing process: "We would start off with a version and then begin refining and refining...It's a lot of work, but in the end it just gets better" (as cited in Isaacson, 2011, p. 419). For writing, most of the refining process takes place in the last stage: editing. Once you have said everything you planned to say on your outline, then you need to read over your essay carefully to understand how you actually wrote in order to look for places where you could be more clear, persuasive, logical, or concise. And, of course, you want to look for any errors you might have made in terms of grammar, sentence structure, and word choice. Every word and piece of punctuation matters, and every error we miss can cause embarrassment, or much worse. Did you know that the state of Maine lost over $250,000 dollars because lawmakers forgot a single comma (Gonzales, 2018)?
51. But there is an important trick to editing well. You must learn to see differently and think differently. So how do you see your writing differently? When you write, your brain knows what you intend to say, but your eyes often see what your brain is thinking, rather than what is actually on the page. You will often make mistakes of syntax, grammar, and word choice without realizing it.
52. If you try to edit while your brain is still in "writing mode," then chances are you will not be able to see your mistakes because your brain will still be reading what it thinks it said, rather than what it actually said. So, the first trick to editing is taking some time, preferably a lot of time, to forget what you wanted to say, in fact, to forget the whole essay altogether. Doing so allows you to come back, say a day or a week or a month later, and then read your writing with a new pair of eyes, a pair of "reader's eyes," rather than "writer's eyes." And while you are reading, you will also need to think differently. You will need to use the dorsal stream of consciousness to really concentrate hard on the words you've actually written and what those words are actually saying (Lehrer, 2012, p. 134). Then you need to compare what you did say with what you intended to say, and make adjustments accordingly. You also need to look out for all the small errors of syntax, grammar, and word choice that you may have made. Fixing these small errors is often called proofreading or copy-editing, which refers to the proof or copy draft of a piece of writing, the very last draft before publication.
53. The other trick that professionals use in the editing stage is criticism. Many people have negative feelings about criticism, but professionals in every field know that criticism is one of the most important and helpful skills that ensures success. Every professional writer seeks out knowledgeable readers to criticize their work. The best way to see your writing differently is to literally get another pair of eyes to look at it. Scientists in particular rely on a critical community to analyze not only the writing itself, but also the data and logical conclusions being presented as truthful.
54. As you edit your own writing, you should also seek out other writers or professionals in your field to read your work as well. These critics will not always find all your errors. In fact, sometimes critics will see errors that aren't in fact problems at all. Regardless if these critics are correct or not, a critical community allows you to see your writing from new points of view. These diverse views will enable you to develop your paper more completely and clearly for a wider audience. Research has shown that people who subject their ideas to critical debate are often much more creative and successful than people who do not (Lehrer, 2012, pp. 159-63).
55. Editing is not easy or fun. In fact, it is the hardest and most uncomfortable part of writing. But it is important. It is perhaps the single most important skill that separates the talented novice from the accomplished professional. You cannot ever become a strong writer without first becoming a strong editor. And further, you cannot become a strong writer without trusted critics. You need to seek out other trusted writers and professionals in your field to ask for critical feedback. But asking for criticism is not enough. This leads to the final trick of professional writers. You must learn how graciously accept criticism and use it constructively. This is especially true when you don't like what others have said or when you don't agree with the points they make.
56. Pixar is one of the most successful movie studios of all time, but even it has made serious errors along the way. The secret is to learn from your mistakes. As Lee Unkrich, a member of the Pixar creative team, explained, "If it feels easy, then you're doing it wrong. We know that screw ups are an essential part of what we do here. That's why our goal is simple: We just want to screw up as quickly as possible. We want to fail fast. And then we want to fix it" (as cited in Lehrer, 2012, p. 69). Every professional makes mistakes. Every first draft is filled with errors. If you want to be a successful professional, then you need to learn how to do the hard work of editing so that you can find your mistakes and learn from them.
How to Make a Claim (Step 3): What’s the Point?
1. Before you can make an argument, you first need to thoroughly investigate your topic and find the facts. Remember, truth has to be actively constructed by critical thinkers through meticulous and rigorous scientific methods. However, many very smart people forget that truth alone doesn't do anything. The truth must be used to solve real-world problems. But in order to use the truth, you first have to convince other people that your facts are actually true. Then you have to convince these same people that you know how the truth should be used to solve the problem. You will have to argue for the truth in open debate in order to convince a skeptical public. Debating with others about truth means both arguing for the truth and demonstrating it with valid logic and evidence. It also means arguing against false opinions, manipulations, and lies. 21st century literacy entails not only being able to construct knowledge with scientific methods, but also openly arguing with diverse groups of people to explain and prove the truth.
2. Writers and speakers have many ways to make an argument, but to simplify the process, I will explain one basic way that most academics use. This process has three main parts: the literature review, the formal argument, and the proposal for action. The first part reviews what has already been said about a topic. The purpose of the literature review is to organize relevant information into connected categories, preferably focused on a single explanatory theory, which provides the logical connections. Second, by outlining what has already been said, the literature review allows an arguer to come up with a new, original argument, hopefully one based on new evidence.
3. Once this evidence is collected and organized, a new argument can be written. At the end of most formal arguments is a proposal for some type of action. In the academic world, the proposal is often focused on new lines of research to explore unexplained areas or to re-test problematic theories or experiments. In the professional world, the proposal usually seeks to address some practical problem and present a solution. In the political world, the proposal asks the audience to think about the argument presented, agree with it, and then engage in some sort of political action, like voting or donating money. It is important to remember that all arguments are practical tools to move audiences into action, but the proposed actions will vary depending upon the genre of argument, which itself depends upon the composition and context of the audience.
12.1 The Literature Review
4. Before you speak in public, it is important to know what has already been said about a topic so as to add something new to the existing conversation. In a scientific setting, it is important to understand what is known and what is not known, so you can test new theories and find new evidence. The literature review is a tool to organize and communicate the existing conversation on a topic. Previous information or debates need to be categorized, and these categories need to be logically connected in some way. In order to come up with categories and logically connect them, you need an explanatory theory, which provides a basic framework for how all the parts fit together and towards what end. The theory also provides key concepts and a technical language to explain those ideas.
5. Depending on the setting and how serious your formal argument is, a literature review can be very short, say a few pages in an undergraduate research essay, or it can be very long, like 50 pages in the PhD dissertation. But regardless of the length, every formal argument must first explain the context of the argument in the introduction. This context includes who has said what about a topic and why, organized into clear positions on the topic. Your audience cannot fully understand your own claims and evidence unless you first provide this larger context.
6. The first part of a literature review is a clear explanation of the broad topic. You need to define the large topic and break it down into parts. You want to explain what people already know about the topic, what people don't know, and the points of controversy or disagreement. You also want to cite the major researchers who have helped produce the most important knowledge on the topic. Depending on the audience, you may also want to introduce these academics and their research history as well. Remember, explaining a topic includes both the information and the specific people who have produced that information, which also includes introducing specific published work, like the titles of articles and books. You may also want to explain the influence of specific authors or works on the field of study.
7. Next, you want to move from the broad topic and multiple positions of the conversation down to a specific focus. You will join members of an existing position and narrow down to focus on a specific issue or problem that has been discussed. You will usually agree and disagree with the existing arguments to uniquely position your own point. This narrow domain will be the specific topic of your paper.
8. Once you have established your topic and positioned it within the existing literature, then you need to articulate your point to argue. This is your thesis, an argumentative point that you will prove with evidence. Your thesis should address this specific issue or problem you identified in the literature review. Your thesis should seek to either fully explain what is already known, or add some new knowledge. However, adding new knowledge is incredibly difficult. This step is usually expected only for graduate students and working professionals.
Entering Academic Debates 10 Strategies
Kiss Ass
Piggyback
Leapfrog
Peacemaker
Pick a Fight
Take on Establishment
Drop Out
Crossbreed
Discover New Evidence
Create New Theory
9. There are several ways to construct a thesis point in relation to the existing scholarly conversation. Mark Gaipa (2004) explained eight specific strategies for entering into an academic debate. I will be organizing these strategies into three groups and I’ll add two additional strategies Gaipa overlooked. Some strategies are focused on agreement with other scholars. First, you can Kiss Ass. You can agree with an established scholar and explain how his or her theory and evidence are correct. Often this strategy is used against a critic who has attacked the established scholar with whom you agree. Second, you can Piggyback an established scholar, standing on the shoulders of an intellectual giant. You agree with a scholar but then expand on his or her work by adding a new idea or new piece of evidence, or by applying the existing theory or evidence in a new way. Third, you can Leapfrog a scholar, which is also known as biting the hand that feeds you. Here you agree with an established scholar, but then point out a problem, missing evidence, a contradiction, an inconsistency, or an error in the existing research.
10. Other strategies are more antagonistic. The fourth strategy is playing the Peacemaker. If there is an academic fight then you can enter into the middle of the fray and try to resolve the debate between two or more scholars. This strategy can entail both praising the strengths of each scholar while also criticizing their errors. A fifth strategy is Picking a Fight. You can critically analyze a scholar's argument and prove how it contains errors or how it is completely wrong. The sixth strategy is very similar, except you are criticizing a group of scholars, or perhaps a whole discipline. This strategy is called Taking on the Establishment. You attack a serious error of fact or theory, which a group of scholars have uncritically accepted as true. You should be very careful with such a strategy, as you will most likely anger a lot of people by attacking their work.
11. The final strategies focus on originality. The seventh strategy involves Dropping Out of an existing debate to focus on an issue, theory, or piece of evidence that everyone else seems to have overlooked. Sometimes this involves bringing back the work of an older scholar who has been ignored due to falling out of fashion. The eighth strategy is Crossbreeding. If you have knowledge of multiple disciplines then you can take the theories or evidence from one field and use it to solve a problem or revise a theory in another field. This technique is often the source of exciting intellectual developments. The ninth strategy grows out of Kissing Ass or Dropping Out. You can use the theories of established scholars to Discover New Evidence, which helps to redefine a field or solve existing problems. The final and most difficult strategy, as the philosopher Thomas Kuhn (1996/1962) has pointed out, entails Creating a New Theory, or making a paradigm shift. It is rare, but sometimes a brilliant scholar will see the world in a new way and propose a completely new theory for understanding objective reality. Examples include Isaac Newton, Charles Darwin, and Albert Einstein.
12.2 The Formal Argument
12. A formal argument seeks to prove a central claim, or thesis, true or false. An informal argument might just explore a topic or claims without any definite conclusions, but a formal argument seeks to clearly articulate and fully prove a point. Thus, a formal argument requires many supporting claims (the grounds) and much valid evidence to prove these claims. A formal argument also requires a logical sequence to connect these supporting claims together in an organized way so as to better prove the thesis. Finally, a formal argument also seeks to propose some course of action in the conclusion. Proposals could include further research, a re-test of existing experiments, a solution to a problem, or an appeal to the audience for some type of action.
13. In order to compose a formal argument, you need to know all of the major parts and how these parts logically fit together. The most important part of a formal argument is the main claim or thesis, which is a statement that can be proven true or false. The purpose of your whole argument is to explain and prove the thesis. In order to do that, you need to make a series of supporting claims, which are often called the grounds of an argument. A ground consists of a supporting claim, evidence to prove that claim true or false, and sound reasoning to explain the evidence and connect it logically to the supporting claim. Each of the grounds can be considered a mini-argument that is a self-contained unit. These grounds need to be logically connected in some organized sequence so as to fully support and prove the thesis of the whole argument. Finally, the whole formal argument rests on a warrant, which is a principle, value, ideology, or scientific theory.
Handout: The Academic Argument
14. Everyone is biased in some way, which is to say we all have principles, world views, and values that inform who we are, what we think about, and what kind of world we want to inhabit. It is important to be open and honest about the foundational principles of your argument so that you can explain and justify these principles to your audience. Not every audience will accept your principles or theories, but in being honest and upfront about your world view and why it is justified, you give your audience a choice to listen to your argument or not. As a developing academic arguer, you are being trained in a professional form of argumentation that is predicated on certain foundational values, like truth, honesty, respect, and freedom. Because you are bound by these values, they affect not only what you talk about, but also how you think and talk.
15. As you know from our earlier discussion, the scientific method for inductive argumentation is based on evidence. Evidence is the key ingredient to a formal argument, and every claim should be fully grounded in sufficient data. It’s the quality of the data that proves each supporting claim (the grounds) to be true or false, and these grounds, in turn, support the truth or falsity of the thesis. When you critically analyze the claims of others, and you can show that your data falsifies their claim, then you can say that you have refuted their argument, which means you have conclusively proven their claim false by means of valid evidence.
16. Usually, you will find that your opponent also has data to support his or her claims. Sometimes an opponent has made a point which you cannot refute because of solid evidence. Rather than engaging in trickery, you should concede your opponent’s point. A concession means that you acknowledge the validity of your opponent’s claim because it was proven with sufficient evidence. If one of your main values is finding truth about the objective world, then you should welcome a concession because it has brought you closer to your goal. Remember, discovering truth is a team effort. Science is predicated upon both the discovery of truth and the refutation of error. For the academic arguer, both events should be considered a form of success.
17. If evidence cannot completely prove the claims you make, then you need to qualify your claims to explain how certain you are based on the evidence you've found. A qualification might sound something like this, “Based on the evidence I found, it is reasonable to conclude X, however, I am not completely certain of X, so more evidence is needed to conclusively prove it.” Each of the italicized words is a qualifying remark. A qualified claim explains the limitations of the data, which lead to a provisional conclusion, pending more data. Never overstate your certainty. Overstatements easily lead to hasty generalizations or lies, which can be attacked by your opponent and could be used to discredit your whole argument. It is acceptable to make provisional claims if the evidence is insufficient. Such a claim is honest and open, albeit weak because there is not yet enough evidence to prove it.
18. Remember, your goal is to seek truth, not win arguments, although the two can sometimes be connected. And you should always be looking for points of agreement with others, especially supposed opponents, because consensus over facts is the surest way to reach truth about the objective world. A formal argument seeks to lay bare all of the evidence and reasoning used to make conclusions so that your peers can review your thinking to make sure that you made a strong and valid argument. Peer review is one of the most important parts of the scientific method. Constructing an open, formal argument helps the peer review process work more quickly and effectively.
12.3 The Proposal
19. Once you have completed your formal argument, the last step is to construct a proposal, which will ask the members of your audience to do something. We argue not just for the sake of arguing, but because we want to affect our world and change it in some way. Thus, the proposal is an argument for the specific change that we seek. The proposal needs to be connected to the thesis of your argument, and it needs to recommend a specific action or set of actions. If the proposal is too vague, then your audience will not fully understand what you are asking and will end up doing nothing.
20. The proposal is another mini-argument based on a principle, similar to the argument for your warrant. And like the warrant, it is outside the framework of your thesis, but still related to it. Your proposal must have a main claim, which like your thesis needs to be proved with supporting arguments. However, if it is a simple proposal claim, then it doesn't necessarily have to be this complicated. Your proposal claim is a claim just like any other, so you have to use evidence and reasoning to prove your claim true, or at least reasonable. Rarely can you prove a proposal claim true. Proposals address the future because they predict the consequences of particular actions. You have to try to foresee the consequences of your proposed action and argue those consequences are good based on some principle and also worth the costs associated with accomplishing them. And remember, not all audiences have the same skills or are willing to expend the same amount of effort, so you may have to alter your proposal when speaking to different audiences.
12.4 Different Types of Claims
21. We previously discussed three main types of claims: claims of fact, claims of meaning, and claims of value. Now we need to explore these types of claims in more detail to further break down these categories into more specific types of claims. When constructing a formal argument, you need to know which kind of claim to make because each type of claim requires a particular form of evidence and reasoning. Some types of claims are easier to make than others. And some types of claims lead to stronger conclusions because they are based on stronger forms of evidence. It is important to understand what kind of claim you are making so that you can fully and logically prove that claim, and so you can organize your supporting claims into a logical sequence to prove your thesis.
A. Factual Claims
22. The first type of argument is the factual argument. It is the simplest and strongest argument to make. A Factual Claim is a statement that seeks to prove that a fact is a fact. This type of claim needs to be specific and each part of the factual claims needs to be proven with sufficient empirical evidence. Sometimes a factual claim discusses a general phenomenon or activity that actually occurs differently in different places but still retains the same basic characteristics of the common category or ideal form. A simple example of a factual claim would be, "All cats have four legs." Now I made the mistake of saying all so I have to literally investigate every species of cat all over the world and take many pictures to prove they all have four legs. That would be quite a chore, and it would take a long time and cost a lot of money. And what about the random three or two legged cat? To remedy this mistake, I would have to qualify my argument to say the natural body form of all cats has four legs, but some cats can be born malformed and some cats can be injured, thus some cats may have fewer than four legs. Now, because I don't have enough time and money, I would probably want to pick just a few species of cat, and given the expense of travel, I might want to pick those species which are close to where I live.
23. For a more complex factual argument, I could make a set of Compare and Contrast Claims. Whenever you compare and/or contrast, you should include at least two claims to make a logical argument. For example, I could claim, "Law enforcement officers in the U.S. still use racial profiling; therefore, the U.S. justice system is still racist." This argument has two claims, and I might want to start with the last one first because it is historical and, therefore, sequentially prior to the first claim. Because the U.S. has three loosely connected political levels (local, state, federal), I would need to settle on a narrow period of time to investigate particular locations in specific states. I would then use this data to make a larger claim about racism in the United States, but qualified by the specific data because specific locales or specific states might have practiced racism differently.
24. I would also have to make a Definition Claim to define what the concepts of race and racism mean in order decide if these historical examples are or are not proof of racism, and further, why there is or is not any variation in the practice of racism. A definition claim needs to combine both logical concepts and concrete historical practices or phenomena to ground the definition in the objective world of fact. Then I would need to do the same type of research on contemporary law enforcement and use the same definition of race and racism to determine if the current practices also indicate racism. Then I would need to compare the historical forms of racism with the contemporary forms of racism to see if they are similar. Depending on my historical time frames, this study would take a HUGE amount of work, taking years or even decades of research. For practical considerations, therefore, I would want to restrict my research to one locale, or if I were really ambitious, one state, but doing any more than that could take over a year at least.
25. Another complex factual argument could be based on Causal Claims (or Cause & Effect Claims), claims that will prove that phenomenon A causes X. For example, I could claim, "Smoking leads to heart disease and early death." In order to prove this claim, I would first need to argue for definitions of my key concepts smoking, heart disease, and early death. Then I would need to find large scale statistical data (preferably a random study with over 1,000 participants) proving a correlation between smoking and heart disease. Then I would have to find the same kind of studies to link heart disease to early death. But remember, these studies only prove a connection and not a cause, so I would have to find further evidence from controlled laboratory research on mice (or some other animal) that proves smoking causes heart disease. Now, why do I say a study with mice and not humans? Well, it would be cruel and illegal to pay people to potentially kill themselves in a laboratory experiment, which is why scientists use animals for dangerous studies. Of course, the ethics of using animals for laboratory research is currently a subject of debate; however, most scientists think the medical benefits of using animals outweigh the harm done to these creatures, and there are now specific procedures to guard against abuse.
26. Categorical Evaluation Claims could be the basis for a different form of factual argument. This type of argument brings together definition claims with compare and contrast claims. A categorical evaluation seeks to use an established category, or set of categories, that represent a special kind of phenomenon in order to evaluate other types of phenomena to see if they match. Doctors use categorical evaluations to decide what disease or ailment you might have based on your list of symptoms. Bosses and teachers use categorical evaluations to judge performance as satisfactory or unsatisfactory. Machine mechanics use categorical evaluation checklists to inspect a car or factory robot to make sure these machines are operating properly.
27. In order to make this type of argument, first you need to argue for the validity of your categories, which should be based on observable, objective characteristics that usually have specific values. A doctor might ask, "Is your breathing restricted?" This is a specific characteristic of many health problems, and its value is high because restricted breathing is a serious problem. An auto mechanic might ask, "Is your ventilation system obstructed." This system is a specific part of a car, but this part has a low value because it is not important to overall car performance, so it would be a low priority fix.
28. Once you have your phenomena broken down into specific categories with clear definitions, then you go looking at another phenomenon to see if it matches so you can make an "evaluation" ranging from good to bad. For example, a university professor will have a set of criteria to evaluate the quality of a student's essay or research project. The phenomenon being observed and tested might be described as a "professional essay" or a "professional research project." The professor will examine your work and check off each criteria to decide how good or bad it is; the evaluation is usually represented with A-F letter grades or percentages (100% to 0%). This grade is correlated to how "professional" your work is, and if it is highly professional (A or B level grade), then you are most likely ready to perform professional level work in a real job.
29. Finally, the last type of factual claim is the Proposal for Action. The proposal is a special type of fact. A proposal is basically a claim foretelling the future, which is always a risky business. You might ask, how is foretelling the future a "factual" type of claim? Well, because it involves trying to predict the factual state of the objective world. Short-term predictions are always easier than long-term projections because the world usually doesn't change all that much over the course of days or months – although in times of crisis, the speed and severity of change intensifies. The primary job of many business and government analysts is to predict the future. Take, for example, an analyst who recommends stocks. Stock analysts must predict not only how a company will do in the near future (usually a projected time frame of one year), but also how the larger national and global economy will also be doing; this is a very complex and difficult prediction to make. That's why most analysts’ stock predictions are rarely accurate.
30. A proposal for action often combines all of the types of claims mentioned above. There will be definition claims for the phenomenon being studied and the type of change being predicted. There will be casual claims examining if we do X, then we should expect Y to occur. Predicting cause and effect will usually also entail compare and contrast claims because usually you have to predict a couple of reasonable scenarios, compare these scenarios, and then argue why scenario 1 is better than scenario 2. In order to make this evaluation logical, you would also need to make a categorical evaluation based on the end result you would like to see happen and then compare each scenario to this end result to see which one gets closest. While proposal for action claims are quite complicated and difficult to make, they are essential to the well-being of human life and our societies. Thus, most professionals routinely have to make proposals for action, hence the "proposal" part of every formal argument.
B. Meaning Claims
31. The second category of argument consists of meaning claims. In arguments of meaning, the facts are usually agreed upon, but not always, so the disagreement is over what the facts mean. How do we define the concept of meaning? This is actually a tricky type of phenomenon that lies between the subjective and objective worlds. Meaning is the significance, relevance, or importance we place on facts so as to use the facts to make our lives better (or worse). For example, some might say, “A rock is a rock.” But a rock can become a paperweight, a musical instrument, a weapon, a canvas for art, a geological artifact, or an economically valuable mineral. Each of these are "meanings" placed on the rock. All the meanings are possible and plausible, and different people could all be looking at the same rock at the same time and have all of these meanings (and more) in their heads as they stare at the rock.
32. Now, here comes the hard part. Which meaning is the true or right meaning? These are two distinct questions with two different answers. The answer to the first question is that they are all objectively true. How can subjective meaning become objectively true? Well, once I state my subjective belief in words, especially in print, it becomes part of the objective world. It becomes a fact. I said that statement. It is true that I said and believe that statement. Now comes the harder part. Is my meaning right? This question is what a claim of meaning is all about.
33. Remember that right refers to the practical use of values (chapter 8.4). As humans, we use our value systems (ideologies or world views) to assign meaning to the objective world, and this subjective and cultural meaning makes our lives better in some way. So the right meaning depends on the right value, and the right value depends on the situation at hand along with the cultural common sense of the audience. For example, if my audience is a group of businessmen, the right meaning of the rock would most likely be an economically valuable mineral. If it is a group of artists or geologists, then the right meaning would be different. Here is where things get especially tricky. What happens when you have a diverse audience filled with businessmen, artists, and geologists who all want to use the same rock for different purposes? Whose meaning will win? There are two options. Historically, the quickest way to resolve this conflict is to just threaten and then kill your competitors. Problem solved. Might makes right! But if you value peace and cooperation, then you have only one other option: Argue over whose meaning is the more useful, appropriate, or valuable given the particular time and place of the argument. How do you do this?
34. The main type of argument of meaning involves the Critical Analysis Claim of Significance. How does this claim work? First, you have to critically analyze the phenomenon that is being argued over, in this case, the rock. Critical analysis means to engage system 2 rational thinking to break down the issue or problem into parts so as to understand the characteristics of the phenomenon and how it works. Then you have to analyze the possible significance or meaning of the phenomenon.
35. You will have to use many of the factual claims outlined above to analyze each one of these options in order to compare and contrast them and establish the strengths and weaknesses of each possible meaning. The guiding framework of your analysis will be based on some principle or value you think is the most important or relevant to the situation. Some values will be seen as more neutral or contentious, depending on the audience. A neutral value would apply to all or most of the audience, while a more contentious value would apply only to some.
36. For example, a neutral value could be the value of utility. A utilitarian analysis might ask: How could this rock be used by different groups? How many people compose each group? What meaning would benefit the most people? A more contentious value would be artistry because how many people in your audience would even value the use of the rock as an artistic canvas? This meaning would be acceptable and benefit only a few. However, an artist might argue that the value of artistry is more important than the value of utility. This argument would then move the grounds of the argument down into the foundational claims of value, which would have to be solved before the argument over meaning could be decided.
37. There are two other important forms of meaning claims. They are basically the same type of claim, but one of them applies to a general range of situations and the other is specific to courts of law using "common law" legal reasoning, a form of law practiced in the U.S. and England. A Resemblance Claim is the more general type. Basically, this claim is like a factual compare and contrast claim, except it is doing much more than simply comparing the facts; it is also comparing the meaning of the facts. Hence, this is a highly complex form of comparison and contrast. You not only have to show how the facts in different cases are similar, but you also have to compare different meanings that have been attached to these facts to show how meanings for the two cases are similar.
38. Often a resemblance claim is the first part of a larger argument. Once a resemblance claim is made, an arguer often moves into a cause and effect argument and a proposal argument. Because the previous case resulted in X, then this new case will also result in X. For example, I could argue, "The war in Iraq is much like the war in Vietnam; therefore, the same unfortunate consequences will result." A similar type of claim can also be made in a court of law where a lawyer argues a Legal Precedence Claim. In common law, a legal precedence tells a judge how to rule based on the ruling of previous judges. The law must be interpreted the same way with the same or similar results. The only judges who can rule in a new way with a new meaning are the Supreme Court judges; they can throw out old laws or modify existing laws based on the warranted principles found in the national Constitution. If I were a lawyer, I might claim, "The 1st Amendment states X, and previous courts have upheld X; therefore, I claim that X must be upheld again today as my current case is clearly an example of X." Such an argument rests on my ability to prove that the present case is similar to past cases, based on specific criteria to prove the comparison.
There is a meaning claim in the use of the word "litter" and two value claims in the blended words "unlAWFUL." Can you explain these three different claims?
C. Value Claims
39. We have already discussed the nature of values, so our discussion here will be brief. Values come in two basic types. First, there is the Claim for the Good, a universal principle of goodness (which also entails its opposite, a universal principle of badness). But principles are meant to be useful. Humans create principles as rules to guide our actions. This need for action leads to a second type of value, the Claim for the Right thing to do. A value argument for right action seeks to apply a claim for goodness to a particular situation in order to argue that a particular act should or should not be done. There are specific values that individuals and groups hold sacred. These are sometimes called ideals or principles. There are also larger systems of values, which define a whole culture or group within a culture. Anthropologists and Political Scientists usually call these systems of values ideologies or world views.
40. While values are among the most important aspects of human nature, they cause us many problems, especially in a globalized world filled with diverse cultures, each of which has different beliefs about what is good and right. Further, different cultures with different values often misunderstand each other, which leads to negative judgments and disrespect. This misunderstanding, in turn, often leads to a "collision of values" (Berlin, 2000a, p. 11), resulting in disagreements, conflicts, and violence. Often values cannot be reconciled; therefore, competing groups scream at each other, or worse, kill each other.
41. When constructing an argument over values, there is no best way to argue. In fact, it is almost impossible to win such an argument. You can't even fall back on the facts because, as Walter Lippmann (1922/1997) once pointed out, we see only "those facts which fit our philosophy...we adjust the facts we see to [our] code [of values]" (pp. 78-79). More recently political scientist Larry M. Bartels (2008) reiterated Lippmann’s basic point: “Careful logical arguments running from factual premises to policy conclusions are unlikely to persuade people who are ideologically motivated to distort or deny the facts” (p. 160). Facts aren’t even facts when ideology is involved. People see through their ideology, rather than their eyes, and one person’s fact becomes another person’s fiction.
42. As Lippmann (1922/1997) pointed out almost a century ago, our values "determine what group of facts we shall see, and in what light we shall see them" (p. 82). Values define not only who we are as humans, but who we want to be and our ideal vision of the perfect life. When our values are criticized, we personally feel under attack and we want to defend not only our lives, but also our culture and way of life. For centuries, humans have fought wars over rival values, almost always described in terms of the “civilized” (us) vs. the “barbarians” (them). Many people consider values worth dying for.
43. People who compose a value argument usually claim that their values are normal and good, while their opponent's values are abnormal and bad. People instinctively believe that "he who denies either my moral judgments or my version of the facts, is to me perverse, alien, dangerous...We believe in the absolutism of our own vision, and consequently in the treacherous character of all opposition" (Lippmann, 1922/1997, p. 82). When it comes to values, most people basically claim that my values are right because they're mine, and your values are obviously wrong, so just accept my values (Fish, 1994, p. 35). Thus, to win such an argument, your opponent must reject his or her own values and assumptions, which rarely, if ever, happens. Often there can't even be an argument in the first place because the different world views underlying the different sets of values are expressed in different languages that appear nonsensical to outsiders (Berlin, 2000b, p. 345). It’s as if a German tried to argue with a Persian, neither one of them speaking each others' language. It would be impossible to even start such an argument, let alone win it.
44. Because of the propensity for diverse cultures to produce a "collision of values," the political philosopher Isaiah Berlin focused on the importance of pluralism as the only hope for humanity. Berlin (2000a) explained, "What is clear is that values can clash....We can discuss each other's point of view, we can try to reach common ground, but in the end what you pursue may not be reconcilable with the ends to which I find that I have dedicated my life" (p. 10). Berlin went on to argue that "values may easily clash within the breast of a single individual" (p. 10). Thus, every individual, every group, every sub-culture, every culture, and every nation must continually set provisional "priorities" (p. 14) where one value or set of values is privileged for a particular reason due to particular historical circumstances.
Read Berlin's Warning about Conflict of Values
45. But setting priorities entails making "trade-offs" (Berlin, 2000a, p. 15), and no one likes to sacrifice his or her own values or interests in favor of another, which is why politics is so messy, corrupt, torn by conflict, and potentially violent. The political realm is the site where these values clash. It is where political and cultural leaders battle for supremacy and sometimes strike bargains. Often political leaders have to ask, What is the best that we can do for these people under these circumstances at this time? But there are no permanent solutions in the political realm. All bargains and trade-offs are tentative, and many simply raise new problems which provoke more arguments and another clash of values. Berlin (1994/2014) explained, and here I wish to quote his wise words at length: "The central values by which most men have lived in a great many lands at a great many times - these values, almost if not entirely universal, are not always harmonious with each other. Some are, some are not. Men have always craved for liberty security, equality, happiness, justice, knowledge, and so on. But complete liberty is not compatible with complete equality - if men were wholly free, the wolves would be free to eat the sheep...Justice has always been a human ideal, but it is not fully compatible with mercy. Creative imagination and spontaneity, splendid in themselves, cannot be fully reconciled with the need for planning, organization, careful and responsible calculation. Knowledge, the pursuit of truth - the noblest of aims - cannot be fully reconciled with the happiness or the freedom that men desire...I must always choose: between peace and excitement, or knowledge and blissful ignorance...if these ultimate human values by which we live are to be pursued, then compromises, trade-offs, arrangements have to be made...some values clash...so we must weigh and measure, bargain, compromise, and prevent the crushing of one form of life by its rivals" (p. 37).
46. What can be accomplished in such an imperfect and incompatible political environment? Amidst such conflict, how do we begin to articulate and argue for the truth, let alone make a proposal for a better world? There are no easy answers. There are no guarantees that we can know or understand the truth, and if we can, there are no guarantees that we will convince others through our arguments that we are right. The truth is but one among many values, and it is not often supported by all. The philosopher John Gray (1989) argued that the truth is just "a preference which few are likely in the end to share" (p. 248). So even academics and scientists who value the truth above all else must make an argument for the good of truth and why it is the best tool to solve problems. But because the truth is ever illusive, we can’t always find it or understand it. Thus, we are often reduced to making an imperfect judgment based on limited evidence. There are no perfect arguments, self-evident values, completely valid conclusions, or easy solutions. All must be argued and debated. In most cases, imperfect and provisional decisions will be reached under difficult circumstances.
47. The best we can hope, as the philosopher Isaiah Berlin (2000a) explained, is for the humility of sound judgment. We must strive for a full investigation of the important problems we face and make the best judgment we can under imperfect circumstances: "There is no escape: we must decide as we decide; moral risk cannot, at times, be avoided. All we can ask for is that none of the relevant factors be ignored, that the purposes we seek to realize should be seen as elements in a total form of life, which can be enhanced or damaged by decisions" (p. 15).
48. Making a reasonable judgment and an effective argument is very hard. First, you have to critically investigate your topic, evaluate the reliability of many different sources of information, and find the facts. Truth has to be actively constructed by critical thinkers through meticulous and rigorous scientific methods. The next step is to argue for the truth in open debate in order to convince a skeptical public. In order to use the truth to solve real-world problems, you first have to convince other people that your facts are actually true. Then you have to convince these same people that you know how the truth should be used to solve the problem. Debating with others about truth means both arguing for the truth and demonstrating it with valid logic and evidence. It also means arguing against false opinions, manipulations, and lies. You will never be able to convince ideological extremists who view their system of values as the only and best way to see the world. But there is some evidence that rational arguments can have an effect on the undecided or ignorant when it comes to political decision making (Bartels, 2008, p. 129).
49. 21st century literacy entails not only being able to construct knowledge with scientific methods, but also openly arguing with diverse groups of people to explain and prove the truth. You will need the tools explained in this book to explore yourself, your society, and your world. You cannot rely on what other people have said. You will need to find the truth for yourself. However, be warned. Few truths are certain. No argument is perfect. No audience is completely receptive. No decisions are final. You need to take responsibility for your knowledge and your arguments. And, to the best of your ability, you need to convince others that you are correct. You need to demonstrate that you have the most reliable information. This is a hard thing to do.
50. But arguing is just the start. There is a lot more work involved in order to actually fix a real-world problem or begin a new endeavor. There is more important, and perhaps more difficult work. In order to actually change the world, you have to mobilize and organize people into action. Only by cooperating with large groups of people can we build schools, start businesses, engineer bridges, clean up pollution, care for the poor, or keep the peace. The work is endless. Competent professionals are needed.
How to Argue (Step 4): Making a Scientific Argument Based on Facts
Rather than a Subjective Argument Based on Common Sense
1. Although the search for knowledge and the creation of technology is thousands of years old, science as we know it is a relatively recent innovation in human history. Modern science was developed in 19th century German research universities and spread from there across the globe. But the various scientific disciplines and their research methods did not become clearly defined and professionally practiced until the middle of the 20th century. While the invention of modern science is one of the most important events in human history, few people understand what science is, let alone use its methodology to make daily decisions. While science delivers the most reliable form of knowledge, it is difficult to practice and not practical for everyday use. Even trained scientists would find it difficult, cumbersome, and costly to use scientific practices in daily life.
2. The journalist Vance Packard (1957/2007) explained, "It would be a dreary world if we all had to be rational, right-thinking, non-neurotic people all the time" (p. 240). Even if it were possible to use science for every daily decision, this complex and arduous method would lead to paralysis or “total absurdity” (Berlin, 2000, p. 168). The Nobel Prize winning psychologist Daniel Kahneman (2011) warned, "Continuous [scientific] vigilance is not necessarily good, and it is certainly impractical" (p. 28). Social scientist Charles E. Lindblom went so far as to say that "examining everything is a path to madness" that "goes far beyond human capacity" (Lindblom, 1990, p. 43). But examine we must if we are to find the best knowledge to improve our lives, our societies, and the world at large.
3. Thus, we need to know not only when to use science to critically analyze the claims of others, but also how to critically analyze claims using scientific methods. But how can we think rationally and make informed decisions? Most people rely on personal experience combined with the common sense of cultural traditions to make daily decisions. While this method can work fairly well some of the time, often this method leads to false conclusions and is, therefore, dangerous, especially when making moral decisions (Kahneman, 2011; Greene, 2002). Many people are "grossly ignorant," "incompetent," and do not gather enough information to make "rational" decisions (Shenkman, 2008, pp. 3, 50; Kahneman, 2011; Popkin, 1994, p. 42). In some ways, we are all "confident idiots" (Dunning, 2014). Rather than thinking critically about the world to investigate the truth, most people rely on what one social scientist calls "gut rationality," i.e. listening to your "gut," or what some people call instincts or intuition (Popkin, 1994, p. 44). Gut rationality includes the opinions of established authorities, tradition, cultural myths, and common sense.
Handout: Common Sense vs Science in Arguments
4. Take, for example, the field of medicine. Most cultures at some point in their history have believed in fictitious entities as the root cause of disease, such as evil spirits, demons, ill humors, black bile, yellow bile, and phlegm. For almost 2,000 years in Europe, it was standard "common sense" practice to poke patients with a knife, or place leeches on them, in order to bleed the mystical "ill humors" out of the sick body. This so-called medical treatment probably killed more patients than it cured. When scientists discovered the new germ theory of medicine in the 19th century, the proposition of micro-organisms, like bacteria, was seen as an absurd belief in ghosts. The medical establishment lambasted it as quackery. The modern theories of evolution, relativism, and global climate change have met with similar disbelief and outrage. But this type of resistance to science is nothing new. For thousands of years people believed the Earth was flat and at the center of a divinely ordered universe. When scientists and explorers postulated a round Earth, which revolved around the sun, these theories were derided as crazy nonsense.
Science vs. Common Sense: The Daily Show Explains
5. Every culture has its official "common sense" beliefs, which the ancient Greeks called orthodoxia, and what social scientists nowadays call "conventional wisdom" (Galbraith, 2001, p. 18). Philosophers and social scientists have known for over a hundred years that the conventional wisdom is almost always wrong, often dangerously so. It is important to understand the common sense method of thinking because it leads to conventional wisdom. We need to guard against this type of thinking in our own lives, and we need to be able to understand and criticize this type of thinking in others, especially when it is used to influence the law or public policy. As we already discussed, all people subjectively classify their world through the bias of personal experience, cultural traditions, and common sense beliefs. Sometimes our conventional wisdom is benign, but often it can lead to dangerous situations (Greene, 2002). For example, some people hold their common sense religious beliefs so dear that they refuse to see a doctor when they are sick, believing their god will automatically cure them. Others use their common sense beliefs to act violently against a hated "other."
6. To illustrate, prejudiced people do not see homosexuals as human. Instead, these intolerant people classify homosexuals as "evil," "perverted," "demonic," or "unnatural" creatures. Such pejorative labels often lead to prejudicial treatment and sometimes violent abuse. Likewise, one hundred years ago, white Europeans classified black skinned Africans as animals or some other sub-human entity, a supposition which led to the brutal killing and enslavement of millions. In all of these cases, a "common sense" belief led to biased claims, poor reasoning, and false conclusions – which, of course, also led to hateful actions and violence.
7. Common sense arguments use a form of deductive reasoning. This form of thinking is based on the authority of an assumed truth, which is usually based on an institution or cultural tradition (Greene, 2002). This assumed truth is then used to arrive at other truths. For example, witches are evil. If witches are evil, then it is necessary and right to kill them. That girl is a witch. Therefore, it is necessary and right to kill her. These claims are all based on the assumed common sense "truths" that witches exist, that witches are evil, and that we all know what witches look like. In Europe, these truths were based on larger common sense truths embedded in Christianity and various cultural traditions of witch hunting. For hundreds of years, this relatively simple belief in witches was used to murder thousands of innocent women in Europe and America (Demos, 2008). An example such as this should make it clear how dangerous common sense can be, especially if you are a despised or distrusted minority.
8. Common sense is also dangerous because you cannot argue against it. Most people don't know why their self-evident truth is "true" – they simply believe it. As psychologist Daniel Kahneman (2011) explains, we all have "answers to questions that [we] do not completely understand, relying on evidence that [we] can neither explain nor defend" (p. 97). If you pressed people to explain why they believe their common sense is true, you would get "bullshit" answers. Bullshit is the use of generalized language to hide the fact that we don't know what we're talking about (Frankfurt, 2005, pp. 46-47).
9. But be careful when pointing out bullshit. Generally people don't like criticism or being called a liar. Throughout human history, many have been put to death for criticizing the common sense or lies of their community. Perhaps the most famous example was the philosopher Socrates. He was killed several thousand years ago because he dared to question the common sense held by the citizens of ancient Athens. For hundreds of years in Europe, the Roman Catholic Church tortured and burned heretics at the stake for the crime of questioning the common sense truth of official doctrines of the Church. But the Catholic Church’s suppression of dissent was not unique. The practice of killing or banishing critics has been widespread around the world.
10. The scientific method for inductive arguments is a much better method than common sense for understanding the world and for finding reliable knowledge. Science uses inductive reasoning. This way of thinking links empirical evidence from the objective world to an argumentative claim in order to prove it true. Empirical evidence consists of facts that can be verified with the senses (tasting, seeing, touching, hearing, and smelling). Or empirical evidence can be facts that are verified with an instrument, the results of which can be verified with our senses (instruments such as telescope, stethoscope, compass, thermometer, or video recorder). Inductive arguments make claims about the objective world. Truth is determined based on evidence. The evidence corresponds with reality and proves that the world objectively exists beyond our own subjective experience.
11. For example, I might claim that it is raining outside today. In order to prove this claim true, I would need to supply evidence of rain. Some types of evidence are better than others. I could point to the weather forecast, but these reports are often unreliable. I could interview a person who said he or she saw rain, but the individual could be lying. I could go outside and take a picture, but how do you know I took the picture today? I could show you a published news report of rain, but is it accurate for your particular location? Of course, the best evidence would be for me to take you outside and let you see and feel the rain yourself. Short of that, I could interview five or ten people, and if they all agreed that it’s raining, then their collective response would also be good evidence. I could also video record the rain at a local landmark with date verification.
12. The strength of my argument rests on the validity of my evidence. Validity is a scientific concept, which determines the quality of evidence. It seeks to measure how well a researcher logically connects evidence to a claim. Relying on an internet weather forecast is highly unreliable and would most likely be considered invalid evidence. Relying on multiple interview subjects, a local newspaper story, and a video recording combined together would be considered highly valid evidence. Using multiple types of evidence like this to confirm a single phenomenon is called triangulation. The method of triangulation is considered one of the most valid research strategies because it does not rely on only a single method.
13. But inductive arguments based on evidence are not perfect forms of reasoning because they are still grounded on some form of bias, which is connected to our human need to make life meaningful – and meaning is not an objective quality of the physical world (Flanagan, 2007). We do not see the world directly, even scientists. We all perceive through our values or world views. For scientists, all observation is grounded in a scientific theory. As one scientist explained, "all observation in science is 'theory-laden,'" which means that scientists use their chosen theory to make meaning of the data they observe (Goodstein, 2010, p. 9).
Handout: How Do You Know? The Seven Stages of Knowledge
14. While the problem of meaning and its resulting bias is the root problem of all human knowledge, it can be controlled to a certain extent. It is also important to remember that not all biases are bad. As discussed earlier, everyone has biased beliefs based on common sense, principles, rational arguments, and/or personal experience. Scientists have bias based on scientific theories. This bias helps us. Bias allows us to understand our sensory observations and gives that information meaning. Bias also helps us make decisions fast because it creates an automatic reflex. But we need to be continually aware of our bias to make sure it is reasonable and justified; often it is not.
15. Burning witches because your priest said they are evil is not reasonable. Burning wood to stay warm in the snow is reasonable. The bias in the first example is blind devotion to tradition and authority, which leads you to kill another person. The bias in the second example is a rational belief that living is better than dying and that heat keeps a body warm in the snow. Scientists and academic researchers try to understand their bias in order to judge its rationality. They will also try to control it when they conduct research so as to be objective and fair-minded.
16. If our bias is reasonable, then we need to fully disclose it as the foundation of our knowledge so that we can explain why it is warranted in our argument. Take for example my claim about rain. This knowledge claim was based on my principled beliefs in the value of truth, empirical evidence, and open argument. I would not go through the time and trouble of assembling valid evidence and constructing a reasonable argument about the rain unless I believed that open, empirical arguments were good, and further, that knowing the truth about the objective world is also good.
17. But these are cultural values with which some may not agree. Some people may believe that winning an argument is good, regardless of whether it is true or not. That would be a different principled bias. Such people would, therefore, not care about empirical evidence or open argument, and they might instead rely on manipulation and lies to trick you. Others might believe that God can make rain or withhold rain and that we should just believe whatever a holy person says about the presence or absence of rain. This belief would be another principled bias. People with such a belief would not care about empirical evidence or arguments of any kind, and they would instead try to explain the authority of their religion.
18. An honest speaker who is interested in the truth about the objective world needs to make an open argument, which should disclose and explain the speaker's bias so that the audience can decide if it is reasonable. The disclosure of bias, often in the form of a foundational principle, is called a "warrant" (Toulmin, 1958, p. 98; Toulmin, Rieke, & Janik, 1979, p. 26). Most warrants are implicit in an argument because most arguers are not fully aware of their bias and the common sense foundation of their arguments (Toulmin, 1958, p. 98). And sometimes a warrant is deliberately kept secret so as to better manipulate an audience. Scientists, academic researchers, and honest public debaters strive to make open arguments; thus, they will always state an explicit warrant at the start of an argument. But it is important to understand that "warrants are not self-validating" (Toulmin, Rieke, & Janik, 1979, p. 58). They need to be fully explained and supported by a reasonable argument, which is sometimes called "backing."
Handout: Types of Warrants & Claims
19. Explicit warrants can take three basic forms: (1) The philosophical principle, (2) the legal principle or precedence, and (3) the scientific theory. The first two are value principles, which seek to argue that a certain value is good or bad based on the qualities of the principle and/or on how that principle has been used in the past. Often value principles are explained as universal principles that are good or bad at all times in all places. These principles are derived from the beliefs of a culture or from a specific legal tradition, or sometimes both. A legal warrant will cite a law or legal principle, which many times contains a sacred cultural value. Thus, there is often a blending of philosophical and legal warrants. One example is the American Constitution, which enshrined the principle of liberty and justice (among other values), so when an American discusses freedom they often point to the Constitution as both the source of that value and also as a legal guarantee of that value. Below is a government sign that is doing something similar. Drivers are reminded that littering is not only against the law, but it is also an "awful" practice, which is a common sense expression of badness. The sign assumes that you already know the common sense values that make littering awful, such as damaging the environment (the principle of environmentalism) or ruining a beautiful landscape (the principle of beauty).
A philosophical and legal warrant
20. The third type of warrant is different. A scientific theory is a provisional model about how the objective world works, which has to be tested with experiments and validated with sufficient evidence. Different scientific theories lead to different methods for collecting data, which in turn, will lead to different types of conclusions about the objective world. Take for example the general difference between a "fundamental" scientist and an applied "forensic" scientist. The fundamental scientist will use the best theory that explains the phenomenon being studied, but a forensic scientist has to be always focused on specific laws and court procedures, and will therefore only choose theories and methods that will hold up in a court of law. And unlike the other two types of warrants, most scientists will readily admit that their theory might be wrong, and they will actively look for disconfirming evidence that could prove it false.
21. Let us examine three scenarios to explore how these three warrants could be used in an open argument. Let us say that a chemical factory next to a small town is leaking a toxic pollutant into the local water supply. Someone might make the warrant that the principle of health is both a public and personal good (we all want to live healthy lives); thus, a toxic pollutant effecting the water supply would be bad, and if it is bad then it should be stopped and cleaned up. In an American court of law, precedence is important. If a court has ruled in principle one way, then courts in the future have to rule that same way, unless the Supreme Court changes the law. So, if law code X was used in the past to convict companies guilty of environmental pollution, then a lawyer would direct a judge that this same law code X should be used in the current case. And if the previous penalty of law Code X was 100% of clean up fees plus damages to victims, then a judge should use that same established principle to penalize the current company if it is found guilty.
22. Finally, a scientist might be brought into the case to ascertain whether or not the contaminating chemicals in the water are actually toxic or not. This scientist will use a particular theory about toxicity to study the particular chemicals. The scientist might explain in court that theory Z predicted certain outcomes, evidence was found confirming these outcomes, and four separate experiments were conducted under laboratory settings to make sure the results could be replicated. Therefore, the empirical evidence confirms the theory, and the theory verifies that the current pollution is toxic. Thus, the judge would then use this argument to declare the company guilty under law Code X and charge fines according to past precedent.
23. The first part of an open argument is making a thesis claim, which is the main claim of an argument. Then the foundational warrant for this thesis claim needs to be analyzed and justified with good reasons. Next, the supporting claims must be stated and organized. Each supporting claim will form a small argument with its own array of evidence leading to logical conclusions. All of these supporting claims will be connected back to the thesis so as to prove the main claim of the argument true or false (or somewhere in between). The more supporting claims and evidence, the stronger the argument. Supporting claims are often called the "grounds" of an argument because they establish the truth or falsity of the main claim, much like stakes pounded in the ground stabilize a tent, keeping it up even in strong wind.
24. When constructing grounds, you need to know that there are three types of claims, each of which needs different kinds of evidence and reasoning to prove it: (1) claims of fact, (2) claims of meaning, and (3) claims of value. The easiest type to prove is the claim of fact. This claim purports to describe or explain part of the objective world. In order to prove such a claim, there needs to be empirical evidence that would correspond to objective reality, making a claim a fact. The second type of claim is more difficult to prove because it does not directly deal with facts. The claim of meaning seeks to prove a certain interpretation of the facts because facts do not inherently mean anything. A rock is a rock, but when does a rock become an obstacle, or valuable, or toxic, or ornamental, or a weapon? When does fighting between nations become a war? When does a relationship end? When does a brain-damaged human die? These are questions of meaning. Often everyone can agree on the facts (although not always), but because humans need meaning, these facts take on symbolic significance, and that meaning added to the fact must be argued for and made reasonable. But the meaning that people place on objects or events derives from the third type of claim, a claim of value.
25. Claims of value are the most difficult claims to make. Why? For many reasons. They are contentious because every culture and sub-culture has its own set of values. They also cannot be proven with any type of evidence. Values do not exist in the objective world. They are purely subjective phenomenon and exist only in the brains of human beings. One cannot empirically point to justice or truth or beauty or goodness. Dogs, ants, and monkeys have no concept of efficiency, honesty, or evil. Humans have invented values to make our lives more meaningful and to increase social cooperation.
26. But different individuals and different cultures have diverse sets of values, sometimes strikingly different. And this is another reason value arguments are so difficult to make: It is impossible to make rival values even sensible to an opponent. Take, for example, the practice of cannibalism. Some people are cannibals; they eat other people. This is a fact. Now, what does this practice mean? Well, it means different things in different cultures, and these different meanings are grounded in different values of human life. Some cultures regard human life as the highest good; thus, they would condemn cannibalism as one of the highest evils because it requires one to not only kill, but also to disrespect the body of the dead person by eating it, rather than praying over it and burying it. How would you even try arguing with cannibals in order to convince them that their common sense value is "bad," and that your rival values are "good"? It’s almost impossible to conceive how such an argument could work. Such a confrontation would most certainly end with aggression or violence – possibly followed by dinner, with you as the main dish!
27. Values come in two basic types. First, there is the claim for "what is good," which would be a universal principle of goodness (which also entails its opposite, a universal principle of badness). There are myriad examples of cultural “goods,” such as life, property, sexual virility, strength, beauty, efficiency, kindness, love, individualism, competitiveness, even death. But principles are meant to be useful. Humans create principles to be rules that guide our actions. Thus, there is a second type of value claim: "What is the right thing to do?"
28. A value argument for right action seeks to apply a claim for goodness to a particular situation in order to argue that a particular act should or should not be done. If I am a cannibal and hold the value that life and death are equally good, especially the goodness of preserving my own life and killing my enemy, then this would lead me to look for enemies to fight, kill, and eat. If I am a typical American and hold the value of life to be good for all people, which also entails the opposite that death is bad for all people, then this would lead me to preserve my own life and not bring death to others. Further, my principles would lead me to condemn, lock up, and perhaps execute a cannibal as a criminal, although the practice of execution would violate the principle of life I supposedly hold dear because I believe that no death should be done to anyone, ostensibly even to cannibals who try to eat me.
29. The above conflict between the values of life and justice is a perfect example of moral ambiguity. This is where a real-life situation puts our moral principles to test and we have to critically think about the right action to take. Sometimes, the right action for a particular situation violates our principles. Sometimes there is no right action and we have to choose the lesser evil of several bad options.
30. While values are one of the most important aspects of human nature, they cause us many problems, especially in a globalized world filled with diverse cultures, each of which has different beliefs about what is good and right. Values are an integral part of our common sense beliefs, but most people don't know why they hold the values they do. Further, most people have no real understanding of why the values they hold are actually good or bad. When we are young, we are told by parents, priests, or politicians to accept certain values, and they become part of the cultural air that we breathe.
31. Thus, different cultures with different values often misunderstand each other, which leads to negative judgments and disrespect. This situation, in turn, often results in disagreements, conflicts, and violence. Because humans take their beliefs so seriously, disagreements over values have been one of the most common causes of murder and war. A verbal argument over facts and the meaning of facts can often turn violent once it becomes clear that the participants hold different sets of underlying values that cannot be reconciled. To illustrate, proponents of abortion believe in the supreme values of physical health for the mother and freedom of choice, while their opponents believe in the supreme values of the life of the fetus and the divine origin of souls. These values cannot be reconciled; thus, these two groups often resort to screaming at each other, or worse, killing each other.
32. But if in an age of globalization we hold the values of peace, tolerance, freedom, and cooperation, then arguing is the only constructive tool we have to resolve conflicts and come to some agreement about building a better world. So, it is important to understand how human beings think and how arguments are constructed. You need to be able to evaluate the thinking and arguments of others to decide if you agree with their claims and will agree to their proposals for action. You also need to be able to evaluate your own thinking and the arguments you might make to move an audience.
33. Different types of audiences will require different types of arguments and evidence, so you must always understand the rhetorical context of any argument to fully understand how and why someone is arguing a certain way. Understanding the different rhetorical contexts will also help you understand how to judge the quality of the speaker's claims, evidence, and reasoning. But there is no silver bullet when it comes to arguments. Everyone makes errors, takes thinking shortcuts, and displays poor judgment (Kahneman, 2011; Popkin, 1994, p. 218). In addition, we can never find the appropriate words or tone to reach everyone. We will always be misunderstood by someone in our audience. Arguments are always fragile and imperfect constructions. However, if we hold dear the values of openness, freedom, and truth, then arguments are the only tool at our disposal to move audiences into collective action. Without arguments to convince an audience, we would have to resort to the older political tools of deception, coercion, and violence.