The problem of good faith, part 2

To continue some of the thoughts from yesterday:

Zeynep Tufekci has been working nonstop–I think she must not sleep more than two hours a night–on the complexities arising from the COVID-19 pandemic, especially the problems of interpreting data and, most importantly for this discussion, what she terms the metaepistemological problem of how we talk about, and ask questions about, the very idea of evaluating knowledge.

For Tufecki, metaepistemology involves among other things the “mundane skill” of “reading between the lines.” She first expounds on this skill in an essay called “Lessons from a Pandemic Anniversary,” in which she identifies three working principles (one could also call these “heuristics”) for evaluating not so much the content of specific knowledge as tell-tale signs about that knowledge, a kind of metaepistemological “tag” or “tell,” that will help you understand the truth about the facts presented. You’ll need to read the essay to make sense of these items. Here I’ll just list them.

  • The  Principle of “You Can’t Finesse the Steep Part of an Exponential” (in other words, if the smoke is dense and the temperature is climbing quickly the fires are likely raging no matter what explanation is given)
  • The “Principle of Always Pay Attention to Costly Action” (in other words, watch where resources are expended instead of listening to what people say about motives, plans, etc. I ran into this principle in senior leadership, where the saying was “if you want to see a university’s real strategic plan, don’t look at the plan, look at the budget”)
  • The Criterion of Embarrassment, which Tufekci calls “something historians use all the time … the idea that something that embarrasses or puts the speaker in a difficult position is more likely to be true.”

I encourage you to follow the link to the “criterion of embarrassment,” as you’ll learn how important this criterion has been to New Testament research–which is not to say that by itself the criterion is always reliable. Indeed, part of the problem with the criterion of embarrassment is that it can lead to the sometimes useful, sometimes treacherous heuristic that “absence of evidence is evidence of absence,” which is essentially an irrefutable argument.

This problem in particular appears in Tufekci’s follow-up essay, “Critical Thinking Isn’t Just A Process“:

One of the things I noticed throughout the past year has been that a lot of my friends who had grown up in authoritarian or poor countries had a much easier time adjusting to our new pandemic reality. My childhood was intermittently full of shortages of various things. We developed a corresponding reflex for stocking up on things when they were available, anticipating what might be gone soon. That was quite useful for the pandemic. So was trying to read between the lines of official statements—what was said and what was not, who was sitting with whom on the TV, and evaluating what the rumor networks brought in. It turns out those are really useful skills when authorities are lying at all levels.

A principle that’s often useful in these situations is that most deliberate misinformation from authorities—especially in places that are mid-range in terms of institutional trust and strict licensing—comes from omission, not saying the truth, rather than outright lying. That offers a way to get at the truth by trying to detect a picture, and looking at the parts that have been obscured, to make out the actual shape.

Notice that “reading between the lines” really means “reading the real lines and ignoring the things the liars are calling the lines.” I say this because many of my students believe that literary analysis is all about reading between the lines, when it’s truly all about reading the lines–that is, attending to the words, their order, the arrangement of chapters and lines and rhymes and voices, etc. In other words, “reading between the lines” is a process of substitution, while what I want my students to do is to become skilled at detecting implied or symbolic meanings. (But I digress.)

In the end, it seems to me, Tufekci’s argument in “Critical Thinking Isn’t Just A Process” derives from a prior assumption, based on experience, that “authorities are lying at all levels.” This assumption is also a conclusion, and one that drives interpretive strategies for finding out what’s really true. But by now it’s clear that this survival strategy, born of hard experience and, from what I can tell, eminently justified, is also at the heart of conspiracy theories and, even less dramatically, the habit of “critique” and corrosive skepticism that one can routinely find in my profession. (Rita Felski and Lisa Ruddick have done very important work in this area. Ruddick’s “When Nothing Is Cool” has been a touchstone for me in this regard.)

Tufekci’s conclusion is sobering indeed, both as a salutary warning and, in its shadow, a strategy whose guardrails may not hold:

There is often talk of teaching people “critical thinking” thinking skills, and that’s certainly something worth doing. A mistake, though, is to think that such critical thinking skills are independent of knowledge: that there is a recipe, or a way of interrogating conclusions, that can turn into “critical thinking.” In reality, the process by itself isn’t where the magic happens.

These do not seem complicated skills in some sense—and especially not in retrospect, once the actual answer is known. But they require more than parsing of words. The institutional operation, and the status and psychological incentives of the people, matter greatly to discerning the truth. Like most knowledge, this is more than “word games.” It is a mixture of sociology and psychology—if we are putting them into fields—but also involve probability: what’s the most likely outcome? What types of evidence would help tip the balance in which direction? How do these institutions operate? What are the personal and professional incentives of this particular person? And so on.

Critical thinking is not just formulas to be taught but knowledge and experience to be acquired and tested and re-examined, along with habits and skills that can be demonstrated and practiced. But there is no separating the “process” from the “substance”.

I am not sure any community can survive the relentless practice of inquiring cui bono? about every single expert or authority. I cannot imagine reading a full financial disclosure of every physician I consult, though I can at the same time imagine a duty to inform that would compel the physician to disclose any clear conflicts of interest–while also raising the question of whether the clear conflicts are as dangerous as those that seem less defined and less of a problem. The question of how to judge a conflict of interest is itself not as straightforward in every case as it might seem. Some apparent edge cases may determine very unhappy outcomes.

But all of that said, I do think that “knowledge and experience … acquired and tested and re-examined, along with habits and skills that can be demonstrated and practiced” sound more like wisdom than simple “competency,” and point to a cycle of learning and thinking and learning and thinking that sound very much like the practices a liberal arts education seeks to model and encourage. It also does not sound like “critique” or “skepticism,” but a positive commitment to hope and trust that has to be renewed and re-asserted over a lifetime. That is to say, an enlightened, wise approach to assuming good faith where such assumptions are warranted, or even where such assumptions are necessary whether or not one can judge their warrant.

Which brings me to Montaigne and Bacon–but that’s for the next part.

Sorcerer

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.