On talking about history to a lay audience

In this essay series, I will write down my own thoughts about Eliezer Yudkowsky’s essays on the Rationality: Ai to Zombies –series from the point of view of a historian. My reason for writing these is primarily to organize my own thoughts regarding the use of rationality as a mental tool in my own profession, and as such, I do not presume to even attempt to appeal to a very wide audience. However, if you are reading this disclaimer and find my essays insightful or entertaining, more the power to you and I implore you to go and read the original essays, if you have not already.

Compared with most scholars and scientists, historians are in quite an easy position when we are faced with having to explain our research to a lay audience. Apart from some specialized brands of history – usually interdisciplinary explorations – even articles published in influential historical journals tend to limit professional jargon. In fact, many publications make sure to include it within their author guidelines to instruct prospective submitters to avoid jargon as much as they can in favour of clarity. As a proponent and defender of the popularization of history I find this to be a good thing. I want people to be able to understand what we are talking about, and despite the benefits of having a professional language to allow the professionals to discuss topics with useful shortcuts, we should take a few steps back and translate our thoughts to more commonly language when we address a wider audience.

When presenting jargon to a lay audience, you are not only being unkind and unprofessional in your duty as an educator (which I think is a duty of all scientists and scholars to some extent) but I am also inclined to think that you are trying to intentionally smuggle your agenda through by masking it in confusing words. Alternatively, you are trying to save face and hide the fact that in all actuality you have nothing substantial to talk about. We rely on the audience to give us the benefit of a doubt and find an agreeable way to interpret what we say, despite of what we actually say. Usually this works too, especially within the narrow confines of academia, because people want to listen in good faith. They may even think they’re too stupid to understand, and let you off the hook. This way, no matter what is said, the façade of professionalism remains.

Yudkowsky considers this issue in his essay “Rationality and the English Language”¹ and includes within a highly relevant quote by George Orwell:

”When one watches some tired hack on the platform mechanically repeating the familiar phrases—bestial, atrocities, iron heel, bloodstained tyranny, free peoples of the world, stand shoulder to shoulder—one often has a curious feeling that one is not watching a live human being but some kind of dummy . . . A speaker who uses that kind of phraseology has gone some distance toward turning himself into a machine. The appropriate noises are coming out of his larynx, but his brain is not involved, as it would be if he were choosing his words for himself . . . What is above all needed is to let the meaning choose the word, and not the other way around. In prose, the worst thing one can do with words is surrender to them. When you think of a concrete object, you think wordlessly, and then, if you want to describe the thing you have been visualising you probably hunt about until you find the exact words that seem to fit it. When you think of something abstract you are more inclined to use words from the start, and unless you make a conscious effort to prevent it, the existing dialect will come rushing in and do the job for you, at the expense of blurring or even changing your meaning. Probably it is better to put off using words as long as possible and get one’s meaning as clear as one can through pictures and sensations.”²

Using jargon, stock phrases, and vague statements begging the question can create multiple interpretations, when we should strive for our words to be undersood as we intended. It is better to be literal and simplistic than to sound authorative or deep, even if we wish to retain our professionalism or want to avoid conciseness in fear of being patronizing to the audience. Rather than making up convoluted sentences that take time to unpack, or hiding the things you don’t know by saying it was ‘complex’ or an ‘emergent phenomenon’, we should strive for clarity and be ready to admit to that we do not know all the details. Self-aggrandizing and trying to hoodwink an audience is unflattering.

 


¹ Yudkowsky, Eliezer. ”Rationality: from AI to Zombies” Berkeley, MIRI (2015). 282–285.

² George Orwell, “Politics and the English Language,” Horizon (April 1946)

Obviously they should have seen it coming

In this essay series, I will write down my own thoughts about Eliezer Yudkowsky’s essays on the Rationality: Ai to Zombies –series from the point of view of a historian. My reason for writing these is primarily to organize my own thoughts regarding the use of rationality as a mental tool in my own profession, and as such, I do not presume to even attempt to appeal to a very wide audience. However, if you are reading this disclaimer and find my essays insightful or entertaining, more the power to you and I implore you to go and read the original essays, if you have not already.

The reason why historical actors tend to appear to us as either Masterminds or Imbeciles can be attributed to both hindsight bias, and the fact that people – historians especially – are very keen on constructing coherent narratives of the past. While it is considered key to consider only what the people themselves knew by the time of any particular source, the historian usually already has the already existing narrative in mind, from start to finish. We know what’s coming next, and thus sometimes we need to remind ourselves that the people at the time did not. It is notoriously difficult to predict the future, or even the consequences of your own actions. There are simply too many factors to consider. Even if in hindsight, some particular feature may stand out above all else, because it is the straw that broke the camel’s back.

In his essay concerning Hindsight Bias, Yudkowsky uses the Challenger disaster as an example, reminding that preventing the disaster ‘would have required, not attending to the problem with the O-rings, but attending to every warning sign which seemed as severe as the O-ring problem, without benefit of hindsight.  It could have been done, but it would have required a general policy much more expensive than just fixing the O-Rings.’

Resulting from hindsight bias, we tend to think that successful people were successful in their endeavours because they could plan their course meticulously Meanwhile, those who failed ought to have been able to predict that one thing and in failing to do so, appear to be have been idiots. Humans are not well equipped to rigorously separate forward and backward messages, so even mindful historians can fall prey to allowing forward messages to be contaminated by backward ones.

Examples of this kind of thinking is especially rife among political history.

Another thing that causes bafflement in students of history of every level is the assumption that most other people likely share your interpretation of a message’s contents. This gets confounded when you take into account the historian’s perspective of usually actually knowing what the message was supposed to say, due to the consequences of its misinterpretations.

In ”Illusion of Transparency: Why No One Understands You”¹, Yudkowsky recites a Second World War example used in a heuristics study by Keysar and Barr to illustrate an over-confident interpretation:

“-two days before Germany’s attack on Poland, Chamberlain sent a letter intended to make it clear that Britain would fight if any invasion occurred. The letter, phrased in polite diplomatese, was heard by Hitler as conciliatory—and the tanks rolled.”

It is an instinctive reaction to tear at one’s figurative beard at the stupidity of both parties involved – how could Chamberlain have left any room for interpretation, and what possessed Hitler to think that in absence of a direct threat, Britain would stall military action? However, Chamberlain’s style was to be very cautious and mild-mannered in his communication, and it had never resulted in a war before. Similarly, Hitler may have decided to act regardless of the word choices in Chamberlain’s message. We may never know, but this exchange makes both appear as Imbeciles, knowing both how the war ended, and what it cost.

Hindsight Bias is one of those mechanisms of the mind that historians are well aware of and actively work to undermine, yet end up submitting to too often. Be it hubris, attachment to one’s own narrative, or just laziness of meta-cognition, we all make this mistake sometimes. Still, it should be considered a required professional skill to be able to go backwards with one’s thinking and separate one’s own knowledge from what information motivated a particular source.


¹ Yudkowsky, Eliezer. ”Rationality: from AI to Zombies” Berkeley, MIRI (2015). 34–36.
The study he refers to in the essay: Boaz Keysar and Dale J. Barr, “Self Anchoring in Conversation: Why Language Users Do Not Do What They ‘Should,”’ in Heuristics and Biases: The Psychology of Intuitive Judgment: The Psychology of Intuitive Judgment, ed. Griffin Gilovich and Daniel Kahneman (New York: Cambridge University Press, 2002), 150–166, doi:10.2277/0521796792.

Why it always takes longer than expected

In this essay series, I will write down my own thoughts about Eliezer Yudkowsky’s essays on the Rationality: Ai to Zombies –series from the point of view of a historian. My reason for writing these is primarily to organize my own thoughts regarding the use of rationality as a mental tool in my own profession, and as such, I do not presume to even attempt to appeal to a very wide audience. However, if you are reading this disclaimer and find my essays insightful or entertaining, more the power to you and I implore you to go and read the original essays, if you have not already.

The Planning Fallacy is one that is guaranteed to hit most starting academics under the belt. To illustrate it, Yudkowsky gives a few sample results of studies exploring this heuristic.

These are direct quotes from his essay Planning Fallacy¹, where he summarizes the findings. I am including them because the point deserved to be driven home by the anvil.

Buehler et al. asked their students for estimates of when they (the students) thought they would complete their personal academic projects. Specifically, the researchers asked for estimated times by which the students thought it was 50%, 75%, and 99% probable their personal projects would be done. Would you care to guess how many students finished on or before their estimated 50%, 75%, and 99% probability levels?

13% of subjects finished their project by the time they had assigned a 50% probability level;

19% finished by the time assigned a 75% probability level;

and only 45% (less than half!) finished by the time of their 99% probability level.


Newby-Clark et al. found that

  • Asking subjects for their predictions based on realistic “best guess”

scenarios; and

  • Asking subjects for their hoped-for “best case” scenarios . . .

. . . produced indistinguishable results.


Likewise, Buehler et al., reporting on a cross-cultural study, found that Japanese students expected to finish their essays ten days before deadline. They actually finished one day before deadline. Asked when they had previously completed similar tasks, they responded, “one day before deadline.” This is the power of the outside view over the inside view.

The planning fallacy has the most impact on the practical side of academia, and its lessons ought to be heeded by especially PhD researches and others who are taking on an expansive research and writing project perhaps for the first time in their lives. Without prior experience on such projects, we tend to over-analyze the project and counter-intuitively this leads to overtly optimistic estimations of the duration it will take us to complete the project. Yudkowksy calls this thinking in the terms of the unique features of the project the ‘inside view’, and recommends switching to the ‘outside view’ instead when organizing projects for oneself.

The outside view is, in all of its simplicity, deliberately avoiding to think about unique features of your current project and just ask how long it took others to finish broadly similar projects in the past.

This should be good news especially to all PhD candidates working on their dissertation, as they have a multitude of peer examples to draw from. And not only that, they also have their advisors, who have completed a dissertation in the past themselves, but have also likely supervised a few of them into completion before you came along. Their estimations should not be brushed off, and one ought not to underestimate other PhD Candidates either – likely, they had their reasons for the project extending beyond what was initially planned. The “inside view,” does not take into account unexpected delays and unforeseen catastrophes.

… And still, I expect my own dissertation project to be finished in the year 2023, maternal leaves in between and all. In my defense, I was faster (around 25% faster) than the average student is during my BA and MA, and I am in a particularly favourable position because I have steady funding until the end of 2022.

Let this blog entry stand as a lesson in humility and the perils of hubris, should 2024 come without me being a PhD.


¹ Yudkowsky, Eliezer. ”Rationality: from AI to Zombies” Berkeley, MIRI (2015). 30–33.

The studies in the quotes are, in the order of appearance:

  1. Roger Buehler, Dale Griffin, and Michael Ross, “Exploring the ‘Planning Fallacy’: Why People Underestimate Their Task Completion Times,” Journal of Personality and Social Psychology 67, no. 3 (1994): 366–381, doi:10.1037/0022-3514.67.3.366; Roger Buehler, Dale Griffin, and Michael Ross, “It’s About Time: Optimistic Predictions in Work and Love,” European Review of Social Psychology 6, no. 1 (1995): 1–32, doi:10.1080/14792779343000112.
  2. Ian R. Newby-Clark et al., “People Focus on Optimistic Scenarios and Disregard Pessimistic Scenarios While Predicting Task Completion Times,” Journal of Experimental Psychology: Applied 6, no. 3 (2000): 171–182, doi:10.1037/1076-898X.6.3.171.
  3. Roger Buehler, Dale Griffin, and Michael Ross, “Inside the Planning Fallacy: The Causes and Consequences of Optimistic Time Predictions,” in Gilovich, Griffin, and Kahneman, Heuristics and Biases, 250–270.]