Knowing Your Blind Spots

What you don't know that you don't know can hold you back.

All of us have blind spots. Oftentimes, these blind spots are developed over the course of years of experience. Experience is a great benefit, but experience also brings biases. I don’t mean biases in the typically negative sense of the word, but rather we develop a series of solutions or ways of thinking about things. We develop these biases because these solutions have worked consistently in the past, but that can leave us unable to perceive the best way to handle things, especially when the fundamentals have shifted.

An Example from History

Now, this is not a post about politics. However, it was inspired by a post about politics…or at least the study of American political history.

The article I Thought I Understood the American Right. Trump Proved Me Wrong., Rick Perlstein discusses at length the changing perceptions of the history of American conservatism in the 20th century. To sum it up, as the study of the American right evolved during the course of the late 20th century, a framework evolved that was used to explain it’s trajectory from William F. Buckley Jr. to Ronald Reagan and onward. While this framework generally made sense of the course of history, it also often conveniently ignored certain factors and events that didn’t neatly fit into the narrative.

Nonetheless, the narrative persisted because, overall, it still provided the clearest means of explaining American conservatisms political evolution. Even if some pieces didn’t quite fit the narrative, nothing fundamentally challenged the overall framework - until Trump.

The rise and ultimate victory of Trump is forcing historians of American conservatism to rethink their framework. As it turns out, many of those inconvenient, but once insignificant events and factors that didn’t quite fit the framework were far more consequential than these historians had acknowledged.

(I won’t go deeper here, but I definitely recommend reading the article if you are curious.)

Another Example

To use another recently topical example, United airlines has probably been using the same process for handling overbooking for years. It may not have always worked perfectly, but it functioned. Until recently, they probably overlooked these minor failures as simply outliers that didn’t fundamentally challenge the way they overbooked and handled overbookings in situations where there were more passengers than seats.

However, in this case, the system failed catastrophically. Their initial responses showed that they still tried to perceive it as a traditional outlier to an overall functioning framework. Only after realizing the scale of the failure, and its impact on their business, do they appear to be open to fundamentally altering their perception of this problem and how to solve it. (Though I have my doubts as to whether they will find a reasonable solution - these sorts of biases can be very hard to root out.)

Recognizing Your Own Blind Spots

I think that most of us can think of ways that we’ve landed in similar scenarios in our own work experience - I know that I can. For instance, as a software developer, you may be attached to a particular language or framework or tool that has served you well over the course of years. Thus, every project that lands on your desk is simply a matter of figuring out how it fits into your language/framework/tools of choice.

Oftentimes, just like these historians, we don’t see that the ground has shifted until our solution fails. In this case, until our software development project either cannot be completed or fails in some other catastrophic manner.

The point here is that we all need to work hard at being aware of where our own blind spots are, so that these sorts of failures don’t sneak up on us.