image/svg+xml

OPEN

QUESTION

(a blog of existential crises)

Theorics - 003

2021/08/25

Hedge your bets

( . . . if it makes sense)

I can be an . . . opinionated person. My wife often likes to remind me that, before we were married, more than one of my family members warned her that I can be overly stubborn when I think I’m right. And who doesn’t think they’re right most of the time? I have the luxury of often being able to spend the time to do some basic research to come up with the “best” answer when forming an opinion. (Vim is the best editor. Fight me.) But what if I’m wrong?

Every one of us is continually bathed in a sea of opinions. As I discussed in a previous post, we choose so much of what we do based on seeking reinforcement of our beliefs. It is so easy to end up only being exposed to views you already agree with, and thus it can often seem unfathomable that we’re wrong. Maybe, “literally every person I know does that.” Or, “every comment I’ve read on this site agrees that it’s not true.” This can be such a difficult cognitive bias to overcome. So much so that it becomes imperative to question even our most strongly held beliefs — especially our most strongly held beliefs.

I am by no means saying that all views should be given equal weight or validity. However, admitting the possibility, however small, that the opposing viewpoint is, in actuality, the very real truth of reality leads to an inevitable thought experiment: what would that mean? If I’m wrong, what consequences does that imply? What changes? What else would I need to rethink? What do I do differently?

I find it most helpful to consider a hypothetical case in which what I believe to be true has no consequences if true, regardless of whether or not I believe it’s true. Conversely, the opposing view would have severe negative consequences if it were true and I thought the opposite.

As a fabricated example, suppose there is a gun sitting on a table that I know is unloaded. I removed the magazine, emptied the chamber, did a visual inspection. I verified that the gun is unloaded, myself, in every way possible, and it has not left my sight since doing so. Then someone walks up, picks up the gun, points it at me, and says they will pull the trigger in 5 seconds.

I now have a choice. I can stand there, confident in my knowledge and wait for a hollow click, or I can simply move out of the way. Of course, I’m going to move out of the way. But why? I am sure I am right. All of my observations say I have nothing to worry about. But being wrong has very real, very life-and-death consequences. I gain nothing by standing still, even if I’m in no actual danger. It’s trivial to perform the necessary actions to avoid the consequences of being wrong. So, despite full confidence in my belief, I will take action contrary to my belief. That’s the better choice, because of the severity of the consequences of being wrong.

(Obviously, this is a contrived example. Life is rarely so black and white. Everything is a continuum, and nothing is this cut and dry. But this is a thought experiment to serve as a baseline.)

While the previous example demonstrates the consideration of extreme consequences, it relies on two conditions that cannot always be assumed: reasonable ease of action and a non-trivial possibility that I was wrong. As a further example, now suppose that someone comes up to me and tells me that tomorrow Earth’s gravity will reverse, and we’ll all be flung into space. Of course, I don’t believe them, but once again, that’s a pretty severe consequence.

This time, I would say it’s pretty safe to ignore the crackpot. But what changed this time? The consequence is even more severe that before. Not only am I risking my own life, but the lives of every human on the planet! That’s a pretty big ask. Again, probability and actionability.

Before with the gun, while I was confident, there was also the possibility that I had missed something. Maybe I didn’t do as good a job as I thought clearing it as safe. Maybe the stranger performed some slight of hand and swapped it out before I could notice. There’s always some chance. This time, though, the gravity reversing claims of our stranger would require our entire understanding of the physical Universe to be completely and utterly wrong. It would require tomorrow to somehow behave so vastly different than the billions of years previous that no other observation ever made would make any sense whatsoever. The probability of this person being right is vanishingly small.

Furthermore, even if this crazy notion is somehow actually correct and going to happen, what could I do about it? Maybe I could go home and start nailing down my furniture so it doesn’t float away? Maybe tether myself to something solid? Those feel like things I could reasonably do. But hold on . . . the floor I’ve nailed my furniture to is sitting on a concrete foundation that’s just poured into the dirt underneath. That’s a pretty minor tensile force compared to the newfound gravitational repulsion. I should probably anchor my house to some bedrock. Great, now I have to get some construction crews out here on less than a day’s notice. But wait again . . . that bedrock isn’t molecularly bonded to the rest of the Earth; it’s pretty much just sitting there due to gravity too. Oh, and the air. There won’t be any of that around anymore either. This is getting out of hand. . .

This is obviously reduction to absurdism, but it underlies the balance that must happen when evaluating positions you don’t agree with. Response to other possibilities has to be informed and influenced by the severity of the consequences to being wrong, the probability of that outcome, and the actionability of the solution.

At least for the moment, I’ve tried to actively avoid directly talking about current events, political ideologies, or hot-button issues. We’re building a tool box for critical thinking, regardless of stance on a particular issue, and I don’t want someone to ignore something that might be helpful only because I come down on some topic differently than they do. However, distribution and ingestion of misinformation seems to be at an all time high. Please don’t think I’m talking only about “them,” whoever “them” is to you. I’m talking to you, you reading this article, regardless of your political background, your religious predilections, or your scientific literacy. And I’m talking to me, because I have to remind myself every day that I don’t—can’t—know everything.

Could you be wrong? Is there even the slightest possibility that the millions or billions of humans on this planet that see things differently than you might have a point? Everyone has an opinion, but that doesn’t mean you have to stake your life on it. Be kind to those that think differently than you. Be open to being wrong sometimes. Think about what it would mean if you are. And, if it’s reasonable to do so, hedge your bets.


Comments for this post are on the /r/openquestion subreddit.