lunes, 2 de abril de 2018

lunes, abril 02, 2018

Question Your Assumptions

I always thought I would learn more as I grew older. In fact, the opposite is happening: The older I get, the more I realize how much I don’t know. Maybe that’s why my search for answers seems to widen and intensify each year.

If you're looking for the 'next Bitcoin'... You're missing the bigger picture.

Our questions differ, but we’re all seeking answers. Our digital technologies, led by search, theoretically make it easy to find answers, too – but they aren’t necessarily the right ones. This is a growing problem. Whatever crazy thing you want to believe, a quick internet search will turn up some “expert” to confirm you’re right.

In Niall Ferguson’s latest book, The Tower and the Square, he spends a good part of one chapter documenting the high percentage of people, not just in the US but all over the world, who believe in one form of conspiracy theory or another. That would be funny if it weren’t so sad. One of the things Niall points out is that the dominance of Facebook and Twitter has tended to break us down further into tribes, where we increasingly talk just to our own kind, reinforcing our parochial beliefs and idiosyncrasies.

Worse, we are increasingly overconfident in our own beliefs, even without expert confirmation.

That’s no surprise to stock traders; they’ve long known that the crowd is often wrong. But they also know that the crowd can believe itself to be right a lot longer than skeptics think it can.

That’s how we get asset bubbles.

Today’s Outside the Box is a short Quartz article by Olivia Goldhill, discussing a new paper by social psychologist David Dunning. Extreme wonks might recognize the name because, with Justin Kruger, he defined the “Dunning-Kruger effect.” It says that people who lack knowledge on a particular topic tend not to recognize their lack. In other words, we don’t know enough to know how little we know.

In his latest research, Dunning says we often make bad decisions, not because other people trick us, but because we trick ourselves. “To fall prey to another person you have to fall prey to your belief that you’re a good judge of character, that you know the situation, that you’re on solid ground as opposed to shifty ground,” he says.

When we were all living in small bands on the savannah, this was actually a good behavioral trait to have. But then it was pretty easy to see who we could trust and who we couldn’t, because the decisions we were making were basically pretty simple. The world has gotten extraordinarily more complex, and we often end up relying on “experts” to make decisions for us, based on their training and knowledge, when in reality they bring their own biases, assumptions, and agendas –limitations that often they aren’t even aware of. As I write this, I can’t help but think of Fed economists, but the problem is pervasive.

We’re hit with such a constant tidal wave of information that no one person can stay on top of it anymore. So what we are “sure” about is no longer as sure as we once thought it was. In a world of social media, where we are breaking up into tribes that live in their own echo chambers (rather than as one big happy family, which is what the developers of social media thought we would become), it’s harder to know what is right and true, and thus the shout goes up, “Fake news!”

My Mauldin Economics colleague Patrick Watson has recently taken note of research similar to Dunning’s. In last week’s Connecting the Dots, he described a neuroscientific study showing that car dealers – experts on selling cars – have almost no idea what motivates car buyers. The researchers call this “expert blindness.” Our own knowledge can keep us from seeing what’s real.

That’s a pretty deep thought, but it’s an important one that we should all keep in mind. It tells me I should question my assumptions and do more research before I make important decisions.

As should we all. So read this Outside the Box and then resolve to question yourself.

On a related note, I’d like to ask your help. It is increasingly clear, given the multiple demands on my time, that I need to streamline my writing schedule. Producing both Thoughts from the Frontline and Outside the Box is not getting any easier, given the seemingly ever-increasing amount of research I have to do to stay on top of my game. And there’s just more – a lot more – going on in my business life than there was five or ten years ago.

We’re exploring some new ideas that will let me continue to deliver the quality information you deserve and even to improve it. I know change is hard (especially for me), and I also know some of you may have ideas I haven’t considered.

Your constantly questioning his assumptions analyst,

John Mauldin, Editor
Outside the Box


The Person Who’s Best at Lying to You Is You

By Olivia Goldhill


In 2008, the psychiatrist Stephen Greenspan published The Annals of Gullibility, a summary of his decades of research into how to avoid being gullible. Two days later, he discovered his financial advisor Bernie Madoff was a fraud, who had caused Greenspan to lose a third of his retirement savings.

This anecdote, from a presentation by University of Michigan social psychologist David Dunning, due to be presented at the 20th Sydney Symposium of Social Psychology in Visegrád, Hungary in July, highlights an unfortunate but inescapable truth: We are always most gullible to ourselves. As Dunning explains it, Greenspan – despite being the expert on gullibility – fell prey to Madoff’s fraudulent behavior not simply because Madoff was some master manipulator, but because Greenspan had, essentially, tricked himself.

“To fall prey to another person you have to fall prey to your belief that you’re a good judge of character, that you know the situation, that you’re on solid ground as opposed to shifty ground,” says Dunning. Greenspan, Dunning notes, failed to follow his own advice and take appropriate cautionary steps before trusting someone in a field he knew little about. Though he wrote the book on how not to be overly confident of your own judgments, Greenspan went against own advice when he handed over his savings without properly interrogating both Madoff’s confidence in himself, and his own sense of confidence in Madoff. Had he followed his own counsel, Greenspan would have recognized he knew little about financial investments, and would have done far more research before deciding to hand over his money to Madoff.

Dunning is an expert on the human tendency to overestimate confidence in our own knowledge and beliefs. In 1999, together with social psychologist Justin Kruger, Dunning identified the co-eponymous Dunning-Kruger effect: people who are incompetent and lack knowledge in a field tend to massively overestimate their abilities because, quite simply, they don’t know enough to recognize what they don’t know. So hugely unqualified people erroneously believe that they’re perfectly qualified. (This effect that has an unfortunate tendency to create the worst possible bosses. It’s also the opposite of imposter syndrome, which describes when qualified people worry that they aren’t qualified.)

In his latest presentation, Dunning highlights the studies that collectively show how we repeatedly and consistently fool ourselves into thinking we know more than we do, and so convince ourselves that our opinion or choice is right – even when there’s absolutely no evidence to support this. There are dozens of studies supporting this hypothesis, showing, for example, that British prisoners rate themselves as more ethical and moral than typical citizens, and that people mistakenly believe they’re better than others at reaching unbiased conclusions.

People tend to be just as confident in their false beliefs as their accurate ones. In one 2013 study, participants were asked a physics question about the trajectory of a ball after it was shot through a curved tube. Those who said the trajectory would be curved (wrong) were just as confident that their answer was correct as those who correctly stated the ball would have a straight trajectory.

A body of research has also established what scientists call “egocentric discounting”: If participants are asked to give an estimate of a particular fact, such as unemployment rate or city population, and then shown someone else’s estimate and asked if they’d like to revise their own, they consistently give greater weight to their own view than others’, even when they’re not remotely knowledgeable in these areas.

Our false confidence in our own beliefs also deters us from asking for advice when appropriate – or to even know to whom to turn. “To recognize superior expertise would require people to have already a surfeit of expertise themselves,” notes Dunning.

Gullibility to oneself is not a modern phenomenon. But the effects are exacerbated in the age of social media, when false information spreads rapidly. “We’re living in a world in which we’re awash with information and misinformation,” says Dunning. “We live in a post-truth world.”

The issue is that the current environment convinces people they’re more informed than they actually are. It might, says Dunning, actually be better for people to feel uninformed. “When people are uninformed, they know they don’t know the answer,” he says, and so they will be more open to hearing from others with real expertise. If we think they know enough, however, we’ll just “cobble together what seems to us to be the best response possible to someone asking us our opinion, or a policy, or what we think,” says Dunning. And, he adds, “unfortunately we’re programmed to know enough to cobble together an answer.”

There’s no quick fix to this, but there is a key step we could take to avoid being so willfully misinformed. We need to not only evaluate the evidence behind newly presented facts and stories, but evaluate our own capability of evaluating the evidence.

The same questions we consider when evaluating whether to trust another person should apply to ourselves: “Are you too invested in this thought or belief you have? Are you really giving the conclusions you’re reaching due diligence? Are you in over your head?”, says Dunning.

That said, constantly questioning ourselves would be impractical, leading to a constant state of self-doubt and uncertainty. Most effective, says Dunning, would be to focus on situations that are new to us, and where the stakes are high. “Normally those two situations go together,” says Dunning. “We only buy so many cars in our lives, we only invest large sums of money every so often, we only get married every so often.”

Of course, as that last example shows, at some point you have to give up being savvy and just trust your own judgment – both in yourself and others. Dunning quotes novelist Graham Greene: “It is impossible to go through life without trust…that would be to be imprisoned in the worst cell of all, oneself.”

We can though, learn to be a little more careful and wise. Just as we don’t blindly trust every person we meet, there’s no reason to be utterly trusting and gullible to ourselves.

0 comments:

Publicar un comentario