969: How to Make Better Decisions by Wisely Evaluating Claims with Alex Edmans

By June 10, 2024Podcasts

 

Alex Edmans shows you how to think smarter, sharper, and more critically so you can make better decisions.

You’ll Learn:

  1. How our biases are holding us back 
  2. The ladder of misinference that mucks up our thinking 
  3. Why we end up mistaking statements for facts 

About Alex

Alex Edmans is Professor of Finance at London Business School. Alex has a PhD from MIT as a Fulbright Scholar, and was previously a tenured professor at Wharton and an investment banker at Morgan Stanley. Alex has spoken at the World Economic Forum in Davos, testified in the UK Parliament, and given TED/TEDx talks with a combined 2.8 million views. He was named Professor of the Year by Poets & Quants in 2021.

Resources Mentioned

Thank You, Sponsors!

  • Acorns. Start saving and investing for your future today with Acorns.com/awesome
  • Storyworth. Give the Fathers in your life a unique, heartfelt gift. Save $10 with at StoryWorth.com/Awesome with the promo code AWESOME.

Alex Edmans Interview Transcript

Pete Mockaitis

Alex, welcome.

Alex Edmans

Thanks, Pete, for having me on.

Pete Mockaitis

Well, I’m so excited to dig into your book, fantastic title, May Contain Lies. Could you please open us up, perhaps, with a wild tale about a story, a study, or a statistic that exploited our biases and the mayhem that erupted from that?

Alex Edmans

Certainly. So, one example is the link between breastfeeding and child development. So, everybody tells you that breast is best. They even give the impression that you are not a good mother if you’re not breastfeeding your kids, if you’re taking the easy option of using the bottle. And so, this is based on some evidence which is cast iron, pretty clear that breastfed kids do better than bottle-fed kids across a range of outcomes. This might be physical development, it might be child IQ, it might even be a maternal-kid bonding.

However, the concern here is that whether you breastfeed or not is not random. It’s driven by other factors. So maybe mothers with a more supportive home environment, they are able to breastfeed because breastfeeding is tough, and it could be their supportive home environment is what’s causing the improvement in child IQ or child health.

So, when you control for that, when you strip out the effect on IQ, of parental background, you actually find no effect of breastfeeding on child development. And so, this is striking. Why? Because everybody tells you that breastfeeding is pretty much the only way to go, but once you have a more careful look at the data, you rule out alternative explanations, you find that the evidence there is much weaker.

Pete Mockaitis

Okay, Alex, so right from the get-go, busting myths. Okay. I was fed with a bottle, and I turned out pretty well, I think, and so I’m intrigued. Some studies, it seems, take the care to carefully explore potentially confounding variables and rule them out, and zero in on what’s really driving the variation, or the impact, and others don’t. And most of us are not, in fact, digging into the details of every scientific study that’s referenced in a news article. So, I guess if we don’t get into that level of depth, we may very well find ourselves with some misinformed views of the world.

Alex Edmans

That’s correct. And sometimes we don’t want to get into that level of depth. Why? Because if we see a study whose conclusions we like, then we accept it uncritically and don’t even bother to ask whether there’s alternative explanations. So, it’s a bit like if you are a police officer and you think that a person is guilty, then you might interpret every piece of evidence as being consistent with his or her guilt, even if it’s also consistent with some other suspects going on.

So, this is something known as confirmation bias. We have a view of the world and we will latch on to anything that supports that view of the world, even if the evidence is actually pretty weak. And so, what might this be in the breastfeeding study? We believe that something natural is better than something artificial, that’s why natural flavorings are better than artificial flavorings, and that’s why the idea that breastfeeding is better than bottle-feeding, it just sounds good. It seems to accord with our view of the world, and so we don’t think, “Is this a correlation but no causation?”

Pete Mockaitis

Like, “Processed food is bad. It’s, oh, so beautiful to see a picture of a mother and baby in that intimate moment.” So, there are a number of things that point us in one direction, so we’ve got the confirmation bias in action. We’re going to dig into some real detail about cognitive biases. I’d love it if, first, you could share anything that really surprised you as you were putting this together. Like, you’re pretty well-versed in this stuff, did you make any new discoveries that made you go, “Whoa”?

Alex Edmans

Well, I think that one thing that surprised me is how much I fell for this myself, because my day job is as a finance professor to think carefully about data and evidence. And then when I went to parenting courses myself, before my son was born, I believed all of this. It wasn’t until I looked into the data much more carefully that I found it was something quite different, but despite me being somebody who should do this for a living, I fell for that.

There are also other cases where I described in the book of things that I taught to my students without, again, looking deeply at the data. So, one thing is Malcolm Gladwell’s 10,000 hours rule, which people argue claims that you can be an expert in anything if you just put in the hours. And that’s something professors like to give that message because we like to say, “Yes, you might not like finance but if you just, yeah, put a lot of effort in, you can really change the direction of your life,” and again without looking at the evidence really closely, which is what I did for this book. I was duped into this myself.

And in my defense, it’s not just me. What the evidence tends to suggest is that more intelligent people, or more sophisticated people, will fall for misinformation more. Well, that’s surprising. You might think, “Well, isn’t it the case that the smarter you are, the more you’re likely to defend against misinformation?” But the answer is no, because the smarter you are, you deploy your intelligence selectively.

So, if there’s a study you don’t like the findings of, you’re able to come up with reasons to dismiss it, to knock it down, but then when there’s a study whose findings you like, you selectively choose not to use your discernment and to accept it. So, given you use your intelligence selectively and in a one-sided manner, this might actually lead to you becoming more misinformed rather than less.

Pete Mockaitis

That’s intriguing, and it makes sense when you put it in that context there. So, I’m curious, what’s the big idea then behind this book? And how is it helpful and relevant for professionals looking to be awesome at their jobs?

Alex Edmans

Well, the big idea is that the solution to misinformation is to look within you. So, we often think misinformation is somebody else’s problem, that the government should prosecute people for producing misinformation. But that’s a problem for a couple of reasons. So, number one is that misinformation is produced far faster than the government can regulate, and, number two is that many forms of misinformation are subtle.

So, they are not the case of somebody flagrantly lying or coming up with a deepfake. So, the statement that breastfed kids have higher IQ than bottle-fed kids, that is a correct statement. You can’t be prosecuted for making that statement, but the implication that this means that breastfeeding caused the high IQ, that’s where the problem is. And so, given that often statements aren’t incorrect, they can’t be prosecuted, the costs of misinformation might be ourselves making incorrect inferences from correct facts.

So, what I’m doing in the book is to highlight our own biases that lead us to make incorrect interpretations, and then come up with a simple set of questions we can ask ourselves to make sure that we’re not being misinformed.

Pete Mockaitis

Lovely. Do you have any cool stories about a professional up-leveling their game in this domain and making superior decisions with superior outcomes as a result?

Alex Edmans

Well, unfortunately, you don’t hear the cases in which situations were avoided. You hear about situations where bad decisions are made because those are the things that make the news. So, if somebody did something which avoided a disaster, that’s not going to make the news because if there’s no disaster, there’s nothing newsworthy. But you do know of cases in which people did not heed this and there were disasters.

So, one big disaster was Deepwater Horizon. So, that was a case in which the oil rig; they ran some tests to see whether it was safe to remove the rig. All these tests failed, but because the people were so smart, they came up with an excuse. They were able to fabricate a reason for why the tests failed. They called this the bladder effect. And because of this bladder effect, they gave themselves an excuse to run a quite different test. That different test passed, and so they thought the well was safe, and this led to the disaster.

Now, in the inquiry afterwards, the government found that this bladder effect was completely made up, that it was a fiction, but it was because the engineers were so desperate to finish this job, and because they had a strong bias, because Deepwater Horizon was the best performing rig, then they went ahead and made up this reason, and then they thought the well was safe.

So, there’s certainly cases in which we have these disasters which are a result of these biases. The cases in which acknowledging the biases led you to not make mistakes, they’re much harder to come by. Why? Because if a mistake was not made, then this is not something as newsworthy.

Pete Mockaitis

Okay, fair enough. That’s so meta, really, Alex, in terms of even there’s a selection bias at work in terms of the cases we hear about on bias.

Alex Edmans

Unfortunately, yes, because what makes the news, what do we hear about? We hear about when things go wrong. So, if, indeed, correct application, correct inference leads to things going right, we would not be hearing about that because of the selection.

Pete Mockaitis

Sure thing. Well, I mean, there’s a huge career benefit in an extra dose of disaster avoidance, both for the poor creatures of the ocean and our own careers and colleagues and customers and products, etc. So, break it down for us, you mentioned we got two big old biases that are largely to blame for us getting snookered, fooled by misinformation. Can you unpack these for us?

Alex Edmans

Certainly. So, one that I’ve alluded to is confirmation bias. So that applies when we have a pre-existing view of the world, and then we interpret evidence as always supporting that view. And notice here that this pre-existing view need not be deeply ideological. So, one might think, “Okay, maybe confirmation bias applies to things like gun control or abortion or immigration,” but it applied to something more subtle like breastfeeding.

And even though I don’t have a particular ideology about breastfeeding, something as subtle as me thinking that something natural is better than something man-made that led me to fall for that trap. So, that’s confirmation bias and that kicks in when we have a pre-existing view of the world, even a subtle one.

But what happens when we don’t have a pre-existing view of the world, if we think we’re open-minded? So that’s when a second bias comes in, and this bias is called black-and-white thinking. So, what is that bias? So even if we have no preconceived view, if we view the world in black and white terms, we think something could be either always good or always bad, then we will be swayed by misinformation which is extreme.

So, let’s give a practical example. So, the Atkins Diet was about carbs. Now, that’s something where people don’t really have strong views. So, protein, people think protein is good. You learn that protein repairs muscles, that’s why you’ve got all these protein supplements that you can want to buy. Fat, we think it’s bad, it’s called fat because it makes you fat. But carbs, they’re not so clear-cut, so many people might not have had strong opinions on carbs until the Atkins Diet, which demonized carbs. It said try to have as few carbs as possible.

That played into black-and-white thinking. There were no shades of gray there, and that made the diet really easy to follow. Well, you didn’t need to count your calories and figure out are carbs within 30 to 40 percent. You just looked at the carbs label on nutritional information and if it was high, you avoided it. But notice, if Atkins had had the opposite diet, saying try to eat as many carbs as possible, he might have also gone viral because that suggests, also plays into black-and-white thinking, it’s easy to implement.

So, what this means is that to be famous, to have an impact, you don’t necessarily need to be right. You need to be extreme, and, indeed what we typically see here are lots of extreme statements, “No bottle feeding at all. Exclusive breastfeeding,” “Don’t eat any carbs. Maybe eat as many superfoods as possible.” These things leave no potential for nuance but they become successful because of this black-and-white idea.

Pete Mockaitis

Well, Alex, that is well said. You don’t need to be right. You need to be extreme, and that’ll do it.

Alex Edmans

Yeah, and if you could put it in 280 characters then that’s something which will be really easily shared, and people want to share things which sound simple. And why people share misinformation is they’re not bad people. They want to share useful practical tips. And so, if the tip is, “Just avoid X or eat as much as Y,” that’s something that people share because they think it’s useful information that people can implement. It’s much easier than saying, “Make sure that X is between 30 and 35 percent of your daily calories.”

Pete Mockaitis

While we’re talking about biases, I’ve been giving a lot of thought lately to overconfidence. It seems like that can just sort of make everything a little bit worse. People are fooled, and then they seem quite certain about their point of view being correct, or true, or “This is the way. This is the only way.” Any thoughts on overconfidence?

Alex Edmans

Absolutely. I think it’s tied to my early comment about how more sophisticated people, or more intelligent people, suffer more from misinformation. Why? Because their biases are stronger, and potentially overconfidence plays into this. How can it play into this? Is that overconfidence can make the confirmation bias stronger? How?

So, one of my fields is sustainable finance. That’s the idea of companies that do good for the world, perform better in the long term. And I might think, “Well, why did I go into this field? I could have looked at many, many areas of finance.” The reason that I’ve chosen to go into sustainable finance is the evidence on this must be really rock solid. There must be rock solid proof that sustainability improves performance.

And so, if, indeed, there’s a new study which comes out, saying, “Well, actually, the evidence for sustainable investing or ESG is less strong than people believe,” I might be even more stringent in rejecting that. Why? Because I know that my field is sustainable investing, and the fact that I’ve chosen to be in this field means that I know more than anybody else, and it must mean that I chose to be in this field because the evidence is strong, and so that’s why I might choose to ignore people on the other side.

Pete Mockaitis

Yes, understood. Okay. Well, you’ve got a really cool tool, your ladder of mis-inference, and a few steps along that ladder. Could you walk us through these and give us some examples?

Alex Edmans

Absolutely. So, why did I come up with this ladder of mis-inference to begin with? It’s to provide a practical solution to the reader to try to figure out how to be awesome at their job by spotting misinformation. Now, you might think spotting misinformation is hard because, “There’s like thousands and thousands of types of misinformation out there, how can I remember all of them and put them into practice?”

So, I wanted to categorize them into just four. And so, I illustrate this into what I call the ladder of mis-inference. Why do I use the ladder as the graphic? It’s that when we start from some facts and then we draw some conclusions, it’s like we’re climbing up the ladder. And why I call it the ladder of mis-inference is that we actually make missteps up the ladder. We are drawing conclusions that are not valid.

So, the first misstep is a statement is not fact, it may not be accurate. So let me unpack that, and, again, with an example as you suggested. So, one big piece of evidence which supported the over-prescription of opioids in the US, which led to the opioid epidemic, was an article in the New England Journal of Medicine entitled “Addiction Rare in Patients Treated with Narcotics.” That has been cited over 1,650 times.

Now, that is a statement and there’s no misinformation there. The article was truly called “Addiction Rare in Patients Treated with Narcotics.” It was truly in the New England Journal of Medicine. But if you click on the article, you find it’s just a letter to the editor. So, there was no study behind it, there was no science, just somebody wrote in to the editor, and so people just cited this article without reading it, without seeing the context, which was this was a letter to the editor rather than a scientific study. And this was seen to be one of the reasons why opioids were so readily prescribed, obviously with then fatal consequences.

And even if you think the letter was completely accurate, and it wasn’t made up, the letter considered patients in hospital. And maybe if you’re in hospital, you won’t get addicted because you’re given narcotics on a prescribed basis. That’s quite different from giving it to an outpatient who might take it whenever he or she wants to. So, again, a statement could be not flawed, it could be not made up, but it’s still inaccurate if you don’t see the context. This was a letter, not a study, it only looked at hospitalized patients.

So, you might think, “Well, the solution is just to check the facts. Let’s go to the original source, read the full context, and that’s enough.” But that’s not enough because of the second step up the ladder. This is the idea that a fact is not data, it may not be representative. So, again, let me give an example. So, one of the most famous TED Talks of all time led to a book called Start with Why by Simon Sinek. This argues that if you have a why, a passion, a purpose, you’ll be successful. Again, those are things that we want to be true. We believe in the power of passion.

And he gives the examples of Apple, clearly successful, that’s a fact. Wikipedia, clearly successful, this is the world’s founding for knowledge. The Wright Brothers, clearly successful, they got the record for the first test-powered flight. But those are just cherry-picked examples. There could be hundreds of other companies that started with a “why” and then they failed, but Simon Sinek will never tell you about them because they don’t support his theory.

So, even if the facts are correct, they might only be a small part of the picture. They’re not giving you the full picture and, therefore, they’re misleading. They’re not data.

Pete Mockaitis
Intriguing.

Alex Edmans

So, you might think the solution is to get the full picture. It’s not data, it may not be representative.

Pete Mockaitis

So, when you say a fact is not data, I mean, I suppose, not to mince words here, a fact could technically be data, but it’s incomplete, non-representative data. So, I guess an isolated fact is not the whole relevant universe dataset. That’s not as pithy though, Alex.

Alex Edmans

Correct, yeah. No, but you’re absolutely right. You could say it, technically, counts as data, but it’s selected data, so what you want is a full representative sample, a representative data sample, rather than just something cherry-picked and selected.

Pete Mockaitis

Well, Alex, is starting with “why” not a good move? Is that not a research-backed approach to success?

Alex Edmans

It doesn’t seem to be research-backed. So, there were a couple of companies which have been successful, but actually Apple never even started with “why.” So, if you look at Simon Sinek’s book, it says Apple had this “why” which was “Everything we do, we believe in challenging the status quo” but Apple never said that. And, again, this is something that I wanted to look at in the research for my book, as I thought that was a fact but it was never said, it was never in any of Apple’s documentation.

And also, Simon Sinek says, ‘Well, people don’t buy what Apple does. They buy why they do it. They buy the iPhone because they believe in Apple’s wanting to change the status quo.” Really? Don’t we buy Apple because of its functionality, its apps, its usability, the fact that it’s got great after-sales service? Do people really think about the higher purpose of Apple when they buy the products? No. What they will go for is how useful it is. But the idea that a “why” is what leads to success, that’s empowering. Why? Because anybody can come up with a “why.” If you have enough brainstorming sessions or market bends or flip charts, that is a nicer message to give than you need to produce an awesome product.

Not everybody can produce an awesome product or be really innovative, and so that’s why that book and that message has been so successful is it’s empowering. It tells us that the secret to success is in our own hands, and it’s something easy to do rather than something much more difficult, hard work in designing a really good product with great functionality.

Pete Mockaitis

That’s intriguing. And then as we’re thinking critically about these assertions, it seems like a lot of times the conclusions are more nuanced. Like, “Having a clear ‘why’ can result in increased motivation that boosts results. However, having a great ‘why’ is by no means a proven success principle that we can hang our hat on, as this will undoubtedly massively increase our odds of victory.”

Alex Edmans

You’re absolutely right, Pete. So, what causes success? There’s lots and lots of factors which contribute to the success of a person, a company, and there’s also luck which comes into it. But a book is never going to lay out all the different things that a company or a person needs to do to become successful. Books, typically, have one idea. And I know this through having tried to publish books, is that whenever you have a pitch, they say, “What is the big idea? Not the 10 ideas, what is the one idea in the book?”

And so, this is why a lot of books try to highlight this one thing which is the secret to success. So, this could be starting with “why” or it could be grit, to take Angela Duckworth’s book, or it could be ten thousand hours to take Malcolm Gladwell, so they focus on one particular thing, and say that’s the one thing that drives success, when it might not drive success. There might be lots of other factors which are driving success. And even if your one factor works, it might not work in every circumstance. It might work when combined with a lot of other stuff.

So, maybe a “why” does matter, and indeed some of my work is on the benefit of purpose, but it also needs to be combined with flawless execution, also discipline, and knowing what “why projects” to turn down, no matter how purposeful they are, maybe they’re pie in the sky, but those messages are much more nuanced. Instead, the simple black-and-white message, which plays into black-and-white thinking, that why will always lead to success in every situation, that’s something which sells, and this is why a lot of books with that message have been very successful.

Pete Mockaitis

Okay. So, we got a statement is not a fact, a fact is not data or the whole dataset. And next, we got data is not evidence.

Alex Edmans

Correct. And so, when you get the whole dataset, so you might think, “Ah, this is the solution. Let’s get the whole dataset. Companies that started with ‘why’ and failed, and companies that succeeded even though they didn’t start with ‘why,’ and we have the whole dataset, can we not then just claim a conclusion from that?” And, the answer is not, but why? Because of the third misstep, because data is not evidence, it may not be conclusive.

So, what do I mean by evidence? Because people use the terms data and evidence interchangeably, but the word evidence, let’s think about a criminal trial, that’s where we often hear that word. And evidence is only evidence if it points to one particular suspect. So, if the evidence suggests that Tom or Dick or Harry could have killed Emma, that is not evidence because it’s with multiple suspects. And the problem with lots of datasets is, even though they look at the full picture, they could point to multiple conclusions.

So, if I go back full circle to the breastfeeding example at the start, “Breastfed kids have better outcomes than bottle-fed kids,” is it breastfeeding causes the higher IQ, or is it parental background leads to some parents to breastfeed, and that parental background also leads to the higher IQ, so that could be a correlation without causation? And, yeah, everybody knows, in the cold light of day, that correlation is not causation, but often we forget this if we like the story being paraded. Due to our confirmation bias, we switch off our discernment and just don’t ask that question if we like the conclusion.

Pete Mockaitis

Intriguing. So, data is not evidence. In the incidence of a crime, we might have data in terms of “The window was shattered.” It’s like, “Okay.” “The window was shattered with a hammer.” “Okay, so that’s some information that we know, yep, that window was shattered with a hammer.” But it’s not evidence because any number of people could have done that window-shattering with a hammer. So, these are just kind of facts that we’ve collected as opposed to things that are really strongly pointing in a particular direction.

Alex Edmans

That’s entirely correct. But if you’re a police officer and you already have a particular suspect in mind, you might interpret all of these facts as consistent with your suspect, even if there were alternative suspects going on. So, then what’s the practical tip to the listener or the reader? Is, “How do we know that we have the correct interpretation of data and are not being blind to alternative explanations?” It’s to consider and assume the data has the opposite result.

So, let’s assume that the data have the result that we don’t like. So, let’s say the data found that breastfed kids perform worse. Now, that goes against our biases because we think that something natural should have a good outcome. So, then we would try to appeal to alternative explanations, or alternative suspects. We might say, “Well, maybe the women who can afford formula are wealthier. They can afford to buy it, and maybe it’s their wealth which leads to the better outcomes of bottle-fed kids.”

So, now that we’ve pointed to the fact that there’s an alternative suspect, which is parental wealth, we have to ask ourselves, “Does that alternative suspect still apply even though the result is in our direction?” And the answer is, yes, it could well be that the parents who are wealthier are able to afford to breastfeed because it’s so exhausting, they might be able to afford home help as well, and maybe it is that income which is also behind the high IQ and other outcomes.

And so, what is the idea of imagine the opposite so powerful? It’s because it unlocks the discernment which is already naturally within us. So, when we hear about misinformation, we might think, “Oh, this is so difficult for me to tackle. I’m a time-pressed, busy person. I don’t have time to dig into the weeds of a study, and I don’t have a PhD in statistics.” But what I’m trying to highlight is we already have discernment.

Whenever I see a study posted on LinkedIn that people don’t like the findings of, there’s no shortage of reasons as to why this is correlation but not causation, why the dataset is not the full complete dataset. So, what the idea of imagine the opposite is, is to try to trigger and activate the same discernment when you find a study you do like and are just tempted to lap up.

Pete Mockaitis

That’s good. Okay. And then the fourth and final step, evidence is not proof. Lay it on us.

Alex Edmans

Yeah, absolutely. So, let’s say you found a perfect study which has perfect causation, that is evidence, but it’s not proof. So, what’s the difference? A proof is universal. So, when Archimedes proved that the area of a circle is pi times the square of the radius, that was not only true in the 3rd century BC in ancient Greece, it’s true in 2024 around the world. But evidence is only evidence in the setting in which it was gathered. So, if the evidence pointed to Tom killing Emma, and Tom was the husband, this doesn’t mean that in every case when a woman dies, it’s always the husband that did it. So, evidence has a particular setting.

So, I go to the 10,000 hours rule. Malcolm Gladwell claims that in any setting, from chess playing to neurosurgery, you need to put in 10,000 hours to be successful. But the evidence he cited was just on violin playing, and what leads to success in violin playing might be quite different to what leads to success in neurosurgery. Violin playing, this is a very predictable environment. You play the sheet music. You can practice that same sheet music 10,000 times.

Whereas, with neurosurgery, one surgery might be very different from another, there’s lots of other factors going on. So, what works in one setting might not work in others. But if you want to sell a bestselling book, you want to say that you’ve identified the secret to success in every situation. Had Malcolm Gladwell claimed the 10,000 hours rule for success in violin playing, he would have not had the same impact that he did.

Pete Mockaitis

That’s right, much smaller audience, the violin. Yes, those who are ambitious violin virtuosos in training is a much smaller market size than the broader sales group of customers for that book.

Alex Edmans

Absolutely. So, we want to claim a theory of everything, a secret to success in all situations, and so the broader we make the claim, the more impact we’ll have, but often these claims are over-extrapolating from evidence gathered in one specific targeted setting.

Pete Mockaitis

Okay. Well, so then, to recap, we got a statement is not a fact, a fact is not data, or the whole dataset, data is not evidence, and evidence is not proof. Well, Alex, it would seem to follow then we have not a lot of proof, not a lot of things are proven then, based on all the ways that this could fall apart. Is that fair to say?

Alex Edmans

That’s absolutely fair to say. And what this means is that while we think, well, this is really shocking because we don’t know anything, it actually means that we can live our lives in a more relaxed way, because often things are said to us as if they’re definitive proof, “You are a bad mother if you ever breastfeed your kid,” “If you want to lose weight, you should never eat any carbs,” “If you want to train for a marathon, you should never drink any alcohol.” Often the reality is much less black and white than these prescriptive statements say.

So, by be discerning with evidence, rather than this being exhausting, because we need to question everything, actually it’s less exhausting because if we question stuff, we realize that some of these dictums and rules we’re given are not as well-founded as people claim, and this allows us to live a freer and more relaxed life.

Pete Mockaitis

Now, Alex, I’m loving the way your mind is working and processing, and sometimes I go here in terms of, you know, being curious and skeptical and exploring, “Well, hey, could it be this or could it be that, and maybe it’s not fair to interpret this or that way?” Alex, do you find that when you do this in practice with teammates, colleagues, collaborators, they just get annoyed with you? Like, “Oh, my gosh, Alex, you’re slowing us down. You’re making this much harder and longer than it needs to be.” How do we deal with some of these interpersonal dynamics when we’re vigorously pursuing truth?

Alex Edmans

Yeah, thanks for the question, Pete. And I think people can sometimes get annoyed if you’re doing it in the wrong way. So, what do I mean by the wrong way? So, sometimes if you oppose an idea based on the evidence, they think you have different goal from them when, in fact, your approach might be different. So, let’s give an example.

So, some of my work is in diversity, equity and inclusion, and I would love the evidence to be overwhelming, that diversity pays off. I’m an ethnic minority myself but I point out that actually some of the research on this claiming that DEI improves financial performance is much flimsier than often claimed. So, people can get annoyed and say, “Oh, you must be racist or sexist if you’re anti-DEI.”

Pete Mockaitis
Oh, wow, that’s hardcore.

Alex Edmans

But what I’m claiming, well, that is pretty hardcore, and these are the reasons why sometimes, on these issues, where there’s strong confirmation bias, it is hard to speak out. But what I’m saying here is I absolutely am pro-DEI, but my concern is the evidence, and the evidence here on DEI might not be as strong. Why? Because all they look at is gender and ethnicity. So, they whittle down the complexity, the totality of a person, to just their gender and ethnicity.

That gives the impression that if you’re a white male, you can never add to diversity even if you’re the first in your family ever to go to university, even if your background is humanities rather than sciences, which is what everybody else is doing in your company. So, what I’m saying is that the problem with these diversity studies does not mean that diversity is not a bad thing, but if we are to put in a DEI policy, it needs to go beyond gender and ethnicity, and look at socio-economic diversity. It needs to look at diversity of thinking, also not just diversity but also equity and inclusion.

So, by trying to say, “Hey, I’m not going to try to debunk the whole DEI movement,” but to say that if we want to implement DEI, it has to be broader than these rather reductive measures analyzed by these studies, then that’s the way hopefully the message is more positive message rather than being seen to nitpick and to get in the way of people.

Pete Mockaitis

Oh, that’s helpful, Alex. You’re sort of sharing where you’re coming from, the context, your goals, and what’s going on there. And, well, while we’re here, a brief detour. Alex, my understanding of the DEI research is that in jobs that require creativity and kind of novel thinking and approaches, that the DEI research is pretty robust in terms of having diversity in these contexts, sure enough, does result in more better ideas and good outcomes. Since you know, and I don’t, is that an accurate snapshot of the state of the support of DEI research?

Alex Edmans

That is the claim, but even that claim is not particularly backed up by data. So, let’s take one famous datapoint or one famous study. This is the TED Talk which initially was called “Want to be more innovative? Hire more women.” So, what this argued is that in an innovative setting, the more women we have, the better the performance is.

But why was the evidence incorrect? Well, number one, the measure of diversity looked at six different measures of diversity, not just gender diversity, but age diversity, lots of other forms of diversity. So, even if the results were correct, it could have been any of those diversity metrics, but they just honed in on the gender diversity because that’s the one which gets a lot of popular support.

Pete Mockaitis

It’s a good title, Alex.

Alex Edmans

Well, it was, and it was actually a good title but a misleading title. So, there were so many complaints to TED about that title that they were forced to change the title. So, the title of that talk is now “How diversity makes teams more innovative.” Now, but even that title isn’t accurate because how did they measure innovation? What they looked at was the percentage of revenues which were generated by products which were invented in the last three years. And so, that’s not necessarily a measure of innovation. That could be just a measure of obsolescence of your prior products, so maybe you’re just doing a bad job of maintaining your prior products.

Pete Mockaitis

Yeah, your old products sucked.

Alex Edmans

It might be, yeah, they all just suck. You’re just not able to maintain them, and it could be that the new products that you’re developing in the last three years, they’re just incremental changes over what you had previously. There’s nothing there which captures the magnitude of innovation. And, also, number three, it could be correlation but not causation. It could be that a great CEO, both hire as more diverse workers, and that same great CEO is also more innovative, so it’s not necessary that diversity causes innovation, something else causes both.

So, those are really basic errors. You measure diversity incorrectly, you measure innovation incorrectly, and also there could be no clear link between the two, but because that’s a nice message that people want to hear, this is something which has been well paraded. So, again, if I go back to, “How do I then approach this?”

Well, my goal is shared as the same as everybody else, I want high functioning organizations, and I’m a supporter of diversity. So, the reason why I’m raising objections is not I’m anti-DEI, but my approach to this is to look beyond just gender and ethnicity, and look at these other forms of diversity. And when you look at more careful research, then you’re right, Pete, in innovative settings, then these broader measures of diversity, such as socio-economic and cognitive diversity, they do lead to better outcomes.

Pete Mockaitis
All right. Well, Alex, what you’re showing here is proof, and even evidence, is hard to come by. But, lay it on us, some of your favorite tactics and strategies for smarter thinking. I love that notion of that suspect. Let’s pretend the data came out the opposite way. What would we conclude? Or where would we be pointed to in terms of suspects? Well, now, how does that inform how it did come out? So that’s a lovely approach. Can you lay on us a few more tools like that?

Alex Edmans

Absolutely. And what I’m going to do to do this is to go beyond just analyzing specific studies because how we want to be smarter-thinking is you want to get just different information more generally not just from studies. So, in an organization, where will these different viewpoints come from? From our colleagues. But often, we have an environment in which people might be just unwilling to speak out. So, what can we do to actively encourage dissenting viewpoints?

So, there was a time when Alfred Sloan was running GM, and he concluded a meeting by saying, “Does everybody agree with this course of action?” And everybody nodded, and then Sloan said, “Well, then we’re going to postpone the decision until the next meeting to give you the opportunity to disagree with me.” So, he recognized that no decision, no course of action that he came up with was going to be 100% perfect. So, if there were no objections, it was not because his proposal was flawless, but simply because people didn’t have time to come up with objections.

And, more generally, what can we do within an organization to encourage dissent, to encourage people to speak up. Again, going back to diversity, people think a lot about just demographic diversity, but it’s not sufficient to bring in a mix of people. We need to make sure that they feel safe to speak up. And one example could be in a meeting where you propose a strategy, and most people agree, and then one person, let’s call him David, comes up, and says “Hey, I actually have some concerns with this strategy ABC.”

Now, despite David raising the concerns, you still go ahead. Then if the chair of the meeting at the end goes to David privately, and says “You know, I really appreciate you speaking up. Even though we ended up going with the strategy, we will take all of your concerns into account.” So why is that useful? Because in the absence of that, then David might have felt, just like the question you asked me earlier, Pete, “It’s costly for me to raise a dissenting opinion. People might have seen me as being annoying, and maybe the next time I have some concerns, I’m not going to speak up and say anything because it made no difference anyway, and I just annoyed a lot of people.”

But here, if the chair just takes five minutes to say, “No, we really value in this organization people who come up with dissenting opinions,” then maybe next time, the equivalent of the Deepwater Horizon disaster would have been avoided, because then somebody like David would have said, “Hey, we have failed this negative pressure test three times. We need to take seriously the possibility that this rig is unsafe to be removed.”

Pete Mockaitis

Beautiful. Any other top strategies?

Alex Edmans

Yeah, so, in addition to this, one thing that you can do is try to assign a devil’s advocate in particular situations, which is somebody to critique a particular course of action. So, this happens in academia, my field. So, whenever a paper is presented at a conference, after the presentation, a discussant comes and comments on this. And the discussant is somebody who’s assigned to read the paper in advance and to come up with critiques, particular blind spots that the author might have.

And so, the analogy of this in a situation might be if there’s an investment management firm where some team is proposing a particular deal, is there’s somebody who might be assigned to poke holes and to scrutinize the deal and highlight all the things that can go wrong. Now, ideally, you might have a devil’s advocate emerging anyway, the culture might be such that people are willing to share their concerns, but if you’re not at that stage, if the culture is still developing, maybe just assigning somebody to find some flaws in this, this is a way of getting different viewpoints.

So, this is something that John F. Kennedy came up with when faced with the Cuban missile crisis. The immediate response to seeing these missiles being installed in Cuba was to bomb the missile sites and have a full-scale invasion, but he created this executive committee of the National Security Council, where he had two teams, one proposing the invasion, another proposing the blockade, and each team was critiquing each other’s proposed course of action so that he was able to see both sides of this difficult situation.

Pete Mockaitis

Lovely. Well, tell me, Alex, anything else you want to make sure to mention before we shift gears and hear about some of your favorite things?

Alex Edmans

It’s just to highlight that this misinformation is really important. So, you might think, “Why do I need to listen to an academic who goes through life reading and scrutinizing academic papers?” In my job, I never read a single academic paper. But what I’m trying to highlight is that whenever we make decisions, they are based ultimately on research. So, if we choose to breastfeed or bottle-feed our child, we are doing this on the basis of research.

When we’re trying to invest in a sustainable way or implement particular DEI policies, those are ultimately based on research, and so it really matters whether we use the best research, and to discern whether the research is best, we don’t need to be a scientist. We don’t need to scrutinize every footnote in a paper. We just need to ask simple, common-sense questions.

Pete Mockaitis

All right. Well, now could you see our favorite quote, something you find inspiring?

Alex Edmans

Yeah, so it’s from a Columbia finance professor, called Laurie Hodrick, where she was asked in the Financial Times, “What is your greatest lesson learned?” And she said, “You can do everything you want to and be everything you want to be, but not all at once.” So why do I like this? It’s that way because there’s loads of things that we want to do in our life, and lots of people just like to be spread really thinly and just do so many things that they just get burnt out. Instead, we have like different chapters to our career.

Pete Mockaitis

Okay. And could you share with us a favorite book?

Alex Edmans

Yeah, so The 7 Habits of Highly Effective People by Stephen Covey was something that I was given as a teenager. I didn’t read it because, as a teenager, I was busy doing other stuff. But then I read it about ten years later, and I wished that I had read it back then. There are some new books which are trying to play theme and variations on this. Books like Atomic Habits or Deep Work, and they’re not bad books, but I think the original authority on questions such as time management and discipline and focus were in the Stephen Covey book.

Pete Mockaitis

And do you have a favorite habit? Perhaps one of the seven or something homegrown?

Alex Edmans

Yeah, so it’s to try to just immerse myself without any distraction, to engage in deep work. So, there will be certain days where I will have zero meetings the whole day. So, that was yesterday, I had no meetings yesterday, no meetings all tomorrow, so that I can get really immersed in something. I’ll try to work without my phone near me. I’ll try to have my internet blocker on, which is not distracting me with email, so that when I am doing some writing, which I’m going to do tomorrow, I can do this and be in full flow.

Pete Mockaitis

All right. And is there a key nugget you share that really seems to connect and resonate with folks; they quote it back to you often?

Alex Edmans

I think it might be from my first book, which was on purposeful business, and actually the TEDx talk that that book was linked to, it’s to reach the land of profit, follow the road of purpose. And so, why is that sometimes a quoted phrase? It’s that often when people think about purpose, people claim it’s about being woke and saving the dolphins and saving the coral reefs, but a serious business person should not care about this.

I’m going to highlight that a purposeful business is not just one that is good for wider society, it’s good for the ultimate long-term success of the company as well. And so, this idea that there’s a business case for purpose, a commercial and financial case, not just a moral and ethical case, is something that resonates with people, particularly those who would otherwise be skeptical of purpose.

Pete Mockaitis

All right. And if folks want to learn more or get in touch, where would you point them?

Alex Edmans

So, my website, AlexEdmans.com, where Edmans is E-D-M-A-N-S. I’m on social media, LinkedIn and X as @aemans. And my new book May Contain Lies, there is a website attached to that book, MayContainLies.com, where, if there were instances of misinformation that I learned about after I finished the book, I do simple blog posts on that.

Pete Mockaitis

All right. And do you have a final challenge or call to action for folks looking to be awesome at their jobs?

Alex Edmans

I’ll say, just question stuff. So, if you want something to be true, just to apply this idea of imagine the opposite and think about how you would shoot this down, I think it’s just really important to try to be discerning and try to overcome our biases, these are so strong, these are things that I myself suffered from, and I think if we can overcome these biases, we will significantly improve our performance at our job.

Pete Mockaitis

All right. Alex, thank you. This has been a lot of fun and I wish you much truth in your future.

Alex Edmans

Thanks so much, Pete. Really enjoyed the interview. Thank you so much for having me on.

Leave a Reply