
Description:
About this item:
Editorial Reviews
Review:
4.9 out of 5
97.50% of customers are satisfied
5.0 out of 5 stars How to Beat the Overconfidence Effect in Yourself and Others
(function() { P.when('cr-A', 'ready').execute(function(A) { if(typeof A.toggleExpanderAriaLabel === 'function') { A.toggleExpanderAriaLabel('review_text_read_more', 'Read more of this review', 'Read less of this review'); } }); })(); .review-text-read-more-expander:focus-visible { outline: 2px solid #2162a1; outline-offset: 2px; border-radius: 5px; } In 1933, the philosopher Bertrand Russell wrote that “the fundamental cause of the trouble is that in the modern world the stupid are cocksure while the intelligent are full of doubt.” While this is just as true today as it was in the early twentieth-century, the problem actually runs deeper; almost everyone recognizes arrogance and overconfidence in others—but never in themselves.Since the time of Russell, what’s become known as the Dunning-Kruger Effect has been experimentally validated. Research shows—and personal experience confirms—that those who are the least knowledgeable in a subject tend to be the ones who overestimate their own knowledge and abilities, while those that are full of doubt know enough about the topic to better gauge the extent of their ignorance.And so the telltale sign of a lack of knowledge is, paradoxically, arrogance and overconfidence, whereas in those with actual expertise you often see the opposite: humility, doubt, and open-mindedness.Far more people fall on the side of overconfidence. This is due, at least in part, to widespread access to the internet, where people can quickly read articles and watch videos (of varying quality and credibility) on any conceivable topic, creating the impression that one has attained deep knowledge in a subject when only a very superficial understanding has been gained.Overcoming this unfortunate state of affairs is the subject of organizational psychologist Adam Grant’s latest book, Think Again, which seeks to show us how to overcome our own unjustified overconfidence by developing the habits of mind that force us to challenge our own beliefs and, when necessary, to change them.Grant begins by telling us that when we think and talk, we often slip into the mindset of three distinct professions: preachers, prosecutors, and politicians. We become preachers when the unwarranted strength of our convictions compels us to convert others to our way of thinking; prosecutors when our sole aim is to discredit the beliefs of others; and politicians when we seek to win favors from our chosen constituency.What all of these mindsets have in common is the assumption that our beliefs are infallible, and that no one could possibly have anything to teach us. Trapped in the prison cell of our own dogma, we don’t set out to learn anything or update our own beliefs; our job is simply to convert others to our way of thinking because, of course, we are right.These habits of mental imprisonment can happen to anyone at any level of knowledge or experience, and intelligence itself has actually been shown at times to be a disadvantage, as those with high IQs have the most difficulty updating their beliefs. As Dunning himself said, “The first rule of the Dunning-Kruger club is you don’t know you’re a member of the Dunning-Kruger club.” You may think all of your beliefs are correct (otherwise you wouldn’t hold them), but there is little doubt that at least some (probably many) of them are false or oversimplified. If your mind remains closed, you’ll never discover which of these beliefs require updating.The key question, then, is this: If most of us are unaware of the extent of our own ignorance, how can we hope to overcome our own resistance to change?The first step, as Grant recommends, is to detach your sense of self from any specific beliefs. If you identify with a specific set of fixed core beliefs, you will be far less likely to change your mind in the face of new evidence or better reasoning.Grant recommends instead to ground your sense of self in mental flexibility, taking pride in the fact that you’re willing to change your mind and update your beliefs. To achieve this, you must consider all of your beliefs to be provisional hypotheses and then seek to disprove them, in the process becoming more knowledgeable by being wrong more often. Using this approach, you will have discovered the ideal mindset for personal development and learning—not the mindset of a preacher, prosecutor, or politician, but the mindset of a scientist.The scientist, Grant tells us, has one overarching concern: the truth. The individual that adopts a scientific mindset will be equally motivated to challenge their own beliefs as the beliefs of others, testing hypotheses against the evidence and continually updating their beliefs in the process.Of course, as Grant points out, being an actual practicing scientist does not guarantee the adoption of this mindset. There are plenty of dogmatic scientists that don’t abide by the principles of their own training. The scientific mindset is not, as Grant is describing it, the mindset adopted by scientists necessarily, but rather the ideal mindset that follows the principles of science as an open-ended pursuit of knowledge that is constantly updated in the face of new evidence.In one interesting study described by Grant (the book is filled with fascinating examples and studies of a similar sort), two groups of entrepreneurs were provided training. One group was taught the principles of scientific thinking while the control group was not. The researchers found that the scientific-thinking group “brought in revenue twice as fast—and attracted customers sooner, too.” As Grant wrote:“The entrepreneurs in the control group tended to stay wedded to their original strategies and products. It was too easy to preach the virtues of their past decisions, prosecute the vices of alternative positions, and politick by catering to advisers who favored the existing direction. The entrepreneurs who had been taught to think like scientists, in contrast, pivoted more than twice as often.”Individuals that enjoy the prospect of being wrong—and so expand their knowledge more often—tend to be more successful and tend to hold more accurate, nuanced beliefs. It’s not that they lack confidence, it’s that their confidence is of a different nature. Flexible-minded individuals have confidence in their ability to learn and to unlearn beliefs that are outdated or are no longer serving them well. Their confidence lies in their ability to change and to adapt rather than in strength of their convictions concerning any single set of beliefs. As Nobel Prize-winning psychologist Daniel Kahneman put it, “Being wrong is the only way I feel sure I’ve learned anything.”There is definitely a line to walk, and the reader may wonder just how far they should take this advice. To constantly question every one of your beliefs would result in paralyzing doubt. Sometimes, it is the strength of our convictions that give us the energy and perseverance to pursue and accomplish our goals. So this is surely a balancing act, and while we all have to find the sweet spot between timidity and arrogance, conviction and doubt, there is little question that too many of us tend toward the extreme of overconfidence.After showing us how to become better rethinkers ourselves, in the second part of the book we learn how to open other people’s minds. Grant shows us how world-class debaters win debates, how a black musician talked white supremicists out of their bigoted views, and how doctors persuaded anti-vaxxers to get their children immunized.In every case, we learn the same lesson in the art of persuasion: to change someone else’s mind, you have to help them find their own internal motivation to change.This is not easy. The mindsets we typically slip into tend to have the opposite effect. Act as a preacher, and people will resist being told what to think (even if the facts are on your side). Act as a prosecutor, and people will resent your condescension and will become further entrenched in their original views. Act as a politician, and you’re just saying what you think people want to hear.None of these approaches are effective as tools of persuasion. It turns out that your best bet is to adopt, once again, the mindset of a scientist—and to try to get others to do the same. This will transform disagreements from battles to be won and lost into a collaborative pursuit of the truth.The most skilled negotiators, debaters, and persuaders all use similar tactics: they first find common ground and points of agreement, ask more questions to get the other person thinking deeper, present a limited number of stronger points, and introduce complexity into the topic to move the person’s thinking away from black-and-white and into shades of gray.It turns out that complexifying the issue is always key. Most people exhibit what psychologists call binary bias, or the “basic human tendency to seek clarity and closure by simplifying a complex continuum into two categories.” If you can show people—through the use of skillful questioning—that the topic they think they understand deeply (Dunning-Kruger Effect) is actually far more complex than they originally thought with more than two distinct positions, then you can plant the seeds of doubt that eventually lead to real change.One example Grant uses is climate change. We tend to think that people fall into one of two categories—climate-deniers or alarmists—when in fact there are six distinct positions people can take from dismissive, doubtful, or disengaged to cautious, concerned, or alarmed—with shades of nuance in between. It’s often the recognition of this complexity that can get people talking and engaged in productive debate.In the final part of the book, Grant shows us how to use the skills of rethinking to engage in more productive political debates, to become better teachers, and to create more innovative cultures at work. Grant provides a host of compelling examples, but my favorite is the middle-school history teacher who gets her students to think like scientists by rewriting textbook chapters that failed to cover important historical events in sufficient depth. Her students pick a time period and topic that interests them and then, through independent research, rewrite the textbook chapter, in the process cultivating the skill to always question what they read. This is a far better approach than simply delivering a lecture and forcing students to regurgitate the information on a test.Bertrand Russell was once asked in an interview if he was willing to die for any of his beliefs. His response was this: “Of course not. After all, I may be wrong.”It’s a shame that most people adopt the opposite attitude, and Grant’s latest book will go a long way to remedying this. Think Again is a timely exploration of the importance of humility and the capacity to rethink your own positions while helping others do the same.But in the spirit of the book—and to “complexify” the topic—it’s worth considering when displaying doubt and humility might actually backfire. Grant wonders this himself, and points out, for example, that displays of doubt and humility have been shown to have negative effects in the workplace in those who have not already established their competence. It can also be less effective when delivering a presentation to an already sympathetic audience. Does Grant downplay the frequency of these types of situations?Another area where excessive doubt and humility might backfire is an area that Grant fails to consider in much depth at all: arguing with bad faith actors. When discussing politics, Grant seems to assume that in most cases both sides are equally motivated by the truth, and that each side has simply failed to understand the complexity of the topic or the merits of the other side.But we know that this is not always the case. In politics, people have a host of motives when arguing that sometimes have very little to do with the truth: the desire for power, money, influence, and sometimes simply the desire to offend and get a rise out of people. Grant does not cover how to handle these situations—or how to identify them—and it is highly unlikely that the tactics of the book will work in these situations.Additionally, it seems that the masses respond better to confidence when electing political representatives, because we know that Trump was not elected based on his knowledge or competence—and certainly not on his humility.When dealing with bad faith actors, perhaps a good strategy would be to start with a simple question, one Grant mentions in the book: “What evidence would change your mind?” If the answer is “nothing,” then it’s probably best to walk away. Either way, a chapter or section on bad faith actors and the questions you can ask to identify them would have been a welcome addition to the book.But of course, this book is not the final word on the topic, and Grant wouldn’t want it to be. As we gain better evidence and more experience, it’s our responsibility to continually rethink and update our beliefs. As Russell said, “If you’re certain of anything, you’re certainly wrong, because nothing deserves absolute certainty.”
5.0 out of 5 stars For real we all need to think again.
Read this book because of a work challenge loved this book because it's actually really really good. I have recommended to so many of my friends. To question what we think we already know is a bit scary but when we do opens so many more possibilities. This is one of those books for me that I will end up re reading every couple years so I can refresh. I highlighted my kindle book but I really may just need to purchase a hard copy so I can highlight and take notes.
4.0 out of 5 stars Think Again. The power of knowing what you don’t know. By Adam Grant
The author, Adam Grant, is a professor of Organizational Psychology at Wharton, with a special interest in evidence-based management.When we think of smart people, we usually understand them to be able to deal with complex problems quickly. It is common to presume that if a person has to rethink and unlearn what they know, it is because they aren’t that smart, and didn’t think well enough in the first place.The thrust of this book is the demonstration that there are two cognitive skills that matter more than any others: the ability to rethink and unlearn.Consider this: You have just completed a multiple-choice test, and you have enough time left to review your work. When you come across an answer that you are not sure is correct, would you change it or leave it? (Pause for your instinctive answer.) Research indicates that ¾ of all people feel it will hurt their score to change. Research also shows that they would have been right to change their answer, but chose to stick to their first opinion, their existing answer. Only ¼ would have been wrong to change the answer they selected.This is called the ‘first instinct fallacy.’People seem quite willing to change many parts of their lives, such as their wardrobe or kitchen. However, we are unwilling to change deeply held knowledge or opinions.The reason for this is that changing deeply held knowledge or opinions threatens our identity, our understanding of who we are. I am a capitalist, I am a member of this faith, I only use alternative medicine, and so on. We are inclined to hold on to beliefs for the comfort of conviction, rather than the discomfort of doubt.Grant was part of Harvard’s first online social network. It connected freshmen before university started, and one in eight of the large intake, participated. When they started university, they abandoned the network and shut it down. The well-learnt view was that online tools connect people far away, not when you live in walking distance from each other.Five years later Mark Zuckerberg started Facebook on the same campus. This experience caused “rethinking to become central to my sense of self,” Grant explains.How does rethinking happen? People with ‘super smart’ or ‘regular’ intelligence have all the tools they need for rethinking. The challenge is remembering to use them. If one needs any incentive to take this valuable skill to heart today, here are some medical statistics.In 1950 it took 10 years for medical knowledge to double. By 1980 it was doubling every 7 years, and by 2010, every 3.5 years. Clearly medicine is not the only field growing at this rate.Philip Tetlock (author of ‘Super-Forecasting’, reviewed in this column) has a useful description of the mindsets we tend to slip into, to avoid rethinking ideas.The first is the “Preacher”. When our ‘sacred’ beliefs are in jeopardy, we ‘deliver sermons’ to protect and promote our ideals. Changing our minds would be a mark of moral weakness.The second is the “Prosecutor” which entails recognizing the flaws in the other person’s position, and marshalling arguments to prove them wrong and win our case. By ‘prosecuting’ others who are wrong, we ensure we are not persuaded, and so don’t have to admit defeat.The third is the “Politician” where the outcome we desire is winning over an audience, and we will change position in response to what is more popular.The correct and most valuable mindset is that of the “Scientist” because it is a sign of intellectual integrity. The scientist mindset shifts when shown sharper logic and stronger data. It doesn’t see learning as a way to affirm our beliefs, but rather, (and this is so important,) to evolve our beliefs. I cannot think of any professional activity that would not be enhanced by this stance. This is not capitulation: it is the evolution of your opinion and belief.It is easy to see the value of the scientific approach from research on startups. Unschooled in the scientific mindset, the control group averaged less than $300 in annual revenues. The group taught scientific thinking, averaged more than $12,000 in revenues.Grant raises the question as to whether mental horsepower guarantees mental dexterity. The unequivocal answer is no. In fact, it has been shown to be liability.A study of American presidents was undertaken to identify one trait that could consistently predict presidential greatness - controlling for years in office, wars, and scandals. What emerged was “their intellectual curiosity and openness.” All the presidents who contributed significantly to the country, were interested in hearing new views and revising their old ones. They may have been ‘politicians’ by profession, but they solved ‘problems’ like scientists.This is as true in business. In 2004, a group of Apple engineers, designers, and marketers tried to persuade Steve Jobs to adapt the best-selling product at the time, the iPod, into a phone. Jobs was strongly against dealing with mobile data and voice suppliers because they imposed constraints on the manufacturers of cellphones. After six months of discussion with Jobs, he agreed to the development of the iPod so it could have calling capacity. Four years after it launched, the iPhone accounted for half of Apple’s revenue.In a US - China study of the leadership characteristics of the most productive and innovative teams, it was found that they were not run either by confident leaders or humble leaders. Rather, they were run by leaders with high levels of confidence and with humility. This combination results in the leader having faith in their strengths, but being keenly aware of their weaknesses.Great discoveries don’t start with a high five and a shout of Eureka! Rather they start with "that's funny..."Ray Dalio, founder of the extraordinarily successful hedge fund, Bridgewater, remarked: “If you don’t look back at yourself and think, ‘Wow, how stupid I was a year ago,’ then you haven't learned much in the last year.”Reading Grant’s book will assist.Readability Light --+-- SeriousInsights High -+--- LowPractical High ---+- Low*Ian Mann of Gateways consults internationally on strategy and implementation, is the author of ‘Strategy that Works’ and a public speaker. Views expressed are his own.
Loved it
(function() { P.when('cr-A', 'ready').execute(function(A) { if(typeof A.toggleExpanderAriaLabel === 'function') { A.toggleExpanderAriaLabel('review_text_read_more', 'Read more of this review', 'Read less of this review'); } }); })(); .review-text-read-more-expander:focus-visible { outline: 2px solid #2162a1; outline-offset: 2px; border-radius: 5px; } Loved the book, another hit by Adam. Makes you humble and eager to learn more.
best book ever
It is amazing book , concept and it is really well written-The art of question what we already know is intriguing and also exiting
da leggere!
da leggere!
Opens up your mind
Love how this books makes us aware of our own self-serving bias. If you are planning to find out how to lead better, interact better or just learn better, then this is the book for you
Excelente
Excelente para reaprender, entre otras cosas
Visit the Penguin Books Store
Think Again: The Power of Knowing What You Don't Know
BHD9851
Quantity:
Order today to get by
Free delivery on orders over BHD 20
Product origin: United States
Electrical items shipped from the US are by default considered to be 120v, unless stated otherwise in the product description. Contact Bolo support for voltage information of specific products. A step-up transformer is required to convert from 120v to 240v. All heating electrical items of 120v will be automatically cancelled.
Similar suggestions by Bolo
More from this brand
Similar items from “Motivation & Self-Improvement”
Share with
Or share with link
https://www.bolo.bh/products/U1984878123