site hit counter

∎ [PDF] Free Irrationality (Audible Audio Edition) Stuart Sutherland Kris Dyer Audible Studios Books

Irrationality (Audible Audio Edition) Stuart Sutherland Kris Dyer Audible Studios Books



Download As PDF : Irrationality (Audible Audio Edition) Stuart Sutherland Kris Dyer Audible Studios Books

Download PDF  Irrationality (Audible Audio Edition) Stuart Sutherland Kris Dyer Audible Studios Books

Why do doctors, generals, civil servants and others consistently make wrong decisions that cause enormous harm to others? Irrational beliefs and behaviours are virtually universal.

In this iconoclastic book Stuart Sutherland analyses causes of irrationality and examines why we are irrational, the different kinds of irrationality, the damage it does us and the possible cures.


Irrationality (Audible Audio Edition) Stuart Sutherland Kris Dyer Audible Studios Books

This is a reissue of a 1992 book, and there are now many books available on cognitive biases. But research in this area since 1992 shows that the book is by no means outdated, and the findings Sutherland reports are generally still valid. So I would say that this book remains one of the better books on cognitive biases, because it has an extensive scope, summarizes many studies, and summarizes the conclusions of those studies, all in a style which is quite readable and sometimes entertaining. I can see how some 'sensitive' readers might get rubbed the wrong way by Sutherland's occasional bluntness, but I like the way he gets to the point and doesn't mince words.

Because the book covers its scope well and I can find no real fault with it, I see no reason to give it less than 5 stars. Here are some key points from the book:

(1) 'Irrationality' involves cognitive processes which are prone to result in inaccurate conclusions or suboptimal decisions relative to available evidence and time constraints. So irrationality will often lead to worse outcomes, but not always ('luck' is a factor also).

(2) The prevalence of irrationality may be due to our cognitive processes having been evolved for physical and social environments which are different from modern life. So what appears as a problematic 'bias' today may have been adaptive in the distant past. But we also need to remember that heuristic shortcuts, though prone to bias and inaccuracy, can also be very effective in many situations, and in the real world we don't have the luxury of thoroughly analyzing everything (rationality is bounded).

(3) Emotions (which can lead to self-deception and denial), stress, rewards, punishments, and even boredom can contribute to irrationality, but the issue is mostly cognitive.

(4) As situations become more complex, we tend to become more irrational. So taking a more systematic approach to making decisions can be helpful as complexity increases (eg, listing pros and cons of options).

(5) A good grasp of basic statistics can significantly reduce irrationality.

(6) Many cognitive biases are tied to availability bias, which involves latching on to what comes to mind readily. Availability is tied to strong emotions, recency, first impressions, etc. Anecdotes feed availability bias.

(7) The halo effect is when people overgeneralize and infer too much about a person from a few traits, and it can undermine the effectiveness of interviews. It's generally better to assume that people are a mix of strengths and weaknesses.

(8) Deference to 'experts' and other authority figures, and conformity with groups (eg, peer pressure), are natural instincts, and they also provide people with 'cover', with the result that they can greatly compromise group effectiveness by promoting groupthink (homogenization of group cognition), increasing risk taking by the group, and even promoting violent group behavior. There can be a vicious cycle here, where group dynamics make groups become increasingly extreme rather than more moderated, and increasingly hostile towards other groups. Contrary to common opinion, sports increase this hostility rather than decreasing it. Groups may also be irrational because of individuals in the group putting their own interest ahead of that of the group in various ways (ingratiating themselves with group leaders, refusing to speak up because it may compromise their position in the group, etc.).

(9) People are often NOT open-minded. We tend to filter and interpret according to what we want to believe, and once someone has a firm belief, presenting them with disconfirming evidence can cause them to rationalize in a way that results in the belief becoming even firmer; we're good at 'making up' explanations for things. This confirmation bias can affect our basic perception and our memory, not just our cognition, and can result in 'illusory correlation', where we falsely perceive a relationship between things. So it's important to sometimes test our beliefs by deliberately trying to falsify them, simultaneously consider multiple competing hypotheses, and consider both positive and negative cases.

(10) Even when correlations are real, it's important not to mix up cause and effect, nor ignore 'common causes' (causes which the correlation between two effects, with neither effect being a cause of the other). When multiple causes contribute to an effect, we sometimes will pick out the most cognitively available one as the dominant cause.

(11) Combining implausible and plausible information tends to make the implausible information and its source appear more plausible and credible.

(12) Anecdotes and small samples are often non-representative of overall trends and can result in incorrect inferences. But beware that even large samples can be biased for various reasons (eg, self-selection bias).

(13) We often fail to recognize that many small probabilities can add up to a large probability in cumulative situations (eg, event risk vs lifetime risk), whereas the probability of an event will be reduced to the extent that a concurrence of factors is needed for the event to occur (ie, joint probability).

(14) A 'slippery slope' can be thought of as a process of normalizing deviance.

(15) We generally tend to be more concerned with avoiding losses than making gains. It may be that losses cause more pain than gains cause pleasure. Insurance is an example where we (rationally) take a definite but smaller loss in order to prevent an uncertain but larger loss.

(16) Continuing in an unpromising direction or refusing to cut losses may be due to sunk-cost bias. The remedy is to focus on the future, rather than the past (other than learning from the past).

(17) Offering a substantial reward for performing a task which is inherently pleasant can backfire by causing that task to become devalued. This has implications for education and employee compensation.

(18) People will tend to dislike something more if it's forced on them rather than freely chosen.

(19) Punishing children, and neglecting them when they cry, are strategies which usually backfire.

(20) We have some bias towards short-term focus, often to the detriment of our long-term interests.

(21) The way questions, situations, and options are framed can dramatically (and irrationally) influence what decisions are made. The 'power of suggestion' exploits this, and the 'anchoring bias' is an example of this.

(22) Overconfidence bias can result in our overestimating our ability to make predictions about the future and overestimating our knowledge and skill in general, as well as the hindsight bias of thinking that we would have made a better prediction or decision in the past than someone else did. Hindsight bias is also driven by inaccurate memories of past events, as well as our knowledge of actual outcomes and our ability to generate causal explanations (which may not be accurate), which can lead us away from considering alternative possible outcomes. Hindsight bias can inhibit our learning from the past and thus prevent recurrence of mistakes. 'Experts' are as prone to these biases as anyone else, and in some cases more prone to them.

(23) We tend to take personal credit for successes, while blaming situations for our failures. But we tend to take the opposite views with the successes and failures of others. And we tend to assume that other people are more similar to us than they actually are.

(24) Bad ergonomic design (eg, instrumentation layout) can contribute to human errors and failures. Designers need to account for the limitations and tendencies of human operators, including how operators may change their behavior in unexpected ways in response to changes in system design.

(25) Risk assessment can be extraordinarily difficult with complex systems. And risks which are insidiously 'spread out' in time and space tend to be taken less seriously than risks which can cause a concentrated loss, even though the former may represent a much higher overall risk.

(26) Our bounded rationality and need to satisfice can lead to irrationality in various ways, such as our neglecting major factors and/or overweighting minor factors when making decisions. On a broader level, we may also spend too little time on major decisions and too much time on minor decisions.

(27) 'Regression to the mean' is the tendency for extreme events to be most likely followed by events which are less extreme, rather than comparably or more extreme. It's therefore irrational to expect that extreme events will continue in the same extreme direction, or to be surprised when they don't.

(28) Our intuition/judgment is often less accurate than formal analysis, even though we tend to be (over)confident about our intuition/judgment. But the judgment of groups is usually (not always!) better than that of individuals, since errors tend to cancel out in groups. Boredom, fatigue, illness, and situational distractions can all compromise judgment. This issue applies to interviewing as well, so formal testing is a useful interviewing method.

(29) The perceived marginal benefit of money tends to decrease as we have more money.

(30) We tend to not accurately predict how we will feel emotionally in situations we haven't previously experienced (eg, marriage, career change, relocation, etc.).

(31) Irrationality can be reduced by working hard to make a habit of rationality.

Product details

  • Audible Audiobook
  • Listening Length 10 hours and 32 minutes
  • Program Type Audiobook
  • Version Unabridged
  • Publisher Audible Studios
  • Audible.com Release Date December 17, 2009
  • Whispersync for Voice Ready
  • Language English, English
  • ASIN B00316QK90

Read  Irrationality (Audible Audio Edition) Stuart Sutherland Kris Dyer Audible Studios Books

Tags : Amazon.com: Irrationality (Audible Audio Edition): Stuart Sutherland, Kris Dyer, Audible Studios: Books, ,Stuart Sutherland, Kris Dyer, Audible Studios,Irrationality,Audible Studios,B00316QK90
People also read other books :

Irrationality (Audible Audio Edition) Stuart Sutherland Kris Dyer Audible Studios Books Reviews


Most examples we all know about. Many of our decisions make sense until we experience the results. Let's try to learn from our past mistakes before we expedite our current decisions!
We are more irrational than I thought!
I just recently purchased this book and it was written some time ago (early 1990s) and doesn't have the objectiveness one might expect from newer works. I like a science-oriented book to cite some results of a test, study or expereiment, and let those results suggest something. I'm happy to have the author expound on the result. "Irrationality" often had statements of fact that weren't clearly supported, again, probably just reflecting the time it was written in. Readers expect a bit more now days.
Provocative and interesting reading
The book covers most areas of irrational decision making in a way that make it easy to follow and make transferrals to ones own setting. Widely useful
This book uncovers some huge misconceptions that the general public has. It proposes many studies and experiments that illustrate them. It uncovers situations in which irrationality thrives in. I highly recommend this book to all.
While the book has some interesting insights into the irrational decision people make, very little is new information. Chances are that you have read most of this information in other books or articles.

On the other hand, his last few chapters go into rating decisions on the basis of a numerical system (say 1 to 5). But in most cases the arguments for or agains a decision have no known outcome. For example, if you think you will live to be 90, you would give the decision to have back surgery a high rating (rather than live with the pain). On the other hand if you expect to live to only 85 maximum, you would give the choice for surgery a lower rating. All you need is a memo from God to tell you when you will die.

I would not discourage you from reading the book. I had fun taking the "pieces apart." But chances are that you could spend your time more productively.
This is a reissue of a 1992 book, and there are now many books available on cognitive biases. But research in this area since 1992 shows that the book is by no means outdated, and the findings Sutherland reports are generally still valid. So I would say that this book remains one of the better books on cognitive biases, because it has an extensive scope, summarizes many studies, and summarizes the conclusions of those studies, all in a style which is quite readable and sometimes entertaining. I can see how some 'sensitive' readers might get rubbed the wrong way by Sutherland's occasional bluntness, but I like the way he gets to the point and doesn't mince words.

Because the book covers its scope well and I can find no real fault with it, I see no reason to give it less than 5 stars. Here are some key points from the book

(1) 'Irrationality' involves cognitive processes which are prone to result in inaccurate conclusions or suboptimal decisions relative to available evidence and time constraints. So irrationality will often lead to worse outcomes, but not always ('luck' is a factor also).

(2) The prevalence of irrationality may be due to our cognitive processes having been evolved for physical and social environments which are different from modern life. So what appears as a problematic 'bias' today may have been adaptive in the distant past. But we also need to remember that heuristic shortcuts, though prone to bias and inaccuracy, can also be very effective in many situations, and in the real world we don't have the luxury of thoroughly analyzing everything (rationality is bounded).

(3) Emotions (which can lead to self-deception and denial), stress, rewards, punishments, and even boredom can contribute to irrationality, but the issue is mostly cognitive.

(4) As situations become more complex, we tend to become more irrational. So taking a more systematic approach to making decisions can be helpful as complexity increases (eg, listing pros and cons of options).

(5) A good grasp of basic statistics can significantly reduce irrationality.

(6) Many cognitive biases are tied to availability bias, which involves latching on to what comes to mind readily. Availability is tied to strong emotions, recency, first impressions, etc. Anecdotes feed availability bias.

(7) The halo effect is when people overgeneralize and infer too much about a person from a few traits, and it can undermine the effectiveness of interviews. It's generally better to assume that people are a mix of strengths and weaknesses.

(8) Deference to 'experts' and other authority figures, and conformity with groups (eg, peer pressure), are natural instincts, and they also provide people with 'cover', with the result that they can greatly compromise group effectiveness by promoting groupthink (homogenization of group cognition), increasing risk taking by the group, and even promoting violent group behavior. There can be a vicious cycle here, where group dynamics make groups become increasingly extreme rather than more moderated, and increasingly hostile towards other groups. Contrary to common opinion, sports increase this hostility rather than decreasing it. Groups may also be irrational because of individuals in the group putting their own interest ahead of that of the group in various ways (ingratiating themselves with group leaders, refusing to speak up because it may compromise their position in the group, etc.).

(9) People are often NOT open-minded. We tend to filter and interpret according to what we want to believe, and once someone has a firm belief, presenting them with disconfirming evidence can cause them to rationalize in a way that results in the belief becoming even firmer; we're good at 'making up' explanations for things. This confirmation bias can affect our basic perception and our memory, not just our cognition, and can result in 'illusory correlation', where we falsely perceive a relationship between things. So it's important to sometimes test our beliefs by deliberately trying to falsify them, simultaneously consider multiple competing hypotheses, and consider both positive and negative cases.

(10) Even when correlations are real, it's important not to mix up cause and effect, nor ignore 'common causes' (causes which the correlation between two effects, with neither effect being a cause of the other). When multiple causes contribute to an effect, we sometimes will pick out the most cognitively available one as the dominant cause.

(11) Combining implausible and plausible information tends to make the implausible information and its source appear more plausible and credible.

(12) Anecdotes and small samples are often non-representative of overall trends and can result in incorrect inferences. But beware that even large samples can be biased for various reasons (eg, self-selection bias).

(13) We often fail to recognize that many small probabilities can add up to a large probability in cumulative situations (eg, event risk vs lifetime risk), whereas the probability of an event will be reduced to the extent that a concurrence of factors is needed for the event to occur (ie, joint probability).

(14) A 'slippery slope' can be thought of as a process of normalizing deviance.

(15) We generally tend to be more concerned with avoiding losses than making gains. It may be that losses cause more pain than gains cause pleasure. Insurance is an example where we (rationally) take a definite but smaller loss in order to prevent an uncertain but larger loss.

(16) Continuing in an unpromising direction or refusing to cut losses may be due to sunk-cost bias. The remedy is to focus on the future, rather than the past (other than learning from the past).

(17) Offering a substantial reward for performing a task which is inherently pleasant can backfire by causing that task to become devalued. This has implications for education and employee compensation.

(18) People will tend to dislike something more if it's forced on them rather than freely chosen.

(19) Punishing children, and neglecting them when they cry, are strategies which usually backfire.

(20) We have some bias towards short-term focus, often to the detriment of our long-term interests.

(21) The way questions, situations, and options are framed can dramatically (and irrationally) influence what decisions are made. The 'power of suggestion' exploits this, and the 'anchoring bias' is an example of this.

(22) Overconfidence bias can result in our overestimating our ability to make predictions about the future and overestimating our knowledge and skill in general, as well as the hindsight bias of thinking that we would have made a better prediction or decision in the past than someone else did. Hindsight bias is also driven by inaccurate memories of past events, as well as our knowledge of actual outcomes and our ability to generate causal explanations (which may not be accurate), which can lead us away from considering alternative possible outcomes. Hindsight bias can inhibit our learning from the past and thus prevent recurrence of mistakes. 'Experts' are as prone to these biases as anyone else, and in some cases more prone to them.

(23) We tend to take personal credit for successes, while blaming situations for our failures. But we tend to take the opposite views with the successes and failures of others. And we tend to assume that other people are more similar to us than they actually are.

(24) Bad ergonomic design (eg, instrumentation layout) can contribute to human errors and failures. Designers need to account for the limitations and tendencies of human operators, including how operators may change their behavior in unexpected ways in response to changes in system design.

(25) Risk assessment can be extraordinarily difficult with complex systems. And risks which are insidiously 'spread out' in time and space tend to be taken less seriously than risks which can cause a concentrated loss, even though the former may represent a much higher overall risk.

(26) Our bounded rationality and need to satisfice can lead to irrationality in various ways, such as our neglecting major factors and/or overweighting minor factors when making decisions. On a broader level, we may also spend too little time on major decisions and too much time on minor decisions.

(27) 'Regression to the mean' is the tendency for extreme events to be most likely followed by events which are less extreme, rather than comparably or more extreme. It's therefore irrational to expect that extreme events will continue in the same extreme direction, or to be surprised when they don't.

(28) Our intuition/judgment is often less accurate than formal analysis, even though we tend to be (over)confident about our intuition/judgment. But the judgment of groups is usually (not always!) better than that of individuals, since errors tend to cancel out in groups. Boredom, fatigue, illness, and situational distractions can all compromise judgment. This issue applies to interviewing as well, so formal testing is a useful interviewing method.

(29) The perceived marginal benefit of money tends to decrease as we have more money.

(30) We tend to not accurately predict how we will feel emotionally in situations we haven't previously experienced (eg, marriage, career change, relocation, etc.).

(31) Irrationality can be reduced by working hard to make a habit of rationality.
Ebook PDF  Irrationality (Audible Audio Edition) Stuart Sutherland Kris Dyer Audible Studios Books

0 Response to "∎ [PDF] Free Irrationality (Audible Audio Edition) Stuart Sutherland Kris Dyer Audible Studios Books"

Post a Comment