SocraticGadfly: Just how irrational are we? Very?

April 25, 2011

Just how irrational are we? Very?

Very, or potentially very irrational, defining "irrational" and "rational" in terms of the great project of Descartes and followers, it seems.

In a blog post at Discover, in follow-up to his column last week at Mother Jones, Chris Mooney notes that the journal Behavioral and Brain Sciences has devoted an entire issue to what he covered at Mojo, with links to summaries of key content.

Here's a couple of key outtakes:

First:
Reasoning is generally seen as a means to improve knowledge and make better decisions. However, much evidence shows that reasoning often leads to epistemic distortions and poor decisions. This suggests that the function of reasoning should be rethought. Our hypothesis is that the function of reasoning is argumentative. It is to devise and evaluate arguments intended to persuade. Reasoning so conceived is adaptive given the exceptional dependence of humans on communication and their vulnerability to misinformation. A wide range of evidence in the psychology of reasoning and decision making can be reinterpreted and better explained in the light of this hypothesis. Poor performance in standard reasoning tasks is explained by the lack of argumentative context. When the same problems are placed in a proper argumentative setting, people turn out to be skilled arguers. Skilled arguers, however, are not after the truth but after arguments supporting their views. This explains the notorious confirmation bias. This bias is apparent not only when people are actually arguing, but also when they are reasoning proactively from the perspective of having to defend their opinions.
And more, from a response to some of the issues:
When people reason alone, there will often be nothing to hold their confirmation bias in check. This might lead to distortions of their beliefs. As mentioned above, this is very much the case. When people reason alone, they are prone to all sorts of biases.

In short, as Mooney notes, classical Cartesianism appears m ore and more dead in the water. First, Dan Dennett (and others) said there is no little man, no Cartesian homunculus, making magic rationality decisions inside us.

Now, BBS et al say that, even if there were such a critter, he wouldn't be a disinterested rationalist anyway.

(The argumentative theory of reasoning is explained more here.)

But, not all commenters on Mooney's post want to accept that, it seems.

I responded to one:
Nullius, (you seem to present) a great defense of the “traditional” view of reasoning or whatever …

BUT, I’m going to argue with you.

First, the “reasoning as argumentation” model I think explicitly says this is NOT, NOT, NOT, a “human failing.” Rather, it is, if I may, “human ISness.”

I won’t propose abandoning “rationalism,” but I will say that it is even more unnatural than you may want to admit.

And, that IS a conflict with Cartesianism, which postulates rationality is a cornerstone of homo sapiens.

Sorry, but, either you don’t get the degree of implications this involves, or …
You DO, unconsciously, understand precisely what is up and by your conscious argumentation, actually support the fact at hand.
Of course, maybe I have reasons for my argumentation. And, I do.

One is to get people to accept that a Cartesian, or Platonic, idea of humans as homo rationalis simply doesn't exist. Not even in the most notable of today's skeptics. Witness Lawrence Krauss defending his billionaire hedge fund buddy.

That said, should we stop trying to be more rational? No. But, we should recognize that even apparent growth in rationality may have ulterior motives.

That said, Nullius has responded to me, and I offered some thoughts in return:
Since, contra certain Pop Ev Psychers, we evolved various mental skills over different environments and different times in the past, I would say it can't be called a "failing" today either. We didn't evolve *for* any particular time in the future, just to better adapt to the time at hand.

So, in light of human "reasoning" and today's issues, I don't consider the relative lack of rationality a "failure" primarily because I think the concept is unapplicable. So, to that, per Doug Hofstader, I apply the Zen Buddhist "mu."

Next ...

As you note, rationalism is a skill. And, per your note on Descartes, the question is, how easy to learn, or difficult to learn, is this skill? Is it like learning to throw a bowling ball at a set of pins, or to hit a major-league curveball? More like high-school algebra or advanced differential equations? I opt for the latter. (Of course, it may be some point in between.) Where you, and others, fall on how you understand this relates in part to the "failure" above and to broader issues about how much we should expect from rationality.

One implication, per the link on the argumentation view of reasoning, is when two groups both come to a theoretically well-reasoned decision within their group, is asking about how likely we can apply "metarationality."

Examples would be global warming and vaccination. In both cases (largely for mercenary reasons in the first, but largely for sincere reasons in the second) "doubters" have focused on the uncertainty issue, while to some semi-deliberate degree, at least, the scientific side has downplayed the issue.

Some examples are already at hand. Kahnemann et al in behavioral economics have *hugely* debunked the idea of man as a rational actor there. There's many implications, but basically none of them have made it to the level of changing political policy.

Reasoning within a group, vs. alone, vs. "to" another group has implications for sociology and out-groups, and how much or little we can expect people's behavior, and more, attitudes, to change.
I'm sure the dialogue will continue.

Meanwhile, this SciAm blog explains some of the reasons for our irrationality, in terms of motivators.

And, Stanley Fish's column is partially relevant, even:
(C)hanges of mind tend to be local and piecemeal, not systemic. Wholesale conversions like Paul’s on the road to Damascus do occur, but more often a change will affect only a small corner of one’s conceptual universe.
So, even if rationality spreads, it won't grow by leaps and bounds.

1 comment:

SteveA said...

Very interesting thoughts in your blog and this post. I've always wondered about how we change our mind. I come from a very rationalist Christian heritage where in the first half of the 20th century debates were prominent and decisive. These debates "proved" that having mechanical instruments of music in church worship is not authorised for Christian worship, for instance. There have been changes in our group but the change has from my perspective been glacial. Will check out Fish's column. Must get ready for work.