Sunday, May 18, 2014

Critical Thinking Short Cut

Introduction
Critical thinking sounds fancy but it's something most of us do everyday.  In fact, most of us are quite good at it...so long as we are properly motivated.   Unfortunately, we absolutely suck when the motivation is wrong--and I'm not talking about money.

What do I mean by all this talk of correct and incorrect motivation?  What I mean is that, when someone is trying to argue against our own cherished position and beliefs, most of us are very good at pointing out the problems with our opponent's arguments.  We are primarily motivated to want to be 'right' and to protect our most cherished beliefs from criticism and so, in such cases, our critical thinking skills are generally quite good.

We are absolutely horrible critical thinkers when an argument or evidence favors our existing cherished beliefs. Why? Because, as I said before, we are primarily motivated to be right and to protect our beliefs, thus, we will usually uncritically accept any argument or evidence that supports our position.  Any argument or evidence that serves our purpose is, ipso facto, good.  Consider:  When was the last time you were critical of an argument or evidence that supported your own position on an issue?

What's the moral of the story here?  Most of us already have a intuitive grasp of critical thinking, what messes it up is our motivation to be right and to protect pre-existing conclusions.  So, if we want to be good critical thinkers, we need to manipulate our motives; or at the very least be aware of their capacity for distortion.

What is Critical Thinking?
To understand what good critical thinking is, it helps to contrast it with poor critical thinking.  The way most people reason is that they look at the conclusion of an argument or the conclusion some evidence implies and assess whether that conclusion agrees with their pre-existing beliefs and positions.  If the argument/evidence agrees with or supports their pre-existing position then the argument/evidence is considered good.  If the converse is true, then the argument/evidence is considered defective.   To summarize the problem: Most people focus on whether they agree or disagree with a conclusion rather than on the quality of the argument/evidence.  This approach is not good critical thinking.

In critical thinking we don't care two hoots whether we agree or disagree with the conclusion:  all we are interested in is whether the argument or the evidence is good support the conclusion.  Critical thinking is mainly about two things:  (a)  standards of evidence (i.e., what constitutes good evidence?) and the logical relationship and relevance of premises of arguments to their respective conclusions (i.e., does the conclusion follow from the premises?).  That's all.  The end.  Good night.  (For convenience, I'll refer to both of these aspects as "quality of evidence/arguments").

The Secret that Professors Don't Want You to Know:
Good critical thinking is all about focusing on the quality of the arguments/evidence relative to conclusions but unfortunately our brains are hardwired to look at the conclusions relative to our pre-existing beliefs. Because we should really focus on the quality of arguments/evidence, we need a trick to overcome the tendency to focus on the conclusion.

The Ol' Switcheroo Version 1:  (a) If an argument or evidence supports your position, ask yourself if you'd find the argument/evidence compelling if the same quality of evidence/justification supported the opposite conclusion.  (b) If an argument/evidence is against your current position, ask yourself if you'd find the quality of evidence/justification compelling if it supported your position.


For example (a):  Suppose you think vaccines cause autism and to support your conclusion you cite the fact that your nephew has autism and he was vaccinated; therefore, vaccines cause autism.  To apply critical thinking special secret #1 we construct a similar argument but for the opposite conclusion:  E.g., I have a nephew and he was vaccinated and isn't autistic; therefore, vaccines don't cause autism.

If your original position was that vaccines cause autism, would this second argument cause you to change your position?  Nope, I doubt it would, and for good reason: a single case doesn't tell us anything about causal relations.  Notice that applying secret thinking sauce #1 allows us to focus on the quality on the evidence rather than on whether we like the conclusion. So, if the second argument fails as good support for the conclusion, so does the first, even though it supports your position.  Boom! goes the dynamite.

Lets try another example (b):  Suppose you are an anthropogenic climate change denier.  Someone argues against your cherished beliefs by saying 97% of climate scientists agree that human activity is responsible for climate change.  Your natural reaction is to discount this as an insignificant argument because it contradicts your pre-existing position.  Now apply critical thinking secret sauce #1 and ask yourself: If 97% of climate change scientists denied that human activity has any effect on the climate, would you consider this as good support for your position?

Lets try a moral example:  In the "homosexuality is bad vs homosexuality isn't bad" debate both sides often make appeals to what is or isn't natural behavior as justification for their position.  Lets apply critical thinking secret sauce to both sides to show why both justifications are weak: 

"Homosexuality is morally wrong because it's unnatural."  The justification here is that moral wrongness is a function of whether something is unnatural.    Now, applying the ol' switcheroo, we ask the person who takes this position: Supposing homosexuality were natural, would you then agree that homosexuality is morally permissible? They will likely answer, "no" thereby indicating that naturalness is a poor justification for moral permissibility. 

But it isn't just evangelical moralists that are using poor justifications for their claim.  Lets apply the same test to those who argue that homosexuality is morally permissible because it is natural for a certain percentage of the population to be gay (usually some sort of genetic argument is given).  Lets try applying the ol' switcheroo:  

Suppose scientists discover that there is no "gay gene" and that homosexual behavior is purely a matter of some combination of socialization and personal choice.  If this were the case, would proponents of the argument then say "welp, I guess homosexuality is morally wrong after all"?  Probably not.  And the reason is that whether a behavior is natural or not tells us nothing about that behaviors moral status.  

Whatever one's opinion on the moral status of homosexuality, the ol' switcheroo shows us that both positions cannot be supported through appeals to "naturalness".  That is, the quality of that particular justification is weak regardless of which conclusion we are sympathetic to.

The Ol' Switcheroo Version 2:  Sometimes issues are such that the simple switcheroo won't work too well in helping to focus our minds on the quality of arguments/evidence; so, we need a variation of the switcheroo to deal with those situations.  Here it is: (a*) If an argument/evidence supports your pre-existing position, ask yourself if a similar argument or evidence would be convincing to you in an different issue to which you are opposed.  (b*)  If an argument/evidence is against your cherished beliefs, ask yourself if a similar argument would be convincing in an issue where you are a proponent.  

Basically, in this version we're trying to generalize the principle that is being used to justify a conclusion then apply it to other cases to see if the principle is being applied consistently or (as is often the case) the principle is being used when it supports a conclusion we like but is being denied when it supports a conclusion we dislike.

Example (a*):  Suppose you think homeopathy works and that you are generally skeptical of conventional medicine.  To support homeopathy you cite a particular scientific study shows that 70% of subjects no longer had condition X after homeopathic treatment.  The study has a sample size of 10 and there's no control group.  Ask yourself, would such a study convince you that a new conventional medication was effective for condition X?

Of course not.  A sample size of ten is way too small to conclude anything of consequence and the lack of a control group makes a study, especially of this size, essentially worthless.  If the evidence in the second case wouldn't be good support the conclusion, then the same applies to the first case.  Critical thinking secret v.2 allows you to see why the evidence you've provided isn't good.

Example (b*)  Suppose again your pre-existing position is that global climate change is not caused by human activity.  Someone points out that 97% of climate scientists think the opposite: that global climate change is attributable to human activity.  Now apply critical thinking secret sauce v.2:  pick an issue where you have a pro position or even one where you don't have position: suppose it's that it's consistent with the 2nd amendment that people should be able to own guns free of restrictions.  Ask yourself: if 97% of all constitutional experts agreed that unrestricted gun ownership is consistent with the 2nd amendment, would you consider this to be a good reason in favor of your position?  If yes, then you have to also allow that it's a good reason in the first case too.

No comments:

Post a Comment