Introduction
The previous chapter on arguments focused on how differences in systems of beliefs can give rise to arguments. People with disparate systems of beliefs hold differing values and beliefs, which in turn influence what they consider to be basic assumptions (to be used in an argument as premises). Reasoning from these different sets of basic assumptions often yields conflicting conclusions about what is the ethically and politically 'right' thing to do (generally and specifically).
It should also be mentioned that sometimes the difference isn't so much that the values are different in an absolute sense, but that they are held to different degrees. For example, much research in social psychology has shown that conservatives favour attributing moral status and providing resources to "in group" members, while liberals often concern themselves more with "out groups" (than do conservatives). This is not to say conservatives don't care about "out groups" or that liberals don't care about "in groups,"; instead, it is a matter of relative value.
For more information on the psychological differences between conservatives, liberals, and libertarians check out this great website: http://www.moralfoundations.org/index.php?t=home
So, why does this all matter to us as critical thinkers? There are a host of reasons, but here are two important ones: The first is that understanding the role of systems of belief in an argument can help make us aware of biases in the premises (both in our opponent's argument and in our own). The second is that understanding an opponent's bias can give us hints as to how we might sway the opponent to our own point of view.
Mommy? What's a Bias?
A bias is an "inclination or prejudice for or against" some fact or point of view. In arguments, what this means is that we are prone to giving undue favour or neglect to some fact or point of view. Everybody does this (except me, of course); it's part of being a human being. As philosopher Richard Feynman says, "the first principle [of critical thinking] is you must not fool yourself--and you are the easiest to fool"!
There is a wealth of evidence in the psychological "litra-cha" demonstrating that we begin with our position first then collect or reject evidence and reasons to support that pre-existing position. Our pre-existing position is usually grounded in emotion/preferences rather that "Reason."
The more emotional our investment in an issue, the greater the likelihood that some kind of bias has crept into our supporting arguments--in attributing either undue strength to a supporting assertion or in overlooking or dismissing contrary reasons or evidence. To quote another philosopher, David Hume, "reason is slave to the passions."
Biases: Too Illegit to Quit?
We've established people (except me) have biases. Now what? Do we automatically rejet everybody's arguments 'cuz they're biased? Nope.
We can make a distinction between legitimate and illegitimate biases. The distinction will depend mostly on how opposing reasons, evidence, and arguments are portrayed, and if there are any intentional important omissions. As you might have guessed an illegitimate bias is one in which the arguer poorly or dishonestly represents the aforementioned elements, or if the bias leads to weak logical connections between premises and the conclusion. Any website or blog with a strong political bias in either direction will usually provide excellent samples of arguments with illegitimate biases.
A legitimate bias is simply favoring a point of view but not in a way such that the opposing position is misrepresented. It allows an impartial observer to fairly evaluate the proposed point of view. For example, I think everyone should be allowed to own an assault riffle-bow that fires swords for self-defense. That's my point of view.
My argument is that they are not prohibited by the Constitution, therefore, they should be legal.
My opponents reply that the 2nd Amendment isn't about arms for personal self-defense but for a well-regulated militia that should be controlled by the Gov't. They'd also might also argue that just because a small group of people a few hundred years ago voted on something, doesn't mean that we need to accept it now. Societies and circumstances change, and the best laws reflect that.
Notice that even though I'm biased toward people owning assault rifle-bows that fire swords, I don't distort the opposing arguments.
Vested Interests
A vested interest is when an arguer (or someone paying the arguer) stands to benefit from their point of view being accepted. When vested interests are involved there's a very high likelihood of illegitimate bias.
For example, when certain industries spend millions of dollars to pay lobbyists and "donate" to politicians, we can be fairly certain that their arguments for special treatment or exemption contain illegitimate biases.
Not all vested interests need be financial. One might be motivated by the desire for power, fame, revenge, attention, sex, etc.. or to get out of trouble/prove one's innocence.
We should be cautious of dismissing arguments out of hand just because the arguer has a vested interest in the outcome. That they have a vested interest tells us nothing about the argument's validity which should be evaluated independently When there is a vested interest, it simply means we should be extra cautious about illegitimate biases (and omissions). It doesn't automatically follow that their argument is invalid.
Conflict of Interest
A conflict of interest is a vested interest on steroids; i.e., when vested interests are extreme. In such cases there is usually an ethical issue involved too, and in professional settings, conflicts of interest have to be disclosed.
For example, in medical research if a university study of a drug is funded by the company that produces the drug, this is a conflict of interests for the researchers for obvious reasons. It must be disclosed at the beginning of any research that is produced. This is actually quite a big problem in medial research because drug studies that are funded by the drug producer sytematically have higher positive results than if the same drug is studied by a neutral party. For more info check out the link:
http://blogs.scientificamerican.com/guest-blog/2012/09/23/can-the-source-of-funding-for-medical-research-affect-the-results/
But bias in medicine (and elsewhere) isn't only on the "proponent's" side. Often people who oppose something for ideological reasons are just as guilty of bias.
An important recent example of a conflict of interest in medicine that wasn't disclosed was Andrew Wakefield's anti-vaccine research article in the Lancet. What he did not disclose in his research was that he had been paid several millions of dollars to do research on vaccines by a company that was developing an alternative to the conventional vaccine.
There was a clear conflict of interest because he stood to gain so much if his research showed that conventional vaccines are unsafe and that the company that had funded the research was developing an alternative.
In the end, his results were never replicated, his methods shown to be unethical, his data drawn from a statistically insignificant sample size (12 children), and the article was subsequently retracted by the publisher. However, because of the fear that came about because of his "research," there was and continues to be tremendous damage to public health.
http://debunking-andrew-wakefield.tumblr.com/
Summary:
We all have biases. What matters is the degree to which they distort the presentation of evidence and reasons in arguments both for and against the arguers position. Biases are illegitimate when they cause distortion such that arguments cannot be fairly evaluated.
For some excellent examples of how biases affect how we interpret the world, this is a beautiful article.
No comments:
Post a Comment