Wednesday, February 5, 2014

Lecture 3B: Burden of Proof, Conditions of Premise Acceptability/Unacceptability/Questionability

HW 3B&Review
1.  Show some of the best. (Didn't have time to post all)
2.  Some people didn't quite get it.  (Read the instructions, bad argument vs the specific fallacies, completing the explanation).
3.  Bonus. (Nicole Cobeo)
4.  Quick review of key concepts.

Examples of Common Beliefs

Burden of Proof
1. BoP and Conclusions:  Starting point for who has to provide evidence and how much.
2.  Context and BoP: Historical, Cultural, and Audience.
Women's Education Group
(Is Borat right? Average brain sizes? Race)
3.  BoP and Arguments from Authority
4.  Proving a Negative/Arguments from Ignorance/Inappropriate burdens of proof.

5.  Null hypothesis and Gumballs

BoP and Premise Acceptability



Premise Acceptability
1. General Heuristic:  Acceptable vs Unacceptable
Step 1: Would the audience accept the claim without further support?
Step 2:  Are the claims reasonable?
If the argument is for a universal audience then (1) also satisfies (2).
Step 3: Move up the layers in the argument applying (1) and (2).
Unacceptable:  If at any stage the answer to either (1) and (2) are "no", then further support is needed.

Questionable:  The wording is vague OR there isn't enough information to decide either way OR you don't have the background knowledge to decide either way OR any combination.
When you evaluate a premise as "questionable" you must support this assessment by citing one of the aforementioned reasons. Also, when you evaluate a premise as questionable, you must employ the principle of charity and ask "what evidence would have to be provided for this premise to be acceptable".  If that evidence would be accepted by a reasonable audience, then you may give the premise the benefit of the doubt (for homework and tests, you show this process in writing).

Exercises: Do 8A 1, 2, 3.

Conditions of Premise Acceptability: The Nitty-Gritty
1.  Acceptable by Definition or Self-Evidentially Acceptable.
  E.g.  Bachelor, triangle,
  E.g.  Law of non-contradiction, law of the excluded middle, disjunctive syllogism
2.  Acceptable as a Factual Statement Reporting an Observation or as a Statement of Eye-Witness Testimony. 
  E.g., "It's cold outside"
  E.g., "Yesterday, I ate eggs for dinner."
  Issue: Grant benefit of the doubt unless the source is known to be unreliable.
3.  Acceptable by Common Knowledge or Assent
  Issue:  Distinguish between descriptive and normative claim/judgments.
  E.g., The ACA was drafted by the Obama administration.
  E.g., The ACA is a horrible/excellent piece of legislation drafted by the Obama legislation.
  Issue: Factual claims directed at expert audiences vs universal audiences.  Unless the claim is directed   at a universal audience, it's acceptable.
  Issue:  Actual vs. expected knowledge of the audience.  Availability of knowledge.
4.  Acceptable Because it Is Defended in a Reasonable Sub-Argument
  E.g., Aquinas' cosmological argument; More sea ice; Death penalty p. 203
5. Acceptable on the Authority of the Arguer or an Expert
  E.g., if the arguer is an authority/expert on a topic.
  E.g., if the arguer cites an expert on a topic.

Exercises:  Do 8B a, b, c

Conditions of Unacceptability (besides failing general rules 1 or 2)
1.  Internal Inconsistency
(a) E.g., Donald Trump on climate change.
       
(b) E.g., (P1)  Only claims that can be verified by 3rd parties can be trusted.
          (P2)  Hundreds of people have claimed to have been abducted by aliens.
          (C)   Therefore, alien abductions are really happening.

 (c) E.g., The US Government is one of the most incompetent institutions anywhere on earth--especially    under G. W. Bush.  9-11 was a top-secret government co-ordinated false flag operation to take away    our civil liberties.

(d) E.g., From facebook:
Post: After 8 years of trying to convince my father to stop taking his statins (cholesterol lowering medications he was prescribed by his GP), he's finally done his own investigation, and, after examining the evidence, has decided to stop taking them. 
http://drhyman.com/blog/2014/01/06/stop-statins/
Commenter: I wish I could convince my dad that dr are people too an not experts on everything!!
Poster: Brenda-- if he's on statins, please show him this amazing documentary that interviews experts in the field that actually go over the data

http://www.newyorker.com/online/blogs/borowitzreport/2014/05/gop-bombshell-an-evil-mastermind-behind-most-elaborate-cover-up-in-us-history.html

Warning!  Don't confuse internal inconsistency with tu quoque.

2.  Begging the Question
(a) E.g.,
(P1)  Alice says she is honest.
(P2)  If an honest person says something, it must be true.
(C)   Therefore Alice is an honest person, because an honest person says so.

(b) E.g.,  p. 206 Good Reasoning Matters
How do we know the bible is the right criterion of truth?  All through the scriptures are found...expressions such as Thus says the Lord," "The Lord said," and "God spoke."  Statements like "Thus says the Lord" occur no less than 1, 904 times in the 39 books of the Old Testament.

(c) E.g.,  (From Russell's Problems of Philosophy)
  (P1)   Other people besides myself report seeing a table.
  (C)  Therefore, the table exists as a physical object, not just an idea in my mind.
  (MC)  Therefore, there are physical objects and a mind-independent reality.

 (d) "It's morally permissible to eat animals because we are humans and they are animals."

(e)  "Gay marriage is wrong because it goes against tradition."

3.  Problems with Language (Vagueness) 
   E.g., "I've never been really sick."
   E.g., "I don't drink regularly."
   E.g., "I've never had any major problems with my vehicle."
   E.g., "Prostitution is a violation of human dignity."

Exercises: 8C a, b, c

How to Skeptic
John Oliver on Miss. America and Scholarships
1.  Of every major premise, fact check and, for politically controversial facts, be sure to check several sources.  I suggest beginning with wikipedia for scientific and conspiracy claims or any topic where there will be expert opinion.  You may use blogs be judicious.  Thoroughly check the credentials of the blogger (see: Arguments from Authority).

2.  Be cautious of partisan websites posing as impartial.  Always try to find out where they get their funding.  You may use partisan websites, but do not accept any statements that aren't supported by a citation.  Be alert for the fallacy of confirming instances.  Cut and paste the source of the data (i.e., any study that is cited to support a premise/evidence) in your browser and at least read the abstract to check for slanting by omission.

3.  For "too-good-to-be-true" health and/or conspiracy claims, type in the claim followed by "debunk."  Be careful because some snakeoil websites are figuring out this strategy and are including "debunk" in their search terms for the article.  It's a war zone out there!  Be sure to secure your tinfoil hat to your head!

Study: Experts vs Online Consumers Evaluation of Website Credibility

Examples
1. E.g., Cantor, Feb. 4: The CBO’s latest report confirms what Republicans have been saying for years now.
Under Obamacare, millions of hardworking Americans will lose their jobs and those who keep them will see their hours and wages reduced.
Boehner tweeted, "Pres. Obama’s #hcr law expected to destroy 2.3 million jobs
2. E.g. Obama:
"What is and isn't a Schedule I narcotic is a job for Congress."
http://www.politifact.com/truth-o-meter/statements/2014/feb/04/barack-obama/barack-obama-says-its-congress-change-how-feds-cla/

3.  Natural News vs Mercola vs Sciencebasedmedicine

Homework 3B for CSN and UNLV Class
From the text book do
Ex 8A 1-6
Ex 8B a-g

Monday, February 3, 2014

Lecture 3A: Cognitive Biases, Fallacy of Confirming Instances, Falsificationism, Slanting by Distortion and Omission,

Homework 2B & Announcements
A.  Questions?
B.  Are there any relevant distinctions between funding sources and policy in regards to public benefit?
C.  Student work & recognition
D.  Challenging your beliefs & arguments

Game

I. Cognitive Biases: Confirmation Bias, Negativity Bias, Availability Bias

Comic

A. Confirmation Bias: Only remembering/counting the hits and forgetting/ignoring the misses.
B. Negativity Bias: Tendency to over-emphasize negative results/events/data.  We tend to remember negative events better than positive events. (Except in the elderly)
C. Availability Bias:  Tendency to over-emphasize events/data that is recent.
D.  Effects of Bias on Reasoning:
Chris Moony on Political Bias and Scientific Evidence
Political Bias and Numeracy
Effect of Bias on Reasoning
Backfire Effect
Bias in "Science" on social media (Chocolate study)
Effect of biases on perception

Can you spot the difference?




E.  Examples

Trump on Global Warming
SIDS and Vaccines
Argument From Design/Teleological Argument

Influenza B Vaccine and Diabetes

II. Fallacy of Confirming Instances

Relevant Stats

More Examples
Green Sweater Story
The catastrophe survivor
Challenge: What are some examples of the fallacy of confirming instances (both sides) in the gun control debate?  The Obamacare debate?
Atheists on Religion
Obamacare Anecdotes and Arguments
E.g.  Gamblers on their own gambling; How the media reports gambling stories (focus on winners vs. losers).

III. Falsificationism
A.  If X then Y will happen.
Y happened, therefore X is true. (true or false?)
X-->~Y
B.  Explanatory vs Predictive Power
Mercury and Autism
C.  Relationship to chance.
D.
E.  "A woman explained to other members of the audience that they don’t need a blood test to see if they’re affected. “There’s a simple test you can take with hydrogen peroxide and red wine in your mouth. And when you spit it out, you will see all those nanoparticles collect in the bowl. Every single one of you are affected. Every single one.”[1:33:45]"

V. Slanting by Omission
A.  Omission


http://skepdic.com/fullmoon.html
Full Moon and Emergency Room Visits


Antarctica Is Gaining Ice

File Drawer Effect

Military spending cuts: Almost 500 billion over 10 years!
British Empire
Context
Context w/Allies

B. Comparisons
  1. Averages and Distribution
Average (Mean) Household Income 69, 000 vs Median 50, 000


  2.  Apples to Oranges
Movie ticket in 1960 vs Now
male-female wage gap

3. Several types of misleading comparisons:
Female hurricane names and "deadliness"

4. Difficult cases:http://www.nybooks.com/articles/archives/2014/nov/20/why-innocent-people-plead-guilty/
How prevalent is the phenomenon of innocent people pleading guilty? The few criminologists who have thus far investigated the phenomenon estimate that the overall rate for convicted felons as a whole is between 2 percent and 8 percent. The size of that range suggests the imperfection of the data; but let us suppose that it is even lower, say, no more than 1 percent. When you recall that, of the 2.2 million Americans in prison, over 2 million are there because of plea bargains, we are then talking about an estimated 20,000 persons, or more, who are in prison for crimes to which they pleaded guilty but did not in fact commit.

Misleading Comparisons Using Graphs
https://www.boundless.com/statistics/frequency-distributions/frequency-distributions-for-qualitative-data/misleading-graphs/

comparison of drugs in terms of harms
Actual study


Homework 3A
Meme:  Go to http://memegenerator.net/
A) 1.  Pick an Issue:  2.  Design a meme for each side of the issue that commits the fallacy of confirming instances to deliberately mislead.  3.  Provide the additional information to correct/counter/give context to the point the argument in the meme is making.
B) 1.  Pick an Issue.  2.  Design a meme that slants by omission.  3.  Provide the additional information to correct/counter the point the argument in the meme is making.
C) I would like to post some of the best memes.  If you don't want your work posted on the blog, please let me know and I can post it anonymously or not at all.
D) Be sure to follow the instructions and create a meme that commits the specific fallacies--not just general poor arguments.
E) Read the following article http://www.vox.com/2014/12/22/7433899/debunk-how-to and answer the following questions (1) What is the background attitude one should have in order to avoid acquiring false beliefs? (2) What is the information deficit model and why doesn't it fully explain the backfire effect?

Bonus:
What critical thinking error was committed in HW 2B?  How would you redo the homework assignment to avoid committing this error?

Sunday, February 2, 2014

Lesson 3A: Confirmation Bias, Fallacy of Confirming Instances, Falsificationism, Slanting by Omission and Distortion

Introduction
In this section we are going to start learn how to detect BS. Let's move beyond the general notion of 'bias' and get more specific about biases and how they affect the strength and validity of arguments.  Recall that one way we can classify biases is according to how much skin the arguer has in the game; that is, the degree to which the arguer stands to gain from his audience accepting his position.  In this respect we can make 3 broad categories of bias: legitimate, illegitimate, and conflict of interest.  By now you should be able to say something about each type.  Moving on...

Cognitive Biases
Another way to classify bias in an argument is according to how the hard-wiring in our brains affect the way the information is presented and interpreted.  A cognitive bias is when our brain's hard-wiring has an unconscious effect on our reasoning.  It is a current area of philosophical debate as to whether cognitive biases are on the whole beneficial or detrimental to our reasoning.  We'll set these concerns aside for this class and operate under the assumption that in many instances cognitive biases do negatively influence our capacity to reason well.  

There are hundreds of cognitive biases but the most common and the one to which we can trace most errors in reasoning is called confirmation bias.  Confirmation bias is when we only report the "hits" and ignore the "misses"; in other words, we only include information/evidence/reasons in our argument that support our position and we ignore information that disconfirms.  Confirmation bias is often (but not always) unintentional and everyone does it to some degree (except me).

What?  You don't think you do?  Oh, I get it.  You're special. Ok, smarty pants.  Here's a test.  Lets see how smart you are.  And don't forget you've already been give fair warning of what's going to happen. The smart money says you will still fall into the trap.

Click on this link and do the test before you continue:
http://hosted.xamai.ca/confbias/index.php
.
.
.
.
.
I said do the test first!
.
.
.
.
.
.
Well?  Vas happened? I'm going to continue with the assumption that you committed the confirmation bias.  Hey, don't feel bad--we're hardwired for it.  Before we move forward and discuss how and why confirmation bias works, let me take you on a philosophical aside.

Aside on Falsificationism
I promised myself I wouldn't do this but it'd be helpful to bring in a little philosophy here.  Please meet my good friend Carl Popper (no relation to the inventor of the popular snack food known as Jalepeno Poppers).

Popper made a very important philosophical observation in regards to how we can test a hypothesis:  he said we cannot test for a hypothesis' truth, rather we can only test for its falsity.  This is called falsificationism.  In other words, there are infinitely many ways to show that a hypothesis is true, but it only requires one to show that it is false.  We should focus on looking to falsify rather than to confirm.

In technical philosophy we refer to an instance of a falsification as a counter example.  A counter example is a case in which all the premises are true but the conclusion is false (more on this later).

For illustrative purposes lets apply this principle to the number-pattern test from the link.  You were given a series of numbers and asked to identify the principle that describes the pattern.  Suppose, (unbeknownst to you) the ordering principle is any 3 numbers in ascending order.  How did you go about trying to discover the ordering principle? You looked at the numbers and like most people thought it was something to do with even numbers evenly spaced.  You looked at the sample pattern and tried to make patterns that conformed to your hypothesis.

For instance, if the initial pattern was 2, 4, 6 you might have thought, "ah ha! the pattern is successive even numbers!"  So, you tested your hypothesis with 8, 10, 12.  The "game" replied, yes, this matches the pattern.  Now you have confirmation of your hypothesis that the pattern is successive even numbers.  Next, you want to further confirm you hypothesis so you guess 20, 22, 24.  Further confirmation again!  Wow! You are definitely right!  Now, you plug you hypothesis (successive even numbers) into the game, but it says you are wrong.  What?  But I just had 2 instances where my hypothesis was confirmed?!

Back to Confirmation Bias
Here's the dealy-yo.  You can confirm your hypothesis until the cows come home.  That is, there are infinitely many ways to confirm the hypothesis.   However, as Popper noted, what you need to do is to ask questions that will falsify possible hypotheses.  So, instead of testing number patterns that confirm what you think the pattern is, you should test number sequences that would prove your hypothesis to be false.  That is, instead of plugging in more instances of successive even numbers you should see how the game responds to different types of sequences like 3, 4, 5 or 12, 4, 78.  If these are accepted too, then you know your (initial) hypothesis is false.

Lets look at this from the point of view of counter examples.  Is it possible that all our number strings {2, 4, 6}, {8, 10, 12}, {20, 22, 24} are true (i.e., conform to the actual principle--ascending order) but our conclusion is false (i.e., the ordering principle is sequential even numbers).  The answer is 'yes', so we have a counter-example.  In other words, it's possible for all the premises to be true (the number strings) yet for our conclusion to be false.  

How do we know our premises can be true and the conclusion false?  Because our selected number stings are also consistent with the actual ordering principle (3 numbers in ascending order).  If this is the case (and it is), all of the premises are true and our conclusion (our hypothesis) is false.  We have a counter-example and should therefore reject (or in some cases further test) our hypothesis.

If you test sequences by trying to find counter-examples you can eventually arrive at the correct ordering principle, but if you only test hypothesis that further confirm your existing hypothesis, you can never encounter the necessary evidence that leads you to reject it.  If you never reject your incorrect hypothesis, you'll never get to the right one! Ah!  It seems sooooooo simple when you have the answer!

Why do we care about all this as critical thinkers?
When most arguments are presented, they are presented with evidence.  However, (usually) the evidence that is presented is only confirming evidence.  But as we know from the number-pattern example, the evidence can support any number of hypothesis.  To identify the best hypothesis we need to try to disconfirm as many hypotheses as possible.  In other words, we need to look for evidence that can make our hypothesis false.  The hypothesis that stands up best to falsification attempts has the highest (provisional) likelihood of being true.

As critical thinkers, when we evaluate evidence, we should look to see not only if the arguer has made an effort to show why the evidence supports their hypothesis and not another, but also what attempt has been made to prove their own argument false.  We should also be aware of this confirmation bias in our own arguments.

Bonus Round:  Where do we often see confirmation bias?
Conspiracy theories and alt-med are rife with confirmation bias.  Evidence is only used that supports the hypothesis.  Alternative accounts of the results are not considered and there is often no attempt to falsify the pet hypothesis.

Confirmation Bias and the Scientific Method:
We'll discuss the scientific method in more detail later in the course but a couple of notes are relevant for now.  The scientific method endeavors to guard against confirmation bias (although, just as in any human enterprise, it sometimes creeps in).  There are specific procedures and protocols to minimize its effect.  Here are a few:
  • When a scientist (in a lab coat) publishes an article, it is made available to a community of peers for criticism.  (Peer review)
  • Double blinding
  • Control Group
  • Incentives for proving competing hypotheses and theories wrong (be famous!)
  • Use of statistical methods to evaluate correlation vs causation
Confirmation Bias 2:  Slanting by Omission and Distortion
Slanting by omission and distortion are 2 other species of confirmation bias.  Slanting by omission, as you might have guessed, is when important information is left out of an argument to create a favorable bias.

Perhaps a contemporary example can be found in the gun-rights debate.   We often hear something like "my right to bear arms is in the Constitution."  While this is true, the statement omits the first clause of the Second Amendment which qualifies the second, i.e., that the right to bear arms arises out of the historical need for national self-defense. The Constitution is mute on the right to bear arms for personal security.  There also the troublesome word "well-regulated".

Omitting these facts slants the bias in favor of an argument for an unregulated right to bear arms based on personal self-defense.  This may or may not be a desirable right to have, but it is an open question as to whether this right is constitutionally grounded.

Another example of slanting by omission might be the popular portrayal by the media of terrorists in the US of being of foreign origins.  Such an argument omits many contemporary acts of domestic terrorism perpetrated by white American males (for example, Ted Kaczynski aka the unibomber and Timothy McVeigh).

Slanting by distortion is when opposing arguments/reasons/evidence are distorted in such as way as to make them seem weaker or less important than they actually are.  Think of slanting by distortion as something like white lies.

For example, famously, when Bill Clinton said "[he] did not have sexual relations with that woman," he was slanting by distortion in the way he deceptively used the term 'sexual relations'.

Summary
  • A common type of bias is confirmation bias in which only confirming evidence and reasons are cited, and falsifying evidence is ignored.  
  • A good way to test a hypothesis or argument is to ask whether it's possible for all the premises to true and the conclusion to be false; that is, are there counter examples.  Instead of emphasizing confirming evidence, a good argument also tries to show why counter examples fail.  In other words, it shows why, if all the premises are true we must also accept the particular conclusion rather than another one.  
  • As critical thinkers assessing other arguments, we should try to come up with counter examples.
  • Slanting by omission is when important information (relative to the conclusion) is left out of an argument.
  • Slanting by distortion is when opponents arguments/evidence are unfairly trivialized.

Wednesday, January 29, 2014

Lecture 2B: Critical Thinking Manifesto, Bias, Detecting Illegitimate Biases, Legitimate and Illegitimate Arguments from Authority

Homework
1. What happened to Sidgwick and Ami's insights?
2. Discuss Moral Foundations results:  Expectations vs Actual.

Ami's Critical Thinking Manifesto
1. Thou shall ignore the conclusions of arguments and instead focus on evaluating the quality of support for a conclusion: thou shall align one's position with the strongest evidence and arguments.
2. Thou shall employ the principle of charity.  Thou shalt not commit the strawman fallacy.
3. Thou shalt not commit the fallacy of confirming instances; Thou shall evaluate the entire data set.
4. Thou shalt not concern one's self with absolute values (numerical, probabilistic, or moral); instead thou shall always concern one's self with relative values. This is particularly important when assessing risk.
5. Thou shalt not use anecdotes as evidence unless one's conclusion is that one is a poor reasoner.


Relative Risk: http://www.cdc.gov/vaccines/vac-gen/whatifstop.htm
Risk vs hazard based policy: https://risk-monger.com/2016/05/04/risk-based-or-hazard-based-regulation/
"Stickiness" of false beliefs and effect of biases: http://www.newyorker.com/science/maria-konnikova/i-dont-want-to-be-right

Introduction Biases Activity. Google and Cancer Causes/Cures.

Bias, Illegitimate Bias, Vested Interests, and Conflicts of Interest
A.  Definitions
Bias:  An inclination or prejudice for or against. (Examples of bias in social media)
Illegitimate Bias: A bias that interferes with one's judgment or reasoning.
Illegitimate Bias Vs. Legitimate Bias:
Vested Interest:  When the arguer stands to gain in some important way if their conclusion is true.  I.e., there is a personal benefit to the arguer if their position turns out to be true or is believed to be true.
Conflict of Interest:  Vested interest on steroids.   "A conflict of interest is a set of circumstances that creates a risk that professional judgement or actions regarding a primary interest will be unduly influenced by a secondary interest."[1] Primary interest refers to the principal goals of the profession or activity, such as the protection of clients, the health of patients, the integrity of research, and the duties of public office. Secondary interest includes not only financial gain but also such motives as the desire for professional advancement and the wish to do favours for family and friends, but conflict of interest rules usually focus on financial relationships because they are relatively more objective, fungible, and quantifiable." (Wikipedia)

B.  Examples
Climate Denial Arguments from Vested Interests
Wakefield
Kids for Cash
Oil and Gas Policy and Political Contributions
Oil and Gas and Politicians
Alcohol Industry and Marijuana Legislation
Immigration Law and Private Prison Industry



C.  Example of HW:  Conflicts of Interest in Politics?  Say it ain't so!
Oil and Gas
http://votesmart.org/

Arguments from Authority:  Legitimate and Illegitimate
2.  Degrees of Legitimate Arguments from Authority
   a.  Individual vs. Individual
   b.  Individual vs. Consensus
   c.  Galileo/Einstein Argument
3.  Problems Part 1
   a.  Analogy to the overconfidence bias.   
Not all policy disputes turn on issues amenable to scientific investigation, of course, so no one 
would or should expect that what scientists have to say will resolve every conflict. But when empirical 
assessments of risk and risk abatement are exactly what members of the public are fighting about, why is the prevailing opinion of scientists—on questions only they are equipped to answer—so infrequently 
treated as decisive? 
4.  Solutions: 
   a.  The secret philosophy professors don't want you to know:  The Ol' Switcheroo
   b.  A priori agreement.
5. Articles:
http://www.motherjones.com/environment/2014/05/harry-collins-inquiring-minds-science-studies-saves-scientific-expertise
Periodic table of Expertise
Problems with Online Pay-to-Publish Journals


Homework:
1.  Go to http://influenceexplorer.com/ and pick an industry.
2.  Summarize which party is getting most of the money and by about how much.
3.  Pick one or two of the top recipients then put their name in the search field of http://votesmart.org/. "Voting Record" should come up in the search results.
4.  Look to see how they voted on legislation that involves the industry/interest group from which they received their funding.
5.  Write a brief paragraph on your own thoughts on the relationship between money and democracy. How well are the general public's interests represented by elected officials?   Propose 1 or two reforms and provide an argument for how your reforms might address particular problems with the existing system.
6. The Revolving Door and Financial Regulation

Lesson 2B: Biases, Vested Interest, Conflicts of Interest

Introduction
The previous chapter on arguments focused on how differences in systems of beliefs can give rise to arguments. People with disparate systems of beliefs hold differing values and beliefs, which in turn influence what they consider to be basic assumptions (to be used in an argument as premises).  Reasoning from these different sets of basic assumptions often yields conflicting conclusions about what is the ethically and politically 'right' thing to do (generally and specifically).  

It should also be mentioned that sometimes the difference isn't so much that the values are different in an absolute sense, but that they are held to different degrees.  For example, much research in social psychology has shown that conservatives favour attributing moral status and providing resources to "in group" members, while liberals often concern themselves more with "out groups" (than do conservatives).   This is not to say conservatives don't care about "out groups" or that liberals don't care about "in groups,"; instead, it is a matter of relative value.

For more information on the psychological differences between conservatives, liberals, and libertarians check out this great website: http://www.moralfoundations.org/index.php?t=home

So, why does this all matter to us as critical thinkers?  There are a host of reasons, but here are two important ones:  The first is that understanding the role of systems of belief in an argument can help make us aware of biases in the premises (both in our opponent's argument and in our own).  The second is that understanding an opponent's bias can give us hints as to how we might sway the opponent to our own point of view.

Mommy?  What's a Bias?
A bias is an "inclination or prejudice for or against" some fact or point of view.  In arguments, what this means is that we are prone to giving undue favour or neglect to some fact or point of view.  Everybody does this (except me, of course); it's part of being a human being.  As philosopher Richard Feynman says, "the first principle [of critical thinking] is you must not fool yourself--and you are the easiest to fool"!

There is a wealth of evidence in the psychological "litra-cha" demonstrating that we begin with our position first then collect or reject evidence and reasons to support that pre-existing position.  Our pre-existing position is usually grounded in emotion/preferences rather that "Reason."  

The more emotional our investment in an issue, the greater the likelihood that some kind of bias has crept into our supporting arguments--in attributing either undue strength to a supporting assertion or in overlooking or dismissing  contrary reasons or evidence.  To quote another philosopher, David Hume, "reason is slave to the passions."

Biases:  Too Illegit to Quit?
We've established people (except me) have biases.  Now what?  Do we automatically rejet everybody's arguments 'cuz they're biased?  Nope.

We can make a distinction between legitimate and illegitimate biases.  The distinction will depend mostly on how opposing reasons, evidence, and arguments are portrayed, and if there are any intentional important omissions.  As you might have guessed an illegitimate bias is one in which the arguer poorly or dishonestly represents the aforementioned elements, or if the bias leads to weak logical connections between premises and the conclusion.  Any website or blog with a strong political bias in either direction will usually provide excellent samples of arguments with illegitimate biases.

legitimate bias is simply favoring a point of view but not in a way such that the opposing position is misrepresented. It allows an impartial observer to fairly evaluate the proposed point of view.  For example, I think everyone should be allowed to own an assault riffle-bow that fires swords for self-defense.   That's my point of view.


My argument is that they are not prohibited by the Constitution, therefore, they should be legal.  

My opponents reply that the 2nd Amendment isn't about arms for personal self-defense but for a well-regulated militia that should be controlled by the Gov't.  They'd also might also argue that just because a small group of people a few hundred years ago voted on something, doesn't mean that we need to accept it now.  Societies and circumstances change, and the best laws reflect that.  

Notice that even though I'm biased toward people owning assault rifle-bows that fire swords, I don't distort the opposing arguments.

Vested Interests
A vested interest is when an arguer (or someone paying the arguer) stands to benefit from their point of view being accepted.  When vested interests are involved there's a very high likelihood of illegitimate bias.

For example, when certain industries spend millions of dollars to pay lobbyists and "donate" to politicians, we can be fairly certain that their arguments for special treatment or exemption contain illegitimate biases.

Not all vested interests need be financial.  One might be motivated by the desire for power, fame, revenge, attention, sex, etc.. or to get out of trouble/prove one's innocence.

We should be cautious of dismissing arguments out of hand just because the arguer has a vested interest in the outcome.  That they have a vested interest tells us nothing about the argument's validity which should be evaluated independently   When there is a vested interest, it simply means we should be extra cautious about illegitimate biases (and omissions).  It doesn't automatically follow that their argument is invalid. 

Conflict of Interest
A conflict of interest is a vested interest on steroids; i.e., when vested interests are extreme.  In such cases there is usually an ethical issue involved too, and in professional settings, conflicts of interest have to be disclosed.

For example, in medical research if a university study of a drug is funded by the company that produces the drug, this is a conflict of interests for the researchers for obvious reasons.  It must be disclosed at the beginning of any research that is produced.  This is actually quite a big problem in medial research because drug studies that are funded by the drug producer sytematically have higher positive results than if the same drug is studied by a neutral party.  For more info check out the link:
http://blogs.scientificamerican.com/guest-blog/2012/09/23/can-the-source-of-funding-for-medical-research-affect-the-results/


But bias in medicine (and elsewhere) isn't only on the "proponent's" side.  Often people who oppose something for ideological reasons are just as guilty of bias.  

An important recent example of a conflict of interest in medicine that wasn't disclosed was Andrew Wakefield's anti-vaccine research article in the Lancet.  What he did not disclose in his research was that he had been paid several millions of dollars to do research on vaccines by a company that was developing an alternative to the conventional vaccine.    

There was a clear conflict of interest because he stood to gain so much if his research showed that conventional vaccines are unsafe and that the company that had funded the research was developing an alternative.

In the end, his results were never replicated, his methods shown to be unethical, his data drawn from a statistically insignificant sample size (12 children), and the article was subsequently retracted by the publisher.  However,  because of the fear that came about because of his "research," there was and continues to be tremendous damage to public health.
http://debunking-andrew-wakefield.tumblr.com/

Summary:  
We all have biases.  What matters is the degree to which they distort the presentation of evidence and reasons in arguments both for and against the arguers position.  Biases are illegitimate when they cause distortion such that arguments cannot be fairly evaluated.  

For some excellent examples of how biases affect how we interpret the world, this is a beautiful article.

Sunday, January 26, 2014

Lesson 2A.3 Audiences

Review
Hey guys, in the last section we looked at systems of belief from the point of view of the arguer.  Doing so helps us to become better critical thinkers in two important ways

(a)  it helps us to identify what might be hidden assumptions in the argument that we might (i) attack or (ii) (if we agree with the position) try to strengthen.  

(b) When we turn a critical eye on our own beliefs and values, understanding systems of belief allows us to identify premises or beliefs that might not be accepted at face value by our opponent(s).  If we can identify these elements, we can anticipate where our opponent will attack our argument and launch a pre-emptive defensive strike by strengthening those premises/assumptions.

One final review note is to recall the elements that influence our system of belief 
(often unbeknownst to us). They include things like: race, sex, nationality, culture, language, family, economic class, social class, religion/non-religion, peer group, career, education, and whether you like cilantro or not. 

Systems of Belief and the Audience
Obviously facts about the person making the argument are important (especially when it's me!) but as critical thinkers and arguers it's also good to consider the system of belief of the audience to whom the argument is addressed.  

There are two general ways to "chop up" the concept of 'audience': (a) according to clusters of values and (b) according to anticipated receptivity to our argument.

When we consider an audience as a group that shares common beliefs and values we call this a specific audience.  Some examples would be Catholics, faculty, Democrats, hockey fans, the NRA, the ACLU, Hispanics, tourists, people that live in Summerlin, philosophers, and so on.  There are often specific audiences within larger specific audiences.  For example, Republicans are a sub-group of American, and 'Ron-Paul Republican' is a sub-group of Republicans.  Wherever there are 'clumps' of values, there are specific audiences.

universal audience is more of an abstract concept than an actual blood and flesh audience.  While it's debatable that there is a set of (non-trivial) values that unite everyone, you should think of a universal audience as "the common person." As an arguer addressing a universal audience, you'd want to begin with assumptions/values/beliefs that just about any rational person could agree to (such as pizza makes us happy).


Suppose you were a Ron-Paul-lovin', Ayn-Rand-worshipping, pick-up-truck-drivin' Libertarian and you wanted to logically explain to a Karl Marx-lovin', Grateful dead-listenin', group-hug hippy Liberal why there should be no restrictions on the right to bear arms.  It might do you some good to consider something about your audience's values and basic assumptions.  Much of what you might say about gun rights would take for granted things that those damn hippies would object to! 


So, what should you do? Well, what you'd want to do is "construct an argument that makes an effort to respond to your audiences convictions and concerns" (p. 19).  If you begin with premises/assumptions/values that you share with the hippies, then you stand a chance of working an argument that they will at least consider.  

Conversely, if you begin your reasoning with premises that bear no relation to those of your audience, they won't even try to follow your reasoning because you are beginning with premises to which they don't agree.

Key point:  A good argument is sensitive to the values/beliefs/convictions of the intended audience.  A good arguer will modify their argument depending on the audience.

Three Types of Audiences based on Receptivity:
Generally we can distinguish between 3 types of audience based on (anticipated) degree of receptivity to the argument. 


A sympathetic audience probably already agrees with many of the values connected to the conclusion of the argument.  For example, if I'm arguing against abortion to a group of evangelical christians, I probably don't have to spend much time arguing for the premise that a fetus is a person with rights.

An open audience does not share our position but is open to considering it.  Such audiences generally don't have values so disparate from those of the arguer.  We don't have to search too hard to find common ground in values and beliefs from which we may begin to reason toward our argument.

hostile audience does not share our position or many of our values and beliefs and is not open to considering it.  For obvious reasons this is the toughest type of audience to argue with.  When common beliefs and values are scarce, it is difficult to find a starting point from which to begin.  Some political debates can appear this way because some groups value individual autonomy over collective needs.  When differences are so fundamental, it's hard to know where to begin.  


Also, with a hostile audience, because the differences in beliefs and values are so fundamental, they are central to that group's identity.  Relinquishing those values might mean leaving the group, something to which most are adverse. The emotional component makes arguing with a hostile audience even more difficult because heightened emotions often shut us off to 'reason'.

The Flip Side
While it is very helpful to take into account your audiences' beliefs and values, we should be cautious not to exploit them.  We see this happen all the time with cults, psychics, medical quackery, and--of course--politics.  An unscrupulous cult leader or "psychic" can appeal to an audiences' values for reasons of exploitation.  


Recall from previous lessons that most of our values and beliefs are acquired uncritically as a result of how we experience the world.  Because of their uncritical origins, we are often eager to assent with anyone who shares our beliefs/values.  Right?  Now look into my eyes and give me all your money!

However, while you might be able to pursued a particular audience with an argument that appeals to specific values, once you try to apply that same argument to a broader audience, you will surely encounter resistance!

Conclusion
The lesson here is that it is important to take into account the values and beliefs of your audience in how you present your argument.  The most effective arguments begin with the values and beliefs shared by the specific audience at which the argument is targeted.  And then, using reason, reasons, and evidence you lead them down the garden path into the waiting jaws of your conclusion.

A caveat is that, while your argument should be tailored to a specific audience, it should not rely so heavily on the beliefs and values of that audience such that a more general audience wouldn't take the argument seriously. 

Lesson 2A.2 Systems of Belief and Sidgwick's Insight

Review: 
In the last post we gave a formal definition to an argument:  a set of reasons and evidence that support a conclusion.  We also discussed the two main components of an argument: the premises and the conclusion.  Recall that the conclusion is the central claim that the arguer is trying to make.  If they do their job well, they will support that claim with relevant premises (i.e., reasons and evidence).  If they don't, they might as well just be waiving their hands in the air and jumping up and down.

In this next section we will look at how certain facts about the person or group making an argument influences various aspects their argument.

Arguers and Systems of Belief
Overview
As much as many of us would like to think we are objective thinkers, we often are not.  Hume famously argued that "reason is slave to the passions."  The general idea is this:  We begin with a position that we are emotionally attached to and we collect evidence and arguments to support what we already believe.  This is as opposed to how most people think they operate; that is, collect evidence and consider reasons and then see where that leads.  There is a wealth of psychological research showing that Hume was right about most people, most of the time.

Mommy, Where do Beliefs Come from?
As we go through our early life, we uncritically acquire a "web" of beliefs based on experiences.  How we experience the world, and the types of experiences we have depend heavily on things out of our control.  Typical elements that form our system of belief include: race, culture, socio-economic class, attractiveness, gender, education, family life, religion/non-religion, nationality, geography, and so on.

Examples of how politics and social media interact to shape our view of the world.

Often, before our ability to reason develops, may of these beliefs become central to our identity.  To have them shown to be false would be to admit that something important to our identity is false.  Having our identity come under scrutiny is often an emotionally painful experience and so we vigorously protect the beliefs that form the core of our identity--often ignoring contravening reasons and evidence.

So, why does this all matter?  Because when it comes to arguments about things that are really important to us, our arguments are often driven by emotion rather than reason and even-handed evaluation of reasons and evidence.  So, on such issues, instead of entering the debate with the attitude, "well, lets look at the reasons and evidence for both positions and evaluate which is best," what often happens is we enter a debate with a pre-existing particular position.  We then use arguments to defend the position that we already held--no matter the relative quality of argument for the other position.

In other words, we are emotionally attached to a conclusion before any real critical thought begins.  From that conclusion, we use argumentation, reason, logic to arrive where we already were!  Our reason is slave to the passions; i.e., reason serves to justify the positions we already hold.  Or, to paraphrase Hume again, "man is not the rational animal but the rationalizing animal."

(Note: There are several recent trends in psychology and philosophy that argue that rather than having a distorting effect, emotions play an important role in various domains such as social and ethical reasoning.)

Now, to be clear, there's nothing wrong with holding a position on an issue, however, what is important is to be aware of how our web of beliefs and emotions influence our ability to effectively argue for a position and evaluate the issue

Elements of a Web of Belief
As critical thinkers we need to pay close attention to how a person's web of beliefs influences the assumptions they will make; that is, what sorts of things will they take for granted. For example, in the abortion debate, opponents of abortion will often take it for granted that a fetus is a person.  This assumption stems from many facts about their personal history.  Such facts might include: race, religiosity and religion (or lack of), gender, sex, education, career, and socio-economic class.

Some proponents of abortion might even agree that fetus is in some ways a person.  But for them the desires of the autonomous woman carrying the fetus outweigh those of the fetus.  But is this a scientific question where someone in a lab coat can put all the fetus' desires into a beaker and put all the pregnant woman's desires into another then put them on a scale and measure which have more weight?  No.  To demonstrate that one set of desires has more weight than the other requires argument--and that argument must begin from common premises if opposing sides are to have any hope of agreement. 

For many people in this debate, the answer to this question will depend heavily upon the different elements that helped to build that individual's web of beliefs.  Their position will likely not come out of having spent month studying the academic literature on the issue and carefully evaluating the arguments on all sides.  It is for this reason that arguers must seek and begin with common ground with their opponents.

Why Do the Elements that Build Someone's System of Belief Matter?

How to Win an Argument/Sidgwick's Insight
What is interesting is that based on a person's web of beliefs we can sometimes "reverse engineer" some of the elements that influenced their web of beliefs and also identify what many of their unstated assumptions are.  Doing so can be an important step in deciding how to engage with the arguer.

If our goal is to show our opponent why his argument is problematic or persuade him to our point of view, you must be able search for and identify common ground from which you can build to your conclusion rather than his.  If you both begin from different assumptions, no progress will ever likely be made!

Sidgwick's Insight: A key to bringing someone to your point of view is to find common assumptions (premises) and show how your conclusion, rather than your opponents follows from these assumptions.

Ami's/Psychological Insight:  There is a growing body of research in psychology showing that people either reject or disbelieve facts if they conflict with their existing beliefs.  In short, trying to bring someone around to your point of view by citing facts often accomplishes nothing at best but more likely results in further recalcitrance.  Because of the counter-intuitive "weakness" of using factual evidence in many debates, it's often a more successful strategy to begin your argument by appealing to values that are shared in common with your opponent (or audience).


How to be a Philosopher
A true philosopher seeks truth above all else--or at least (non-foolish) consistency.  While we can use our understanding of systems of beliefs and the elements that form them, we can also use this information on ourselves.

It would be foolish to think that, magically, we are the only ones without ideological blind spots and unexamined assumptions!  Introspection on how our own gender, culture, religion/non-religion, family, education, career, peer group, etc... shape the way we experience the world (and in turn our beliefs and assumptions about it) is a valuable exercise. Doing so allows us to see where we have uncritically accepted certain views.