UniversityEssayServices

Barriers​ ​and​ ​Obstacles​ ​To​ ​Critical​ ​Thinking​:​ ​The​ ​Individual/Psychology​ ​and​ ​The Group/Sociology

-these​ ​are​ ​natural,​ ​but​ ​provide​ ​unreliable​ ​standards​ ​for​ ​judgment,​ ​or​ ​the​ ​bypassing​ ​of​ ​judgment overall.​ ​While​ ​being​ ​individuals​ ​with​ ​private​ ​perspectives,​ ​thoughts,​ ​emotions,​ ​desires, motivations​ ​and​ ​self-interests​ ​is​ ​normal​ ​for​ ​us,​ ​many​ ​barriers​ ​to​ ​critical​ ​thinking​ ​come​ ​from​ ​this aspect​ ​of​ ​human​ ​nature.​ ​Likewise,​ ​while​ ​we​ ​are​ ​innately​ ​social​ ​beings​ ​who​ ​require​ ​coexistence with​ ​others​ ​for​ ​development​ ​and​ ​success,​ ​we​ ​can​ ​be​ ​prevented​ ​from​ ​thinking​ ​critically​ ​by​ ​this aspect​ ​of​ ​ourselves.​ ​All​ ​of​ ​the​ ​points​ ​below​ ​are​ ​potential​ ​barriers​ ​to​ ​critical​ ​thinking.​ ​In​ ​order​ ​to think​ ​critically,​ ​one​ ​must,​ ​initially,​ ​make​ ​sure​ ​that​ ​none​ ​of​ ​these​ ​barriers​ ​are​ ​in​ ​effect.

1.)​ ​egocentrism:​ ​Using​ ​one’s​ ​own​ ​perspective​ ​as​ ​authoritative​ ​or​ ​one’s​ ​own​ ​self-interests​ ​as standards​ ​for​ ​no​ ​other​ ​reason​ ​than​ ​that​ ​they​ ​are​ ​one’s​ ​own​ ​perspective​ ​or​ ​interests.​ ​One​ ​is​ ​being egocentric​ ​when​ ​they​ ​fail​ ​to​ ​view​ ​reality​ ​or​ ​evaluate​ ​beliefs​ ​from​ ​a​ ​vantage​ ​point​ ​other​ ​than​ ​the one​ ​they​ ​come​ ​most​ ​naturally​ ​with.​ ​This​ ​is​ ​tragic​ ​in​ ​the​ ​sense​ ​that​ ​this​ ​person​ ​will​ ​be​ ​unable​ ​to see​ ​reality​ ​as​ ​anything​ ​other​ ​than​ ​what​ ​their​ ​limited​ ​view​ ​or​ ​motivations​ ​make​ ​of​ ​it,​ ​which​ ​is​ ​not even​ ​helpful​ ​for​ ​for​ ​the​ ​pursuit​ ​of​ ​their​ ​own​ ​self-interests.

2.)​ ​self-serving​ ​biases:​ ​We​ ​tend​ ​to​ ​overestimate​ ​our​ ​own​ ​strengths​ ​and​ ​merits​ ​while​ ​ignoring​ ​our weaknesses​ ​and​ ​faults;​ ​it​ ​is​ ​easy​ ​to​ ​think​ ​that​ ​we​ ​are​ ​above​ ​average​ ​for​ ​no​ ​other​ ​reason​ ​than​ ​that we​ ​are​ ​us;​ ​others​ ​tend​ ​to​ ​be​ ​blamed​ ​for​ ​our​ ​failures,​ ​and​ ​we​ ​tend​ ​to​ ​take​ ​all​ ​the​ ​credit​ ​for​ ​our successes.​ ​This​ ​is​ ​a​ ​kind​ ​of​ ​double-standard​ ​we​ ​apply​ ​to​ ​ourselves.​ ​It​ ​prevents​ ​us​ ​from​ ​seeing things​ ​as​ ​they​ ​are,​ ​because​ ​we​ ​distort​ ​our​ ​vision​ ​with​ ​our​ ​own​ ​self-delusions.

3.)​ ​Subjectivism:​ ​We​ ​are​ ​committing​ ​a​ ​subjectivist​ ​fallacy​ ​when​ ​we​ ​think​ ​that,​ ​just​ ​because​ ​we believe​ ​something​ ​is​ ​true,​ ​it​ ​is​ ​true​ ​for​ ​no​ ​other​ ​reason​ ​that​ ​that​ ​we​ ​believe​ ​it.​ ​It​ ​is​ ​easy​ ​for​ ​us​ ​to be​ ​deluded​ ​into​ ​thinking​ ​that​ ​our​ ​our​ ​beliefs​ ​are​ ​true​ ​just​ ​because​ ​they​ ​are​ ​ours,​ ​but​ ​this​ ​is obviously​ ​misleading.​ ​A​ ​further​ ​problem​ ​arises​ ​in​ ​that​ ​our​ ​brains​ ​release​ ​dopamine​ ​which​ ​makes us​ ​feel​ ​happy​ ​when​ ​we​ ​think​ ​we​ ​are​ ​right.​ ​If​ ​I​ ​say,​ ​”you’re​ ​right,”​ ​you​ ​are​ ​likely​ ​to​ ​receive happiness​ ​from​ ​this.​ ​In​ ​this​ ​way​ ​there​ ​is​ ​a​ ​physiological​ ​incentive​ ​for​ ​us​ ​to​ ​think​ ​we​ ​are​ ​right, whether​ ​we​ ​are​ ​right​ ​or​ ​not.

4.)​ ​Confirmation​ ​bias:​ ​One​ ​commits​ ​confirmation​ ​bias​ ​when​ ​they​ ​attempt​ ​to​ ​prove​ ​that​ ​a​ ​belief they​ ​hold​ ​is​ ​correct​ ​by​ ​finding​ ​evidence​ ​in​ ​favor​ ​of​ ​it.​ ​This​ ​sounds​ ​like​ ​proper​ ​investigation,​ ​but is​ ​flawed.​ ​If​ ​one​ ​only​ ​attempts​ ​to​ ​find​ ​evidence​ ​which​ ​confirms​ ​their​ ​beliefs,​ ​all​ ​that​ ​they​ ​will accomplish​ ​is​ ​the​ ​confirmation​ ​of​ ​their​ ​beliefs.​ ​One​ ​must​ ​also​ ​look​ ​for​ ​disconfirming​ ​evidence, or​ ​evidence​ ​that​ ​would​ ​contradict​ ​their​ ​belief.​ ​There​ ​is​ ​enough​ ​evidence​ ​in​ ​the​ ​world​ ​to​ ​confirm any​ ​belief​ ​at​ ​all.​ ​The​ ​trick​ ​is​ ​to​ ​figure​ ​out​ ​which​ ​beliefs​ ​have​ ​the​ ​most,​ ​the​ ​best,​ ​evidence​ ​in​ ​favor of​ ​them.​ ​In​ ​order​ ​to​ ​do​ ​this,​ ​we​ ​have​ ​to​ ​try​ ​to​ ​both​ ​confirm​ ​and​ ​disconfirm​ ​beliefs​ ​at​ ​the​ ​same

Prof.​ ​Eckel,​ ​U.​ ​Toledo,​ ​FA17

time.​ ​There​ ​may​ ​be​ ​a​ ​tiny​ ​fragment​ ​of​ ​confirming​ ​evidence​ ​for​ ​my​ ​belief.​ ​If​ ​that’s​ ​all​ ​I​ ​look​ ​for,​ ​I could​ ​be​ ​ignoring​ ​a​ ​mountain​ ​of​ ​evidence​ ​that​ ​disconfirms​ ​my​ ​belief.

5.)​ ​Selective​ ​attention:​ ​When​ ​a​ ​particular​ ​belief​ ​of​ ​ours​ ​limits​ ​our​ ​ability​ ​to​ ​fairly​ ​analyze​ ​all​ ​the relevant​ ​information,​ ​we​ ​are​ ​selectively​ ​tuning​ ​out​ ​certain​ ​information.​ ​This​ ​is​ ​a​ ​form​ ​of narrow-mindedness​ ​that​ ​should​ ​be​ ​avoided​ ​because​ ​it​ ​leads​ ​us​ ​to​ ​lack​ ​access​ ​to​ ​the​ ​facts​ ​we​ ​need to​ ​make​ ​good​ ​decisions.​ ​One​ ​is​ ​being​ ​selectively​ ​attentive​ ​when​ ​a​ ​prior​ ​belief​ ​makes​ ​it​ ​so​ ​that certain​ ​later​ ​beliefs​ ​are​ ​impossible​ ​to​ ​hold​ ​or​ ​even​ ​evaluate​ ​since​ ​certain​ ​information​ ​which would​ ​lead​ ​to​ ​that​ ​later​ ​belief​ ​is​ ​simply​ ​ignored.

6.)​ ​Avoidance:​ ​When​ ​we​ ​intentionally​ ​block​ ​out​ ​certain​ ​sources​ ​of​ ​information​ ​because​ ​they​ ​do not​ ​conform​ ​to​ ​our​ ​pre-existing​ ​beliefs.​ ​This​ ​is​ ​a​ ​way​ ​of​ ​sidestepping​ ​the​ ​hard​ ​work​ ​of​ ​evaluating the​ ​support​ ​for​ ​beliefs​ ​by​ ​simply​ ​not​ ​paying​ ​attention​ ​to​ ​other​ ​beliefs​ ​so​ ​that​ ​ours​ ​look​ ​like​ ​the strongest​ ​supported​ ​ones.​ ​We​ ​are​ ​essentially​ ​making​ ​sure​ ​that​ ​our​ ​beliefs​ ​are​ ​the​ ​only​ ​game​ ​in town​ ​as​ ​far​ ​as​ ​we’re​ ​concerned.

7.)​ ​Anger:​ ​We​ ​become​ ​angry​ ​when​ ​our​ ​beliefs​ ​are​ ​challenged​ ​so​ ​we​ ​react​ ​toward​ ​whatever challenged​ ​them​ ​in​ ​a​ ​hostile​ ​way.​ ​The​ ​impulse​ ​is​ ​that​ ​if​ ​we​ ​can​ ​destroy​ ​or​ ​force​ ​a​ ​retreat​ ​from​ ​the source​ ​of​ ​conflict,​ ​then​ ​our​ ​beliefs​ ​will​ ​somehow​ ​be​ ​more​ ​true.​ ​This​ ​is​ ​irrational.​ ​Anger​ ​is​ ​an emotion,​ ​and​ ​it​ ​is​ ​incapable​ ​of​ ​discerning​ ​true​ ​from​ ​false​ ​beliefs​ ​because​ ​it​ ​is​ ​a​ ​response,​ ​not​ ​a way​ ​to​ ​evaluate.​ ​By​ ​forcefully​ ​ejecting​ ​the​ ​source​ ​of​ ​opposition,​ ​it​ ​simply​ ​becomes​ ​easier​ ​to seem​ ​correct.​ ​Anger​ ​is​ ​an​ ​understandable​ ​response,​ ​but​ ​we​ ​should​ ​not​ ​feel​ ​attacked​ ​when​ ​our beliefs​ ​are​ ​challenged.​ ​We​ ​should​ ​want​ ​to​ ​see​ ​if​ ​the​ ​challenge​ ​is​ ​legitimate.​ ​I​ ​identify​ ​with​ ​my beliefs,​ ​but​ ​I​ ​am​ ​not​ ​well​ ​served​ ​if​ ​I​ ​identify​ ​with​ ​false​ ​beliefs.

8.)​ ​Cliche:​ ​A​ ​cliche​ ​is​ ​a​ ​platitude​ ​that​ ​may​ ​be​ ​used​ ​to​ ​deflect,​ ​disarm,​ ​or​ ​simply​ ​end​ ​the evaluation​ ​of​ ​beliefs.​ ​”To​ ​each​ ​their​ ​own,”​ ​”Everyone​ ​is​ ​entitled​ ​to​ ​their​ ​own​ ​opinion,”​ ​or​ ​”That’s true​ ​for​ ​you​ ​but​ ​not​ ​for​ ​me,”​ ​are​ ​simply​ ​ways​ ​to​ ​end​ ​an​ ​otherwise​ ​legitimate​ ​search​ ​for​ ​true beliefs​ ​because​ ​we​ ​don’t​ ​want​ ​to​ ​be​ ​proven​ ​wrong​ ​or​ ​hurt​ ​someone​ ​else.​ ​When​ ​a​ ​cliche​ ​is employed,​ ​it​ ​is​ ​against​ ​critical​ ​thinking.​ ​Be​ ​cautious​ ​of​ ​tactical​ ​uses​ ​of​ ​cliche.​ ​I​ ​may​ ​be​ ​inclined to​ ​use​ ​a​ ​cliche​ ​to​ ​end​ ​a​ ​debate​ ​when​ ​I​ ​know​ ​that​ ​I’m​ ​about​ ​to​ ​lose.​ ​Don’t​ ​let​ ​me​ ​off​ ​the​ ​hook​ ​so easily.

9.)​ ​Denial:​ ​We​ ​commit​ ​to​ ​denialism​ ​when​ ​we​ ​simply​ ​refuse​ ​to​ ​connect​ ​evidence​ ​and​ ​support​ ​to the​ ​beliefs​ ​which​ ​they​ ​support.​ ​I​ ​am​ ​in​ ​denial​ ​I​ ​simply​ ​refuse​ ​to​ ​think,​ ​act,​ ​or​ ​behave​ ​in accordance​ ​with​ ​what​ ​is​ ​known.​ ​This​ ​is​ ​generally​ ​done​ ​because​ ​one​ ​would​ ​rather​ ​hold​ ​the​ ​belief, because​ ​it​ ​matters​ ​that​ ​much​ ​to​ ​them​ ​for​ ​some​ ​reason,​ ​rather​ ​than​ ​have​ ​a​ ​true​ ​belief,​ ​when​ ​one thinks​ ​that​ ​beliefs​ ​are​ ​supposed​ ​to​ ​do​ ​something​ ​other​ ​than​ ​be​ ​true​ ​(like​ ​make​ ​us​ ​happy,​ ​make​ ​us feel​ ​good,​ ​give​ ​us​ ​power,​ ​etc.).

Prof.​ ​Eckel,​ ​U.​ ​Toledo,​ ​FA17

10.)​ ​Ignorance:​ ​We​ ​are​ ​prevented​ ​from​ ​thinking​ ​critically​ ​if​ ​we​ ​are​ ​ignorant​ ​of​ ​relevant information​ ​we​ ​need​ ​in​ ​order​ ​to​ ​evaluate​ ​a​ ​belief.​ ​This​ ​can​ ​occur​ ​deliberately​ ​or​ ​accidentally.​ ​I can​ ​choose​ ​to​ ​be​ ​ignorant​ ​of​ ​information​ ​so​ ​that​ ​I’m​ ​not​ ​guilty​ ​of​ ​blunt​ ​denial,​ ​if​ ​holding​ ​my belief​ ​matters​ ​that​ ​much​ ​to​ ​me.​ ​I​ ​may​ ​willfully​ ​refuse​ ​to​ ​understand​ ​a​ ​topic​ ​because​ ​I​ ​am​ ​afraid to​ ​uncover​ ​information​ ​which​ ​would​ ​force​ ​me​ ​to​ ​either​ ​change​ ​my​ ​cherished​ ​beliefs​ ​or​ ​bluntly commit​ ​to​ ​denial.

11.)​ ​Struggle:​ ​I​ ​may​ ​choose​ ​to​ ​struggle​ ​as​ ​a​ ​way​ ​a​ ​deferring​ ​my​ ​understanding​ ​of​ ​a​ ​topic​ ​instead of​ ​allowing​ ​that​ ​understanding​ ​to​ ​force​ ​me​ ​to​ ​revise​ ​my​ ​beliefs.​ ​If​ ​evidence​ ​leads​ ​me​ ​to​ ​believe that​ ​my​ ​most​ ​cherished​ ​belief​ ​is​ ​false,​ ​I​ ​may​ ​choose​ ​to​ ​simply​ ​turn​ ​the​ ​struggle​ ​against​ ​that evidence​ ​into​ ​my​ ​focus​ ​rather​ ​than​ ​focus​ ​on​ ​my​ ​belief.​ ​I​ ​may​ ​simply​ ​bury​ ​myself​ ​in​ ​information until​ ​I’m​ ​lost​ ​and​ ​choose​ ​to​ ​stay​ ​lost​ ​so​ ​that​ ​I​ ​don’t​ ​have​ ​to​ ​proportion​ ​my​ ​belief​ ​to​ ​evidence.​ ​I would​ ​be​ ​constantly​ ​working​ ​on​ ​understanding​ ​but​ ​refusing​ ​to​ ​acknowledge​ ​what​ ​I​ ​need​ ​to​ ​do with​ ​that​ ​understanding.​ ​At​ ​a​ ​certain​ ​point,​ ​conclusions​ ​can​ ​be​ ​drawn.​ ​If​ ​I​ ​let​ ​the​ ​fact​ ​that​ ​we don’t​ ​have​ ​absolute​ ​knowledge​ ​about​ ​the​ ​issue​ ​prevent​ ​me​ ​from​ ​drawing​ ​a​ ​conclusion​ ​I​ ​don’t like,​ ​then​ ​I’m​ ​choosing​ ​to​ ​struggle.​ ​Humans​ ​do​ ​not​ ​have​ ​absolute​ ​knowledge​ ​about​ ​anything; inquiries​ ​are​ ​always​ ​ongoing,​ ​but​ ​we​ ​have​ ​more​ ​or​ ​less​ ​evidence​ ​on​ ​certain​ ​cases​ ​so​ ​should​ ​have more​ ​or​ ​less​ ​confidence​ ​in​ ​the​ ​beliefs​ ​which​ ​that​ ​evidence​ ​supports.​ ​Letting​ ​the​ ​imperfections​ ​of human​ ​knowledge​ ​keep​ ​us​ ​from​ ​accepting​ ​a​ ​belief​ ​we​ ​don’t​ ​like​ ​is​ ​a​ ​cop-out.

12.)​ ​Distraction:​ ​When​ ​I​ ​defer​ ​coming​ ​to​ ​a​ ​conclusion​ ​I​ ​don’t​ ​like​ ​by​ ​filling​ ​my​ ​mind​ ​with irrelevant​ ​matters​ ​and​ ​tasks,​ ​I’m​ ​simply​ ​distracting​ ​myself​ ​in​ ​a​ ​tactical​ ​way​ ​to​ ​preserve​ ​the lifespan​ ​of​ ​the​ ​belief​ ​I’m​ ​worried​ ​will​ ​have​ ​to​ ​be​ ​revised.​ ​There​ ​will​ ​always​ ​be​ ​something​ ​to distract​ ​ourselves​ ​with,​ ​and​ ​others​ ​may​ ​also​ ​keep​ ​us​ ​in​ ​the​ ​dark​ ​by​ ​distracting​ ​us.​ ​Keeping priorities​ ​regarding​ ​truth​ ​intact​ ​regardless​ ​of​ ​where​ ​that​ ​pursuit​ ​leads​ ​is​ ​critical​ ​thinking. Anything​ ​else​ ​is​ ​running​ ​away.

13.)​ ​Rationalization:​ ​We​ ​rationalize​ ​whenever​ ​we​ ​come​ ​up​ ​with​ ​a​ ​justification​ ​for​ ​a​ ​belief​ ​or​ ​act after​ ​we​ ​have​ ​already​ ​decided​ ​to​ ​hold​ ​the​ ​belief​ ​or​ ​do​ ​the​ ​act.​ ​This​ ​is​ ​the​ ​reverse​ ​order​ ​of​ ​only accepting​ ​a​ ​belief​ ​when​ ​we​ ​have​ ​a​ ​good​ ​reason​ ​to.​ ​If​ ​one​ ​has​ ​a​ ​belief,​ ​then​ ​needs​ ​to​ ​justify​ ​it later,​ ​then​ ​they​ ​definitionally​ ​don’t​ ​have​ ​a​ ​good​ ​reason​ ​to​ ​have​ ​that​ ​belief​ ​in​ ​the​ ​first​ ​place. Making​ ​a​ ​justification​ ​up​ ​later​ ​is​ ​simply​ ​writing​ ​a​ ​story​ ​to​ ​make​ ​the​ ​belief​ ​seem​ ​more​ ​acceptable after​ ​it​ ​has​ ​already​ ​been​ ​accepted.​ ​One​ ​essentially​ ​cherry-picks​ ​a​ ​belief​ ​then​ ​concocts​ ​a​ ​story​ ​to makes​ ​it​ ​seem​ ​more​ ​acceptable​ ​to​ ​themselves​ ​and​ ​others.​ ​This​ ​will​ ​never​ ​yield​ ​a​ ​true​ ​belief​ ​in any​ ​other​ ​way​ ​than​ ​by​ ​accident.

14.)​ ​Double-Think/Double-Standard:​ ​One​ ​is​ ​guilty​ ​of​ ​double-think​ ​when​ ​they​ ​hold​ ​two contradictory​ ​views​ ​as​ ​true.​ ​The​ ​definition​ ​of​ ​a​ ​contradiction​ ​is​ ​when​ ​two​ ​beliefs​ ​both​ ​cannot​ ​be true​ ​at​ ​the​ ​same​ ​time.​ ​Thus,​ ​at​ ​least​ ​one​ ​of​ ​the​ ​beliefs​ ​is​ ​a​ ​false​ ​belief.​ ​It​ ​is​ ​easy​ ​to​ ​do​ ​this,​ ​since our​ ​worldviews​ ​are​ ​collections​ ​of​ ​a​ ​great​ ​many​ ​beliefs​ ​which​ ​we​ ​generally​ ​pick​ ​up​ ​in​ ​a

Prof.​ ​Eckel,​ ​U.​ ​Toledo,​ ​FA17

scattershot​ ​way.​ ​We​ ​also​ ​tend​ ​to​ ​use​ ​our​ ​belief​ ​pragmatically:​ ​as​ ​long​ ​as​ ​they​ ​work​ ​they​ ​are​ ​good beliefs.​ ​Thus​ ​it​ ​becomes​ ​possible​ ​for​ ​me​ ​to​ ​do​ ​something​ ​like​ ​believe​ ​that​ ​there​ ​should​ ​be​ ​a separation​ ​of​ ​church​ ​and​ ​state,​ ​then​ ​vote​ ​for​ ​a​ ​candidate​ ​whose​ ​agenda​ ​represents​ ​that​ ​candidate’s religious​ ​beliefs​ ​which​ ​represent​ ​mine.​ ​This​ ​could​ ​be​ ​ignorance​ ​of​ ​the​ ​contradiction​ ​on​ ​my​ ​part, or​ ​I​ ​may​ ​understand​ ​that​ ​contradiction​ ​but​ ​not​ ​care​ ​because​ ​I​ ​gain​ ​from​ ​it.​ ​In​ ​one​ ​context,​ ​it makes​ ​sense​ ​to​ ​believe​ ​one​ ​of​ ​those​ ​beliefs.​ ​In​ ​another​ ​context,​ ​the​ ​other​ ​seems​ ​fine.​ ​Taken​ ​as​ ​a whole,​ ​they​ ​contradict.​ ​The​ ​deeper​ ​concern​ ​would​ ​be​ ​if​ ​we​ ​doublethink​ ​in​ ​a​ ​self-serving​ ​way, that​ ​is,​ ​we​ ​flip-flop​ ​our​ ​principles​ ​if​ ​it​ ​is​ ​advantageous​ ​for​ ​us​ ​personally.​ ​If​ ​an​ ​opposing candidate​ ​has​ ​religious​ ​views​ ​that​ ​don’t​ ​conform​ ​to​ ​mine,​ ​I​ ​may​ ​play​ ​my​ ​separation​ ​of​ ​church and​ ​state​ ​card.​ ​When​ ​a​ ​candidate​ ​comes​ ​along​ ​whose​ ​views​ ​do​ ​support​ ​mine,​ ​I​ ​may​ ​play​ ​my​ ​vote for​ ​the​ ​personal​ ​who​ ​will​ ​best​ ​represent​ ​my​ ​political​ ​interests​ ​card.​ ​This​ ​is​ ​simply​ ​abandoning​ ​a commitment​ ​to​ ​true​ ​beliefs​ ​so​ ​that​ ​one​ ​can​ ​use​ ​them​ ​as​ ​weapons.

15.)​ ​Stereotypes:​ ​A​ ​stereotype​ ​is​ ​an​ ​overgeneralization​ ​about​ ​a​ ​group,​ ​normally​ ​of​ ​people.​ ​We’ll look​ ​at​ ​overgeneralization​ ​more,​ ​but​ ​understand​ ​it​ ​this​ ​way:​ ​I​ ​overgeneralize​ ​when​ ​I​ ​conclude something​ ​about​ ​a​ ​group​ ​based​ ​on​ ​evidence​ ​from​ ​an​ ​insufficient​ ​number​ ​of​ ​members​ ​of​ ​that group.​ ​We​ ​extend​ ​past​ ​what​ ​the​ ​evidence​ ​can​ ​account​ ​for.​ ​We​ ​have​ ​to​ ​make​ ​generalizations​ ​to live.​ ​If​ ​I​ ​wait​ ​to​ ​pay​ ​for​ ​my​ ​groceries​ ​because​ ​I​ ​haven’t​ ​determined​ ​whether​ ​every​ ​cashier​ ​in​ ​the world​ ​is​ ​reliable​ ​so​ ​I​ ​don’t​ ​know​ ​about​ ​the​ ​one​ ​in​ ​front​ ​of​ ​me,​ ​I’m​ ​being​ ​ridiculous.​ ​At​ ​a​ ​certain point,​ ​we​ ​must​ ​decide.​ ​Then​ ​again,​ ​we​ ​are​ ​too​ ​easily​ ​lazy​ ​with​ ​our​ ​generalizations​ ​and​ ​able​ ​to cherry-pick,​ ​since​ ​it’s​ ​an​ ​imperfect​ ​way​ ​of​ ​understanding.​ ​Stereotypes​ ​are​ ​employed​ ​against people​ ​as​ ​a​ ​way​ ​of​ ​gaining​ ​some​ ​kind​ ​of​ ​power​ ​over​ ​them.​ ​It​ ​is​ ​easy​ ​to​ ​remove​ ​someone’s​ ​power or​ ​worth​ ​by​ ​placing​ ​them​ ​in​ ​an​ ​incredibly​ ​broad​ ​category,​ ​asserting​ ​some​ ​common​ ​feature​ ​of​ ​that category​ ​which​ ​would​ ​justify​ ​the​ ​treatment,​ ​then​ ​assuming​ ​that​ ​the​ ​person​ ​is​ ​one​ ​of​ ​the​ ​members of​ ​the​ ​group​ ​with​ ​that​ ​feature.​ ​Overall,​ ​stereotypes​ ​are​ ​way​ ​of​ ​making​ ​cheap​ ​determinations​ ​on insufficient​ ​evidence.

16.)​ ​Fear​ ​of​ ​Challenge:​ ​Many​ ​barrier​ ​come​ ​from​ ​a​ ​fear​ ​of​ ​revising​ ​cherished​ ​beliefs, undermining​ ​oneself,​ ​or​ ​having​ ​to​ ​do​ ​the​ ​hard​ ​work​ ​of​ ​determining​ ​what​ ​is​ ​true​ ​or​ ​good.​ ​Critical thinking​ ​is​ ​hard,​ ​tedious,​ ​and​ ​uncomfortable.​ ​The​ ​urge​ ​to​ ​take​ ​the​ ​path​ ​of​ ​least​ ​resistance​ ​is strong.​ ​Overall,​ ​if​ ​we​ ​don’t​ ​challenge​ ​ourselves,​ ​we​ ​simply​ ​will​ ​not​ ​grow,​ ​personally,​ ​or​ ​as​ ​a society.​ ​We’ll​ ​simply​ ​stop​ ​at​ ​a​ ​certain​ ​point,​ ​saying​ ​things​ ​are​ ​good​ ​enough.​ ​This​ ​is​ ​a​ ​kind​ ​of laziness​ ​and​ ​avoidance​ ​which​ ​essentially​ ​ends​ ​the​ ​human​ ​project​ ​of​ ​advancement​ ​in​ ​every​ ​way. We​ ​are​ ​never​ ​finished,​ ​never​ ​perfect,​ ​but​ ​can​ ​always​ ​get​ ​better.​ ​The​ ​only​ ​thing​ ​that​ ​is​ ​good enough​ ​is​ ​realizing​ ​this​ ​and​ ​trying​ ​to​ ​get​ ​better.

17.)​ ​Misperception​ ​of​ ​random​ ​data:​ ​Humans​ ​impose​ ​meanings​ ​on​ ​otherwise​ ​meaningless information.​ ​The​ ​human​ ​mind​ ​is​ ​a​ ​meaning-producing​ ​machine.​ ​From​ ​all​ ​of​ ​the​ ​sensory​ ​data​ ​that pours​ ​in,​ ​the​ ​mind​ ​organizes​ ​it​ ​according​ ​to​ ​certain​ ​concepts,​ ​rules,​ ​and​ ​beliefs.​ ​In​ ​the​ ​absence​ ​of order,​ ​we​ ​make​ ​order,​ ​no​ ​matter​ ​what.​ ​This​ ​is​ ​why​ ​in​ ​ambiguous​ ​phenomena​ ​like​ ​smoke,​ ​clouds,

Prof.​ ​Eckel,​ ​U.​ ​Toledo,​ ​FA17

crumbly​ ​walls,​ ​or​ ​cheese-puffs​ ​we​ ​see​ ​things​ ​that​ ​aren’t​ ​there.​ ​The​ ​lesson​ ​for​ ​critical​ ​thinking: don’t​ ​assume​ ​that​ ​the​ ​world​ ​is​ ​the​ ​way​ ​it​ ​seems​ ​at​ ​first​ ​glance.​ ​Make​ ​sure​ ​you’re​ ​not​ ​putting meaning​ ​where​ ​it​ ​doesn’t​ ​actually​ ​belong.​ ​If​ ​you​ ​believe​ ​something,​ ​make​ ​sure​ ​that​ ​there’s​ ​more going​ ​on​ ​than​ ​just​ ​you​ ​making​ ​it​ ​believable.

18.)​ ​Memorable​ ​events​ ​error:​ ​The​ ​human​ ​mind​ ​recalls​ ​events​ ​primordially​ ​based​ ​on​ ​the​ ​criteria of​ ​strongest​ ​emotive​ ​impact.​ ​We​ ​recall​ ​evidence​ ​from​ ​memory​ ​based​ ​on​ ​things​ ​like​ ​fear,​ ​love, trauma,​ ​bliss,​ ​then​ ​we​ ​use​ ​that​ ​recollection​ ​as​ ​a​ ​standard.​ ​This​ ​runs​ ​the​ ​risk​ ​of​ ​giving​ ​us​ ​a​ ​bad belief​ ​to​ ​use​ ​for​ ​evaluation​ ​which​ ​then​ ​causes​ ​selective​ ​attention.​ ​If​ ​I​ ​need​ ​to​ ​determine​ ​whether or​ ​not​ ​I​ ​should​ ​eat​ ​a​ ​salad,​ ​then​ ​only​ ​recall​ ​that​ ​salad​ ​which​ ​made​ ​me​ ​sick​ ​ten​ ​years​ ​ago,​ ​then refuse​ ​to​ ​eat​ ​salad​ ​now​ ​because​ ​it​ ​might​ ​make​ ​me​ ​sick,​ ​I’m​ ​simply​ ​misrepresenting​ ​the probabilities​ ​because​ ​I’m​ ​letting​ ​my​ ​emotions​ ​give​ ​one​ ​factor​ ​a​ ​power​ ​it​ ​doesn’t​ ​have.

19.)​ ​Innumeracy:​ ​Humans​ ​are​ ​traditionally​ ​bad​ ​at​ ​determining​ ​probabilities​ ​and​ ​statistics,​ ​even doing​ ​simple​ ​math,​ ​in​ ​an​ ​experiential​ ​way.​ ​We​ ​often​ ​commit​ ​the​ ​gambler’s​ ​fallacy:​ ​thinking​ ​that our​ ​actions​ ​determine​ ​the​ ​outcomes​ ​of​ ​random​ ​events​ ​(pay​ ​attention​ ​to​ ​the​ ​meaning​ ​of​ ​the​ ​word random).​ ​We​ ​also​ ​tend​ ​to​ ​mythologize​ ​events,​ ​making​ ​probabilities​ ​seem​ ​improbable.​ ​For example,​ ​we​ ​often​ ​think​ ​of​ ​what​ ​a​ ​coincidence​ ​it​ ​is​ ​when​ ​something​ ​happens​ ​which​ ​we​ ​were​ ​just thinking​ ​about.​ ​In​ ​fact,​ ​we​ ​are​ ​retroactively​ ​going​ ​back​ ​in​ ​time,​ ​picking​ ​that​ ​thought​ ​out​ ​of​ ​the thousands​ ​of​ ​thoughts​ ​we​ ​had​ ​in​ ​the​ ​previous​ ​hour,​ ​and​ ​assigning​ ​it​ ​a​ ​value​ ​it​ ​doesn’t​ ​have​ ​based on​ ​the​ ​fact​ ​that​ ​it​ ​corresponds​ ​to​ ​an​ ​event​ ​we​ ​just​ ​experienced.​ ​Think​ ​of​ ​your​ ​friend​ ​calling​ ​you after​ ​you​ ​thought​ ​of​ ​them.​ ​You​ ​likely​ ​think​ ​of​ ​friends​ ​every​ ​minute​ ​in​ ​an​ ​unconscious,​ ​quick, associative​ ​way​ ​without​ ​noting​ ​it.​ ​When​ ​the​ ​friend​ ​calls,​ ​you​ ​retrieve​ ​the​ ​thought​ ​and​ ​assign​ ​it​ ​a kind​ ​of​ ​magic​ ​improbability.​ ​When​ ​we​ ​say,​ ​”What​ ​are​ ​the​ ​chances​ ​of​ ​that?”​ ​They’re​ ​often​ ​quite good.​ ​Don’t​ ​assign​ ​a​ ​mystical​ ​value​ ​to​ ​these​ ​situations.​ ​The​ ​true​ ​improbability​ ​is​ ​that​ ​your thought​ ​of​ ​the​ ​friend​ ​somehow​ ​caused​ ​them​ ​to​ ​call,​ ​far​ ​more​ ​improbable​ ​than​ ​the​ ​coincidence you​ ​made​ ​too​ ​much​ ​of.​ ​The​ ​problem​ ​is​ ​that,​ ​if​ ​we​ ​add​ ​small​ ​situations​ ​like​ ​this​ ​together,​ ​it becomes​ ​too​ ​easy​ ​to​ ​think​ ​the​ ​world​ ​allows​ ​for​ ​some​ ​kind​ ​of​ ​magic​ ​in​ ​general.

20.)​ ​Self-fulfilling​ ​prophecy:​ ​When​ ​a​ ​belief​ ​causes​ ​it’s​ ​own​ ​confirmation,​ ​this​ ​is​ ​a​ ​self-fulfilling prophecy.​ ​If​ ​I​ ​believe​ ​that​ ​I​ ​am​ ​unlikable,​ ​so​ ​become​ ​anti-social,​ ​I​ ​will​ ​have​ ​fewer​ ​people​ ​who like​ ​me,​ ​thus​ ​confirming​ ​my​ ​initial​ ​belief​ ​by​ ​causing​ ​the​ ​evidence​ ​to​ ​come​ ​into​ ​existence​ ​for​ ​it when​ ​this​ ​evidence​ ​wouldn’t​ ​have​ ​otherwise​ ​been​ ​there.​ ​Make​ ​sure​ ​that​ ​beliefs​ ​don’t​ ​produce their​ ​own​ ​evidence.​ ​This​ ​is​ ​a​ ​kind​ ​of​ ​accidental​ ​confirmation​ ​bias​ ​which​ ​is​ ​quite​ ​dangerous.​ ​If​ ​I believe​ ​that​ ​all​ ​Muslims​ ​are​ ​terrorists,​ ​then​ ​go​ ​about​ ​destroying​ ​and​ ​alienating​ ​as​ ​many​ ​Muslims as​ ​I​ ​can,​ ​I’m​ ​quite​ ​easily​ ​making​ ​more​ ​Islamic​ ​terrorists​ ​which​ ​only​ ​confirms​ ​my​ ​initial​ ​belief more​ ​and​ ​more.​ ​To​ ​avoid​ ​this​ ​kind​ ​of​ ​self-fulfilling​ ​prophecy,​ ​one​ ​must​ ​always​ ​ask​ ​why​ ​until they​ ​reach​ ​the​ ​origin.​ ​If​ ​I​ ​don’t​ ​look​ ​back​ ​far​ ​enough,​ ​I​ ​may​ ​just​ ​see​ ​that​ ​I’m​ ​friendless​ ​and

Prof.​ ​Eckel,​ ​U.​ ​Toledo,​ ​FA17

therefore​ ​unlikable,​ ​or​ ​that​ ​Islamic​ ​terrorists​ ​are​ ​more​ ​prevalent​ ​and​ ​hence​ ​Muslims​ ​must​ ​be terrorists.

21.)​ ​Absolutism:​ ​see​ ​the​ ​previous​ ​lecture​ ​notes​ ​on​ ​this​ ​topic.​ ​Also​ ​consider​ ​relativism​ ​a​ ​barrier.

22.)​ ​Conformism:​ ​When​ ​one​ ​simply​ ​accepts​ ​the​ ​values,​ ​beliefs,​ ​and​ ​practices​ ​of​ ​their​ ​social group​ ​without​ ​any​ ​additional​ ​evidence​ ​in​ ​favor​ ​of​ ​the​ ​legitimacy​ ​of​ ​those​ ​values,​ ​beliefs,​ ​or practices,​ ​they​ ​are​ ​simply​ ​conforming​ ​to​ ​their​ ​group​ ​without​ ​thinking​ ​critically.​ ​We​ ​do​ ​this​ ​when we​ ​are​ ​young​ ​because​ ​we​ ​have​ ​few​ ​other​ ​options;​ ​we​ ​simply​ ​are​ ​socialized​ ​into​ ​whatever random​ ​society​ ​we​ ​happen​ ​to​ ​be​ ​born​ ​into.​ ​Maturity,​ ​however,​ ​brings​ ​the​ ​option​ ​of​ ​revising​ ​that initial​ ​random​ ​selection​ ​by​ ​questioning​ ​your​ ​socialization.​ ​Experience​ ​with​ ​diverse​ ​groups​ ​is crucial​ ​here,​ ​since​ ​the​ ​tools​ ​to​ ​fight​ ​conformity​ ​come​ ​from​ ​outside​ ​the​ ​initial​ ​group.​ ​Only​ ​by maximal​ ​experience​ ​with​ ​diversity​ ​can​ ​we​ ​be​ ​sure​ ​that​ ​we​ ​are​ ​not​ ​only​ ​avoiding​ ​conformity,​ ​but picking​ ​the​ ​best​ ​beliefs​ ​for​ ​the​ ​best​ ​reasons​ ​from​ ​synthesis​ ​and​ ​comparison.

23.)​ ​Ethnocentrism/group-think:​ ​This​ ​is​ ​the​ ​same​ ​problem​ ​as​ ​egocentrism,​ ​but​ ​with​ ​a​ ​broader circle.​ ​Instead​ ​of​ ​only​ ​seeing​ ​things​ ​from​ ​the​ ​perspective​ ​of​ ​one’s​ ​own​ ​subjective​ ​view,​ ​with ethnocentrism​ ​or​ ​group-think,​ ​one​ ​only​ ​sees​ ​things​ ​from​ ​the​ ​perspective​ ​of​ ​whatever​ ​social​ ​group they​ ​happen​ ​to​ ​be​ ​in.​ ​The​ ​limitations​ ​are​ ​exactly​ ​the​ ​same,​ ​however.​ ​In​ ​fact,​ ​the​ ​risk​ ​is​ ​worse.​ ​As an​ ​egocentrist,​ ​the​ ​world​ ​will​ ​consistently​ ​disconfirm​ ​that​ ​you​ ​are,​ ​in​ ​fact,​ ​its​ ​center.​ ​As​ ​an ethnocentric​ ​person,​ ​on​ ​the​ ​other​ ​hand,​ ​the​ ​world​ ​will​ ​disconfirm​ ​that​ ​your​ ​group​ ​is,​ ​in​ ​fact,​ ​the center​ ​of​ ​it​ ​all,​ ​but​ ​it’s​ ​far​ ​easier​ ​to​ ​ignore​ ​or​ ​rationalize​ ​this​ ​disconfirmation.​ ​An​ ​egocentric person​ ​may​ ​be​ ​able​ ​to​ ​push​ ​their​ ​narcissism​ ​to​ ​nearly​ ​psychotic​ ​proportions,​ ​but​ ​not​ ​often.​ ​A person​ ​who​ ​completely​ ​conforms​ ​to​ ​their​ ​socialization,​ ​however,​ ​by​ ​definition​ ​has​ ​a​ ​bunch​ ​of other​ ​people​ ​to​ ​use​ ​for​ ​confirmation.​ ​For​ ​an​ ​egocentric​ ​person,​ ​they​ ​have​ ​to​ ​start​ ​blaming​ ​their failures​ ​to​ ​understand​ ​and​ ​act​ ​effectively​ ​on​ ​everyone​ ​and​ ​everything​ ​else.​ ​For​ ​an​ ​ethnocentric person,​ ​they​ ​still​ ​start​ ​blaming​ ​everything​ ​external​ ​for​ ​their​ ​failures,​ ​but​ ​they​ ​are​ ​able​ ​to​ ​see others​ ​like​ ​them​ ​and​ ​use​ ​that​ ​as​ ​confirmation​ ​that​ ​they​ ​aren’t​ ​crazy.​ ​The​ ​problem​ ​is​ ​that​ ​they simply​ ​might​ ​be​ ​in​ ​a​ ​group​ ​of​ ​irrational​ ​people​ ​who​ ​use​ ​the​ ​fact​ ​that​ ​there​ ​are​ ​others​ ​like​ ​them​ ​as confirmation​ ​that​ ​they​ ​are​ ​behaving​ ​or​ ​thinking​ ​rationally.​ ​If​ ​they​ ​are​ ​properly​ ​ethnocentric​ ​and committed​ ​to​ ​groupthink,​ ​full​ ​conformity,​ ​they’ll​ ​have​ ​no​ ​other​ ​perspective​ ​to​ ​judge​ ​by​ ​than​ ​their group’s​ ​standards,​ ​so​ ​even​ ​their​ ​definition​ ​of​ ​’rationality’​ ​will​ ​be​ ​determined​ ​by​ ​their​ ​irrational group.​ ​This​ ​is​ ​why​ ​diversity​ ​is​ ​the​ ​only​ ​antidote​ ​to​ ​fanaticism.​ ​At​ ​the​ ​very​ ​least,​ ​we​ ​should​ ​be concerned​ ​with​ ​using​ ​group-beliefs​ ​as​ ​a​ ​lens​ ​to​ ​interpret​ ​the​ ​world​ ​through.​ ​While​ ​this​ ​is unavoidable,​ ​not​ ​all​ ​beliefs​ ​track​ ​reality​ ​in​ ​equally​ ​adequate​ ​ways.​ ​Group​ ​expectations​ ​can​ ​incite mass​ ​delusions​ ​which​ ​are​ ​nearly​ ​unstoppable.​ ​A​ ​collective​ ​delusion​ ​is​ ​simply​ ​a​ ​bogus​ ​worldview shared​ ​by​ ​a​ ​group.​ ​The​ ​irrationality​ ​has​ ​officially​ ​become​ ​all-powerful​ ​at​ ​that​ ​point.​ ​The​ ​Salem Witch​ ​Trials​ ​is​ ​a​ ​good​ ​example​ ​of​ ​this.​ ​One​ ​may​ ​see​ ​the​ ​Holocaust​ ​similarly,​ ​which​ ​was​ ​based on​ ​a​ ​collective​ ​delusion​ ​that​ ​genocide​ ​would​ ​somehow​ ​be​ ​advantageous​ ​to​ ​enact​ ​or​ ​permit.

Prof.​ ​Eckel,​ ​U.​ ​Toledo,​ ​FA17

24.)​ ​One​ ​of​ ​us/them:​ ​From​ ​an​ ​extreme​ ​enough​ ​kind​ ​of​ ​groupthink,​ ​it​ ​becomes​ ​easy​ ​to​ ​see​ ​the social​ ​world​ ​as​ ​an​ ​absolute​ ​dualist:​ ​either​ ​one​ ​of​ ​us​ ​or​ ​one​ ​of​ ​them.​ ​This​ ​is​ ​easy​ ​to​ ​do​ ​for extremist​ ​groups,​ ​who​ ​have​ ​a​ ​lot​ ​of​ ​push-back​ ​from​ ​the​ ​world​ ​around​ ​them.​ ​They​ ​are​ ​able​ ​to​ ​turn that​ ​disconfirmation​ ​into​ ​confirmation​ ​by​ ​blaming​ ​that​ ​world​ ​and​ ​the​ ​others​ ​in​ ​it​ ​rather​ ​than​ ​their own​ ​irrationality​ ​for​ ​their​ ​shortcomings​ ​caused​ ​by​ ​limited​ ​perspective.​ ​This​ ​is​ ​a​ ​kind​ ​of self-serving​ ​bias​ ​at​ ​the​ ​level​ ​group-psychology​ ​in​ ​the​ ​sense​ ​that​ ​bad=them​ ​and​ ​good=us​ ​is​ ​an absurdly​ ​simplified​ ​way.​ ​The​ ​error​ ​makes​ ​sense:​ ​first​ ​the​ ​limited​ ​perspective​ ​of​ ​group-think makes​ ​it​ ​hard​ ​for​ ​the​ ​group​ ​to​ ​be​ ​effective​ ​in​ ​their​ ​endeavors​ ​because​ ​they​ ​don’t​ ​understand​ ​the world​ ​efficiently;​ ​then​ ​the​ ​group​ ​can​ ​rationalize​ ​this​ ​frustration​ ​by​ ​externalizing​ ​it​ ​onto​ ​another group​ ​since​ ​they​ ​seem​ ​rational​ ​enough​ ​to​ ​themselves​ ​and​ ​have​ ​no​ ​external​ ​standard​ ​to​ ​check with;​ ​finally,​ ​this​ ​act​ ​of​ ​turning​ ​another​ ​group​ ​into​ ​an​ ​evil​ ​group​ ​makes​ ​sense​ ​to​ ​the​ ​initial​ ​group, since​ ​they​ ​don’t​ ​actually​ ​understand​ ​that​ ​group​ ​they’ve​ ​just​ ​overgeneralized​ ​anyway​ ​due​ ​to​ ​their initial​ ​lack​ ​of​ ​perspective​ ​which​ ​caused​ ​the​ ​whole​ ​mess​ ​in​ ​the​ ​first​ ​place.​ ​A​ ​significant​ ​danger comes​ ​from​ ​groups​ ​whose​ ​identity​ ​is​ ​either​ ​threatened​ ​or​ ​loose​ ​to​ ​begin​ ​with.​ ​If​ ​a​ ​group’s​ ​identity needs​ ​defining,​ ​strengthening,​ ​or​ ​clarifying,​ ​it​ ​is​ ​cheap​ ​and​ ​easy​ ​to​ ​construct​ ​the​ ​revised​ ​identity as​ ​a​ ​negative​ ​definition:​ ​we​ ​are​ ​not​ ​them.​ ​Groups​ ​that​ ​lack​ ​substance​ ​make​ ​this​ ​move​ ​easily. Those​ ​who​ ​wish​ ​to​ ​construct​ ​a​ ​certain​ ​group​ ​will​ ​use​ ​this​ ​to​ ​their​ ​advantage.​ ​This​ ​often​ ​takes​ ​a moral​ ​connotation.​ ​They​ ​are​ ​bad;​ ​we​ ​are​ ​not​ ​them,​ ​so​ ​we​ ​are​ ​good.​ ​Nothing​ ​gets​ ​a​ ​group cohesive​ ​like​ ​predicating​ ​its​ ​own​ ​identity​ ​on​ ​an​ ​enemy​ ​when​ ​that​ ​identity​ ​is​ ​otherwise illegitimate,​ ​broken,​ ​or​ ​impotent.​ ​Those​ ​who​ ​seek​ ​control​ ​over​ ​groups​ ​know​ ​this,​ ​and​ ​are​ ​able​ ​to construct​ ​obedience​ ​by​ ​constructing​ ​enemies​ ​to​ ​fear,​ ​then​ ​mobilize​ ​the​ ​obedient​ ​followers​ ​with hate.​ ​This​ ​is​ ​the​ ​road​ ​to​ ​political​ ​authoritarianism.​ ​You​ ​should​ ​be​ ​able​ ​to​ ​see​ ​it​ ​interpersonally​ ​as well.​ ​When​ ​two​ ​friends​ ​are​ ​talking​ ​together​ ​but​ ​have​ ​very​ ​little​ ​to​ ​talk​ ​about,​ ​the​ ​conversation can​ ​be​ ​easily​ ​propped​ ​up​ ​and​ ​the​ ​bond​ ​deepened​ ​by​ ​listing​ ​off​ ​all​ ​of​ ​the​ ​ways​ ​that​ ​they​ ​dislike​ ​a third,​ ​non-present​ ​friend.​ ​In​ ​this​ ​way,​ ​groupthink​ ​creates​ ​myths​ ​that​ ​serve​ ​the​ ​group​ ​which​ ​often take​ ​on​ ​dualistic​ ​dangers​ ​with​ ​real​ ​consequences.

25.)​ ​Peer-pressure:​ ​Peer-pressure​ ​is​ ​a​ ​way​ ​for​ ​social​ ​groups​ ​to​ ​maintain​ ​themselves​ ​by​ ​ensuring conformity​ ​though​ ​coercion.​ ​All​ ​social​ ​groups​ ​are​ ​made​ ​of​ ​individuals​ ​who​ ​are​ ​members​ ​of​ ​that group​ ​in​ ​virtue​ ​of​ ​sharing​ ​some​ ​kind​ ​of​ ​beliefs,​ ​values,​ ​or​ ​practices​ ​with​ ​the​ ​rest​ ​of​ ​the​ ​members of​ ​that​ ​group.​ ​When​ ​a​ ​member​ ​deviates​ ​from​ ​the​ ​group​ ​identity​ ​by​ ​not​ ​sharing​ ​the​ ​appropriate beliefs,​ ​values,​ ​or​ ​practices​ ​which​ ​define​ ​that​ ​group,​ ​the​ ​group​ ​self-corrects​ ​by​ ​(1)​ ​making​ ​sure the​ ​deviation​ ​is​ ​made​ ​explicit​ ​and​ ​(2)​ ​punishing​ ​the​ ​deviation.​ ​In​ ​this​ ​way,​ ​groups​ ​stay​ ​alive​ ​by reproducing​ ​their​ ​rules​ ​(“We​ ​believe,​ ​value,​ ​do​ ​this….”)​ ​as​ ​though​ ​they​ ​were​ ​replicating​ ​DNA. Peer-pressure​ ​is​ ​that​ ​auto-correct.​ ​It​ ​may​ ​be​ ​laughter;​ ​it​ ​may​ ​be​ ​killing​ ​someone;​ ​it​ ​may​ ​be anything​ ​in​ ​between.​ ​People​ ​succumb​ ​to​ ​peer-pressure​ ​then​ ​conform​ ​because​ ​there​ ​is​ ​an incentive​ ​to.​ ​They​ ​may​ ​be​ ​punished,​ ​humiliated,​ ​harmed,​ ​or​ ​they​ ​could​ ​simply​ ​have​ ​the​ ​benefits of​ ​being​ ​a​ ​member​ ​of​ ​that​ ​group​ ​revoked.​ ​This​ ​could​ ​include​ ​anything​ ​from​ ​security,​ ​money,​ ​or​ ​a future,​ ​to​ ​the​ ​love​ ​of​ ​friends​ ​and​ ​family.​ ​None​ ​of​ ​this​ ​has​ ​anything​ ​to​ ​do​ ​with​ ​true​ ​beliefs​ ​in​ ​the

Prof.​ ​Eckel,​ ​U.​ ​Toledo,​ ​FA17

slightest.​ ​It​ ​works​ ​the​ ​same​ ​for​ ​any​ ​belief.​ ​This​ ​is​ ​why​ ​we​ ​must​ ​examine​ ​our​ ​socialization.​ ​We may​ ​be​ ​coerced​ ​into​ ​propagating​ ​a​ ​falsehood​ ​because​ ​we​ ​didn’t​ ​like​ ​the​ ​consequences.

26.)​ ​Face-saving:​ ​We​ ​save​ ​face​ ​whenever​ ​we​ ​believe,​ ​claim,​ ​or​ ​do​ ​something​ ​simply​ ​because​ ​we want​ ​to​ ​affirm​ ​that​ ​we​ ​are​ ​the​ ​kind​ ​of​ ​person​ ​who​ ​believes,​ ​says,​ ​or​ ​does​ ​that​ ​kind​ ​of​ ​thing.​ ​This is​ ​often​ ​for​ ​some​ ​kind​ ​of​ ​reward​ ​like​ ​group-membership​ ​or​ ​personal​ ​gratification.​ ​Again,​ ​it​ ​has nothing​ ​to​ ​do​ ​with​ ​the​ ​truth.​ ​It’s​ ​a​ ​form​ ​of​ ​self-marketing.

27.)​ ​Anthropocentrism:​ ​Anthropocentrism​ ​is​ ​when​ ​we​ ​limit​ ​our​ ​view​ ​to​ ​the​ ​human​ ​perspective.​ ​It runs​ ​the​ ​same​ ​risks​ ​as​ ​egocentrism​ ​and​ ​ethnocentrism,​ ​just​ ​with​ ​a​ ​broader​ ​circle. Anthropocentrism​ ​views​ ​everything​ ​from​ ​the​ ​vantage​ ​point​ ​of​ ​humans.​ ​On​ ​the​ ​one​ ​hand,​ ​this​ ​is natural.​ ​On​ ​the​ ​other​ ​hand,​ ​the​ ​truth​ ​of​ ​things​ ​is​ ​not​ ​disclosed​ ​by​ ​their​ ​relation​ ​to​ ​humans​ ​alone. Things​ ​are​ ​not​ ​for​ ​humans.​ ​Reality​ ​both​ ​preceded​ ​and​ ​will​ ​exceed​ ​humans​ ​in​ ​time​ ​and​ ​space. Reality​ ​is​ ​not​ ​FOR​ ​human​ ​understanding​ ​or​ ​use.​ ​If​ ​we​ ​think​ ​that​ ​reality​ ​is​ ​FOR​ ​us,​ ​we​ ​will​ ​never understand​ ​it​ ​as​ ​it​ ​is.​ ​One​ ​must​ ​understand​ ​a​ ​phenomenon​ ​from​ ​a​ ​non-human​ ​perspective​ ​as much​ ​as​ ​it​ ​can​ ​be​ ​understood​ ​that​ ​way.​ ​This​ ​eliminates​ ​biases.​ ​First,​ ​this​ ​how​ ​we​ ​avoid anthropomorphizing​ ​things,​ ​turning​ ​them​ ​into​ ​people​ ​or​ ​understanding​ ​them​ ​by​ ​analogy​ ​with people.​ ​We​ ​used​ ​to​ ​understand​ ​everything​ ​this​ ​way​ ​by​ ​attributing​ ​some​ ​human-like​ ​god​ ​or​ ​spirit to​ ​it.​ ​To​ ​understand​ ​the​ ​weather,​ ​we​ ​cannot​ ​understand​ ​it​ ​as​ ​the​ ​product​ ​of​ ​the​ ​human-like​ ​desires and​ ​actions​ ​of​ ​Zeus​ ​or​ ​Poseidon,​ ​and​ ​we​ ​do​ ​better​ ​when​ ​we​ ​understand​ ​it​ ​as​ ​atmospheric variations,​ ​as​ ​what​ ​it​ ​is.​ ​To​ ​understand​ ​our​ ​cat,​ ​we’d​ ​do​ ​better​ ​to​ ​understand​ ​him​ ​as​ ​a​ ​cat,​ ​not​ ​Mr. Fluffy​ ​who​ ​is​ ​a​ ​refined​ ​gentleman.​ ​Anthropomorphizing​ ​is​ ​not​ ​a​ ​good​ ​way​ ​to​ ​gain​ ​access​ ​to​ ​true beliefs;​ ​it​ ​is​ ​an​ ​obvious​ ​projection​ ​of​ ​our​ ​own​ ​selves​ ​onto​ ​the​ ​world​ ​around​ ​us.​ ​A​ ​further​ ​danger of​ ​anthropocentrism​ ​is​ ​to​ ​believe​ ​that​ ​things​ ​are​ ​for​ ​our​ ​own​ ​use.​ ​A​ ​biologist​ ​who​ ​understood​ ​a tree​ ​only​ ​through​ ​the​ ​lens​ ​of​ ​an​ ​object​ ​to​ ​be​ ​used​ ​would​ ​be​ ​a​ ​terrible​ ​biologist.​ ​They​ ​might understand​ ​the​ ​features​ ​of​ ​a​ ​tree​ ​that​ ​make​ ​it​ ​a​ ​good​ ​bit​ ​of​ ​lumber,​ ​or​ ​producer​ ​of​ ​fruit,​ ​or​ ​bit​ ​of cover​ ​to​ ​hide​ ​from​ ​enemies​ ​behind,​ ​but​ ​they​ ​could​ ​never​ ​understand​ ​the​ ​tree​ ​for​ ​what​ ​it​ ​is​ ​in itself​ ​in​ ​all​ ​of​ ​its​ ​features.​ ​Crucially,​ ​the​ ​tree’s​ ​being​ ​does​ ​not​ ​conform​ ​to​ ​our​ ​uses​ ​of​ ​it.​ ​It​ ​has features​ ​that​ ​exceed​ ​our​ ​use​ ​of​ ​it.​ ​It​ ​is​ ​something​ ​more.​ ​We​ ​have​ ​to​ ​eliminate​ ​our​ ​human​ ​biases as​ ​much​ ​as​ ​possible​ ​in​ ​order​ ​to​ ​objectively​ ​analyze​ ​things​ ​in​ ​their​ ​non-human​ ​being.

28.)​ ​Diffusion​ ​of​ ​Responsibility:​ ​This​ ​is​ ​when​ ​people​ ​are​ ​less​ ​inclined​ ​to​ ​take​ ​responsibility​ ​due to​ ​the​ ​presence​ ​of​ ​other​ ​people.​ ​In​ ​short,​ ​the​ ​more​ ​people​ ​there​ ​are​ ​in​ ​a​ ​situation,​ ​the​ ​less​ ​likely those​ ​same​ ​individuals​ ​are​ ​to​ ​take​ ​responsibility​ ​for​ ​something​ ​compared​ ​to​ ​if​ ​those​ ​individuals were​ ​in​ ​a​ ​situation​ ​with​ ​less​ ​people,​ ​or​ ​alone.​ ​The​ ​error​ ​is​ ​in​ ​conforming​ ​to​ ​group-think unconsciously,​ ​where​ ​we​ ​for​ ​some​ ​reason​ ​ignore​ ​moral​ ​impulses​ ​under​ ​conditions​ ​which​ ​are irrelevant​ ​to​ ​the​ ​moral​ ​standing​ ​of​ ​the​ ​situation.​ ​If​ ​it​ ​is​ ​wrong​ ​in​ ​a​ ​group​ ​of​ ​10,​ ​it​ ​is​ ​wrong​ ​in​ ​a group​ ​of​ ​1,000​ ​if​ ​nothing​ ​but​ ​that​ ​quantities​ ​of​ ​people​ ​change.​ ​Humans,​ ​however,​ ​aren’t​ ​wired​ ​to operate​ ​like​ ​this.​ ​Make​ ​sure​ ​you​ ​are​ ​not​ ​letting​ ​this​ ​diffusion​ ​effect​ ​your​ ​beliefs.

Prof.​ ​Eckel,​ ​U.​ ​Toledo,​ ​FA17

Found something interesting ?

• On-time delivery guarantee
• PhD-level professional writers
• Free Plagiarism Report

• 100% money-back guarantee
• Absolute Privacy & Confidentiality
• High Quality custom-written papers

Related Model Questions

Feel free to peruse our college and university model questions. If any our our assignment tasks interests you, click to place your order. Every paper is written by our professional essay writers from scratch to avoid plagiarism. We guarantee highest quality of work besides delivering your paper on time.

Sales Offer

Coupon Code: SAVE25 to claim 25% special special discount
SAVE