Saturday, September 24, 2011

Pat Robertson Made Me Think...

Abstract

My assignment this past week was to write about the impact of critical thinking on my doctoral studies. Events conspired to provide me with the perfect opportunity to consider that question within a real-world application. I turned on the radio to hear a talk show host questioning a particular response that televangelist Pat Robertson had given a caller the previous week. In reviewing Robertson’s initial answer along with the cacophony of criticism it generated, I was struck with the inadequacy of a single authority to answer certain types of complex questions without the balancing effect of counterargument to keep the truth on track.

Treatment

Recently on his daily television show, The 700 Club, televangelist Pat Robertson was asked by a caller whether the husband of a dementia patient could divorce his ill wife and remarry prior to the sick spouse’s actual death. Robertson responded by suggesting that the healthy spouse’s experience of living with an Alzheimer’s-ill spouse is similar to the experience of enduring the death of a spouse. As such, divorce was acceptable perhaps even desirable from a doctrinal standpoint. Despite the fact that Robertson footnoted his answer by referring the caller to an ethicist for more thorough examination of the issues, respondents immediately leaped on the latent inhumanity of such a claim and condemned it. I, too, was appalled by the answer’s horribly lopsided application of grace. But further investigation revealed that Robertson’s answer, flawed though it might be in some respects, also demonstrated evidence of counterargument desirable within our religious institutions.

Human beings use various methods for seeking out reliable information and the methods often shift depending on the type of information we seek. (Metzlhof, 2001, p. 3-6). For instance, when issues of guilt or innocence are at stake, we ask a jury to determine truth based on the evidence presented by opposing representation. On the other hand, if we are trying to decide whether or not to drink from a carton of milk past its expiration date, we are more likely to rely on a simple “sniff test” without further investigation. The caller on The 700 Club demonstrated yet another approach, which is to seek out the opinion of a trusted authority. This can be a useful method of discovering the veracity of a claim. A pharmacist can help a patient avoid danger by advising the proper administration of a medication. Consumer Reports magazine can help a buyer choose the most fuel-efficient car.

But, what happens when the authority figure is in error? In The Experts Speak: The Definitive Compendium of Authoritative Misinformation, editors Cerf and Navasky (1998) catalog a list of quotes from respected authorities whose wisdom failed the veracity test. Among them was Alfred Nobel, founder of the Nobel Prize and inventor of dynamite. So convinced was he that his explosives would deter war, he declared, “My dynamite will sooner lead to peace than a thousand world conventions.” (p. 274) Another famously inaccurate prediction came from former President Herbert Hoover. Just months before the stock market crash of 1929 he boasted, "We in America today are nearer to the final triumph over poverty than ever before in the history of any land." While the naiveté of these quotes is readily apparent now, that was not the case for those who blindly trusted in the judgment expressed then. Total reliance on the word of an authority for reliable truth leaves the seeker vulnerable to error.

Fortunately, we have other methods of testing the veracity of a claim available to us. Critical thinking calls for any claim to be weighed and measured against counterarguments before a conclusion is drawn, thus making the veracity of the claim much less inclined to error. Pat Robertson himself alluded to the value of critical thinking by referring the caller to an ethicist and encouraging further exploration. Even within (or perhaps especially within) institutions like organized religion, which are founded on authoritative truth, multiple perspectives are necessary to stay on track.

In fact, Pat Robertson’s answer was in itself evidence of growing counter-arguments to long-standing church tradition within the evangelical movement. Historically, both Catholic and Protestant traditions forbade divorce for any reason save infidelity or desertion (Snuth, 1990). The contemporary Christian voice criticizing Robertson that was most commonly quoted in the press supported this conservative view of marriage. Russell Moore, writing for The Baptist Press, condemned Robertson’s remarks as “an embarrassment… a repudiation of the Gospel of Jesus Christ” (2011). Ironically, Robertson’s expanded allowance for divorce is a turnabout even for himself. Robertson’s Christian Coalition used scripture to lobby for reform to no-fault divorce laws in the 1990s (Drew, 1996). Moore suggests that Robertson’s reversal is due to the televangelist’s tendency towards a “prosperity gospel with more in common with an Asherah pole than a cross.”

Even so, Robertson’s more progressive interpretation has support elsewhere in the evangelical movement where counterarguments to the rigid stance of earlier church traditions have been examined. These newer suppositions consider the various nuances involved with a broken marriage covenant as we have come to understand them from a psychological or sociological perspective. Fidelity and commitment can be compromised in ways other than adultery and desertion and yet have the same profound effect, such as in the case of spousal abuse and addiction (Keener, 1991). As a result, divorce may have a wider application than originally defined. As troubling as Robertson’s answer was, it allowed for considerations other than church tradition to be examined for the purposes of informing church doctrine. It demonstrates the effectiveness of counterargument in informing matters of faith with the knowledge and understanding gleaned from other fields of study.

Conclusion

In summary, Pat Robertson made me think. Even more unexpectedly, he made me think critically. The inadequacy of his answer demonstrated for me the necessity of measuring authoritative judgment against the proofs of sound reasoning. But it was also proof of the corrective power of counterargument to move even some of the most rigid traditions towards a greater understanding of truth.

Resources

Cerf, C., & Navasky, V. S. (Eds.). (1998). The experts speak: the definitive compendium of authoritative misinformation. New York: Villard.

Drew, B. A. (1996). What God hath joined together: divorce laws challenged in Michigan. Freedom Writer. Published by the Institute for First Amendment Studies. Retrieved from http://www.publiceye.org/ifas/fw/9607/divorce.html

Keener, C. S. (1991). And marries another, divorce and remarriage in the teaching of the new testament. Baker Academic. Retrieved from http://books.google.com/books/feeds/volumes?q=0801046742

Meltzhoff, J. (2001). Critical thinking about research: psychology and related fields. Washington, DC: American Psychological Association.

Moore, R. D. (2011). First person: Alzheimer’s, Pat Robertson & the true Gospel. The Baptist Press. September 15. Retrieved from http://www.bpnews.net/BPFirstPerson.asp?ID=36119

Snuth, D. L. (1990). Divorce and remarriage from the early church to John Wesley. Trinity Journal, 11.2. Retrieved from http://christiandivorceservices.com/Documents/Divorce%20And%20Remarriage%20From%20The%20Early%20Church%20To%20John%20Wesley.pdf

(n. d. ) Our presidents: 31. Herbert Hoover. Retrieved from http://www.whitehouse.gov/about/presidents/herberthoover

(September 13, 2011). Bring it on: Alzheimers. The 700 Club. Retrieved from http://www.cbn.com/media/player/index.aspx?s=/mp4/BIO_091311_WS

Sunday, September 11, 2011

The Invisibility of Bias: Friend or Foe

Abstract


The same invisible nature of bias on the web can be both an obstacle for investigation and an opportunity for imagination.


Treatment


Issues of bias are not new with the advent of the Internet. What is new is the degree to which bias can be detected. In a revolutionary way, the medium of the web makes possible the discussion of ideas with greater anonymity than ever before. This invisibility of bias can either be an obstacle or an opportunity depending on your use of the information gleaned on the web.


Invisibility as obstacle


If your use of the web is to gather accurate information of scholastic integrity, this invisibility can be deadly. Alan November, on his novemberlearning.com website, points out a particularly disturbing example of this when he draws attention to a website whose url address is martinlutherking.org (November 2011). This website presents itself as one with historically accurate information about the influential civil rights leader. Further examination however reveals the site is actually registered to stormfront.org, a white supremacy organization with a revisionist agenda towards the civil rights movement. The ease with which an organization can cloak its bias can deceive an earnest seeker looking for information. This places an additional burden on the researcher to be especially vigilant in seeking to identify potential prejudices behind the information in order to ascertain its veracity.


Invisibility as opportunity


If your use of the web is to creatively problem-solve requiring the exploration of multiple perspectives, invisibility of bias can be a salvation. Studies comparing electronic brainstorming with more traditional forms of brainstorming found that the anonymity of the web eliminated the problem of evaluation apprehension. (Kay 1995) Participants felt greater freedom to pose solutions without fear of recrimination. This freedom allows ideas to be evaluated for their own merit without the distraction of outside prejudices. In this way, the invisibility of bias becomes an invaluable partner in stumbling upon innovative solutions that otherwise may have been missed.


Conclusion


As an academic, it is tempting to see the invisibility of bias on the web only as an impediment to investigative research, but that same invisibility can be the launching pad for imaginative exploration.


References


November, A. (n.d.) “Find the publisher of a website.” Retrieved February 28, 2011 from http://novemberlearning.com/resources/information-literacy-resources/v-find-the-publisher-of-a-website/


Kay, G. (1995). Effective meetings through electronic brainstorming. The Journal of Management Development, 14(6), 4-4. Retrieved from http://search.proquest.com/docview/216298221?accountid=10868

Sunday, September 4, 2011

My Baptism in Web 2.0

When I began my career as a college professor, it was following a long hiatus from education. My college experience had been dominated by wise professors who doled out bits of wisdom in lecture format. So, naturally I prepared similarly expecting the same grateful appreciation from my students. Additionally, I viewed the web as nothing but an enormous well of information from which I could dip to sprinkle my presentations with film clips and YouTube videos. Thinking I was cresting the wave of technology, I would soon discover that I was all wet. Web 2.0 would baptize me into a radically new perspective on education altogether.


Before the first day of class had even begun, a student approached me asking if I would allow him to leave each class period an hour early because he had the opportunity to work in a job where he believed he would learn more than he could in my class. I was so shocked I didn’t know what to say. It would never have occurred to me to approach a professor in such a manner when I was in college. It would have been disrespectful and I would not have believed I was qualified to make such a judgment. I quickly learned I was dealing with a very different generation that was accustomed to greater agency in their educational process. It surprised me to discover this assumption lived quite peaceably along with their respect for authority.


Once I got past this one student’s rather awkward breach of protocol, I realized students had equal access to the same enormous well of information as I had when I compiled my lecture. They needed and wanted something different from me. I came to view this as a tremendous advantage. Instead of taking precious class time filling the pool with knowledge they already have access to, we could spend more time manipulating and interpreting the knowledge.


However, this self-directed access is not without problems. The dangerous riptide nascent in web-acquired knowledge is one of authority. My students frequently use the defense, “I read it on the internet” with the confidence of a Rhode scholar after a yearlong research sabbatical. This is not surprising since the web is still quite young and we are virginal in our interaction with it. Previous generations embraced television with similar naiveté before being burned by events like “The Quiz Show” scandals of the fifties. This tendency to swallow information whole still makes monitoring the knowledge acquisition process necessary on some level.


I’ve been experimenting with ways to deal with this problem while salvaging the obvious benefits of digital media. I've found that using applications like Screenflow to record repetitive concepts, helps protect class time for more theoretical discussion. But this week a colleague introduced me to a more advanced paradigm that already exists on the high school level where the traditional role of homework and lecture are completely flipped. This paradigm relegates the lecture to video viewed at home where it can be referenced repeatedly and supplemented with an unlimited supply of web sources. Simultaneously this model redeems the realm of the classroom to Socratic inquisition and relational interaction.




For a teacher, this is salvation. I’ve seen the light. And it’s on the web.

Sunday, August 28, 2011

Looking for the black swan...



What is critical thinking? I recently "Googled" the term and the search revealed 22,500,000 items for perusal. It's not surprising so many people have something to say on the topic considering the amount of information with which we are deluged each day. It is a skill in danger of being shelved along with 8-track tapes and VCRs. A study by eMarketer revealed that the American consumer spends 11 hours a day interacting with major media. With such over-stimulation, it is admittedly easier to resort to a favorite litmus test rather than evaluative thought. But critical thinking is a human activity necessary to prevent an inevitable drift away from reason. Consider the following two elements.


Critical thinking propels thought forward by asking questions. Dr. Richard Paul, an internationally recognized authority on critical thinking expressed it this way, “Questions define tasks, express problems and delineate issues. Answers on the other hand, often signal a full stop in thought. Only when an answer generates a further question does thought continue its life as such.” The habit of asking questions prevents us from settling just short of the truth. Quality of the information gleaned is only as powerful as the question asked.


20th century philosopher Karl Popper illustrated this when his critique of the scientific method prodded his search for the elusive black swan. He proposed that if you postulated “All swans are white,” you would “prove” such an hypothesis, not by seeking out every white swan, but by diligently searching for a black one. A multitude of white swans might satisfy a complacent questioner, but only a persistent, inquisitive mind finds the black swan and finally the truth.


Critical thought checks itself against its own assumptions and biases. We cannot move forward without assumptions any more than a baby can walk before he crawls. However, when we fail to acknowledge those underlying presuppositions, not only can our inferences err, but the unintended consequences of our reasoning can be detrimental.


I was struck with the profundity of such a concern when I read Philip Myer’s article on Stanley Milgram’s obedience experiments. Per Myer, Milgram’s studies were an effort to prove The Shirer thesis that Germans were fundamentally flawed, and therefore willing accomplices to Hitler’s inhumane treatment of the Jews. Not only did his studies fail to prove the thesis, but his methodology resulted in intense pain and suffering on the part of his own “naïve subjects”. Under the pressure of the experiment, subjects would stutter, sweat, and break into nervous bouts of laughter to the point of seizure, among other disturbing manifestations. And even though subjects were eventually apprized of the ruse, is it not reasonable to consider the potential damage such self knowledge might have? When questioned about this ironic juxtaposition, Milgram's explanations echoed eerily of the justifications he would give his “naïve subjects” for continuing to inflict pain on their innocent person. One has to one wonder that if Milgram had acknowledged his bias against Germans, he may have been more likely to speculate as to his own capacity to inflict pain and suffering.


In closing, critical thinking is hard. It means delaying a solution until information is poked, prodded, and sniffed. It means continuing to ask questions even when we think we have the answer. Critical thinking is destabilizing. It means weighing our biases and assumptions which means our foundation is always threatened. But critical thinking is our best friend. Like gravity holding us fast to the earth, critical thinking is the human endeavor that grounds us to the truth.

Saturday, August 20, 2011

A curious beginning...

As a child in the seventies, my early memories of television were largely filtered through my friends at school. The flicker of our black and white set at home surrendered to static before I reached second grade and my conservative family would not replace it with a new television until my junior year of high school. The limited exposure to media only magnified my awareness of its influence on the world around me. Every morning I would scan my friends’ conversations for clues about the previous nights’ programming. I punctuated my sentences with Fonzie’s “Aaaaay” and cut my hair with Farrah Fawcett’s feathered flip before I had ever even seen a full episode of Happy Days or Charlie’s Angels. While frustrating for an adolescent, my evangelical parents’ quarantine on television was well intentioned. Mourning the perceived perpetual loss of moral high ground, my father preferred to more closely monitor my introduction to life and story. But despite his best efforts, it was still television that introduced me to the intersection between the two. Ironically my early deprivation would eventually blossom into a career in the entertainment industry.

For fifteen years I worked as a production coordinator and supervisor in television and movies on projects as varied as afterschool specials for CBS to summer blockbusters like Face/Off and Big Fish. I loved the logistics of filmmaking; the challenge of its relentless pace; the outrageousness of its demands; and, the camaraderie of its cast and crew. And while I stayed happily immersed in actual production, I remained curious about the interaction between media and identity development that I had first observed as a schoolgirl. I was also intrigued by the mercurial history the evangelical community had had with media throughout most of the 20th century. Eventually, that curiosity got the best of me and I decided to pivot in mid-stream. For now, I’ve shelved my IMDb credits to pursue a PhD at Fielding University in Media Psychology. I’m also teaching film production and criticism classes to a host of bright and challenging students at Biola University.


Through my research at Fielding and my work at Biola, I hope to introduce the evangelical community to a more mature understanding of how human beings can thrive with mediated messages. Instead of censoring or fashioning inadequate alternatives, I hope to illuminate for societies of faith a shrewd, but dynamic interaction between human development and the appropriation of media.