Book Review - The Cult Around the Corner

Cultic Studies Review, 3, (2/3), 2004

The Cult Around the Corner

Nancy O’Meara and Stan Koehler

Foundation for Religious Freedom International, Los Angeles, CA, 2003

Reviewed by Steve K. D. Eichel, Ph.D., ABPP

For many years, it seemed impossible to find a non-academic, “practical” book about cults that did not advance a countercult viewpoint.  Balanced views of the cult phenomenon seemed relegated to weighty (and expensive) books from academic presses.

The Cult around the Corner (TCATC) is an unabashed attempt by Nancy O’Meara and Stan Koehler to correct this imbalance by “bringing reason, understanding and open communication to an often explosive subject” (from the Introduction, p. 5).  Mr. Koehler is identified as a conflict resolution teacher.  Ms. O’Meara is identified as an interfaith hotline volunteer.  I have argued strongly that, given the controversial nature and claims of all sides on the cultic studies spectrum, it behooves researchers, clinicians, and writers on these topics to make affiliations and a priori assumptions (or biases) known, especially when we publish (cf. Dole & Eichel, 1981;  Dubrow-Eichel, 1999;  Dubrow-Eichel, 2002).  Both O’Meara and Koehler are on the Board of the new Cult Awareness Network (CAN), which many have argued is in fact a Church of Scientology “front” organization.  Whether or not this claim is true, I think everyone agrees on the origins of the new CAN and that members of the Church of Scientology play a very active role in it.  Ms. O’Meara’s connection with CAN is only hinted at in her biography in TCATC, and Mr. Koehler’s affiliation is not mentioned at all.  Interestingly, although the CAN website lists some Board members’ religious affiliations (Koehler is identified as a Buddhist, and a former Secretary is identified as a member of the Movement for Spiritual Inner Awareness), it says nothing about O’Meara’s full-time staff status with, and rank within the Church of Scientology, a fact that she proudly indicated during our various conversations in person.

A cursory Google search on 7/20/04 yielded 524 references or “hits” on the Internet, a testimony to the broad dissemination and potential influence of The Cult around the Corner (TCATC).  It is an easy-to-read, 88-page book, designed to be accessible to parents, friends, families, educators, and clergy concerned about group (cult) membership.  It is organized into four sections, with the majority of the book concentrating on what loved ones and friends of group members should and should not do.  A third section focuses on the authors’ unique explanation of the underlying cause of teenage problems that may contribute to involvement in “bad” cults (it’s all due to illiteracy) and then proceeds to berate “brainwashing” theory and other criticisms of new religions.  This section ends with an appeal to reason and tolerance.  Brief case vignettes, presumably from CAN files, are presented to illustrate many of the authors’ points.

Writing a balanced practical guide to the cult phenomenon is a laudable goal, and O’Meara and Koehler at times approach achieving it.  Unfortunately the book ultimately descends into the usual “us” vs. “them” dichotomy, the “us” being enlightened civil libertarians and liberal religionists, and the “them” being coercive deprogrammers-cum-exit counselors (and their supporters) who continue to hide behind discredited theories of brainwashing in their attempts to spread hatred and religious bigotry.

What makes this attempt so disappointing is that it sporadically includes advice with which I strongly agree.  On page 9 the authors stress the importance of staying in communication with the group member, that “the importance of communication cannot be over-stressed in a situation with deep belief differences.”  More specifically, the authors suggest, “You write.  You send letters, packages.  You send photos.  You let the person know you love them, even if you disagree with some of his choices” (p. 48).  Who can disagree with that?  The authors state that “communication does not mean a one-way flow of ideas from you to the other person...”  Also quite true.  I strongly advise families and friends not to argue, lecture, or harangue their loved ones.  However, the authors assume that parents (and other nonmembers) are the only parties guilty of one-way communication.  In my experience, among the primary complaints made by families of cultists are how difficult (if not impossible) it is to contact a loved one, and that their loved one seems capable of only one-way communication (e.g., proselytizing).  Friends report to me that the member now seems almost incapable of truly meaningful dialog and discussion.  I know one parent with a daughter on full-time staff with Scientology who for the past five years still averages a dozen calls before she can reach her daughter, even when calling during agreed-upon times.  Years ago the father gave his daughter a cell phone in an attempt to rectify the matter, but it soon “disappeared” (they wonder if it was appropriated by other Scientology staff) and their daughter declined a replacement.

The authors suggest that, if we have difficulty understanding a group member’s actions, “it is highly likely that you are missing information [and] the way to get that information is through communication” (p. 11).  O’Meara and Koehler correctly note that groups are not static; they change, and past information may no longer be valid.  In addition to talking to the new group member, they advise talking to group leaders and visiting the group.  They advocate getting more information before taking any actions, and I strongly agree with that advice.  However, according to TCATC the best source of valid information seems to be the group itself or the group member (that is, until the member leaves the group).  Lastly, they note that “getting information directly from the group does not preclude you from obtaining information from other sources” (p. 23).

But what are these “other sources?”  Meeting group members socially, talking with a college ombudsman, discussing your concerns with a clergyman, and visiting the group’s Internet website are among the “other sources” listed.  So are professors of religion, history, and sociology (but not psychology!).  The authors also recommend visiting “independent” websites, generalizing that those sponsored by universities and interfaith organizations, “and ones which clearly state the authors of the information” are good examples.  By this standard, the current CAN website does not seem to be a good example.  As of July 22, 2004, it listed the authors of articles listed under “Articles/Papers,” but not all the authors of articles that appear under the category “Into Infamy.”

Prominent among “other sources” not recommended by O’Meara and Koehler are former members.  Apparently, only current members are valid information sources.  Talking with critical ex-members is denigrated, and we are warned against taking their stories seriously:  “The vocal critics of new religious groups are frequently sour former members...[who] have had a bad experience with one organization and have turned it into a generalized hysteria” (p. 54).

I imagine there would be hundreds of former members who would be deeply offended by these comments, and those found in the subchapter, “What About Negative Books.”  In this section, O’Meara and Koehler begin with the reasonable suggestion that we “check out the author’s agenda.  You may have to read between the lines.”  (In my opinion, this advice pertains to all books on the topic of cults, including this one.)  Former members who are critics of new religions are relegated to those “kicked out 20 years ago after failing to live up to the moral standards of the group,” duplicitous people who join undercover and were “never honest with the group,” and are compared to a group leader’s ex-daughter-in-law...[who does not report] how much money she accepted [from the group] as a cash settlement” (This latter seems to refer to Nansook Hong, former daughter-in-law of Sun Myung Moon).  It seems offensive, even hateful, to treat critical former members in such a dismissive manner.

Section 2, “What Not to Do,” begins with a rule “... carved in stone:  Don’t ever pay someone to talk anybody out of anything.”  More specifically, O’Meara and Koehler admonish readers never to pay for a deprogramming, an exit-counseling, or an intervention.  The ethics of hiring exit-counselors are always worthy of reasoned discussion.  I have rarely known a family to engage an exit-counselor except in near desperation and after considerable moral struggle.  O’Meara and Koehler would have done better to stick with their ethical concerns.  Instead, however, they resort to ad hominem attacks and undocumented claims.  At various points, the authors suggest that opinions about groups should not be based on second-hand experiences, or strictly on the research of “arm-chair critic[s]” whose findings are based on “interviewing only people with negative opinions [about a group]” (p. 35).  This is good advice, and I wish the authors had taken it when they wrote this section.  They seem to base their findings only on people with negative opinions about their experiences during exit-counseling.  The opinions of scores of people who describe their exit-counseling experiences in highly positive terms are not mentioned, let alone thoughtfully considered.  Instead, exit-counselors are universally described in TCATC as “browbeating,” as withholding the truth about their “very low success rate,” and as contributing to “mak[ing] the [family] situation worse than ever”  (p. 45).  I don’t know how many exit-counselings the authors have witnessed first hand, but as a researcher with some expertise in this area (Dubrow-Eichel, 1989, 1990), I do not believe “browbeating” is an accurate description of the process I experienced and wrote about.  In fact, the exit-counselors I know strongly advise against any kind of “browbeating,” if only because it is highly counterproductive.  I will admit that I do not have objective data across a good sample of exit-counselings to state categorically that they never involve “browbeating.”  But neither do O’Meara and Koehler.  The same holds true with their claim of a “very low success rate.”  The authors do not provide data backing this claim.  The only deprogramming/exit-counseling outcome data I’m aware of (which is seriously outdated) suggested a success rate of about 60-67%, which was comparable to the success rates reported in hundreds of therapy outcome studies.

TCATC focuses a great deal on the need to respect unusual and unconventional beliefs.  Again, I am in full agreement.  (As I wrote this, I was reminded of a time when I was chastised by a Church of Scientology President for belonging to a profession that is intolerant of unconventional or dissenting beliefs.  I replied that anyone who believes psychologists are coercive mind manipulators who are intolerant of unconventional beliefs has clearly never been to a meeting of the American Psychological Association, where practically any and all opinions about human behavior may be heard.  If anything, psychologists are typically accused of being too liberal and overly tolerant of lifestyles and beliefs that most of society finds alien.)  True, there are families and friends whose objections to groups labeled “cults” seem to be based primarily on the “incorrectness” of their beliefs.  However, this is typically not what motivates families or friends to take action.  Parents come to see me because their sons or daughters who could always be trusted have suddenly begun to lie a lot, or they have spent their college tuition on a group’s “courses” (without their parents’ knowledge).  Spouses come because a husband or wife has abruptly left their marriage and their children, or has suddenly refused medical care.  One person contacted me because his wife began “channeling” entities while driving.  One of my most recent cases involves a young woman who has just announced she will not see her family again, ever, and that she will not visit her sister who is about to give birth.  In another case, a family contacted me because their daughter gave a group that advocates “detachment from materialism” her entire $350,000 inheritance.  (The leader is presumably “immune” from materialism.)  Most concerned families and friends who have sought my help have done so because of highly unusual behaviors, not controversial beliefs.  O’Meara and Koehler do not address what to do when a loved one’s behavior suddenly and drastically becomes upsetting, hurtful, unscrupulous, or potentially harmful after becoming involved with a group.

The third section closes with a strongly worded invective about the “hoax of brainwashing and mind control.”  This review is not the place to address this complicated debate in detail.  I will only address the most obvious problems I had with the authors’ treatment of this topic.  I agree that the terms “brainwashing” and “mind control” have been over employed and used in a simplistic and reductionist manner.  However, the authors (and from what I can tell, a great many others) continue to foster the false notion that “mind control” is an updated term for a process approaching black magic or “voodoo.”  They engage here in several logical fallacies.

On page 70, the authors state that “the whole subject of brainwashing as it applies to religious groups has been debunked by competent scholars and repeated studies.  Religious people do think for themselves.”  Again, I know of no one who makes the claim that all religious people do not think for themselves.  However, it is equally false to claim that all people (religious or otherwise) do think for themselves.  This is not a black or white issue.  Unlike pregnancy, it is indeed possible to be “a little bit” or partly unduly influenced.

Secondly, there is no single theory of “mind control,” but rather a variety of theories of undue influence or “mind control.”  These theories are far from uniform, but they all agree on four major points:  (1) behavior and beliefs can be influenced and even radically changed, (2) influence can occur outside awareness (and therefore outside one’s control), (3) people are influenced to various degrees due to various factors, and some people seem more vulnerable than others, and (4) people can be influenced to commit grave acts of harm.  Nobody argues that there are people who seem able to resist “mind control,” or who voluntarily exit “mind control cults.”  

Nobody argues that the CIA’s MK-ULTRA (and related) projects failed to reach their impossible goals:  Develop a perfect, consistent and totally reliable means of breaking Soviet agents and then convert them into double agents, and create a cadre of “perfect,” unknowing CIA “sleeper agents” who, if captured by our Soviet counterparts, could never be broken (or would commit suicide).  No such perfect technology was developed.  However, the CIA’s experiments did succeed in some cases.  If we applied the same “if it doesn’t work 100% then it isn’t valid” criteria to open heart surgery or antibiotic research, there would be quite a rise in unnecessary heart failures and deaths from bacterial infections.  We all know these medical procedures are not 100% effective and probably never will be, yet they enjoy almost universal acceptance as being “valid.”

Meanwhile, over half a century of social psychology research has shown that the behavior of “good” people can be drastically reshaped and influenced, as recent events in Abu Ghraib have sadly reminded us.  The behavior of U.S. prison staff was eerily similar to those reported in Dr. Philip Zimbardo’s famous prison study at Stanford University 30 years ago.  The concept of “undue influence” has been recognized by law for centuries.  Our own government seems to recognize that something akin to brainwashing occurs in Al Qaeda training camps and, more frightening, certain Islamic schools.

My own experience in discussions with staff and leaders of “cultic groups,” is that they readily recognize the existence of “mind control” when it is purportedly practiced by psychiatrists, psychologists, exit-counselors, and so-called deprogrammers.  The authors of TCATC ignore the serious issues raised by these studies in favor of a gross generalization that all cult critics continue to buy into the Singer-Ofshe model of “mind control.”  There have always been critics who questioned all or parts of this model (including me), and there have been revised theories that have incorporated the findings of more recent social psychological studies. Hopefully, since Ms. O’Meara has attended AFF conferences, she has had some exposure to theories other than the Singer-Ofshe model, and will modify her blanket statement about the “hoax of brainwashing” in a second edition of TCATC.

References

Dole, A., & Eichel, S.  (1981).   Moon over academe.  Journal of Religion & Health, 20, 35-40.

Dole, A., Langone, M., & Dubrow-Eichel, S.  (1990, July).  The new age movement:  Fad or menace?  Paper presented at the International Council of Psychologists, Tokyo.

Dole, A., Langone, M., & Dubrow-Eichel, S.  (1990). The new age movement:  Fad or menace?  Cultic Studies Journal, 7, 69-91.

Dubrow-Eichel, S.  (1989). Deprogramming:  A case study.  Part I:  Personal observations of the group process [Special issue].  Cultic Studies Journal, 6(2).

Dubrow-Eichel, S.  (1990). Deprogramming:  A case study.  Part II:  Conversation analysis.  Cultic Studies Journal, 7, 174-216.

Dubrow-Eichel, S.  (1999). Can scholars be deceived? Empirical evidence from social psychology and history.  Paper presented at “Religious and Spiritual Minorities in the 20th Century:  Globalization and Localization,” the 13th International Conference of the Center for Studies on New Religions, Bryn Athyn, PA.

Dubrow-Eichel, S.  (2002).  Can scholars be deceived? Empirical evidence from social psychology and history.  Cultic Studies Review:  An Internet Journal of Research, News & Opinion, retrieved July 22, 2004 from http://www.cultsandsociety.com/csr_articles/dubroweichel_steven.htm