Sociological Analysis © 1984 Association for the Sociology of Religion, Inc.

Constructing Cultist "Mind Control"

THOMAS ROBBINSp

Who am I to say that's crazy
Love will make you blind
In the church of the poison mind
Culture Club

Introduction

Facts and values are badly entangled in controversies over "cults." Can it be plausibly maintained that the analysis of social processes in terms of "brainwashing" or "coercive persuasion" is primarily an objective scientific matter which can be detached from judgmental ideological and policy considerations? Concepts such as "brainwashing" or "mind control" are inherently normative. Szasz (1976: 10) notes, "We do not call all types of personal or psychological influences 'brainwashing.' We reserve this term for influences of which we disapprove." The application of such concepts to a given group necessarily stigmatizes that group; however, the stigma is frequently primarily connotative. It does not derive directly from what is actually empirically established about the group in question but from the choice of terminology or the interpretive framework from which empirical observations are considered.

This essay will examine the rhetorical conventions, underlying assumptions, interpretive frameworks, and epistemological rules which make possible the brainwashing allegations against cults, i.e., an exercise in demystification. It is not our contention that authoritarian and "totalistic" sects do not present some difficulties for American institutions or that there aren't "abuses" in a number of areas perpetrated by some groups, or that legal measures and controls may not sometimes be appropriate. We do assume, however, that there is a certain relativity to "social problems" which may be viewed as social movements striving to define certain aspects of reality as problematic and requiring social action (Mauss 1975). "It is the conflict over the 'definition of reality' that provides the heart of any 'social problem" (Wolf-Petrusky 1979: 2).1 A "Politics of reality" operates (Goode 1968). Allegations of brainwashing and coercive mind control on the part of cults are thus essentially interpretive and involve assumptions and frames of reference which interpenetrate the "objective facts." Finally, it is our view that the overwhelming popular, legal, and scholarly focus on the processes by which individuals become and remain committed to cults is misleading in the sense that it shifts attention away from what we consider the ultimate sources of social and professional hostility to cults. We see the issue of coercive persuasion in cults as an ideological "superstructure" which mystifies an underlying "base" entailing threats posed by today's movements to various norms, groups, and institutions.

Underlying Sources of Tension

Beckford (1979), Robbins and Anthony (1982), Shupe and Bromley (1980), and others have discussed the underlying sources of

167    THE "BRAINWASHING" CONTROVERSY

tension between contemporary religious movements and various groups and institutions which appear to be ranged against them. These factors may be briefly summarized: (1) Groups such as Hare Krishna or the Unification Church may be said to be incivil religions which claim a monopoly of spiritual truth and legitimacy and in so doing contravene American civil religion qua "religion of civility" (Hammond 1981; Cuddihy 1978; Robbins 1984b). (2) Such groups are frequently communal and totalistic and thus additionally contravene the norm of personal autonomy (Beckford 1979) and the value of individualism, which is central to modern Western culture. (3) Groups such as Scientology or the Unification Church are highly diversified and multifunctional and therefore compete with and threaten many groups and structures in modern society (Robbins 1981, 1984a). (3a) Close-knit, totalistic "cults" operate as family surrogates and thus disturb the parents and relatives of converts (Bromley et al. 1983; Schwartz and Kaslow 1979), who are also concerned with converts' termination of conventional career goals. (3b) Dynamic religious movements diminish the pool of young persons available to participate in conventional churches and denominations; moreover, religious movements elicit an intense and diffuse commitment from converts which contrasts with the limited commitment of most churchgoers (Shupe and Bromley 1980). (3c) Gurus and new movements compete with certified secular therapists and healers; moreover, the latter are increasingly taking advantage of opportunities as counselors, rehabilitators and quasi-deprogrammers of "cult victims" and families traumatized by the "loss" of a member to a close-knit movement (Robbins and Anthony 1982). The conflict between "cults" and "shrinks" also has an ideological dimension involving the conflict between the socially adjustive ethos of mental health and the various deviant visions of transcendence, apocalyptic transformation, mystery and ecstacy (Anthony and Robbins 1980; Anthony et al. 1977). (4) The totalism and multifunctionality of some movements encourages a strong dependency on the part of devotees, who may be subject to exploitation (Robbins 1981; Thomas 1981). (5) Finally, the totalism, diversification, and transformative visions of cults burst the normative bounds of a largely "secular" culture, and in particular, repudiate the expected differentiation of secular and religious spheres of action (Anthony and Robbins 1980). Some new religions do not "know their place."

These troublesome aspects of "cults" would cause concern even if individuals entered and remained in cults voluntarily. However, in the context of the constitutional guarantee of "free exercise of religion," it is difficult to constrain or control deviant religious movements. There is a paradox to freedom: one cannot be truly free unless one is free to surrender freedom. However, this consideration, and civil libertarian objections to action against cults, can be obviated if it is established that in fact the involvement of converts in offending movements is involuntary by virtue of "coercive" tactics of recruitment and indoctrination plus consequent psycopathology and converts' diminished rational capacity. The "cult problem" is thus "medicalized" (Robbins and Anthony 1982). Cultist claims to "free exercise of religion" are neutralized by the implication that cultist religion is not really free because cults "coerce" their members into joining and remaining and because the latter may lose their capacity for decision-making.' Discourse on cults is thus displaced to models of conversion and persuasion, and disputes over how persons enter and leave (or don't leave) cults. In effect, what is considered is not so much the nature and goals of these groups but their procedures of recruitment and indoctrination (Beckford 1979).

In the bulk of this essay we will discuss the assumptions and conventions of reasoning and rhetoric which constitute the "issue" of "forced conversion" in religious movements. We will discuss the following: (1) the simultaneous employment of critical external perspective to analyze and evaluate processes within cults and an empathic internal perspective to interpret processes entailing the seizure, "deprogramming," and "rehabilitation" of devotees; (2) "epistemological

168    THE "BRAINWASHING" CONTROVERSY

manicheanism," which imputes absolute truth to the accounts of hostile apostates and nullifies the accounts of present cult converts as insincere or delusory; (3) the use of a broad and only tenuously bounded concept of "coercion"; (4) the assumption that it is intrinsically "coercive" or reprehensible for movements to recruit or "target" structurally available or "vulnerable" persons; and (5) exaggeration of the extent and effect of deception utilized as a recruiting tactic by some groups.

Epistemological Issues: Internal and External Interpretive Frameworks

Many elements involved in controversies over alleged cultist brainwashing involve trans-valuational conflicts. Behaviors and processes which might otherwise be seen mainly as indications of intense religious commitment, zeolotry, and dogmatic sectarianism are reinterpreted as signs of pathological mind control. Repetitive chanting, "obsessive prayer," repetitive tasks, evocations of sin and guilt, and "intense peer pressure" are viewed as "coercive" (or even "hypnotic") processes which paralyze free will and enslave the devotee (e.g., State of New York 1981). Speaking in tongues is considered by some clinicians as an aspect of coercive mind control (Mackey 1983). Cult-induced psychopathology and "thought disorders" are inferred from a convert's unconcern with conventional career goals, stereotyped and dogmatic responses to questions (Delgado 1977), and from an alleged pattern of absolutist and polarized thinking which impairs cognition such that "the thinking process is limited to a black-white totalistic perspective where everything external to the cult is evil and everything within is good" (Rosenzweig 1979: 150-1.

Any social process can be evaluated from two perspectives: an empathic internal or actors' phenomenological perspective or an external critical observer's perspective. As we have seen, evocations of sin and guilt, repetitive chanting, and "obsessive prayer" are interpreted as "coercive processes" which destroy free will, although the application of an alternative perspective would yield different interpretations.

It is arguable that the case against cults with respect to brainwashing is grounded in the simultaneous employment of a critical external perspective to interpret processes within religious movements, and an empathic internal perspective to evaluate activities in which movement participants are pressed to de-convert and guided in the negative reinterpretation of their experiences in stigmatized movements. However, it is also arguable that defenses of cults against mind-control allegations tend to entail the combination of a critical external perspective on deprogramming and an empathic "inner" or phenomenological perspective on processes within controversial new movements.

Defenders of controversial religious groups have protested the radical transvaluation implicit in some applications of external perspectives. A civil liberties lawyer criticizes "the name calling which is typical of programs of denigration."

A religion becomes a cult; proselytization becomes brainwashing; persuasion becomes propaganda; missionaries become subversive agents; retreats, monasteries, and convents become prisons; holy ritual becomes bizarre conduct; religious observance becomes aberrant behavior; devotion and meditation become psychopathic trances. (Gutman 1977: 210-11)

Another legal writer maintains that arguments in support of deprogramming essentially transvalue the intensity of faith in inferring psychopathology or coercion from items such as total involvement in a movement, unconcern with public affairs, dualistic thinking, etc. Converts who "subordinate their reason to imperatives of faith" and "demonstrate the depth of their commitment by insisting upon their beliefs as ultimate concerns, should not find the intensity of their faith being used as proof of their incompetence" (Shapiro 1978: 795).

In legal terms, Shapiro is arguing that the use of allegations of polarized thinking or unconcern with public affairs as rationales for

169    THE "BRAINWASHING" CONTROVERSY

state intervention violates the absolute quality of freedom of belief. But, whatever the legalities, clinicians may still insist that certain behavorial and thought patterns are objectively coercive or pathological or constitute mind control, notwithstanding legal constraints on the use of such allegations or the traditional quality of behavior such as glossolalia. On the other hand, some sociologists have argued that the behavioral and linguistic patterns from which clinicians have inferred a general "depersonalization" or a basic alteration of personality may really be indicative of situationally specific role behavior (Balch 1980). The different conceptual frameworks of sociologists, civil libertarian lawyers, and students of religion produce different interpretations of the same phenomenon.

It is important to note that conflicts of internal vs. external perspectives also emerge with respect to counter-cult activities. A clinical psychologist who supports the practice of deprogramming comments, "Although lurid details of deprogramming atrocities have been popularly supplied by cults to the press, the process is nothing more than an intense period of information giving" (Singer 1978: 17). Alternatively, deprogramming has been externally viewed as coercive persuasion (Kim 1978) or even something akin to exorcism (Shupe et al. 1978).

Many elements involved in controversies over alleged cultist brainwashing entail trans- valuational conflicts related to alternative internal vs. external perspectives. The display of affection toward new and potential converts ("love bombing"), which might be interpreted as a kindness or an idealistic manifestation of devotees' belief that their relationship to spiritual truth and divine love enables them to radiate love and win others to truth, is also commonly interpreted as a sinister "coercive" technique (Singer 1978). Yet successfully deprogrammed ex-devotees have enthused over the warmly supportive and "familial" milieu at post-deprogramming "rehabilitation" centers such as the Freedom of Thought Foundation (e.g., Underwood and Underwood 1979). Could this also be "love bombing"? One study indicates that processes of deprogramming, intervention, and therapy appear to exert influence on ex-cultists in the direction of assisting them to reinterpret their experiences in terms of brainwashing (Solomon 1981). In this connection, the literature of sociology is replete with "external" conceptualizations of psychotherapy as a persuasive process, a process of thought reform, a context of conversion, a context of negotiation and bargaining in which the greater power of the therapist is crucial, or a social control device (Frank 1980; Schur 1980).

The case against cults with respect to alleged brainwashing tends to be grounded in the simultaneous employment of a critical external perspective to evaluate and analyze processes within movements and an emphatic internal perspective to interpret the activities outside of religious movements through which devotees are physically coerced, pressed to de-convert, or guided in the reinterpretation of cultist experiences. Likewise, the polemical defense of cults tends to combine a critical external perspective on deprogramming and anti-cult activities with an emphatic internal orientation toward what goes on in cults (e.g., Coleman 1982). Since so many cult issues involve transvaluational conflicts, one's evaluation of conflicting claims may be largely a function of one's a priori interpretive framework or perspective.

Epistemological Manicheanism

An additional aspect of the epistemological dimension of cult-brainwashing controversies is the issue of who is a credible witness. This is the problem of evaluating the conflicting testimonies of present devotees and apostates. Some critics of cults seem to embrace a kind of epistemological manicheanism whereby the accounts of recriminating ex-devotees are acceptable at face value while the accounts of current devotees are dismissed as manifesting false consciousness derivative from mind control or self-delusion.' However, defenders of cults have been criticized - perhaps justly - for too readily discounting the testimonies of ex-converts because they are allegedly under the influence of new anti-cult reference groups

170    THE "BRAINWASHING" CONTROVERSY

or their recriminations against cults are self- justifying, while naively accepting the accounts of current devotees (Zerin 1982c). Doubts have been cast on the accounts of fervent devotees (Schwartz and Zemel 1980), and knowledgeable circumspection has also been urged with respect to the accounts of ex-converts whose interpretations have been influenced by deprogrammers, therapists, parents, and anti-cults activists (Beckford 1978; Solomon 1981) and whose current interpretations may function to disavow deviant stigma and facilitate social and familial reintegration. Epistemological manicheanism often characterizes both fervent indictments and defenses of cults. One's analysis can too easily be predetermined by one's implicit epistemological exclusionary rule.'

Construction of Coercive Persuasion Claims Through Assumptions and Definitions

It is our view that debates about mind control and brainwashing in cults are inherently inconclusive. Arguments on either side depend upon arbitrary or a priori assumptions, interpretive frameworks, and linguistic conventions. Our argument does not imply that reprehensible manipulative practices and strong peer pressures are not present in the proselytization and indoctrination repertoires of some movements. However, it is implied that certain key issues are assumptive, definitional, or epistemological, and thus in a sense ideological and not susceptible to decisive empirical resolution. Propositions may sometimes hinge on the arbitrary use of terms.

Coercion

Arguments to the effect that religious movements "coerce" their participants to remain involved generally depend upon broad conceptions of "coercion" which need not be tangible (e.g., physical), and of which the "victim" need not be aware (Ofshe 1982). Thus, a bill passed by the New York State legislature identified an individual's subjection to "a systematic course of coercive persuasion"

as a necessary condition legitimating the appointment of a guardian over a member of a communal group. "Systematic coercive persuasion" may be inferred from a variety of indices specified in the bill, including "control over information" or the "reduction of decisional capacity" through "performance of repetitious tasks," "performance of repetitious chants, sayings or teachings," or the employment of "intense peer pressure" to induce "feelings of guilt and enxiety" or a "simplistic polarized view of reality" (State of New York 1981).

Can persons be "coerced" by repetitious chanting or by peer pressure in a formally voluntary context? Perhaps, but what has emerged is a relatively broad and unbounded conception of coercion which transvalues as reprehensible "coercive" activities which have otherwise been viewed as innocuous religious staples (e.g., repetitious chanting). What pressure cannot be viewed as "coercive" along these lines?

In general, no distinction between "coercive" and "manipulative" processes seems to be made by critics of cults. Disparate processes and pressures arising in cults are labeled "coercive." It is arguable, however, that in common linguistic usage, the term "coercive" is employed to denote a situation in which an order and a threat are communicated such that the "coercee" is aware that he is being pressured and that his action is involuntary. Subtle manipulative influence via information control or seductive displays of affection (i.e., Moonist "love bombing") would ordinarily he viewed as manipulative rather than coercive processes. Thus, Lofland and Skonovd (1981) distinguish between the manipulated ecstatic arousal or "revivalist" techniques of the "Moonies" and true "coercive conversion" or brainwashing. "Coercion," however, has a stronger negative connotation of the overriding of free will, and is thus an ideologically superior term. Broad conceptions of coercion inevitably have rhetorical and ideological significance because they generalize a negative connotation to disparate situations and groups, which become psychologically and morally equivalent. Formally voluntary associations such as religious movements are some-

171    THE "BRAINWASHING" CONTROVERSY

times acknowledged to embody a different form of coercive persuasion compared to POW camps, but are then viewed as essentially equal in coerciveness or even more coercive than the latter because seductive cultist proselytizing may indeed sometimes be more effective than the techniques used to indoctrinate prisoners. On the other hand, it is arguable that seductive appeals are often more effective than persuasion of captives in part because they are less coercive and thus do not elicit the crystallization of a prisoners' adversary culture or resentment syndrome.

The concept of "coercive persuasion" has, in fact, been used in some significant research. A respected model of c.p. is the one developed by Edgar Schein and his colleagues (1961). Schein argues that if the notion of coercive persuasion is to achieve objectivity, it must be seen as transpiring in a wide range of - often culturally valued - contexts, e.g., conventional religious orders, fraternities, mental hospitals, the army. Coercive persuasion is generally stigmatized only when its goal is detested, e.g., producing communists or Moonies.

Schein's analysis strives for stringent ideological neutrality. Nevertheless, he may have contributed somewhat to a subjective and stigmatizing use of the concept of coercive persuasion by downplaying an essential distinction between forcible physical restraints (e.g., as in the prisoners of war he studied) and the more voluntaristic contexts to which he aspired to generalize the concept. Notwithstanding this effect, it is important to realize that assumptions about free will are external to Schein's model and some other models (Solomon 1983). "Coercively persuaded" subjects are not necessarily helpless robots (Shapiro 1983).

In rhetorical applications of c.p. models an unexamined and arbitrary assumption is often made with regard to the involuntary nature of the involvement of persons involved in movements allegedly utilizing coercive persuasion. References to "forced conversions" and similar notions arise (e.g., Schwartz and Isser 1981), although the inference as to absence of volition is not warranted by the mere technical applicability of c.p. models (Shapiro 1978, 1983; Solomon 1983). Interesting in this respect is the use by Singer (1979) and Zerin (1982a) of the vocabulary of "technologies" of coercive persuasion employed by cults, as if influence processes in cults involved precise instruments or machines operating automatically on passive cogs. In short, arguments as to the involuntary quality of involvements are often supported more by connotative imagery and rhetorical reification than by sophisticated applications of models of coercive persuasion or thought reform.

Models of "coercive persuasion," "brainwashing," and "thought reform" vary in the stringency of their existential criteria (Richardson and Kilbourne 1983). Sargent (1961) interprets religious revivals as a form of brainwashing. Schein (1961) model is the broadest and is clearly applicable to cults, as well as to college fraternities, reputable religious orders, etc. Lifton's well-known model (1961) of "thought reform" is applicable to various cults (Richardson et al. 1972; Stoner and Parke 1977: 272-6), and is probably applicable to any authoritarian and dogmatic sect. Recently, Lofland and Skonovd (1981) have argued that Lifton's criteria embody "ideological totalism," which is a broader phenomenon than true brainwashing or coerced conversion. The latter, according to Lofland and Skonovd, is delineated by the more stringent criteria employed by Somit (1968), which would exclude practically all formally voluntary groups. Given the array of diverse models of varying restrictiveness, cults can "brainwash" and be "coercive" depending upon which model is employed. Polemicists tend to conflate different models or shift back and forth between models (Anthony, in preparation).

Finally, neither the growing number of studies reporting that the Unification Church and other cults exhibit substantial voluntary defection rates (Skonovd 1981; Barker 1983; Beckford 1983; Ofshe 1976), nor studies indicating that there is a substantial "failure rate" in cultist indoctrination and recruitment (Barker 1983; Galanter 1980), can settle the debate about cultist coercion. Voluntary defectors and non-recruits can be said to lack the "vulnerability" traits which allow coercive

172    THE "BRAINWASHING" CONTROVERSY

pressures in cults to operate (Zerin 1982a). The basic issue is largely definitional and is not susceptible to empirical resolution.

Targeting the "vulnerable"

One of the characteristics of cultist recruitment and proselytization which is widely excoriated is the alleged tendency of cults to exploit the "vulnerability" of young persons who are lonely, depressed, alienated, or drifting away from social moorings. Cultist mind control is held to be differentiated from respectable, innocuous monasticism by the reluctance of the latter to "concentrate, as do religious cults, on the weak, the depressed, or the psychologically vulnerable" (Delgado 1977: 65). "Cult recruiters tend to look for the 'loners,' the disillusioned or floundering ones and those who are depressed" (Schwartz and Kaslow 1979: 21).

The above allegations concerning the nature of cultist recruitment are not false. Social movements in general tend to recruit individuals who are "structurally available" and who are not integrated into "countervailing networks" which would operate to inhibit recruitment in a new movement (Snow et al. 1980). This is the case with respect to those "authoritarian" movements in which participation is exclusive in the sense that "core membership may even be contingent upon the severance of extra-movement interpersonal ties" (ibid: 796). Movement organizations of this type tend to proselytize in public places and to recruit relatively unattached persons who are "more available for movement exploration and participation because of the possession of unscheduled or discretionary time and because of the minimal countervailing risks or sanctions" (ibid: 793). In contrast, groups with less exclusive participation patterns exhibit a greater tendency to "attract members primarily from extramovement interpersonal associations and networks, rather than from public places, i.e., existing members recruit their pre- conversion friends and associates." Although some stigmatized "cults" such as the Divine Light Mission of Guru Maharaj-Ji (Downton 1979) appear to be of this latter variety and to have recruited largely from existing interpersonal networks, it appears likely that relatively authoritarian and totalistic groups such as Hare Krishna or the Unification Church recruit many unattached individuals whose lack of binding ties and commitment render them structurally available.'

It is arbitrary, however, to stigmatize this mode of recruitment as coercive or reprehensible. Clearly social movements and proselytizing religious sects will "target" the more "vulnerable" potential participants. Young persons occupying transitional and ephemeral statuses (e.g., students) bereft of consolidated careers, salaries, dependents, spouses, and children, and not harmoniously nestled into other affiliative structures such as fraternities or clubs, will surely be prime "targets." Such individuals have less to lose in joining a communal sect or a messianic movement than other persons. A greater proportion of "available" persons relative to unavailable persons recruited to a movement would seem to this writer to be indicative of a voluntary rather than a "forced" quality of participation. Some sort of hypnotic or "coercive" device might be indicated if a disproportionate number of "unavailable" middle-age executives with large families, numerous dependents, and satisfying social affiliations were recruited.'

It also seems rather plausible that unhappy or "alienated" persons who are dissatisfied with either themselves or "the system" are more likely to be recruited to messianic movements than complacent "pharisees." The special importance of messianic religion for "miserable sinners" is a rather traditional evangelical theme. While some persons may be more "vulnerable" to cultist involvement than others (Zerin 1982b), it seems arbitrary to view the "targeting" of such persons as illegitimate or as indications of the involuntary or irrational quality of involvement.'

Demonology of deception

The role of deception in the proselytizing of cults is receiving increasing emphasis in allegations of cultist mind control. There appears to be some tendency to treat deception as a

173    THE "BRAINWASHING" CONTROVERSY

functional equivalent to the raw physical coercion which is used to initially bring individuals into POW or concentration camps. The absence of the physical coercion, which is a defining attribute of classic brainwashing contexts such as POW camps, is thus neutralized as an indicator of voluntariness or lack of coercion.

The writings of Richard Delgado on cults express succinctly a clear and coherent conception of the crucial role and significance of deception of in cultist mind control:

The process by which an individual becomes a member of certain cults appears arranged in such a way that knowledge and capacity, the classic ingredients of an informed consent, are maintained in an inverse relationship: when capacity is high, the recruit's knowledge of the cult and its practices is low; when knowledge is high, capacity is reduced. (Delgado 1980: 28).8

The potential recruit, attending his first meeting, may possess an unimpaired capacity to make rational choices. "Such persons, if given full information about the cult and their future life in it, might well react by leaving. For this reason, the cult may choose to keep secret its identity as a religious organization, the name of its leader or messiah, and the more onerous conditions of membership until it perceives that the victim is 'ready' to receive this information. These details may then be parceled out gradually as the newcomer, as a result of physiological debilitation, guilt manipulation, isolation, and peer pressure, loses the capacity to evaluate them in his ordinary frames of reference" (Delgado 1980: 28-9). The necessary conditions for voluntariness are knowledge and capacity; however, the cult convert "never has full capacity and knowledge at any given time; one or the other is always impaired to some degree" (ibid: 29). In short, gross deception lures the victim on to the premises where he is fairly quickly relieved of his mental capacity. It is claimed that by the time the veil of deception drops, the disoriented convert is not in a position to take advantage of his knowledge (see also Schwartz and Zemel 1980).

Let us examine some issues arising from Delgado's formulation. First we need to consider the question of generality. How typical is Delgado's account?

Let us examine three examples. (1) Firstly, it would be difficult for someone to become involved with the Hare Krishna sect without knowing at the very outset that he had encountered a very eccentric and somewhat regimented communal sect. The Krishnas are known to solicit funds deceptively, donning wigs and business suits to solicit in airports (Delgado 1982). Such ruses are employed for the purpose of soliciting funds - not warm bodies. Devotees seem relatively straightforward with respect to acknowledging the stringent membership requirements and they do not place intense pressure on marginal hangers-on who attend festivals at the Krishna Temple to become encapsulated in the communal sect.' A six-month screening period allows persons who cannot take the discipline to self-select themselves out (Bromley and Shupe 1981).

(2) During the middle 1970s one of the most controversial cults in the northeast was the Church of Bible Understanding (COBU), formerly the Forever Family.' Several members were deprogrammed and rehabilitated at the Freedom of Thought Foundation in Tucson, Arizona. Members of this movement wore large buttons saying GET SMART GET SAVED. They would accost one on the streets and inquire. "Do you know the Lord?" The author was invited to visit the loft which a large number of devotees were occupying. After "hanging around" the group for a few days, certain properties of the group were rather obvious (and, in the opinion of the writer, would have been readily apparent to anyone). Though not as well-organized as the Krishnas or "Moonier," the sect was regimented and authoritarian. Large numbers of converts lived in decidedly unhygienic conditions in lofts from which they were later evicted on health grounds. The members were expected to take odd jobs and relinquish their pay to the leaders, who researched and publicized available jobs. The group focused on the Bible and had an eccentric exegetical technique of "color coding" scriptural passages. Doubtless, there

174    THE "BRAINWASHING" CONTROVERSY

were salient aspects of the movement's lifestyle and interactional process which escaped the writer during his brief observation. What is significant, however, is that a number of rather unpleasant aspects of the group were immediately apparent. These were not elaborately or effectively concealed. It would be impossible to be fooled into thinking that this was a discussion club or a respectable church outreach program. Recruits were presumably individuals who were willing to sacrifice amenities and "take risks" of various sorts to pursue truth and salvation.

(3) Finally, the writer attended a three-day indoctrination workshop sponsored by the Unification Church in 1974. There was a salient element of deception: the link to Reverend Sun Myung Moon was not emphasized, and the writer was struck by the absence of pictures of the leader of the movement, whom the writer already knew to be an object of devotion to followers. But other elements of the workshop were relatively straightforward. There was not real concealment of the eccentric "religious" character of the group. Lectures commenced on the first day and dealt with the nature and relationship of God and man, and other matters derived from church doctrine. The regimented character of the movement could easily be inferred from the disciplined and ascetic quality of the workshop; men and women were separated; participants were awakened at 7:00 for calisthenics. There were 5-8 hours of lectures each day. There was clearly a manipulative quality to the workshop, although it seemed relatively crude and heavy- handed." But extreme deception entailing concealment of the basic nature of the group simply did not seem to have been the case.

It does not seem likely that deception in the above groups was of sufficient magnitude to account for the initial involvements of participants. The latter would likely be aware of the eccentric, authoritarian, and ascetic aspects of these groups from the outset.

Another implication of Professor Delgado's analysis needs to be considered. It is clear that Delgado believes that the pre-convert loses his or her "capacity" during the period in which he is denied knowledge, i.e., deceived. How long is that period? In the above examples, which involved very authoritarian communal sects, thorough deception as to the nature of the movement and its internal milieu was not really accomplished. It seems to be the case, however, that some Moonist groups, particularly the "Oakland Family" operation at Booneville, California, utilized a greater degree of deception than that experienced by the writer in New York (Bromley and Shupe 1981). But how long does deception last? No estimates this writer has heard involve a period longer than three weeks. The question thus arises as to whether a person can actually "lose capacity" during this period? Conceivably, there might be methods involving brutality and torture which could "break" a person in a matter of weeks or even days. But these methods are not used. Scheflein and Opton (1978) argue that cults do not really utilize extreme brainwashing or coercive persuasion methods: "Some might do so if they could, but they cannot, for dehumanization is excrutiatingly painful. Most people who were not prisoners, or tied to the group by strong bonds of loyalty, or by lack of anywhere else to go, would leave" (ibid: 60).12

Given the likelihood that unmotivated persons will shun a stringently regimented authoritarian milieu, supporters of mind- control allegations are heavily dependent upon claims with regard to deception. Yet it is problematic whether deception is as widespread, extreme, or significant as the influential Delgado model suggests. Cultist deception is really a (reprehensible) foot-in-the-door tactic and cannot plausibly provide the motivation for a person to tolerate otherwise objectionable conditions."

One additional point is worth noting. The Moonist indoctrination center at Booneville, California once utilized deception to a degree which exceeds the manipulation experienced by the author at another Moonist indoctrination center. As Bromley and Shupe (1981) note, the modus operandi of the Unification workshop at Booneville has been generalized by opponents of cults and by the media to "cults" in general. Indeed, the pervasive general stereotype of the deceptive cult which lures unwary youth to

175    THE "BRAINWASHING" CONTROVERSY

totalistic communes under false premises is largely based on the Unification Church, and in particular, its operation at Booneville (Bromley and Shupe 1981; Robbins and Anthony 1980). The overgeneralization of the Booneville Moonist modus operandi to contemporary deviant religious movements in general has been partly a product of fortuitious circumstances (Bromley and Shupe 1981) and partly a deliberate tactic of anti-cultists (ACLU 1977) to exploit the notoriety of the Moon sect. The allegation of widespread cultist mind control is thus constructed in part through an over- generalization of the extreme deceptive proselytization of one group.'

Conclusion

While the manipulative and heavy-handed recruitment and indoctrination practices of some groups cannot be gainsaid, arguments imputing extreme mental coercion, mind control, and brainwashing to cultist practices tend to depend upon arbitrary premises, definitions, and interpretive and epistemological conventions. Arguments in this highly subjective area are too often mystifications which embellish values and biases with the aura of value-free science and clinical objectivity.

As acknowledged in this essay, there are many difficulties and conflicts associated with cults. These conflicts would be legitimate objects of concern even if commitments to troublesome movements were acknowledged to be voluntary. Rhetorical mystiques about mind control have the consequence of implying that cultist involvements are involuntary and that devotees are not fully capable of making rational choices. In consequence, these arguments serve as a rationale for legitimating social control measures which treat devotees as if they were mentally incompetent without formally labeling them as such and without applying rigorous criteria of civil commitment (cf., State of New York 1981).

The debate over cultist brainwashing will necessarily be inconclusive. The contending parties ground their arguments on differing assumptions, definitions, and epistemological rules. Nevertheless the debate will continue. The medicalized "mind control" claim articulates a critique of deviant new religions which not only obviates civil libertarian objections to social control but also meets the needs of the various groups which are threatened by or antagonistic to cults: mental health professionals, whose role in the rehabilitation of victims of "destructive cultism" is highlighted; parents, whose opposition to cults and willingness to forcibly "rescue" cultist progeny are legitimated; ex-converts, who may find it meaningful and rewarding to reinterpret their prior involvement with stigmatized groups as basically passive and unmotivated; and clerics, who are concerned to avoid appearing to persecute religious competitors. An anti-cult coalition of these groups is possible only if medical and mental health issues are kept in the forefront (Robbins and Anthony 1982) and if the medical model is employed in such a way as to disavow the intent to persecute minority beliefs and to stress the psychiatric healing of involuntary pathology.

As argued above, the debate over whether cult devotees are "coerced" via "mind control" and "psychologically imprisoned" will necessarily be inconclusive. To some degree one can choose from an array ofbrainwashing/coercive/persuasion/ thought reform models with existential criteria of varying stringency, and, by then selecting appropriate background assumptions, imagery, and epistemological rules, "prove" whatever one wishes. The argument will persist, however, because it articulates an "acceptable" indictment of cults which is arguably compatible with respect for religious liberty, and which avoids a direct confrontation with the underlying issue of the limits of "church autonomy" in the context of the increasing diversification of the functions (e.g., educational, political, healing, commercial) of various kinds of religious groups. Because they use "mind control," it has been argued that cults can be set apart from other religious organizations (Delgado 1977), which arguably are not threatened by constraints on cults. Medicalization of deviant religion compartmentalizes issues involving

176    THE "BRAINWASHING" CONTROVERSY

"cults" and obscures some of the underlying conflicts and broader implications of conflicts over contemporary movements. A shift of focus will be necessary to transcend the inconclusive psychologism of debates over brainwashing. Such a shift will not isolate cults as a special theoretical compartment but will reconsider the uneasy general boundary of church and state in the 1980s.'

Notes

The author wishes to thank Dick Anthony, whose collaboration with the writer over a number of years contributed to the development of perspectives which are reflected in this essay.

1 Dr. Wolf-Petrusky's unpublished paper, "The Social Construction of the 'Cult Problem'," represents a pioneering formulation, which bears some similarities to the present analysis. However, the present writer's analysis of the construction of anti-cult claims through a priori premises, epistemological rules and definitional parameters diverges somewhat from Dr. Wolf-Petrusky's natural history approach, which follows Mauss (1975) more closely. The present writer has only been slightly influenced by the earlier work of Dr. Wolf-Petrusky.

2 Interestingly, the medical model is less salient in conflicts over cults in France and West Germany where norms of civil liberties and religious tolerance are weaker and deviant cults can be directly attacked as anti-social and culturally subversive (Beckford 1981).

3 A recent survey distributed by anti-cult activists (Conway and Siegleman 1982) of psychopathological symptoms among ex-converts made no attempt to include "returnees" or ex-cultists who had returned to their religious groups in their sample. Schwartz and Zemel (1980) suggest that converts' allegations of lack of deception in their recruitment are not credible because acknowledgment of deception would be cognitively dissonant with their present fervent belief. The authors do not apply a cognitive dissonance argument to the claims of recriminating ex-cultists for whom a lack of deception and manipulation may be dissonant with their present disillusionment, anger, and activism.

4 It is worth noting in this connection that several recent studies (Skonovd 1981; Wright 1983) have indicated that there are substantial numbers of ex-cultists who do not recriminate against cults or interpret their experiences in terms of brainwashing in the manner of those embittered deprogrammed apostates whose testimonies and allegations have been widely publicized. The absolute contrast of devotees' and ex-devotees' accounts is an appearance which arises from the fact that more public attention has been focused on a subset of ex-devotees who have usually been deprogrammed and have become assimilated to an anti-cult subculture or social network (Solomon 1981).

5 Snow et al. (1980) present data comparing Hare Krishna with the less totalistic Nicheren Shoshu movement, which supports their argument. A more recent study of Hare Krishna by Rochford (1982) came up with somewhat different findings. Krishna recruitment patterns varied from city to city, and overall, there was significant recruitment from social networks. See Wallis and Bruce (1982) for a conceptual critique of the "structural availability" concept.

6 Interestingly, elderly persons also appear to be prime "targets" for cults (see, for example, ABC-TV's 20/20 program, November 24, report on an eternal life cult). Elderly persons, like young persons, are often poorly integrated into the occupational structure. Such marginality qua "rolelessness" may enhance one's susceptibility to the appeals of extraordinary groups.

7 The concept of "vulnerability" seems to have an interesting affective connotation, i.e., one isn't considered "vulnerable" to something positive such as a promotion. The implicit imagery is mildly medicalistic, i.e., a "vulnerable" person is like a weakened organism whose defenses against germs have been impaired.

8 See also Delgado (1977, 1982).

9 The author conducted preliminary participant observation among Hare Krishnas in 1969-70 in Chapel Hill, North Carolina. A close colleague and collaborator conducted observation in Berkeley during 1970-2.

10 F. E. Gaiter, "Inside a New York Cult," New York Daily News series (Jan. 1-4, 1979 The author briefly observed this group in the middle 1970s.

11 For a description of the workshop as observed

by the author, see Robbins et al. (19761.

12 Some cult critics have acknowledged that it is social bonds which "incapacitate" a devotee to leave the group. By the time a neophyte Moonie is undeceived as to the identity of the

177    THE "BRAINWASHING" CONTROVERSY

group he has joined, he is "bonded" to the group and its members one cannot leave (Edwards 1983). But do we generally consider social bonds to nullify free will, e.g., is someone who loves his spouse a "prisoner" in his or her marriage?

13 A Superior Court judge in San Francisco recently issued a 30-page opinion granting summary judgment for the defendants in a case in which two former "Moonies" sued the Unification Church for "false imprisonment" (through mental coercion) and fraud. The ex-converts had initially been deceived as to the identity of the group, and claimed that by the time the deception was lifted (after 2-3 weeks) they had become psychologically dependent upon the group and were not capable of choosing to leave voluntarily. The court found that, initial deception notwithstanding, the plaintiffs' lengthy subsequent. involvement with the church was essentially voluntary; moreover, coercive persuasion without force or threat of force was not sufficient to establish actual imprisonment. See Molko and Leal vs. Holy Spirit Association For The Unification of World Christianity, et al., California Superior Court. City and County of San Francisco, Department No. 3, Order No. 769-529. The facts of this case, involving both deception and alleged "coercive persuasion," closely correspond to the model used by Delgado (1982) in proposing a civil remedy for cultist mind control.

14 See Schwartz and Kaslow (1982) for a description of a "typical" cultist recruitment scenario, which appears in fact to be a description of the Notorious Moonist "Camp K" at Booneville, California; which, in our view, is of limited generality.

15 It has recently been argued (Robbins 1984a) that a general crisis of church and state is emerging in the United States because of three converging factors: (1) the increasing state regulation of "secular" organizations, from which "churches" are exempt; (2) the increasing functional diversification of religious groups which increasingly perform functions similar to those of secular organizations; and (3) the failure of the liberal ideal of providing goods, services, and meanings essential to enhance the "quality of life" under state auspices. As religious groups such as evangelicals or cults strive to "fill the gap" they increasingly become embroiled in conflicts with other groups and institutions (e.g., minorities who feel depen-

upon public services). The debate over "mind control" obscures the linkage between controversies over cults and other "church autonomy" conflicts.

References

American Civil Liberties Union 1977. Deprogram-
ming: Documenting the Issue. New York. Anthony, Dick. Monograph on Cults and Coercist
Persuasion. In Preparation.
Anthony, Dick and Thomas Robbins. 1980. "A Demonology of Cults." Inquiry 3, 15: 9-11.
Anthony, Dick, Thomas Robbins, Madalyn Doucas, and Thomas Curtis. 1977. "Patients and Pilgrims: Changing attitudes toward psychotherapy of converts to Eastern mystical converts." American Behavorial Scientist 20, 6: 861-86.
Balch, Robert W. 1980 "Looking Behind the Scenes in a Religious Cult." Sociological Analysis 41 (2): 137-43.
Barker, Eileen. 1983. "Resistable Coercion: The significance of failure rates in conversion and commitment to the Unification Church." Forthcoming in D. Anthony, J. Needleman, and T. Robbins eds. Conversion, Coercion and Commitment in New Religious Movements. Unpublished. Beckford, James. 1978. "Through The Looking- glass and out the other side: Withdrawal from Reverend Moon's Unification Church." Archives de Sciences des Religions 45 (1): 71-83.
1979. "Politics and the anticult movement." Annual Review of the Social Sciences of Religion 3: 169-90.
1981. "Cults, controversy and control: A comparative analysis of the problems posed by new religious movements in the Federal Republic of Germany and France." Sociological Analysis 42, 3: 249-63.
1983. "Conversion and apostacy." Forthcoming in D. Anthony, J. Needleman, and T. Robbins eds. Conversion, Coercion and Commitment in New Religious Movements. Unpublished.
Bromley, David and Anson Shupe. 1981. Strange Gods: The Great American Cult Hoax. New York: Beacon Press.
Bromley, David, Bruce Busching, and Anson Shupe. 1983. "The Unification Church and the American Family: Strain, Conflict and Control." In E. Barker ed. New Religious Movements: A Perspective for Understanding Society: 302-11. Coleman, Lee. 1982. "Psychiatry: The Faith Breaker." Pamphlet.
Conway, Flo and Jim Siegleman. 1982. "Informa-

178    THE "BRAINWASHING" CONTROVERSY

tion Disease: Cults have created a new mental illness." Science Digest 90 (1): 86-92.
Cuddihy, John M. 1978. No Offense.' Civil Religion and The Protestant Taste. New York: Seabury Press.
Delgado, Richard. 1977. "Religious Totalism: Gentle and Ungentle persuasion." Southern California Law Review 51: 1-99.
1978. "Investigating Cults." New York Times, Op-ed. (Dec. 27, 1978): A27.
1979-80. "Religious Totalism as Slavery." New York University Review of Law and Social Change 9: 51-68.
-1980. "Limits to Proselytizing." Society 17 (March/April): 25-32.
1982. "Cults and Conversion: The Case for Informed Consent" Georgia Law Review 16 (3): 533-74.
Downton, James. 1979. Sacred Journeys: The Conversion of Young Americans to the Divine Light Mission. New York: Columbia University Press.
Edwards, Chris. 1983. "The Nightmare of Cult Life." Lecture at Central Michigan University, January 25.
Frank, Jerome. 1980. Persuasion and Healing. New York: Schocken.
Galanter, Marc. 1980. "Psychological Induction Into the Large-group: Findings from a Modern Religious Sect." American Journal of Psychiatry 137: (112).
Goode, Eric. 1968. "Marijuana and the Politics of Reality." Journal of Health and Social Behavior 10: 83-94.
Hammond, Phillip. 1981. "Civil Religion and New Movements." In Robert Bellah and Phillip Hammond eds. Varieties of Civil Religion. New York: Harpers, 1981.
Kim, Byong-Suh. 1978. "Deprogramming and Subjective Reality." Sociological Analysis 40 (3): 197-208.
Lifton, Robert. 1961. Chinese Thought Reform and the Psychology of Totalism. New York: Norton.
Lofland, John and Norman Skonovd. 1981. "Conversion Motifs." Journal for the Scientific Study of Religion 20 (4): 373-85.
Mackey, Aurora. 1983. "The truth about cults." Teen Magazine vol. 27, no. 4 (April): 12-14 and 97.
Mauss, Armand. 1975. Social Problems as Social Movements. Philadelphia: Lippincott.
Ofshe, Richard. 1976. "Synanon: The people's business." Pp. 116-38 in C. Glock and R. Bellah eds. The New Religious Consciousness. Berkeley: University of California Press.
1980. "The social development of the Synanon cult: The managerial strategy of organization transformation." Sociological Analysis 41 (2): 109-27.
1982. "Regulating diversified social movements." Seminar presentation at the Graduate Theological Union, April.
1983. "The role of out-of-awareness of influence in the creation of dependence on a group: An alternative to brainwashing theories." Forthcoming in D. Anthony, J. Needleman, and T. Robbins, Conversion, Coercion and Commitment in New Religious Movements. Unpublished. Richardson, James and Brock Kilbourne. 1983. " classical and contemporary applications of brainwashing models: A comparison and critique." In D. Bromley and J. Richardson, The Brainwashing-Deprogramming Controversy. Toronto: Mellon.
Richardson, James, Robert Simmonds, and Mary Harder. 1972. "Thought Reform and the Jesus Movement." Youth and Society 4: 185-200.
Robbins, Thomas. 1979. "Cults and the therapeutic state." Social Policy 10 (1): 42-6.
-1979-80. "Religious movements, the state and the law." New York University Review of Law and Social Change 9 (1): 33-50.
-1981. "Church, state and cult." Sociological Analysis 42 (3): 209-25.
1984a. "Religious Movements and the Intensification of Church/State Tensions." Society 21 (4): May/June.
1984b. Incivil religions and religious deprogramming. Presented to the Midwest Sociological Society, Chicago.
Robbins, Thomas and Dick Anthony. 1980. "The limits of 'coercive persuasion' as an explanation for conversion to authoritarian sects." Political Psychology 2 (1): 22-37.
1981. "Harrassing cults." New York Times, Op-ed (Oct. 16): A31.
1982. "Brainwashing, deprogramming and the medicalization of deviant religious groups." Social Problems 29, 3: 283-97; 2 (2): 22-6.
Robbins, Thomas, Dick Anthony, Madalyn Doucas, and Thomas Curtis. 1976. "The Last Civil Religion: Reverend Moon and the Unification Church." Sociological Analysis 37 (2): 111-25.
Rochford, E. Burke, Jr. 1982. "Recruitment strategies, ideology and organization in the Hare Krishna movement." Social Problems 29 (4): 399-410.
Rosenzweig, Charles. 1979. "High demand sects: Disclosure legislation and the free exercise clause." New England Law Review 15: 128-59.
Sargent, William. 1957, Battle of the Mind. Garden City, NY: Doubleday.

179    THE "BRAINWASHING" CONTROVERSY

1961. Battle for the Mind. London: Heinemann.
Scheflein, Alan and Edward Opton. 1978. The Mind Manipulators. New York: Paddington.
Schein, Edgar, I. Schneir, and C. H. Barker. 1961. Coercive Persuasion. New York: Norton.
Schur, Edwin. 1980. Politics and Deviance. Englewood Cliffs, NJ: Prentice-Hall.
Schwartz, Lita and Natalie Isser. 1981. "Some involuntary conversion techniques." Jewish Social Studies 43 (1): 1-10.
Schwartz, Lita and Florence Kaslow. 1979. "Religious cults, the individual, and the family." Journal of Marital and Family Therapy 6: 301-8.
1982. "The cult phenomenon: Historical, sociological, and familial factors contributing to their development and appeal." Pp. 3-30 in F. Kaslow and M. Sussman eds. Cults and the Family (special double issue of Marriage and Family Review 4 (3-4).
Schwartz, Eta and Jacqueline Zemel. 1980. "Religious cults: Family concerns and the law." Journal of Marital and Family Therapy 6: 301-8.
Shapiro, Eli. 1977. "Destructive cultism." American Family Physician 15 (2): 80-3.
Shapiro, Robert. 1978. "Mind control or intensity of faith: The constitutional protection of religious beliefs." Harvard Civil Rights-Civil Liberties Law Review 13: 751-97.
1983. "On persons, robots and the constitutional protection of religious beliefs." So. Cal. Law Review 56 (6): 1277-1318. A shorter version of this paper is forthcoming in T. Robbins, W. Shepherd, and J. McBride, The Law and the New Religions. Chico, CA: Scholars Press.
Shupe, Anson, Roger Spielmann, and Sam Stigall. 1978. "Deprogramming: The new exorcism." Pp. 145-60 in J. Richardson ed. Conversion Careers. Sage.
Shupe, Anson and David Bromley. 1979. "The Moonies and the anti-cultists: Movement and countermovement in conflict." Sociological Analysis 40: 325-34.
1980. The New Vigilantes. Beverley Hills, CA: Sage.
Singer, Margaret. 1978. "Therapy with Ex-Cult Members." National Association of Private Psychiatric Hospitals Journal 9 (4): 14-18.
1979. "Coming out of the Cults." Psychology Today 12 (8): 72-82.
Skonovd, L. Norman. 1981. Apostasy: The Process of Defection from Religious Totalism. Unpublished Ph.D. dissertation, University of California-Davis (sociology).
Snow, David, Louis Zurcher, and Sheldon EcklandOlsen. 1980. "Social networks and social movements." American Sociological Review 45: 787-801.
Solomon, Trudy. 1981. "Integrating the Moonie experience." Pp. 275-94 in T. Robbins and D. Anthony, In Gods We Trust. Transaction.
1983. "Programming and deprogramming the Moonies." In D. Bromley and J. Richardson, The Brainwashing-Deprogramming Controversy. Toronto: Mellon.
Somit, Albert. 1968. "Brainwashing." Pp. 138-43 in vol. 2 of D. Sills ed. International Encyclopedia of the Social Sciences. New York: Macmillan.
State of New York. 1981. "An act to amend the mental hygiene law, in relation to the appointment of temporary guardians." In assembly, March 31.
Stoner, Carroll and Jo Anne Parke. 1977. All Gods Children: The Cult Experience - Salvation or Slavery. Radnor: Chilton.
Szasz, Thomas. 1976. "Some call it brainwashing." New Republic (March 9).
Thomas, W. John. 1981. "Preventing non-profit profiteering: Regulating religious cult employment practices." Arizona Law Review 23: 1003- 29.
Underwood, Barbara and Betty Underwood. 1979. Hostage to Heaven. New York: Potter.
Wallis, Roy and Steve Bruce. 1982. "Network and Clockwork." Sociology 16 (1): 102-7.
Wolf-Petrusky, Julie. 1979. "The social construction of the Cult Problem." Paper presented at the annual meetings of the "Association for the Study of Religion," Boston, 1979.
Wright, Stuart A. 1983. "Post-involvement attitudes of voluntary defectors from controversial new religious movements." Presented to the Society for the Scientific Sociology of Religion, Knoxville, TN.
Zerin, Margery. 1982a. The Pied Piper Phenomenon: Family Systems and Vulnerability to Cults. Dissertation. The Fielding Institute.
1982b. Review of F. Kaslow and M. Sussman eds. Cults and the Family, 1982, Haworth, pp. 7-9 in Cultic Studies Newsletter 1 (1).
1982c. "Reply to Dr. Robbins," Cultic Studies Newsletter, forthcoming.

180