This paper, slightly modified here, first appeared in volume 4 of the 'Skeptical Intelligencer', 2001.
In this paper I shall discuss what I consider are similarities between the attitudes of psychiatric patients to their maladaptive or delusional beliefs and the attitudes of individuals to experiences that they attribute to anomalous or paranormal phenomena. It is not my contention that the latter are suffering from a psychiatric disorder; rather I wish to consider whether there are common underlying processes.
To approach this question, I shall start by considering in simple terms some important characteristics of the way the human brain processes information from external and internal sources and thereby enables us to consciously experience the world as complex and meaningful.
Superficially it appears to us that our brain receives information in a passive manner, as though the immediate receptors of this information are pre-programmed to recognise instantaneously complex chunks of information that correspond to the stimuli involved. For example, we instantaneously recognise written and spoken words, faces, pieces of music, foods, odours and so on and in fact, unless we withdraw our attention from them, we often find the act of recognition virtually impossible to resist.
Despite this, it is clear that while there are receptors and nerve cells that are specialised to register fundamental aspects of the sensory input in the various modalities, the greater the complexity of the information to be recognised, the more the brain is involved in an active process of construction. One hypothesised process has been termed 'analysis by synthesis' (Neisser, 1967) whereby a match is found between a sample of the sensory input and an existing template constructed by the brain.
Major influences on this process are redundancy and expectation, which collectively enable us to attend to an abridged sample of the input, although we thereby increase the risk of errors should the information be contrary to our expectations. An everyday example of this is reading: we do not attend to each individual letter or even word, and not uncommonly we misread what is there owing to false expectations. Also of relevance are the cognitive biases associated with emotional states such as anxiety, which will be described later in this paper.
All of this is akin to hypothesis testing, but normally it is contrary to our subjective impression of our perceptual experiences, which seem so immediate. We become more aware of it when the sensory input is unusual, fragmented or degraded in some way. Everyday examples are when we are trying to identify something in the dark or a very faint object, decipher somebody's handwriting, or recognise an unusual sound. 'Is it X? …. No, it doesn't look or sound like X. … Is it Y?….. No. …. Z? …. Oh yes, of course, it's Z'.
This process of generating hypotheses is exploited in puzzles such as 'Find the hidden objects in the picture', 'What is this familiar object from an unfamiliar angle?' or 'Guess whose voice this is?' Certain visual art forms also evoke this at a conscious level.
The process is fallible and one consequence is that it is relatively easy for people to perceive meaningful stimuli that are not really there (as in ink blot tests) or create patterns when there is only random noise - visual or auditory. Another consequence is that we can be remarkably good at leaping to correct (or incorrect) conclusions from very limited perceptual input. For example, I see a momentary flash of red and luminous green reflected in the glass of the picture above my fireplace and I immediately make my way to the front door in anticipation of the postman. Or consider this example: 'Have you had a good day?' a mother asks her young son, as usual, when he arrives home from school. 'Yes' he replies, as always. But the almost imperceptible delay in his reply, or the slightly lower-than-usual eye gaze, or whatever, leads Mum to conclude that he has probably been in trouble again. Or this one: in the middle of the night I hear a sudden sharp noise and immediately conclude that the hook that I stuck on the bathroom wall has just fallen off.
All of these conclusions are hypotheses that can be actively tested by further observation. In the third example, I get up and go to the bathroom, only to find that the hook is still stuck to the wall. My hypothesis is falsified. Had the hook been on the floor, my hypothesis would have been supported, but not proved: the noise might still have been caused by something else. All of this is Karl Popper applied to everyday life, even if he may not have accurately represented the scientific method (Gardner, 2001).
This process of construction, matching and hypothesis testing also characterises memory and is again more consciously accessible when the information - in this case the memory traces - is fragmentary. Thus: 'What was that woman's name? …. Doris? … No. ….. Dora? - No. Dorothy? … Yes, of course'. Or 'How does that tune go?' …. etc.
Hypothesis construction and testing in the case of internally generated stimuli or activity is also exemplified by our making sense of physical symptoms, including pain - 'Is this indigestion/ heart failure/ cancer?' - a process known as 'attribution'.
We can extend this discussion to the cognitive processing of information more complex than simple stimuli. Suppose that Fred comes home one day and makes these observations. The door is not locked, the answerphone is switched on and there is no sign of Fred's wife. On the kitchen floor are several carrier bags full of food items. The hypothesis that Fred comes up with first is that his wife has just arrived back from shopping and has popped into the neighbours' house with something she promised she would pick up for them.
Some years ago there was a fascination for a certain type of puzzle in which people would be given a decription of an unusual scenario and, by a series of questions requiring 'yes' or 'no' answers, deduce the explanation or the antecedents of the situation described. (I vaguely recall that one of them was about a one-legged man who receives a parcel through the post, takes it out to sea on a boat, opens it, laughs, and drops the contents of the parcel, a human leg, into the sea.) In fact in the early 1960s there was a television quiz in which a panel of celebrities was similarly challenged by members of the public who had had what at first appeared to be rather unusual experiences. I recall two these, one of which was not at all interesting. The other went as follows: the person in question opened her front door one morning to be confronted on the doorstep by a lavatory brush and a saucer of milk. The show, incidentally, was called 'What's it all About?' In this case the panel quickly homed in on the answer, namely that on returning home in the dark the night before, the person saw an object on her doorstep which she perceived (i.e. hypothesised) to be a hedgehog and she very thoughtfully gave it a saucer of milk. The next morning, as a result of further observations, the lady constructed the most likely hypothesis: the brush had fallen out of the bathroom window above the front door while she was out the previous night.
The processes involved in interpreting the information illustrated in the above examples are continuous with those previously mentioned that govern the perception of very simple meaningful material, and again they become more apparent when the material is fragmented and incomplete.
So far, this is a simplified overview but one that is sufficient for present purposes. A final point to emphasise is that an important part of the above process is that we are able to give some consideration to the relative likelihood of various interpretations, again based on what is already known about the word and our ability to reason. Similarly we are able to acknowledge that we are fallible and may be wrong on many occasions, even when we think in the most rational way, since our interpretation of any event may be based on limited information and may be contradicted by further evidence.
From the above discussion, it is no surprise that most reporting of anomalous phenomena occurs under the conditions described above: viz. ghosts, UFOs and unusual creatures such as the Loch Ness monster, Big Foot and large cats in the United Kingdom; likewise, instances of what are more rationally classified as 'cold readings', e.g. by astrologists, tarot card users and mediums. Thus, in the dark and under restricted viewing conditions, a moving observer may misinterpret, say, a planet as a saucer-shaped flying object moving at fantastic speed. Compare with this someone given the following information from a medium: 'I have a man here who is holding his chest. I am getting a name beginning with J or G - is it Joe or George? He is talking about 'the garden'. Can anyone help me with this one?' Here again, we have ambiguous and fragmented information about which the observers (members of the audience) are required, if they can, to construct a hypothesis.
Let us analyse in greater depth these processes. One assumption I shall make here is that amongst the unique characteristics that human beings possess is that they are capable of being rational and, in consequence, they are capable of being irrational. (It may be possible to demonstrate to a limited degree something resembling both qualities in some of the higher primates, but I am focusing here more on the cognitive rather than behavioural manifestations.)
I have so far argued that as humans we habitually perceive and make sense of our environment (including internal stimuli) according to rules that are rational, logical and not dissimilar to those characteristic of the scientific method. We construct hypotheses that are consistent with the evidence provided by our senses, and test them in a logical manner, falsifying or supporting them. Moreover, we construct hypotheses that, from our experience, provide the most likely explanations in a manner consistent with Occam's Razor. In the example I gave earlier of my hearing a noise in the night, I do not immediately deduce that a friend has just called in for tea; when Fred comes home to find his wife is absent, he goes for the most likely hypothesis first - that she has popped into the neighbours' - rather than immediately infers that she has run off with the milkman or, less likely in Fred's case, that she has been abducted by aliens.
And yet, we do not always think and act in this rational manner. I once heard an amusing story of two girls who were playing tennis and one of them lost her contact lens. As they were both crouched down pawing the ground, an elderly lady who was passing by called out, 'Are you looking for this?' and held up a tennis ball.
We all make mistakes like this and psychologists have demonstrated how all of us can misperceive or be irrational in quite basic ways. I believe that this is why we should be sceptical of the support offered for unusual claims that goes 'Fred Higgins is a down-to-earth Lancastrian who is not easily fooled…' or even 'Amongst the observers of the UFO was Flight Lieutenant Reginald Mainwaring with 25 years' flying experience…'. Not long ago a sighting of a puma-like cat was reported in Nottinghamshire by a woman who declared, 'I ought to know the difference between a large cat and a dog. I'm a vet!'. In fact, the most salient attribute common to all these people is that they are human beings, and human beings are fallible, particularly when they perceive stimuli under restricted viewing conditions.
Nevertheless, failure to arrive at rational conclusions and beliefs is not simply due to a lack of competence or oversight. Irrationality - meaning here any departure from the rules of hypothesis construction and testing - occurs in predictable ways. Very often there is some purpose or advantage for us, although this is not always apparent. One obvious reason for departing from the rules is wishful thinking: we seek out and interpret evidence in the manner that suits us. Our own selfish interests make us prejudiced.
It is not hard to conclude that this is happening when someone is being given, by a medium, messages supposedly from a departed loved one. This indeed illustrates one of the principles of cold reading. Sceptics also often assume that believing that one has had an extraordinary or paranormal experience is valued in itself and people will be biased to some extent to perceive and interpret certain events accordingly, and to resist disconfirming evidence.
I postulate also (and cognitive dissonance theory predicts) that publicly declaring such a belief will in itself render a person reluctant to accept more likely explanations and the evidence for such. This is something that could be investigated systematically. I would, for example, predict that someone who declares that he or she has just seen a tiny pony in a nearby field would be more accepting of the possibility that what he or she saw was the farmer's new wolfhound, than someone who declares that he or she has just seen a puma in the field. ('I do know the difference between a cat and a dog, and that was no dog!')
More deep-seated beliefs about the world tend to be particularly resistant to disconfirming evidence. One reason may be that there are good adaptive reasons why we are reluctant to change our beliefs. We need predictability, consistency, and familiarity. Our beliefs, opinions, attitudes and assumptions provide us with that. They enable us to view the world in a more orderly and predictable way. In that way we are less anxious. Consequently, we are prejudiced: we tend to interpret information and to behave in ways that will confirm or, at least, not disconfirm our beliefs.
To return to our consideration of the person who declares that he or she has had an unusual or paranormal experience, another reason for the tenacity of that belief, even in the face of more likely, but mundane, explanations and disconfirming evidence, is that we value having such experiences and believing in their authenticity. Life is thus more exciting and offers many more possibilities; perhaps we are even special in some way. The reader may be acquainted with the 'Ah, yes but' mind-set of those who are reluctant to consider more reasonable explanations for their unusual experiences. A few years ago a 'UFO' was sighted over the Sheffield area by a number of people, some of whom telephoned the local TV station. The reports were studied by the news team who realised that the description of the UFO was identical to that of a telecommunications instrument that hovered over a local stadium. That evening on the local TV news, one of the newscasters rang a lady who had earlier described to them what she had seen. The newscaster simply informed her that what she had sighted was certainly the device in question. But she wouldn't have it! 'No, that's not what I saw', she said and then, very significantly, declared, 'and do you know, since I saw it, it's changed my whole life!'
Now it may be that what this lady saw was indeed an extraterrestrial craft. But it seems that she had somehow lost the ability to consider the relative likelihood of the various hypotheses that she could construct to explain her observation. This is an oft-noted characteristic of claims of anomalous sightings. Recently I saw a television programme promoting the existence of an unknown giant bear in Siberia. The commentator stated that one man who claimed to have seen it was unlikely to have misidentified a common brown bear because of his years of experience hunting these creatures. Here we have two hypotheses: firstly that a hitherto unknown giant bear is alive in Siberia or, secondly, an experienced hunter made a mistake. Both are unlikely, but which is the less likely on the available evidence?
There is one more process that I wish to discuss, namely how unusual beliefs may become progressively more elaborate, often in exponential fashion. Let us first recapitulate. The starting point is information that is unclear, incomplete, and ambiguous. We then have a belief that this indicates something unusual or even supernatural. A declared commitment is made to the authenticity of the belief (possibly because it is personally valued by the believer).
Beliefs thus acquired demand relaxation of the usual constraints that govern rational thinking. One risk of this is that the door is thus open to further unusual ideas and beliefs, which may all-too-readily seek admission. The restraints are down and the boundaries widen further and further to accommodate claims that are increasingly more unusual and bizarre. The adherents of what has now become a belief system find it difficult to 'blow the whistle' and declare, 'Wait a minute! Stop! This is crazy! We've gone wrong somewhere!'
Why this can happen is as follows. To sustain the original claim or belief, new information has to be interpreted accordingly. This is akin to making facts fit the theory. Similarly, new claims tend also to be accepted as authentic. For example: somebody sees a flying object and, in the absence of any obvious explanation, declares it to be an extraterrestrial (ET) craft. Other similar sightings are reported, thus supporting the original hypothesis that ET beings are visiting the earth. Someone then claims to have actually seen an ET craft on the ground. Hence these ET beings are now landing on Earth. Another observer actually sees figures that she interprets as ET beings themselves. Now they are amongst us. Someone claims to have interacted with them. Someone then claims to have been taken on board an ET craft and someone else …. and someone else … etc. until eventually hundreds of people are claiming to have been thus abducted. Somebody then claims to have observed evidence that millions of people have been thus abducted. A massive epidemic of alien abductions is declared. Later, people report that ET beings have conducted surgery on them and interfered with their reproductive organs and a belief develops that women abductees are being used to breed ET beings. Some 'abductees' claim to have had pieces of metal implanted in them. And so on and on and on …. The belief system spreads, seemingly with no boundaries and nothing to check its growth and no restraints on what ideas and claims attach itself to it. Contrast this with the development of scientific theory, with its strict rules of accountability that ensure a process of self-correction according to the evidence and the application of logic and mathematics.
Now, I have no doubt that at some stage some individuals involved in the belief system are going to draw the line and deny the credibility and authenticity of some of the claims that are being made. But the problem is that there are no clear rules for determining what is most likely to be authentic and what is otherwise. Hence, once you start to question, say, the belief and the evidence that women are being reproductively programmed by ET beings, where does your doubting stop? Just as when one starts to pull at the loose thread of a garment, so when one starts to question these claims, the whole belief system is in danger of unravelling all the way back to the original claim that ET crafts are flying around Earth.
Elements in the above process are evident in varying degrees in everyday life. A not dissimilar process is at work in the case of claims and beliefs that are driven by fear, as in mass hysteria. Extreme examples characterise the belief systems of cults, particularly where the cult is physically separated from mainstream society. But of greatest relevance here are the kinds of unusual beliefs that are of interest to sceptics. As well as UFOs, major examples are claims of ritual satanic abuse, multiple personality disorder, and crop circles (for which, with increasing complexity of the patterns, explanations more elaborate than UFO landing sites have had to be provided to preserve the claim for their extraordinary nature). In the case of alternative medicine we can likewise see the difficulties that adherents have in accepting some, but not all, of the methods that claim to be 'natural and holistic': if one rejects, say, crystal healing and colour therapy, why should one not also reject homeopathy and acupuncture?
I believe that where a belief system becomes increasingly complex in the above way, the label 'psychotic' is not inappropriate, at the very least by way of analogy to the thinking of psychotic patients who have delusional beliefs. But more of this later. Right now it is worth also observing how adherents of the kinds of beliefs in question often need to resort to the incorporation of 'paranoid' ideas into their belief system. This is usually necessary to account for the dearth of good evidence or the refusal of experts in mainstream disciplines or 'the establishment' to take seriously the beliefs in question. 'There is a government cover-up' is a common assertion by UFO advocates. In America, the CIA has been cited as being in cahoots with the False Memory Foundation and the backlash against the diagnosis of multiple personality disorder. The medical establishment is often cited as being involved in a conspiracy against alternative medicine. Of course, some unusual belief systems are entirely based on the idea of a conspiracy.
I have been discussing the nature of unusual beliefs. I am particularly interested in beliefs based upon unusual interpretations of everyday experiences. I have asserted (although more systematic evidence is required on this) that people are reluctant to disclaim such ideas and beliefs. Individuals are biased to interpret further information and events in a manner that is consistent with the belief (even when logically this evidence opposes it). One occasional consequence of this is what I have termed a 'psychotic' process, namely a reluctance or inability to disclaim related beliefs and assertions that may become increasingly bizarre over time. Finally, 'paranoid' attitudes may be expressed in the face of a lack of acceptance of the beliefs in question, notably by orthodox and authoritative persons and organisations. These processes are potentiated when unusual beliefs are actively shared by groups of individuals, and at the extreme we have the phenomenon of cults.
It is my contention that the processes identified above not uncommonly represent the thinking of normal individuals who are not necessarily suffering from any form of psychiatric disorder. But what about people whose ideas and beliefs are unusual to the extent that they are defined as suffering from a psychological disability or even as being mentally ill? Such individuals interpret their experience of their external world and their own internal experiences (thoughts, images, memories and physical feelings) in extremely unusual ways that may be part of a far-reaching belief system.
If we define mental disorders as those that are currently classified in the Diagnostic and Statistical Manual of Mental Disorders of the American Psychiatric Association (DSM-IV) or the International Classification of Diseases (ICD-10) then we have a considerable range of human problems, only the most serious of which we would refer to as 'mental illness'.
Over the last 30 years, the psychological treatment of such disorders has been greatly influenced by the cognitive approach. This endeavours to understand the thoughts and beliefs of people with mental disorders because usually it can be shown that these are erroneous, unfounded, unrealistic or irrational. If one can help clients and patients to acquire more realistic beliefs (i.e. hypotheses) about themselves and their world, then their emotional and behavioural problems may accordingly ease.
Consider the following:
In their rational moments, Nita and Joe would probably acknowledge that their fears are unreasonable. However, they will act as though they are true. Part of therapy will be to help them adopt more realistic beliefs when they are beset by their fears. Another is to encourage them to test out their beliefs in reality; for example Joe could be prevented from washing his hands, thus allowing him to find out if anything terrible really does happen. These two methods constitute much of what is termed cognitive-behaviour therapy. The second method is similar to the process of hypothesis testing by performing an experiment.
In this case David can be said to have constructed a 'catastrophic' hypothesis from the evidence of his physical experiences. In contrast to Nita and Joe, David may seriously believe that his fears are well founded. Sometimes patients may need some advice and factual information to help them correct their erroneous beliefs. So it is important that David understands that there is much evidence to show that, in a healthy person, a panic attack does not cause fainting or cardiac arrest. However, it is important that David test his initial hypothesis against the one offered by his therapist by performing an experiment in which he remains in a situation in which he is panicking and discovers whether he does indeed faint.
Just as Nita and Joe may be able to accept that, despite their persistence, their fears are completely unrealistic, Mahmood may be able to tell himself that in reality he is highly qualified, has a successful career, is a dutiful husband and father, has never broken the law, and so on. Yet still he disparages himself. In contrast, Jason may well insist that he is correct in his belief that his baldness is a severe handicap for him, that other people think less of him for it, and that the key to his happiness is acquiring a fine head of hair. Despite his therapist's efforts to encourage him to challenge his ideas about his baldness and to consider the relevant evidence, Jason may prove extremely resistant to what ought to be welcome news.
So far, one difference between the beliefs of these people and beliefs in paranormal experiences is that the former are mainly personal, whereas the latter relate more to the nature of the external world. Another difference is that the beliefs of these patients appear to be unwelcome and destructive for the person concerned, whereas, as I have argued, in many (but not all) cases, paranormal claims are highly valued by their advocates. Nevertheless, as I have illustrated with Jason, it is not necessarily easy to persuade patients that there are more realistic and constructive ways of thinking. For example, hypochondriacal patients, 'somatisers' (those whose psychological problems tend to be manifested as physical symptoms), and illness phobics (such as Maria) when given the physical 'all clear' do not necessarily perceive this as a cause for celebration. Not uncommonly they may devalue the evidence, calling into question the doctors' competence, or finding loopholes such as 'I was feeling OK when they did the tests' or in the case of one somatiser I saw, 'They took blood out of my right arm but the pain is in my left arm'.
Even poor Mahmood may be peculiarly resistant to the evidence that he has at least as many positive attributes and as few negative ones as most people. 'They don't really mean it' or 'They are just saying this to please me' may be his stock answers when it is pointed out to him that other people express admiration of his positive qualities.
Patients such as Mahmood, Jason, David and Maria are predisposed to interpret every incident or event as evidence in support of their beliefs. For example, if someone laughs in Mahmood's vicinity, he may immediately conclude that that person is surely laughing at him because he is so stupid; likewise Jason may conclude that his appearance is being ridiculed. As soon as David notices his heart rate has increased, he believes that he is going to panic and faint. When Maria has a headache, to her it is a clear sign of a brain tumour.
In fact, beliefs that are associated with high levels of fear or anxiety predispose us to two cognitive biases. Firstly, with increasing levels of anxiety we allocate more attention to information that is perceived to be potentially threatening. Secondly, we are increasingly prone to interpret information in a threatening way. (There is a sub-group of individuals, called 'repressors', who tend to do the opposite, but these will not be discussed here.) These cognitive biases have adaptive value; that is, they favour survival. However, one theory (Eysenck, 1997) is that anxiety disorders arise through a positive feedback loop, whereby anxiety generates these attentional and interpretative biases, which generate further anxiety and so on. Thus the anxiety spirals out of control. A panic attack is a good example of this, the cognitive biases being concerned with bodily processes; likewise illness phobia.
This model explains why ill-founded and irrational beliefs that generate anxiety, and are therefore not valued by the person, can nevertheless, especially when the anxiety level is already high, be held with unshakable tenacity and are resistant to rational influences.
Sometimes one cannot help but come to the conclusion that the persons concerned may have some reason, perhaps even unknown to themselves, for holding onto their beliefs. For example, it may somehow be psychologically less threatening for Maria to assume the sick role than to be pronounced healthy and able-bodied. With patients such as Jason, it is possible that, rather like Mahmood, they constantly feel unfulfilled and afflicted by unhappy and anxious feelings and therefore construct their own explanations or hypotheses about these experiences, in Jason's case that it is his baldness that is handicapping him. Jason's distress thus becomes more understandable to him, likewise the solution to his distress. Hence, whilst he and, for that matter, Maria, in no way value their fear, distress and unhappiness, they may be said to value, and indeed overvalue, their beliefs or hypotheses as to why they feel this way. Hence it is not easy for them to entertain the possibility that there may be other, more personal, explanations for their dissatisfaction and unhappiness.
Here we have an obvious case in which there can be no doubt that the person values his beliefs.
It is these kinds of delusional beliefs that I find are often most similar to paranormal beliefs or belief systems. Not uncommonly, the thoughts, feelings, communication and behaviour of such patient appear quite normal and rational when they are not engaged in the subject matter of their delusions. Otherwise, they will be completely impervious to any suggestion, evidence or reasoning that they may have got things wrong. They will be alert to and interpret any scrap of evidence as supporting their theory, even when it obviously contradicts it, and will not be prepared to consider that a different interpretation is possible. For example, when Sean writes Donna a letter asking him to meet her and she does not turn up, it is evidence that she feels the needs to test his love for her. When she looks upset next time he walks past her in the pub, clearly it is because she is in love with him, but is not yet able to commit herself; something or someone is holding her back. An ex-boyfriend confronts Sean and explains that Donna is not interested in him: clearly this person is jealous of Donna's love for him. Sean overhears Donna at the bar saying in a loud voice 'I need more time'. Clearly this message is directed at him and not at the group of people she is with. And as for her choice of record on the jukebox when Sean is present - 'Just wait a little longer' - this says it all.
It makes sense to speculate (and again the theory of cognitive dissonance predicts this) that the more a person commits himself or herself to a belief, the less easy it is for him or her to disengage from it, particularly when that commitment is manifested in action. People have harmed themselves and others, and have even committed murder, on the basis of their delusions. It is plausible to consider that having thus acted upon their deluded beliefs, the people concerned are less likely to disclaim them when they are refuted by reason and evidence.
Indeed, the staff of the local Granada Rentals shop have become very concerned about Joe's visiting them and haranguing them to do something about this.
My impression is that generally the more unusual or irrational the belief, the greater the tenacity with which it is held by the person. Direct challenges to the belief may provoke hostility and, in the case of certain delusions, psychiatric staff may be perceived as being part a conspiracy or persecutory network. For example, Dawn may conclude that the refusal of psychiatric staff to support her belief in her pregnancy implies that they are in cahoots with the medical doctors who have denied her claims.
There are a number of theories about auditory hallucinations, one being that they are simply the result of aberrant neurophysiological activity. However, an understanding that has informed cognitive-behaviour therapy in helping people cope with their voices is summed up by the expression 'misattribution'. The principle problem is that patients construct sinister and frightening hypotheses concerning these internal cognitive experiences, often attributing them to external persecutory agencies, rather than their own thought processes. This formulation then opens up the possibility of using cognitive-behaviour therapy to assist the patient in constructing more rational hypotheses about his or her experiences.
There are patients whose thinking is so disorganised that it is virtually impossible to engage in any meaningful dialogue with them. When one sits down with someone who is floridly deluded, like Karl, and listens to his or her story, one often has an experience analogous to turning on a tap: an endless stream of bizarre and irrational ideas and nonsense comes pouring out. Indeed one may wonder whether there is any limit to this and whether the details are not being simply generated on the spot rather than being representative of the person's existing daily pre-occupations. It is unusual to have this kind of extreme experience with a non-patient with a paranormal belief system, but it does happen. (I have to say that listening to interviews with Mr David Icke reminds me of my experiences with such patients. For this reason I find it distasteful that he has been used as fodder for chat-show entertainment, one presenter even gleefully remarking, 'But people are laughing at you, not with you'.)
More commonly in the non-patient population, the flight of ideas is more organised and coherent, and the ideas themselves are accepted and valued by others. However, as previously stated, one still has the impression of the lack of some kind of critical regulation of the ideas being entertained.
On what grounds are we entitled to assert that any or all of the above people are mistaken in their beliefs? This is a fair question. So far, I have talked, in admittedly simplistic terms, about a failure to apply rational thinking to the available evidence and the need to test hypotheses that are constructed on the basis of the available evidence. However, one can apply rational thinking to reliable evidence and thus arrive at the most likely interpretation and still be completely wrong. It may even be that the least likely hypothesis is the correct one. In such cases, this should become apparent as further evidence accumulates. For example, in due course, Maria's doctors may indeed detect the presence of a brain tumour or Stephan's insistence that there is a global economic conspiracy may be vindicated. Thus, one may be irrational but correct.
At this juncture it should be emphasised that 'having irrational beliefs' on the one hand and 'having a psychological problem or disorder' on the other are not one and the same. This is indeed one of the central points that I am making in this paper. It is usually only when the person starts to suffer, or cause suffering to others, that we say that he or she is a 'case'. Otherwise, Stephan, Joe, Dawn and so on are quite free to hold onto their beliefs without having the ministrations of the mental health services offered to or inflicted upon them. Hence having a psychological disorder or being 'mad' is not synonymous with thinking irrationally.
For example, one may argue that we are not entitled to dismiss Stephan's theories as irrational even if they make no sense to us. Some would even argue that truth is relative and the world that one individual constructs for himself or herself is as valid as that of any other individual. Suppose, however, Stephan starts to believe that members of his family have been indoctrinated by the government and are part of the global conspiracy. Now he interprets things that they say and do, and have said and done in the past, as evidence that they are part of the conspiracy. Naturally he alienates himself from them, and his wife and children thereby suffer. Suppose, quite naturally again, he seeks to defend himself against them by physically attacking them in a serious way.
At what point in this chain of events do we identify Stephan's thinking as irrational? Not necessarily when he becomes 'mad'; most likely he has not been thinking rationally for a long time as he has developed his world conspiracy theory. (Often the boundary line between non-pathological and pathological beliefs is crossed when the beliefs become highly personalised and this adversely affects the individual's day-to-day behaviour. For example a man may sincerely believe in alien abduction. He may further believe that he himself was once abducted. Then he may believe that his abductors have a special mission for him. He therefore gives up his job, leaves his family, demands audiences with world leaders, and so on. Usually such people are very isolated in their beliefs rather than part of a group of likeminded individuals.
Even at quite basic levels, each one of us constructs his or her experience of the world and comes to his or her own understanding of it, in a manner which is not dissimilar to that of the scientific ideal. That is, the hypotheses we construct are based upon what experience and logic has told us are most likely to be true and we test these hypotheses in ways that helps us to support, modify, or refute them.
This is the ideal but it is not as simple as it sounds. When I have given talks on scepticism I have often found myself stating that 'Truth' - that is, true, reliable knowledge about the nature of the material world - is very difficult to come by. Indeed, I believe that this is a pivotal, but usually only implicit, tenet of scepticism. It applies not only to science but also to all academic and applied disciplines that form part of the 'tree of knowledge'. It also applies to information about the wider world that each one of us acquires through his or her education and, especially, through the media. Much of the latter is cheaply gained and hence unreliable and often, in large measure, false. It is often commented how much our beliefs and opinions are simply based on illusions. We perceive the world 'through a glass darkly'; this is so, even for our knowledge and understanding of our immediate world that we receive through our senses and we encode in memory.
That we all fall short of this ideal is probably, most of the time, of little detriment. In fact, perhaps in a manner analogous to spontaneous errors in genetic material, it may even be a good thing that we deviate from the rules. Life can thereby be more interesting and exciting and we may be more creative and even make important insights and discoveries that we might otherwise miss were we always to adhere rigidly to the rules. Our greatest mistakes are made when we do not acknowledge our fallibility; that is, when we are unwilling or unable to consider that our observations may be unreliable and our hypotheses incorrect, and that there are alternatives that may be true and even more likely to be so. Likewise we fall short when we unduly bias our interpretation of sensory information and internal experiences to support our existing hypotheses.
These 'mistakes' are made by all people, whether or not they hold paranormal beliefs and whether or not they have psychological problems, mental disorders or mental illnesses. What interests me is how much these mistakes may characterise the thinking of patients with even the most serious mental illnesses. In this respect they may not differ as much as we like to believe from other people who have unusual beliefs. Certainly the statement 'I am not mad!' adds little authenticity to any claim that the person has witnessed a supernatural or anomalous phenomenon.
Eysenck, M.W. (1997) Anxiety and Cognition: A Unified Theory. Hove: Psychology Press.
Gardner, M. (2001) A skeptical look at Karl Popper. Skeptical Inquirer, 25(4), 13.
Neisser, U. (1967) Cognitive Psychology. New York: Appleton-Century-Crofts.