In this paper I look back on my life, on what have I tried to achieve professionally, and on what trends and theories I have opposed. Next I discuss the convictions and considerations that constituted the building blocks for the spiritual foundation of my life. I reach the conclusion that the building blocks were derived from Judaism and explain why that is so. Finally, I conclude that the scientific and the theological ingredients of my life relate reciprocally and do not exist independently.
- Spiritual life
- professional life
Statistics from Altmetric.com
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.
I am not a philosopher, not someone who by profession reflects upon the key issues of the human condition. Certainly, I think. Actually I am constantly in discussion, either with others or with myself. Meditation, conceived as the attempt to relieve the mind as completely as one can, is strange to me. I ponder my profession, psychiatry, and life—my own life, not living in general. Is the course I have plotted the right one to achieve my goals, as a professional and as a human being?
These reflections are the substance of this treatise. First I discuss the professional objectives, then the spiritual anchorpoints.
“I have tried to bring about some changes in psychiatry.” Thus I began my book Beyond the Mainstream. About the Scientific Anchorpoints of a Psychiatric Career.1 I have resisted views I considered to be counterproductive for the psychiatric profession, either scientifically or in terms of practice. I have fought against sacred cows, which I considered to be golden calves. I did not leave the matter at that. A critique which offers no alternatives is a sterile approach and a presumptuous one at that. I did suggest alternatives. Or more precisely, the alternatives generated my critique. I finished the above mentioned book as follows: “A river bed has of course two sides: this one and beyond. The former is reached when one arrives at the stream, the other side after traversing it. I hope I have moved beyond the mainstream”.
Undeniably, that sounds pretty pedantic. Iconoclasts—and that characterisation approximates to my self image—are by definition opinionated and not devoid of self conceit, simply because they evaluate particular extant situations and notions as unsatisfactory and hence they reach the conclusion that they know better than the mainstream. May they be forgiven. The pressures of life on the barricades and the tenacity one needs to continue may counterbalance the conceit.
To which causes did I devote myself? I will discuss the most prominent ones, briefly—almost telegrammatically—and arrange them in chronological sequence.
1958–1980: The struggle for an empirically based psychiatry
The principal action point in this period was to mount opposition against a predominantly essayistic psychiatry which lacked an appreciable empirical foundation, and to make concerted efforts to provide this profession with a scientifically established base.
Psychiatry, in this period, was dominated by two philosophies: psychoanalysis and phenomenology. Both were pre-eminently subjectivist and averse to objective methods and general statements. Psychoanalysis regards psychiatric disorders as being strictly individual expressions of strictly individual (disturbed) emotions. The origin of disturbed emotion is sought in the (disturbed) developmental history of the particular individual seeking treatment by that particular therapist. Psychiatric examination, so runs the line of thought, is a process of individualising, not of generalising.
The phenomenology of Jaspers is particularly interested in the experiential world of the individual patient. What is passing through his or her mind when the diagnostician has diagnosed a particular disorder? The phenomenologist tries, so to speak, to enter the private emotional world of that singular patient. He identifies with him, in order to understand what moves that patient; what the patient experiences when an outsider speaks of phenomena such as delusions, hallucinations, or mood lowering. He aspires to understand (in German verstehen) rather than to explain (erklären).
As in psychoanalytic thinking, so in phenomenology the unique subject holds central position. Both philosophies remained averse to attempts to arrive at objective and generalising statements regarding a particular category of psychiatric patients. Without detracting from the contributions these philosophies made to psychiatry, one has to count against them that both have acted as brakes on the evolution of psychiatry into an evidence based discipline, that is, a discipline in which evidence is experimentally collected or verified.
Attempts to get biology accepted as one of psychiatry’s basic disciplines
Opposition to a predominantly psychologising psychiatry, a psychiatry mainly interested in the psychological determinants of abnormal behaviour, was another of my action points in those years. Opposition to a de-biologised psychiatry went hand in hand with the struggle for acceptance of (neuro)biology as one of the legitimate basic disciplines of psychiatry (although emphatically not the sole such discipline). Students of phenomenology and psychoanalysis were “a-biological”, but not fundamentally antibiological. They were simply not interested in the brain and thought it unlikely, if not impossible, that biology would ever furnish fundamental insights into the origins and treatment of mental disorders. It is true that Freud occasionally wrote sympathetically about biology, the tenor of that writing being that biology might at some stage construct a bridge between abnormal psychology and psychiatry, but the matter remained restricted to platonic declarations. In practice psychoanalysis floated ever farther away from biological points of view, to the point where one could properly speak of alienation between them. I remember the threat uttered by a psychoanalyst some 30 years ago: “If you prescribe this depressed patient Lithium, I will stop treating him”.
The accidental discovery of the modern psychotropic drugs (Lithium, antipsychotics, antidepressants, and anxiolytics) fell, then, on infertile soil. For several years these drugs were belittled as sops that would conceal the true roots of a mental disorder, or as interventions of last resort, in those cases where psychotherapy had failed. In the leading psychiatric circles, their therapeutic significance was recognised only slowly and reluctantly.
A third movement called antipsychiatry, arising at the end of the 1960s, was forthrightly antibiological. Though short lived, it shook the psychiatric world forcibly. Antipsychiatry was interested in the social determinants of abnormal behaviour, and exclusively so. Its extreme protagonists denied even the very existence of mental disorders. They were considered to be “would be” diseases, constituted by behaviour which in fact made a lot of sense as a way of accommodating to a diseased society. This movement was first and foremost politically inspired, and felt no need at all to evaluate its viewpoints empirically.2 Antipsychiatry obstructed the development of biological psychiatry, not for a long time perhaps but certainly forcefully.
Endeavours to achieve diagnostic validity
A third province of concern was diagnosis. Opposition to the chaotic, unvalidated state of psychiatric diagnosis, and proposals for its standardisation and operationalisation, within a multi-axial framework were formulated. The proposed axes were: symptomatology, aetiology, course, severity, and premorbid personality structure, to be scored independently, because no significant mutual intercorrelations could be demonstrated and the predictive value of those variables appeared to be quite limited.3
In those years psychiatric diagnosis was anything but standardised. Without much exaggeration one may state that there were as many taxonomies in existence as there were psychiatric textbooks, and most psychiatrists of renown felt compelled to write such a textbook. Diagnosis varied not only from country to country, but within one country from institution to institution. Diagnostic constructs were usually poorly operationalised and employed different criteria (such as symptomatology, aetiology, course, prognosis, and premorbid personality structure) either alone or in varying combinations.
Empirical investigations must above all be reproducible, and must also have an accurate and unambiguous definition of the object of study. The chaotic state of psychiatric diagnosis hardly allowed the repetition of any experiment. Understandably then, empirical research was not in vogue in those days, and was technically scarcely feasible for lack of adequate methods.
1980—present: Continued endeavours to achieve diagnostic validity
A prominent objective in this period was critical analysis of the newly introduced diagnostic approach, embodied in the third edition of the DSM (Diagnostic and Statistical Manual of Mental Disorders).5 In my opinion it showed several essential failings and yet, shortly after its introduction it “solidified”, by which I mean it was by and large exempted from critical evaluation of its basic assumptions. Criticisms were allied to advocacy of a manner of diagnosis which has been named functional psychopathology.6–8
Unquestionably, in 1980 psychiatry made a considerable move forward. With the publication of DSM III an operationalised and standardised taxonomy of psychiatric disorders was introduced. It was the first taxonomy of this nature covering the entire domain of psychiatric disorders. It was pretty well instantly embraced by the entire psychiatric world, clinicians and researchers alike. It was “love at first sight”, prompted by two things in particular: first, the necessity for experimental research that had begun to blossom in those years, characterising in detail the objects of study, and, second, the waning influence of psychoanalysis, phenomenology, and antipsychiatry.
Why then, at the same time, did I engage in opposition against a development whose necessity I duly acknowledged? Because, though I agreed with the principle, I did not agree with the way in which it was formulated. I put forward a number of points which I believed needed to be considered.1,9,10 First, the system was cast in a nosological mould. Disorders were defined on the basis of five criteria (so called “axes”). To qualify for a particular diagnosis, all five criteria had to be fulfilled. In this way “disease packages”—that is, disorders, were demarcated. Criteria were basically determined by groups of experts in consensus meetings, and by literature reviews. In some cases field studies were carried out, but even these were seldom repeated for confirmation. Most disease-constructs were introduced without adequate prior validity studies. This was understandable and defensible: prior validity studies would have delayed the project for many years. But what was incomprehensible and indefensible was that retrospective systematic validity studies, construct by construct, also remained in abeyance or, at best, were carried out haphazardly. DSM-III was followed by DSM-III-R and DSM-IV; indeed DSM-V is on its way. Unvalidated constructs are revised, often on grounds which are themselves insufficiently validated: this does not enhance their validity!
Moreover, in clinical practice one sees many patients who do not meet all the criteria for a given diagnosis. This engendered the introduction of an avalanche of new and insufficiently validated diagnoses, and led in clinical practice to an excess of diagnoses with the qualification “not otherwise specified”. I characterised this situation as one of structured chaos.
With the introduction of DSM-III, furthermore, the refined characterisation of syndromes disappeared. In terms of symptoms one qualifies for a certain diagnosis if x symptoms out of a series of y symptoms can be demonstrated, no matter which ones. Each diagnosis thus covers a variety of syndromes. This approach has made psychiatric diagnosis much coarser, and has had the same effect on psychiatric research, dependent as it is on the precise definition and delineation of the object of study.
Finally, the so called axis-I diagnoses (that is, currently diagnosable disorders) and the axis-II diagnoses (lasting personality disorders) are assessed independently. Thus a crucial concern in psychiatry is bypassed: do axis-I and axis-II diagnoses interrelate; to what extent are axis-II diagnoses responsible for axis-I diagnoses or, possibly, vice versa? Of course in many cases, this question cannot be answered with certainty. For the sake of a rational therapeutic programme, however, a statement—a hypothesis if you prefer—on the questions is absolutely indispensable. One could reply that DSM represents a classification system, not a diagnostic system. This rebuttal, however, does not hold good. The DSM is much more than a classification system. It is a system that presently governs psychiatric diagnosis completely. Anything which is not embodied in DSM carries the stigma of irrelevance. Because the question of how axes-I and -II diagnoses relate is not posed, it has lost much of its topicality. This has impoverished psychiatry.
1990—present: Endeavours to maintain evenhandedness in psychiatry
Opposition to overappraisal of biological psychiatry and against excessive desubjectivising of psychiatric diagnosis; advocacy of an “evenhanded psychiatry”, with sustained attempts to reconquer the subjective domains of psychopathology using experimental methods, and to secure rehabilitation for the concept of psychogenesis: these were my key objectives in this period.
In the period 1960–1980 biological psychiatry developed from a barely accepted stepchild of psychiatry into a leading player. Occasionally the stream has overflowed its banks. Then the biological protagonists get over excited. One notes, for instance, headlines above page wide advertisements in (mainly American) newspapers reading: Schizophrenia is a Brain Disease. Therefore, brain research will eventually resolve the “problem”. The notion is ignored that psychological and psychosocial variables may have contributed to the cerebral disorganisation; that schizophrenia, besides being a biological disorder, may also be a psychological and psychosocial disorder, and that in this case not only the biological but also the psychological and psychosocial determinants require experimental as well as therapeutic attention.
Reconquest of the subjective
Experimental research, nowadays, has gained a central position in psychiatry. Attention is focused on phenomena that can be established and documented in a more or less objective manner. The world of subjective experiences is in danger of being marginalised as being “soft”, and thus shut off from scientific exploration and becoming (seemingly) ever more irrelevant. Leaving aside the question of whether the experiential world is closed to experimental research, we will do psychiatry no favours by allowing entire domains of psychopathology to wither through scientific neglect. So I and some like minded colleagues came to the conclusion that a reconquest of the subjective by scientific means was a matter of considerable urgency.
Rehabilitation of the concept of psychogenesis
Overestimation of the biological determinants of abnormal behaviour had, we felt, led to an underappraisal of the concept of psychogenesis. To what degree have past events contributed to the present mental condition? Have personality features contributed to a misappraisal of particular events or to a failure to process them adequately? Can those personality features be linked to particular developmental adversities? Those questions, no less than the biological variables, are crucial if one aspires to grasp the complex ways in which psychopathology develops. The present is incomprehensible without taking the past into account. Psychiatry cannot neglect this classical truth with impunity.
Psychological development, personality structure, and axis-I diagnoses are probably intimately interconnected and should be studied accordingly.
PROGRAMMES OF ACTION
So far I have set out the most important areas for action. Next I will briefly discuss the main programmes that resulted.
Developing building blocks for an empirically based psychiatry
Psychiatry is not philosophy; it is not a philosophy of life. A philosopher is allowed to contemplate whatever he chooses, but a doctor is not. The doctor puts his views directly into practice. Practice means in this context a cohort of patients, that is, individuals in distress, who are highly dependent on him. Patients should be confident that the opinions of their doctors are based on more than private views. Therefore my first goal was to contribute to the groundwork on which an empirically based psychiatry could be developed. That groundwork has two constituents: valid diagnosis and reliable methods of patient assessment. Without a precise definition of the object of study, in this case psychopathological features, no psychiatric research is feasible. Without instruments to measure those features, even precise and valid diagnoses do not carry us any farther.
The central theme of the first phase of my career was depression. Our research programme literally started with the development of a system to diagnose depression in an operationalised, standardised, and multi-axial manner.3 A number of syndromes were described, namely vital depression, personal depression, and mixed forms. Vital depression is comparable to the syndrome of endogenous depression in the anglo saxon literature, and personal depression to the syndrome described under the heading of neurotic depression. Furthermore, it was proposed to characterise depression along five axes: symptomatology; severity; duration; course, and premorbid personality structure. It was explicitly stipulated that those axes had to be assessed and scored independently of each other. This was because no mutual correlative relationships had been demonstrated (nor have been demonstrated to this day) and the predictive value of those variables appeared to be negligible. A telling example is provided by the syndrome of vital depression. It appeared to be aetiologically non-specific, of variable severity, course, and outcome, and responded unpredictably to both psychopharmacological and psychological interventions, while premorbid personality structure varied from normal to definitely pathological. In other words, though the syndrome was definable, it could by no means be considered to represent a specific “disorder” or morbid “category”.
Twenty years later, DSM III introduced standardised and operationalised diagnosis for the whole of psychiatry. The principle of independent scoring of the various axes was not, however, adopted. As I have said already, I believe this has to be considered a mistake. In addition an attempt was made to develop an interview that was standardised and operationalised in order to recognise and register depression in a reproducible manner. This resulted in the Vital Syndrome Interview, the first standardised, operationalised interview to be introduced in psychiatry.11
With these tools, we began studies into the biological and psychological effects of antidepressants in depression and into the neurobiological determinants of mood disorders.
Making biological psychiatry respectable
For many years psychology and biology were in opposition. I started residency training at the time that psychology predominated. I considered this polarisation to be fruitless and logically untenable. I proposed an alternative, namely to introduce into psychiatry the concepts of aetiology and pathogenesis, as they were used in the rest of medicine.11 “Pathogenesis” refers to the complex of cerebral dysfunctions thought to underlie the psychiatric condition, and “aetiology” refers to all influences—biological or psychological in nature, hereditary or environmental—that have contributed to cerebral dysfunction. Diagnosis entails the assessment of pathophysiological variables (to the extent they are already discernible) and of aetiological factors (be they of a biological or, very frequently, also of a psychological nature).
Research can be directed toward either pathogenesis or aetiology; the latter having a particular interest in either psychogenesis or biological factors. Treatment should be fundamentally twofold (to the extent that adequate methods are available): psychotropic drugs to normalise disturbed brain functions underlying the psychiatric condition, and psychological interventions geared towards easing existing suffering and strengthening an individual’s resistance—coping—with regard to psychotraumatic events that might upset cerebral function.11 We maintained that the biological and psychological viewpoints are complementary and not at all contradictory.
We chose mood disorders as the central theme of this research programme. The biological component posed two questions: the first was whether the therapeutic actions of antidepressants are associated with changes in mono-aminergic systems; and the second was whether mono-aminergic disturbances play a role in the pathophysiology of depression, particularly in those subtypes that tend to respond favourably to antidepressants that impact on mono-aminergic systems. Several observations supported the first assumption, and positive indications were also found in support of the second, particularly with regards to neuronal systems requiring serotonin as a neurotransmitter.12 Several observations suggested that the functioning of these nerve cells can be impaired. These observations have contributed to the development of a new group of antidepressants, the so called selective serotonin reuptake inhibitors (SSRIs), compounds that increase the efficiency of serotonergic nerve cells.
Moreover these studies showed that biological research in psychiatry is not an esoteric pastime but an exercise that can generate data with practical implications.
Development of a functionally oriented psychiatry
The diagnostic constructs proposed in the third edition of the DSM and in subsequent editions are in many respects heterogeneous and for the most part insufficiently validated. This makes them a hazardous starting point for scientific endeavours.10 This allegation is serious, but hard to refute. It has generally been settled by ignoring it. On the other hand, it would have been irresponsible and quixotic of us if we had suggested simply throwing the system overboard—after all it represents the lexicon of the language modern psychiatrists speak. Instead we proposed upholding the DSM’s categorical constructs as diagnostic starting points, but refining the diagnostic process by adding a number of diagnostic steps.13
The first step is conscientious differentiation among, and characterisation of, distinct syndromes. Unfortunately, however, syndromes frequently appear in incomplete form and often the patient presents (parts of) more than one syndrome. Hence the next step should be a precise stocktaking of the symptoms which make up the psychopathological condition in question. Psychopathological symptoms are really effigies—effigies of underlying disturbances in the psychological regulatory and control systems. Psychopathological symptoms are the way those dysfunctions are perceived by the observer and experienced by the patient. For instance, hearing voices is a symptom, a particular perceptual disturbance, the underlying dysfunction. Psychological dysfunctions are the basic units of psychopathology.
The next step in the diagnostic process, therefore, should be analysis and assessment of the psychological dysfunctions underlying psychopathological symptoms. This procedure will ensure a detailed map of those components of the psychological apparatus whose functioning is impaired. An additional advantage of this approach is that psychological dysfunctions are much more readily measurable than are psychopathological symptoms, entire syndromes or nosological entities. We have called this strategy the “functionalisation” of psychiatric diagnosis. Once fully developed, it will put psychiatric diagnosis on a firm scientific footing.11,14
Reconquest of the domain of subjective pathology
A considerable part of psychopathology does not manifest itself in observable behaviour, nor is it communicated verbally in a direct and unequivocal manner. It declares itself exclusively or predominantly in the subjective realm, a terrain which experimental psychiatry tends to shun. Its attributes are hard to ascertain and measure and hence seem, scientifically speaking, “weak” and thus uninteresting. This, however, is a fallacy. The world of subjective experiences is an integral branch of human existence and, in consequence, of psychiatry. The same is true for the “life history” as experienced by the subject in question. It is of course important to know what has actually happened in a person’s life, but we also want to know how those events disturbed his life. We cannot simply disregard important domains of psychopathology. We have to develop methods of charting these subjective domains reliably and reproducibly, projecting factual life history and experiential life history on top of each other in the search for meaning. Comparing this image with that obtained from “average citizens” under comparable circumstances will yield important data on the psychological vulnerabilities and endurance of a given patient.
The reconquest of the subjective domain (I speak of reconquest because it has been the pre-eminent sphere of work of psychoanalysts and phenomenologists) will be difficult, but by no means impossible.15 As a first step we have proposed the reintroduction of the free interview as a research tool, through detailed definition and explanation of the questions that have to be answered after the interview. If several assessors evaluate the interview it seems that quite reliable judgments about subjective phenomena can made.
A second method for evaluating subjective phenomena, in which my department in Maastricht in particular has been heavily involved, is the so called “experience sampling method”. The subject is asked to fill out diaries containing structured questions about mood states, disturbing thoughts, and emotion laden events, encounters, and situations. Entries in the diaries are to be made several times per day, at random time points, and over several days per week. In this way “on line” information is gathered on mood states, the way they fluctuate, on precipitating factors, if any, and on possibly disturbing thoughts.16–20
The experiential world can be penetrated by experiments, and these experiments should be undertaken and the experiential world explored, otherwise experimental psychiatry will wither.
So my professional life has revolved round the borderlands between body and soul. This has never tempted me to take the position that after all the soul is nothing more than a product of brain activity. This type of reasoning bespeaks a primitive kind of materialism. Of course, brain activity is a necessary precondition for the existence of the soul, but the soul is so much more; just as a portrait by Rembrandt is so much more than a complex of well definable and distinguishable strokes of paint. The soul, I never ceased to believe, is a unique gem; dependent on a functioning brain, certainly, but made of a singular “substance” and of a singular identity, and on whose structure and orientation neurobiology will remain silent. Orientation means in this context directedness. Spiritual life is not enacted at random. It has targets that determine how we will organise our lives, even if those targets appear to be beyond our reach. Such targets act as spiritual anchorpoints and give meaning to an existence that requires more than the gratification of bodily and material needs.
What have been the spiritual anchorpoints of my life? They can be reduced to a common denominator, Judaism.21 That, at least, is the way I have sensed it. In fact, I should not say simply “Judaism”; the common denominator is my Judaism—– Judaism as it has been experienced and interpreted by me. This sounds somewhat presumptuous, but it is not meant that way. Judaism permits such private interpretations. It is the least theological of the monotheistic religions, in the sense that is has always been averse to formulating a creed, a catechism, a set of dogmas in which one has to believe in order to be considered as faithful. It is precisely because Judaism has not formulated an official and canonised theology that so many Jewish traditions can exist peacefully side by side. Up to a point the Jew is free to formulate his own profession of faith.
Well then, the following is an attempt to do just that. The “articles of faith” will be rendered in broad strokes and, just like the professional anchorpoints, point by point, in telegraphese and simplified. Somewhat artificially I will distinguish general and specific “articles”. The general ones concern an attitude to life, the specific ones the pursuit of life.
The general “articles”—an attitude to life
The idea of a unique Supreme Being, which was introduced and elaborated by the Hebrews and preserved by them and their descendants in a pure form, appeals to me. It is the idea of a steering, driving force; in essence unknowable, indefinable, impossible to depict, indivisible; yet recognisable to those who are open to it; who have in other words developed a sixth sense which makes them sensitive and responsive to metaphysical experiences. In the Hebrew Bible the concept of God does not figure as an abstraction. The descriptions are anthropomorphic and refer to recognisable human archetypes—for example, the judge, the ruler, the father figure, who can be well pleased, cross, caring, or vengeful. Only much later, in the Middle Ages, is the concept of God framed in such abstract terms as the Omnipresent, the Supreme Being, the Infinite Principle. Its abstract configuration represents for me the essential notion of the concept of God.
I am not a determinist, one who disparages free will and believes that everything in life is predestined. Neither do I believe that it is only chance that is steering our lives. I hold the view that human beings themselves are responsible for the way they manage their lives, at least in large measure. I take this responsibility, however, not to be unlimited. I recognise, or rather assume, that human beings certainly “steer” their lives, but they are steered as well. I am not now alluding to unconscious forces but to Faculties that exceed human faculties of cognition—to be on the safe side, I mark them with a capital letter. For me a hypothesis such as this meets a need, both intellectually and emotionally. Without this assumption I would apprehend life as too prosaic: literally, too down to earth.
A second element in Judaism which is important to me is that it leaves its members free to think according to their conscience. It thus offers a surprising intellectual ambience. It knows, as said, little of a formalised theology in the sense of a set of doctrines which have to be accepted without doubts or criticisms in order to be accepted in the religious community to which one aspires to belong. One formidable exception of course is the central belief of Judaism that God is one; an indivisible Unity. The biblical texts themselves are immutable, but their interpretation allows maximal freedom of thought. Jews are permitted, even encouraged, to think about what God intends to communicate to mankind. Judaism sets high standards as to the way life should be lived. Judaism allows men a large measure of latitude, however, in terms of their thoughts. This combination provides an intellectual residence in which I can feel at home.
It will come as no surprise that the inhabitants of that residence are attracted to words, ideas, concepts. This preference goes hand in hand with a dislike of dogma and a desire for exegesis, interpretation, and dialectics. Such an attitude generates my third “article”. Nothing is definitely established; everything is open for interpretation and discussion, even God’s word. Moses, Abraham, Job, and in a certain sense the writer of Ecclesiastes, are at issue with God, challenging Him, arguing with Him. Jacob fought against God, or an angel of God, and was from then on allowed to call himself Israel: “He who fought with God”.22 This event has been interpreted by some as Jacob’s fight with his lower appetites. An interpretation that appeals more to me, however, is that Jacob fought God over an issue unknown to us, and that God was prepared to admit having been wrong. My final example: God entered into a covenant with the Jewish people. It is binding upon both parties. In this way Abraham could reprove God when he was about to destroy Sodom and Gomorrah. “Shall not the Judge of all the earth deal justly?” (The Holy Bible,22 Genesis 18:25). The covenant permits man to call God to account.
The Talmud is symbolic of this dialectics, of the flexibility and pliancy of the human mind. It became a sanctuary during 2000 years of persecution, oppression, and discrimination. Study of the Holy Books began to be a “freeport of the spirit” within which a life that otherwise would have been drab and without perspective, brightened up and maintained hope.
I admit that this third “article” renders a somewhat idealised image of the spiritual climate within Judaism. It is hard to deny that in the past 150–200 years a measure of intellectual stiffening has crept into Judaism, at least in (some) Orthodox circles. I consider this trend to be foreign to Judaism, to “my” Judaism, Judaism as I savour it.
A fourth core attitude which I have encountered in Judaic culture, and which has attracted me to such a degree that it too has become a governing “article” of my life, is togetherness—alliance with the group to which one feels oneself to belong; an attachment to a common culture and value system; a readiness to keep fostering those spiritual values even if such a decision brings danger with it—traits that are strongly ingrained in the Jewish people. Such was the case in biblical times when the group had to hold its own within dominant cultures foreign to its nature. It held true for the 2000 years of the Diaspora, when for most of the time the Jews lived in communities inimical to them. It holds true in our own day, when the risk of total assimilation is greater than ever.
I treasure the security of, and atmosphere in, a home with ethnic, cultural, and historic foundations; to put it more strongly, I need such shelter. In the course of the years the home of the Jews has been besieged many times. The occupants never surrendered. They rejected time after time the facile way of total assimilation and abjuring their heritage, in spite of the colossal pressure to do so. This can truly be called strength of mind. I am proud to belong to this people.
The specific “articles”—the pursuit of life
Let me continue to discuss some of the special elements that have shaped my way of life, or become its guidelines. They, too, are borrowed from the world of Jewish ideas and sentiments. The reader may possibly think them rather trivial; and so they are, up to a point. Everyday life is largely guided by practical considerations, not by high-sounding incantations.
As a first point of orientation I mention messianism, an idea that arose from Judaism. I am not alluding to messianism in its eschatological, utopian form (concerning the return of the Jewish people to the Promised Land and the restoration of Creation prior to the Fall of Adam and Eve). Rather I am alluding to the messianic idea in its more unobtrusive, evolutionary sense. This is the expectation that this world and its inhabitants, in spite of all their misery, injustice, violence, and cruelty will evolve toward a better future. Indissolubly connected with this vision of the future is the idea that such an evolution will not come of itself; rather, mankind has to make concerted efforts to usher in the messianic epoch. Our deeds are what bring it about. Mending our disfigured world requires the utmost dedication of every Jew, of every human being. “Walk before me and you shall be complete” (The Holy Bible,22 Genesis 17:1). Purification is a process initiated by men, not by God. Human beings are conceived as God’s stewards.
For me, this idea holds a penetrating appeal.
The same holds for the forceful emphasis the Hebrew Bible puts on the here and now. Ultimately we will be judged by what we do and have done, not by what we believe. Humans are held responsible for what they do and for what they do not do. They are free to choose in the direction of right or of wrong, and will be judged according to those decisions. The Jewish religion is directed toward this world, much more than to a world to come. The ideas of an afterlife, of an immortal soul, of resurrection after death occur infrequently in the Hebrew Bible. It was only much later, at the time of the Pharisees, that those tenets were introduced into Jewish thought. The revelation on Mount Sinai provided a blueprint of a social order, a set of rules which laid out how human beings should behave toward each other, here, on this planet.
The Jewish people are summoned to be a holy people. This call bears not on piety or passive subjection to divine laws; it bears on actions, on good deeds, on an internalised system of standards and values, on an attitude to life that ultimately may lead a subject to grow into a Zaddik, a righteous man (or woman). I would prefer to qualify this by referring to an “honest man”. The righteous one, the homo justus, follows worldly rules. Worldly laws are by definition time bound. The honourable man, the homo pius, follows moral laws. Moral laws are, if not timeless, much less prone to culturally determined oscillations. The Zaddik for me is a homo pius, much more than a homo justus.
The Zaddik is a key character, an archetype, in the Jewish concept of Man. He—or she—is no saint in the Christian sense. He does not rise above himself. On the contrary, he turns inwardly, searching for his best mental attributes, however modest those essentially may be. He tries to arrange his life as if those attributes constitute his authentic self.
The honourable man tries to live an honourable life not in order to earn a place in heaven but because he sees it as a duty seriously to try to regard others as his equals and to treat them accordingly. He does not need to do much more than this, though it would of course be desirable; but to do much less is insufficient.
A last “article” I want to mention is the central position the family occupies in the Jewish way of life. The family, not the individual, is seen as the very foundation of our society. The family is regarded as the vehicle that keeps the standards and values of the group alive by truly living them and, through example, by transferring them to the next generation. Continuity, the passing on of the spiritual and cultural heritage from one generation to the next, is an essential tenet of Jewish life. Esau, I take it, lost his birthright not because of unbearable pangs of hunger but because the privilege of birthright seemed immaterial to him. After all, we all die, he notes “and what profit shall the birthright do to me” (The Holy Bible,21 Genesis 25: 32). With this declaration he neglects the tenet of continuity and defies the binding covenant which God had drawn up for the Jewish people: “And I will make of thee a great nation, and I will bless thee, and make thy name great; and be than a blessing” (The Holy Bible,21 Genesis 12:2).
The centrality of the family is an unfashionable tenet today, sounding waggish and conservative. But for me it is a still modern concept; progressive, in that it makes for a better society. A well structured family can be compared to a work of art: something which can never be considered as finished, and which has to be worked on continuously, cultivated and perfected so that it will remain a source of buoyancy and will provide its members with the measure of satisfaction its founders hoped for at the time they entered into the partnership. A family is indeed never finished. Cultivating it reaches far beyond the limits of the family and will contribute to the repair of a distraught world.
All this may sound like an edifying tractatus —“edifying” sounding in some languages rancid, stale, discordant. The same is true of the word decency—for instance, in the Dutch language. This has not been always the case; the behaviours covered by those terms used to be key constituents of Dutch culture. Their cultural downgrading took place hand in hand with the “modernisation” of Dutch society in the 1970s, whereby it changed from a strictly regulated, religious community to a (hyper)permissive one. In other cultures and in other languages this transformation was much less obvious. In English—for example, words such as “devotion” and “decency” are still respectable and acceptable. Obviously, the discord arises not in the concepts themselves but in the ears of the hearer, and as such is a culture bound transformation that does those cultures no credit.
Can I be considered as a religious or even devout person? Is my outlook on life religiously based? Yes and no. No, in the sense that the concept of God, for me, is in the nature of a myth. Thus far I concur with the views of Freud. Yes, in that for me the concept “myth” has an entirely different meaning from the one it had for Freud and (many of) his followers. Myths for me are much more than non-committal human narratives. In my perception myths express a reality that is hard to put into words but which has considerable meaning for a particular individual or group of individuals and meets certain of their needs.
A myth may concern the needs of a group: national myths are an example. For instance, after his untimely death, Kennedy became for Americans (and not only for them) a symbol of the ideal leader: intelligent, enlightened, involved, impartial, creative, trustworthy, charismatic, and progressive. We now know that this image is open to criticism.23 Yet it persists; apparently it meets a deeply rooted need. Rabin became for Israel and much of the Western world a hero of peacemaking. This view is in part mythical. He was above all a war hero. It was only with difficulty that he was persuaded to walk in the way of peace.24 After the second world war, the Netherlands gladly accepted the image of having been a nation that bravely and with self sacrifice dedicated itself to rescuing its fugitive Jewish compatriots. With all due deference to the brave people involved, this self image is for the greater part mythical. There was also large scale collaboration with the Germans. Only in Poland was the percentage of Jews who survived the second world war smaller.
Myths may also serve personal needs. Religious myths are a telling example. They confer upon a life a sense of purpose, imbuing it with meaning, tilting it, so to speak, from a horizontal plane towards a more vertical one. I follow in this the line of thought of the Jewish theologian and philosopher, Neil Gillman.25,26 Myths, he states, are attempts to provide a measure of sense to events and situations that may seem in the first instance rather meaningless. They consist of symbols which are systematised in a certain way. The symbols represent a reality hiding itself from rational explanations: a reality beyond the reach of the immediately perceptible and scientifically provable. Religious myths, thus, provide answers to such existential questions as: where do we come from; where are we going; why are we here; why is it that the world around us shows a certain order, and why do so many processes we observe or deal with show goal directedness and progress with predictable regularity. Religious myths procure a measure of sense, a degree of anchorage, for a seemingly senseless human existence. In the absence of meaning, little more remains than the hedonic gratification of needs.
Gillman compares religious myths with the beams supporting a house. Though they are invisible on the outside, they are indispensable to keep the house erect, to give it durability and stability. Religious myths are beyond experimental proof, but at least for some people, including myself, they possess what I call “face validity”. Subjectively they are experienced as true, since they evoke reactions and allow interpretations that are sensed intuitively as adequate and significant.
In an objective sense religious myths are neither true nor false. Rather, they are important, in proportion to their ability to imbue someone’s life with a sense of meaning. We do not invent God, Gillman says, we discover Him. He presents in this context a striking analogy. Freud conceptualised the idea of a personality centre, called the ego. He gave a name to a common experience. Via a scale of relational experiences and observed behaviours we perceive our fellow men as stable entities, each of them unique and easily distinguishable one from the other. That uniqueness is framed in the word ego. Freud did not invent the ego, he discovered it and gave it a name.
PIETY AND SCIENCE
Piety and science do not exist independently. I cherish the religious myth because I realise that the empirical methods employed in science can illuminate only part of the world through which we journey, in terms that are rationally satisfactory. Those considerations constitute the cognitive root of my religious attitude towards life.
Its emotional root is fashioned by two components: amazement and admiration. Heschl used the phrase “radical amazement” alluding to: “amazement not only at one facet of the world but in the startling fact that there are facets at all”.27 But I do not subscribe to that view. My amazement and admiration concern particular realities. Let me mention a few of these:
amazement and admiration for a control system, encompassed in the brain, consisting of billions of nerve cells and their innumerable mutual junctions, a system that in spite of this unimaginable complexity is able to act as a well organised unity and only rarely falls into chaos
amazement and admiration for the faultless goal directness by which, in most cases, a fertilised egg will develop into a viable individual
amazement and admiration for the wealth of forms in which nature has shaped its creatures, and for which Darwin’s theory of evolution and its offshoots provide, at least in my understanding, only a partial explanation
amazement and admiration for the creative potential with which humankind is blessed.
Those are a few of the many experiences that awe me, intellectually speaking, into silence. Just as a painting without a frame impresses one as being unfinished, so these emotions, amazement and admiration, need a framework. They need to be fitted within the pattern of values and convictions that direct someone’s life. The religious myth is, for me at least, well suited to this purpose. It is a framework, moreover, so richly ornamented that it moves me also aesthetically to the heart (or, more properly, to the soul).
The family, scientific work, and Judaism have structured my life, provided it with meaning and preserved its alignment. Its direction has been predominantly bound toward the future, but with a strong alliance to the past.
This configuration has made me into a contented and thankful man: thankful to my family, thankful to the inexhaustible riches of my profession, covering a territory reaching from molecular neurobiology to domains belonging to philosophy and theology, and thankful, finally, to the divine myth, the most noble of myths humanity knows, and, I presume, ever will know.
The “Odyssey” in this issue is the first in a new series of occasional features in Medical Humanities, in which practitioners at or near retirement can reflect on their careers and on the influences which encouraged—or, perhaps, challenged—humanistic practice as they see it. Intending contributors are invited to send manuscripts to the Editors.