Article Text

Download PDFPDF
Borderlands: a historian's perspective on medical humanities in the US and the UK
  1. Harold J Cook
  1. Brown University, Providence, Rhode Island, USA
  1. Correspondence to Harold J Cook, History Department, Box N, Brown University, 79 Brown Street, Providence, Rhode Island 02912, USA; halcook7{at}mac.com

Statistics from Altmetric.com

Embedded Image

Medical Humanities has developed as an important pedagogical ideal, although it is not a research field, at least not yet. Anyone with clinical, administrative, or personal responsibilities for other people is well aware of a duty of care requiring judgement. Judgement is an attribute applying as much to values as to knowledge, an ability to take appropriate decisions attuned to circumstances of person, context and moment. What one knows and has experienced provides a necessary foundation for deciding a course of action. And when patterns repeat themselves again and again, judgements can become routine. But because judgement is in the end about what best course of action to follow on particular occasions for particular people, general rules and universal truths can ever only be important helps, never giving determinative answers about how to act. Put another way, judgement is not about drawing conclusions but deciding what to do. The duty of care requires the practitioner to decide not what is true but what is best, and with the views of those under care taken into consideration, too.1 The complexities and ambiguities of the human condition, then, inform medicine profoundly, and exploring them often goes by the term ‘medical humanities’.

Given that the subject is aimed at forming good judgement rather than informing the content of any particular field of knowledge, the medical humanities can have no unified research programme. Its advocates share an understandable concern about the potential narrowness of medical education and training, which is a special problem for young men and women who have not otherwise been exposed to a breadth of human experience. While this has been felt most keenly in the large university medical schools of the post war USA, the recent amalgamation of medical schools in Britain and the consequent routinisation of education for the resulting masses has brought attention here, too, to the problems that the medical humanities are meant to address. Once upon a time, small and diverse schools meant variety of education and example, and plenty of personal mentoring. In content-driven and exam-based learning, however, character formation carries less weight among the measures of success. Many deplore such developments and struggle to find alternatives.

In the USA, pressures on students to base their judgements on scientific decision trees rather than a breadth of vision about their patients' circumstances prompted a response by the late 1950s. The Flexner Report of 1910 had signalled the fundamental importance of scientific competence in medical education, and major funding initiatives from private and government sources underwrote scientific research, especially after the successes of World War II. But in a land where the ‘good bedside manner’ continued to have personal appeal and financial reward, voices were also raised about how the great focus on science caused students to undervalue other aspects of medicine, in turn leading to deficits in clinical skills. In 1957, therefore, the medical school at Case Western Reserve introduced reforms meant to give more attention to patients, and other schools followed. Given the apparent existence of two cultures, so successfully articulated by CP Snow, these programmes often earned the epithet of ‘medical humanities’.

But while the field may have been founded with a common concern for educational breadth, when academic definitions of research are applied, as they must be in universities, it resolved into three subject areas: history, ethics and literature, with the fine arts as a supplementary interest. By 1957, the history of medicine already existed at many medical schools. Earlier in the century, at some of the most scientifically advanced medical schools in Canada and the USA, the William Oslers of the world had founded great rare book libraries for the use of faculty and students, where they could imbibe the rich heritage acquired by doctors over two and a half millennia and study forgotten cases, while in the clinic they asked questions about the discoverers of eponymous diseases and during leisure hours participated in reading and dining clubs that passed on a sense of a great tradition. A formal American Association for the History of Medicine had been founded in 1925 to foster the subject, and based on the German example a formal Institute for the History of Medicine was founded at Johns Hopkins in 1929, with its Bulletin becoming the chief research journal in English. At most medical schools, the history of medicine seemed more worthwhile for the informal broadening of lecturers and students than for encouraging research, but the topic was widely cultivated.2

By the later 1960s, history was joined by a new field called biomedical ethics. It arose from new questions about ‘rationing’ technoscientific treatments. While the origin of the field is sometimes associated with the revival of professional declarations of ethical principles following the Nuremberg Trials, codes of moral conduct had long been essential components of professional self-regulation. More transformative were the political needs for collective and institutional decision making. Particularly significant were the decisions of a committee in Seattle that had to ration chronic haemodialysis treatment (introduced in 1961), and an ad hoc committee at Harvard that in 1968 published criteria for defining irreversible coma, both of which were widely debated in the press. More generally, as patients increasingly litigated against their doctors, the principle of informed consent emerged as a legal right (beginning in 1972 in the District of Columbia). Such developments, coupled with the scandal that closed the Tuskegee syphilis study (also in 1972) moved the US Congress to pass the National Research Act of 1974 that mandated institutional review boards (IRBs) to examine and approve all human subjects research in applications for government research funding. While many practitioners continued to prefer to think of medical ethics as best taught through informal mentoring, the requirement to institute IRBs and other such bodies compelled most medical schools in the US to give the subject formal attention although not always formal instruction. The professionalisation of this area of life sciences research oversight, sometimes by bringing philosophers into the medical schools, led to the coining of the term ‘biomedical ethics’.

Additional fields of study related to the medical humanities also appeared. For instance, at The Institute for the Medical Humanities of the University of Texas Medical Branch at Galveston (founded in 1973), the subject included work in medical history and ethics, and literature as well. The study of literature and medicine came to emphasise the importance of narrative in medical cases. Narrative, it turns out, is one of the most basic forms of human communication, offering description and explanation, so that being attentive to how it works in medicine and science can illuminate a great deal. A journal, Literature and Medicine, began publication in 1982, sponsored by the Institute in Galveston. In more recent years, medicine and art has also been added to the medical humanities, partly as a pedagogical tool and partly as a therapeutic agent.

It should be noted, however, that the shortcomings of education based on science alone were not completely resolved by reflections on humanities in medicine. Many US medical schools also began to add elements of the social sciences to medical teaching, as represented by the Department of Social Medicine and Health Policy begun at Harvard in 1980. By the late 1980s, many medical schools also showed a widespread interest in adopting what was often called a ‘doctor, patient and society’ course, an eclectic mix of topics emphasising the social basis of medical problems, the need for better skills in communicating with patients, and the breaking down of social, ethnic, gender and racial prejudices. At most medical schools, such as the University of Wisconsin–Madison, where I served a period as Chair of the curriculum committee, successful efforts were also made to get the mainly suburban, middle class students out into rural and inner city clinics, and even impoverished overseas communities. Equally importantly, from the 1980s, increasing efforts were being made in many medical schools to recruit a proportion of students who had education or life experience in non-bench science areas, even from the arts, which helped to change the culture of the student body.

In every medical school, however, the pressures on curriculum time meant opposition to the formal teaching of such subjects from some of the faculty in scientific and clinical fields who had too little time in the curriculum to convey all they thought necessary, and from students who had no wish to spend their efforts on ‘soft’ subjects when excelling in the ‘hard’ sciences would be a clearer path to high exam scores and so professional advancement. Since the main aim of the curricular reforms was encouraging breadth of interest rather than acquisition of research skills, light-touch elective courses and courses taught by doctors with an interest in the humanities or social sciences tended to be the national norm. Not all medical schools made, or could afford to make, academic appointments in the medical humanities and social sciences, and fewer still made a commitment to recruiting a critical mass necessary for carrying on research. If more and better mentoring rather than the acquisition of academic knowledge was the underlying raison d'être for curriculum change, it was understandable that those with medical backgrounds rather than specialised subject knowledge took first place. It was therefore often difficult for academics in the medical humanities to find space in the curriculum, or if they did, to find many students who took the subject content seriously.

Despite 50 years of effort, then, the very well intentioned motives for encouraging the medical humanities in medical schools have had far from universal success. In places and moments where truly interdisciplinary conversations flourish because of especially good personal relationships, the diversity of the field and its practitioners has strengths, but the lack of academic coherence has also made it difficult to find consensus about its goals or content, meaning that most of its practitioners have stronger loyalties to their own disciplines and their colleagues in related research departments than to the still-inchoate medical humanities.

But perhaps this is changing. Now that medical schools in the UK are facing many of the same problems of factory-style education, while government is seeking to move patients into the role of consumers,3 the same concerns are present that led US medical schools to think carefully about how they can add other values to those of acquiring the kind of knowledge tested by examination. These concerns have recently led the BMJ group to create this journal, and drawn the attention of the huge charity, The Wellcome Trust, which has recently funded two Centres for Medical Humanities at Durham and King's and has begun to transform its History of Medicine funding programme into one for Medical History and Humanities. Whether the funding for research in the subject will finally transform a pedagogical ideal into a coherent academic field is yet to be seen. Trying to make an academic field of education itself, for instance, has not been an entire success. The risk is that by shifting funds from a research field in which the Trust has supported global excellence for 50 years into one that is meant to create better doctors, there might be a fizzling away of excellence in one field without the creation of new sparkle in the other. Much depends on whether the new leaders of the movement in Britain and their postgraduate students, backed by Wellcome funding and spurred by curricular crises, can find their way to defining a research subject that will be of clear and evident importance. It will be no easy task. But improving the quality of medical practice and practitioners certainly requires attention to the proper formation of judgement as well as the inculcation of knowledge. One can only wish them well.

References

Footnotes

  • Funding Wellcome Trust.

  • Competing interests None.

  • Provenance and peer review Commissioned; not externally peer reviewed.

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.