It is argued that the phrase “Necessary Inhumanity” more accurately describes the alienation required of doctors in some circumstances, than do modern sanitised coinages such as ‘clinical detachment’. ‘Detachment’ and ‘objectivity’ imply separation, not engagement: creating distance not only from patients, but from the self: the process may well be required, but where it becomes too extreme or prolonged, it can damage everybody, including patients, family members, doctors themselves, and wider society. An awareness of the history of health care in the context of our society might assist self reflection–might help keep initiates in touch with the culture they have been induced to leave and might help them remain humane despite the bruising process of training.
- medical culture
- presumed consent
Statistics from Altmetric.com
The title of this paper is taken from the teaching of the eighteenth century surgeon-anatomist, William Hunter, who urged his students to gain “a Necessary Inhumanity” by dissecting the dead.1 Hunter knew that trainee doctors could not be too tender, that this inhumanity would stand his students in good stead in dealing with the surgery of the day, which—in the days before anaesthesia, antisepsis or transfusion—needed to be not just accurate but fast if it was to be successful.
Nowadays, we call this necessary inhumanity “clinical detachment” or something similar which sounds less emotive, more scientific. But in a sense Hunter's words are more honest. They help clarify what he was actually urging—inhumanity—but only to a necessary degree. The phrase has more precision, a suggestion of calibration, even a hint of warning, which clinical “detachment” and “objectivity” lack.
Whatever we call it, it clearly has value even today: it can be protective for both patient and doctor, allows each generation to learn how to examine, diagnose, treat, operate, and verify diagnoses after death. It's key, then. But there are also dangers. A doctor currently practising in the UK recollects:
“Dissecting earthworms in biology was no preparation. … One of the students was unable to sit through the introductory lecture, which was about scalpels and forceps, and fat and fascia, because of the thought of dissecting. And the first week that we were in the dissecting room he spent throwing up in the loo. At the end of the first week he blew his brains out with a shotgun.”
“ … Dissecting [is] a strange way to be introduced to patients … we start with a pickled patient. This curious introduction resulted in such misbehaviour as games of cricket played with human arms and large blood clots. Even shy and gentle me looked down one day to see that I was swinging a human head nonchalantly by its windpipe.”
This passage conveys a painful understanding of the impact of conventional medical training. This doctor is still haunted by it. He reveals the astonishment he felt when he realised that against all the odds, he had somehow acquired a detachment which apparently extended to his own arm. This doctor suffered a sort of existential unease at what he'd been forced to undergo, and took a conscious decision to deal with it at an early stage:
“I found myself looking at the body as a wonderful machine, but not as a creature with a soul—that worried me a bit. What in fact I had to do was consciously unlearn that sort of thing, and start to look at human beings as human beings.” (Personal communication: my informant currently wishes to remain anonymous.)
He speaks very simply, and I think rather downplays the importance of what he's saying, but what this doctor is describing is fundamental to humane medicine.
Lessons from the past
Three episodes from the past, each of which has implications for the present day, illustrate the potential for inhumanity in clinical detachment.
I: DRUGS BUDGETS
To control the national drugs budget, Poor Law contracts of employment ruled that all medicines dispensed to the sick in workhouses were to be paid for out of the doctor's salary.
In the midsummer of 1872 Dr Joseph Rogers was appointed medical officer at the overcrowded Westminster workhouse. He was conducted around the building by a Mr French, who'd held the post for the previous forty years. In the course of this tour, Rogers discovered that the conditions inside the workhouse were atrocious, worse than anything described by Dickens, and Mr French laughingly confided the trade secret that he pocketed all his salary by means of the simple expedient of giving no physic. All patients—whether in mild, severe or even mortal pain—were prescribed coloured peppermint-water.2
The method of remuneration appealed to and benefited the worst motives of unscrupulous doctors–while penalising well motivated and benevolent ones. Rogers perceived it as doubly pernicious: corrupting and brutalising doctors, while exacerbating the suffering of patients in these huge old workhouse infirmaries, congested with sick and dying people.
So what do Mr French and Dr Rogers have to do with us? With the help of the Lancet, Rogers campaigned against the gross abuse of penalising workhouse doctors for prescribing proper treatment. The Poor Law administration was eventually shamed into establishing a system of capitation payment for salary, with a separate dedicated drugs budget. These remain the basis of remuneration for general practitioners in the National Health Service.3
A recent report in Pulse, concerning balancing practice remuneration against prescribing costs, quoted a Chelmsford GP, Dr Anne Dyson:
“We have made as many savings as we can without cutting into patient care. The flesh was cut right back to the bone a long time ago. How can we be unbiased in prescribing when you know it might come out of your pocket?”4
II: PRESUMING UPON CONSENT
Transplantation is often presented as a phenomenon of the twentieth century. In fact it's a development in the much longer history of surgery, and rooted in anatomical exploration. Looking back at that history, with a consciousness of the current shortage of organs for transplant, one cannot help but perceive that problems like those of the past are being played out afresh in our own time.
The surgeon-anatomist John Hunter (brother of William—he of the “Necessary Inhumanity”) performed successful autotransplants on cockerels, moving the spur from a bird's heel to its own head, where it proliferated. It was only a short step to the practical application of such ideas to human subjects. In the 1770s, he recommended the transplantation of teeth, which was rapidly adopted by high-class dentists, first using teeth supplied by grave-robbery, then from living child “donors”.5
The exploitative nature of these operations was evident to contemporaries. The cartoonist Thomas Rowlandson vilified Hunter, and the novel, Adventures of a Rupee (1782) revealed the catastrophic long term effects on children already poor, of the removal of healthy second teeth: the dietary impact of being unable to masticate, and the permanent damage to facial appearance, resulting in the likely loss of a normal married or working life.6–7
The demise of the practice can be traced not to ethical questionability, but to clinical failure—decomposition of the tooth invariably followed after a time. In the Medical Transactions of The Royal College of Physicians in 1785, Sir William Watson reported a fatal case: syphilis had been transmitted to a recipient in an infected tooth.8 Hunter was apparently impervious to lay ethical criticism. He denied clinical failure, and cast doubt on stories of disease spread.
I've occasionally seen the cockerel mentioned as a progenitor of modern transplantation, and Hunter is often called the “Father of Modern Surgery”, but the story of the teeth seems too often (and unaccountably) overlooked.
The problem of obtaining human materials has been largely resolved since the National Health Service. Donation supplies all UK dissection rooms, blood donors keep the blood banks going, and over eight million British citizens have currently registered themselves on the UK organ donor register, established less than a decade. Obtaining body parts by theft, purchase, coercion or trickery, as in the past, is quite unnecessary, and rightly perceived as unethical. Nevertheless the current policy of the British Medical Association is to promote “presumed consent”, or, taking without asking.
III: SPECIMEN TAKING
By far the great majority of the human specimens in UK medical museums were obtained without consent.
The most spectacular specimen in the Royal College of Surgeons' Museum in London is the skeleton of Charles O'Brien, or Byrne, otherwise known as the Irish Giant, almost eight feet tall. In the 1780s he was a living human exhibit—rather like the Elephant Man. His fear of dissection was so intense that before his death in 1783 he had accumulated a large sum of money (said to be £500) to have his body buried at sea in a lead coffin. His undertaker, however, was heavily bribed (apparently to the same amount) to deliver the corpse instead to John Hunter's dissection rooms. O'Brien's skeleton continues to serve as a monument to the morality of the medical museum, to theft, to medical acquisitiveness, and to a historic injustice.9
It was from the basement of this institution that the artist Anthony Noel Kelly took the body parts which recently landed him in jail for theft. The irony of his prosecution cannot be lost on anyone who contemplates the sources of the college's specimen collection.10
Pathology as a discipline—and its customary manner of specimen-taking, too—has preserved many of the attitudes of its forbears. Many hundreds, possibly thousands of parents in Bristol, Liverpool, Southampton, Leeds, London and elsewhere have suffered profound distress because their children's organs have been “retained” after postmortem examination (and in many cases disposed of) without consent. The attempt to maximise yield by deception has caused terrible anguish and damage to bereaved relatives, and has brought the entire medical profession into disrepute.11
I'm a rationalist. I have no problem with the idea that medical science needs body parts, that bodies need to be dissected, that students need to train, that surgeons need to understand fetal and other abnormalities, that postmortem findings are important. But I also believe that human beings have feelings, and it is the job of a caring profession to respect them. We all also have rights, and one of the most fundamental of human rights is the right to freedom of self determination.
An inhumane attitude of mind has pervaded medical dealings with a too trusting public. The attitude is inhumane because it denies our common humanity. I suspect it may derive from the fact that many doctors learn in the process of becoming doctors to deny aspects of their own humanity.
The humane doctor
As we've seen it's quite possible for doctors to behave inhumanely: Mr French laughed as he described the coloured water he'd meted out to the dying for forty years. John Hunter extracted healthy teeth from the mouths of poor children, and paid the undertaker to deny the Irish Giant his rightful burial.
These doctors were without pity, able to ignore, deny, overlook or even despise their patients' humanity. Their motives were fundamentally acquisitive. Their activities thrived in closed institutions—the workhouse, the anatomy school. In each of these stories, too, government had a hand—failing to protect children, administering a heartless Poor Law, choosing neither to oversee nor adequately to regulate the conduct of the anatomy school, the dentist's surgery, the workhouse. Dissection rooms remain immune to public scrutiny, as do operating theatres, coroners' mortuaries, research laboratories and medical museums.
Each of these stories also has its humane professional, who nudged matters towards change—Dr Rogers, serving his workhouse patients even to the extent of losing his job; Sir William Watson, lifting the lid on a corrupt and highly lucrative surgical intervention by revealing its clinical dangers, and Dr Stephen Bolsin, who blew the whistle at Bristol. In each case, the humane doctor's moral intelligence has been more in touch with public opinion than has the inhumane.
A Necessary Inhumanity?
A “Necessary Inhumanity” is in my view greatly preferable to ‘clinical objectivity’ for describing the necessary distance from the patient which the trainee doctor must attain, in order to become a good clinician. It more honestly and precisely describes an aspect of the doctor/patient relationship. Were we to resurrect the phrase, to return consciously to using it, knowing what we know, and with the science we now have, it might become evident that clinical detachment is not a simple acquisition, but a spectrum of sensibility which can range from extreme cruelty to conscious empathy.
The notion of a “Necessary Inhumanity” could be valuable because the questions it prompts might help serve as an effective calibrator: how necessary in these circumstances? For how long? And with what effect? Resurrecting and knowingly re-embracing the term “inhumanity” now might mean an increased awareness of its dangerous potential, which in turn might mean there'd be less of it about.
The author thanks Brian Hurwitz and Deborah Kirklin for their constructive observations on this piece.
Ruth Richardson, D.Phil, FRHist Soc is a Historian and author of Death, Dissection and the Destitute (Chicago UP, 2000. In press) and Monkton Copeman Lecturer at the Society of Apothecaries. She is the author of many articles for such journals as the British Medical Journal, and the Lancet and she is a broadcaster of documentary history programmes for BBC Radio 4 and the World Service.
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.