We are searching data for your request:
Upon completion, a link will appear to access the found materials.
Today, the word “lobotomy” is rarely mentioned. If it is, it’s usually the butt of a joke.
But in the 20th century, a lobotomy became a legitimate alternative treatment for serious mental illness, such as schizophrenia and severe depression. Physicians even used it to treat chronic or severe pain and backaches. (As you’ll learn below, in some cases, there was no compelling reason for the surgery at all.) There’s a surprising history of the lobotomy for its use in mental health.
A lobotomy wasn’t some primitive procedure of the early 1900s. In fact, an article in Wired magazine states that lobotomies were performed “well into the 1980s” in the “United States, Britain, Scandinavia and several western European countries.”
In 1935, Portuguese neurologist Antonio Egas Moniz performed a brain operation he called “leucotomy” in a Lisbon hospital. This was the first-ever modern leucotomy to treat mental illness, which involved drilling holes in his patient’s skull to access the brain. For this work, Moniz received the Nobel Prize in medicine in 1949.
The idea that mental health could be improved by psychosurgery originated from Swiss neurologist Gottlieb Burckhardt. He operated on six patients with schizophrenia and reported a 50 percent success rate, meaning the patients appeared to calm down. Interestingly, Burckhardt’s colleagues harshly criticized his work at the time.
The Lobotomy in America
In 1936, psychiatrist Walter Freeman and another neurosurgeon performed the first U.S. prefrontal lobotomy on a Kansas housewife. (Freeman renamed it “lobotomy.”)
Freeman believed that an overload of emotions led to mental illness and “that cutting certain nerves in the brain could eliminate excess emotion and stabilize a personality,” according to a National Public Radio article.
He wanted to find a more efficient way to perform the procedure without drilling into a person’s head like Moniz did. So he created the 10-minute transorbital lobotomy (known as the “ice-pick” lobotomy), which was first performed at his Washington, D.C. office on January 17, 1946.
(Freeman would go on to perform about 2,500 lobotomies. Known as a showman, he once performed 25 lobotomies in one day. To shock his audiences, he also liked to insert picks in both eyes simultaneously.)
According to the NPR article, the procedure went as follows:
“As those who watched the procedure described it, a patient would be rendered unconscious by electroshock. Freeman would then take a sharp ice pick-like instrument, insert it above the patient’s eyeball through the orbit of the eye, into the frontal lobes of the brain, moving the instrument back and forth. Then he would do the same thing on the other side of the face.”
Freeman’s ice-pick lobotomy became wildly popular. The main reason is that people were desperate for treatments for serious mental illness. This was a time before antipsychotic medication, and mental asylums were overcrowded, Dr. Elliot Valenstein, author of Great and Desperate Cures, which recounts the history of lobotomies, told NPR.
“There were some very unpleasant results, very tragic results and some excellent results and a lot in between,” he said.
Lobotomies weren’t just for adults either. One of the youngest patients was a 12-year-old boy! NPR interviewed Howard Dully in 2006 at the age of 56. At the time, he was working as a bus driver.
Dully told NPR:
“If you saw me you’d never know I’d had a lobotomy,” Dully says. “The only thing you’d notice is that I’m very tall and weigh about 350 pounds. But I’ve always felt different — wondered if something’s missing from my soul. I have no memory of the operation, and never had the courage to ask my family about it…”
The reason for Dully’s lobotomy? His stepmother, Lou, said Dully was defiant, daydreamed and even objected to going to bed. If this sounds like a typical 12-year-old boy, that’s because he was. According to Dully’s father, Lou took her stepson to several doctors, who said there was nothing wrong with Dully, and he was just “a normal boy.”
But Freeman agreed to perform the lobotomy. You can check out the NPR article for Freeman’s notes on Dully and more from his patients’ families. (There’s also lots more on lobotomies on their website.)
In 1967, Freeman performed his last lobotomy before being banned from operating. Why the ban? After he performed the third lobotomy on a longtime patient of his, she developed a brain hemorrhage and passed away.
The U.S. performed more lobotomies than any other country, according to the Wired article. Sources vary on the exact number but it’s between 40,000 and 50,000 (the majority taking place between the late 1940s and early 1950s).
Curiously, as early as the 1950s, some nations, including Germany and Japan, had outlawed lobotomies. The Soviet Union prohibited the procedure in 1950, stating that it was “contrary to the principles of humanity.”
This article lists the “top 10 fascinating and notable lobotomies,” including an American actor, a renowned pianist, the sister of an American president and the sister of a prominent playwright.
What have you heard about lobotomies? Are you surprised by the history of the procedure?
Photo by frostnova, available under a Creative Commons attribution license.
Psychology: The Lobotomy and Understanding the Brain
In the 1930s, Dr. Walter Freeman became the leading spokesman for a new brain surgery that promised hope for the thousands of people with mental illness who were living in state hospitals and asylums across the country.
Dr. Freeman claimed that the new surgery, lobotomy, was relatively simple to perform and could provide relief for a range of mental disorders. At a time when there were few treatment options available for the mentally ill, the public and press were quick to embrace the lobotomy as a miracle cure.
But there was a darker side to the procedure, which often used a surgical tool modeled on an icepick to sever the frontal lobe from the rest of the brain. Many physicians – as well as psychoanalysts – were strongly opposed to the experimental operations, which were performed without the support of any randomized testing or consistent follow-ups that would have revealed that many patients were left worse off rather than better, with some disbled for life.
The development of antipsychotic drugs like Thorazine dampened the popularity of lobotomies. But Dr. Freeman persisted, performing them until he was forced to stop in 1967 after a patient died during her third surgery.
Today, as brain research continues on many fronts, Dr. Freeman’s example illustrates the wisdom of the adage “First, do no harm.”
- In the early decades of the 20th century, what were common treatments for mental illness? How did the limitations of psychiatric care at the time give rise to lobotomies?
- What is a prefrontal lobotomy? How did it affect cognitive function? How did it affect the lives and personalities of patients?
- How was it decided which patients would undergo lobotomy? What symptoms were seen as sufficient grounds for performing the surgery?
- How did the development of antipsychotic drugs like Thorazine in the 1950s affect the status of prefrontal lobotomy in the medical community?
- What is deep brain stimulation?
- Given the symptoms that lobotomy was meant to treat, do you know anyone who might have undergone lobotomy if Dr. Freeman had been their doctor in the 1940s and 50s?
- From the standpoint of medical ethics, what lessons can we learn from the story of Dr. Walter Freeman and the history of prefrontal lobotomies?
- Dr. Freeman was very successful in manipulating the news media to make himself a medical celebrity and to popularize the lobotomy as a form of treatment. Is this a problem we are confronting in modern times? Are there doctors today whose fame has given them undue influence?
- Psychiatric neurosurgery (or psychosurgery) is re-emerging as a treatment option. As this form of treatment expands, how can the history of the lobotomy help us to make better decisions about the future of psychosurgery?
Describe how biological, psychological, and sociocultural factors influence behavior.
A Brief and Awful History of the Lobotomy
On November 19, 1948, the two most enthusiastic and prolific lobotomists in the Western world faced off against each other in the operating theater at the Institute of Living in Hartford, Connecticut. They performed before an audience of more than two dozen neurosurgeons, neurologists, and psychiatrists. Each had developed a different technique for mutilating the brains of the patients they operated on, and each had his turn on the stage.
William Beecher Scoville, professor of neurosurgery at Yale, went first. His patient was conscious. The administration of a local anesthetic allowed the surgeon to slice through the scalp and peel down the skin from the patient’s forehead, exposing her skull. Quick work with a drill opened two holes, one over each eye. Now Scoville could see her frontal lobes. He levered each side up with a flat blade so that he could perform what he called “orbital undercutting.” Although what followed was not quite cutting: instead, Scoville inserted a suction catheter—a small electrical vacuum cleaner—and sucked out a portion of the patient’s frontal lobes.
The patient was then wheeled out and a replacement was secured to the operating table. Walter Freeman, a professor of neurology at George Washington University, was next. He had no surgical training and no Connecticut medical license, so he was operating illegally—not that such a minor matter seemed to bother anyone present. Freeman was working on developing an assembly-line approach so that lobotomies could be performed quickly and easily. His technique allowed him to perform 20 or more operations in a single day. He proceeded to use shocks from an electroconvulsive therapy machine to render his female patient unconscious and then inserted an ice pick beneath one of her eyelids until the point rested on the thin bony structure in the orbit. A few quick taps with a hammer broke through the bone and allowed him to sever portions of the frontal lobes using a sweeping motion with the ice pick. The instrument was withdrawn and inserted into the other orbit, and within minutes, the process was over. It was, Freeman boasted, so simple an operation that he could teach any damned fool, even a psychiatrist, to perform it in 20 minutes or so.
Tens of thousands of lobotomies were performed in the United States from 1936 onward, and both these men would continue operating for decades. Lobotomy’s inventor, the Portuguese neurologist Egas Moniz, received the Nobel Prize in Medicine for his pains in 1949. Major medical centers in the United States—Harvard, Yale, Columbia, the University of Pennsylvania—regularly performed variations on the basic operation well into the 1950s.
It has become fashionable in recent years among some medical historians to argue that the operation was not the medical horror story that popular culture portrays it as being. These scholars suggest that, when considered within the context of the times, lobotomy was perhaps a defensible response to massively overcrowded mental hospitals and the therapeutic impotence of the psychiatry of the time. That is not my view, and Luke Dittrich’s book Patient H.M.: A Story of Memory, Madness and Family Secrets (2017) adds to evidence from elsewhere that Scoville (like Freeman) was a moral monster—ambitious, driven, self-centered, and willing to inflict grave and irreversible damage on his patients in his search for fame. He certainly had no time for the Hippocratic injunction: “First, do no harm.”
Ironically, Scoville was quick to denounce the crudity of Freeman’s procedure, a position the neurosurgeons in the audience were happy to endorse. Freeman, in turn, was scornful of the notion that his rival’s suctioning away of portions of the brain was “precise,” as Scoville and his supporters contended. On these points, at least, both men were for once correct.
Dittrich devotes considerable space to this remarkable surgical contest in Hartford, which he views with a suitably skeptical eye. But he opens his narrative much earlier, with the story of an accident that befell Henry Molaison, a young boy of six or seven, one summer evening. En route home for dinner, Henry stepped into the street and was struck from behind by a bicycle. The impact threw him through the air, and he landed on his head, sustaining a concussion that temporarily rendered him unconscious. Henry eventually recovered from his injuries, but only partially.
He began to suffer from epileptic seizures that increased in frequency and severity as the years went by and made his life a misery. Drugs didn’t help. Finally, in 1953, Henry’s parents brought him to see Dr. Scoville. Unlike most of the other patients subjected to psychosurgery, Henry was sane. Scoville informed the family that the epilepsy might be tamed by the brain surgery he was pioneering, and within a matter of months, Henry was wheeled into the operating theater. What occurred next made him one of the most famous patients of the 20th century.
Following his usual procedure, Scoville cut into Henry’s skull, exposing portions of his brain to view. His target on this occasion, however, lay further back, behind the frontal lobes that he usually targeted for his lobotomies. The electroencephalograph had failed to reveal any epileptic focus. Now, using a flat brain spatula, Scoville pushed aside the frontal lobes to expose deeper structures in the temporal lobe—the amygdala, the uncus, the entorhinal cortex—searching for any obvious defects or atrophied tissue. Nothing. At this point, a cautious surgeon would have cut the surgery short, since there was no obvious lesion to justify further intervention. Scoville was not such a person. In his own words, “I prefer action to thought, which is why I am a surgeon. I like to see results.” Results he obtained, although not the ones his patient was hoping for. Using a suction catheter, Scoville proceeded to destroy all three regions of the temporal lobe bilaterally.
Patient H. M., as Henry became known in the trade, suffered absolutely devastating losses. Though his intellect remained intact, he had in those few minutes lost all but the very shortest of short-term memory. Henceforth, as Scoville noted, he was left essentially helpless and hopeless, with “very grave” memory loss, “so severe as to prevent the patient from remembering the location of the rooms in which he lives, the names of his close associates, or even the way to the toilet or the urinal.” And, of course, much else besides. Those words constituted, as Dittrich puts it, “the birth announcement of Patient H.M. It was also the obituary of Henry Molaison.”
The first of many surprises Dittrich springs on the reader is the news that William Beecher Scoville was his grandfather, someone he came to know well over many years. Those family ties gave Dittrich access to all manner of materials that no outsider could have obtained, and he is clearly both a talented and persistent journalist and an excellent storyteller. I found it all the more disappointing, then, that he and his publisher elected to provide neither footnotes nor any systematic documentation of his sources. What we are left with is a gripping story, but one whose provenance is at times unfortunately quite murky.
A second surprise concerns Dittrich’s grandmother, Emily Barrett Learned, whom he affectionately refers to as Bam Bam. Emily had been a high-spirited young woman before she married the handsome Bill Scoville in June 1934. By 1944, they had three children, and Bill was serving in the Army medical corps, leaving her alone much of the time in the small town of Walla Walla in eastern Washington State. Then she found out that her husband was having an affair. She began to hallucinate and tried to hang herself. She was placed in a secure ward of a local hospital until, a few weeks later, the entire family left for Hartford, Connecticut. There, she was rushed to the Institute of Living, one of America’s oldest mental hospitals, and where her husband would perform most of his lobotomies (though not the operation on H. M.). Scoville had been on staff there since 1941.
The Institute of Living (previously the Hartford Retreat for the Insane) was a ritzy private establishment catering to the wealthy in surroundings that superficially resembled a country club. Its grounds had been laid out by Frederick Olmstead, the architect of Central Park in New York. Its inmates were referred to as “guests,” though these were guests deprived of any voice in their fate. The superintendent, Dr. Burlingame, aggressively employed all the latest weapons of 1940s psychiatry: insulin comas, metrazol seizures, hydrotherapy, pyrotherapy (insertion into a coffin-like device that allowed the patient to be heated until the body’s homeostatic mechanism failed and an artificial fever of 105 or 106 degrees Fahrenheit was achieved), and electroshock therapy (ECT) in its unmodified form (which produced violent seizures).
Emily received many of these so-called treatments, to little effect. Her unsympathetic psychiatrist commented that her husband’s infidelity “has upset her to an unusual degree,” and her case notes reveal someone frightened of the ECT and still in the grip of psychotic ideation. Her subsequent release seems a bit mysterious. She was henceforth withdrawn and rather lacking initiative, a pattern that makes more sense when we learn, in the book’s closing pages, that Dr. Scoville had personally lobotomized her. One of the many poignant scenes in Dittrich’s book is his recital of a Thanksgiving dinner at his grandparents’ house, during which Emily sat silently amid her family while her ex-husband and his new wife (a younger, more attractive model) presided over the proceedings.
His book’s title notwithstanding, Dittrich spends many pages exposing these kinds of family secrets. But he eventually returns to the case of the memory-less H. M. Here was a scientific prize. Unlike the legions of lobotomized patients Scoville left in his wake (he continued to perform his orbital undercutting procedure into the 1970s, claiming it was “safe and almost harmless”), H. M. was not mentally ill, and his intellectual abilities remained intact after the surgery. That made him an ideal subject for research on human memory, and the findings of that research were what made Henry so famous (not that he was capable of appreciating that).
Early on, an eminent neuroscientist from McGill University in Montreal, Dr. Brenda Milner, was the primary psychologist studying H. M., and she made a number of pathbreaking discoveries about memory through her work with him, including the finding that humans possess two distinct and independent memory systems. One of these had survived in H.M., the one that allowed him to acquire and improve on learned skills. The other, memory for events, was utterly extinguished.
Dr. Milner soon moved her research in a different direction and lost touch with H.M. In her place, one of her graduate students, Suzanne Corkin, took over. Subsequently, Corkin obtained a faculty position at MIT. H.M. became, in effect, her possession. For as long as he lived, Corkin controlled access to him, forcing other researchers who wanted to examine him to dance to her tune, and building a good deal of her career as one of the first women scientists at MIT on her privileged access to this fascinating subject. From 1953 until his death in 2008, H.M. was regularly whisked away to MIT from the Hartford family he had been placed with, and later from the board-and-care home where he resided, to be poked and prodded, examined and re-examined, each time encountering the site and Dr. Corkin as though for the first time.
Corkin, it turns out, also had a connection to Dittrich. She had lived directly across the street from the Scoville family and had been Dittrich’s mother’s best friend when the two girls were young—not that it seems to have helped Dittrich much when he sought to interview her for his book. She first evaded meeting him and then sought to put crippling limitations on his ability to use whatever he learned from talking to her. How much this affected Dittrich’s attitude toward her is difficult to say, but it seems inarguable that he developed an extremely negative view of her behavior.
As Dittrich points out, Corkin and MIT obtained millions of dollars in research grants due to her control over H. M. Not a penny of it reached poor Henry’s pockets. He subsisted on small disability payments from the government, and not once did he receive any compensation for his time and sometimes suffering. He once returned to Hartford, for example, with a series of small second-degree burns on his chest—the result of an experiment to determine his pain threshold. After all, he couldn’t remember the experiment, so why not subject him to it? Belatedly, it seems to have occurred to Corkin that she should get some legal authorization for her experiments, since H. M. was manifestly incapable of giving informed consent. Making no effort to locate a living blood relative (a first cousin lived only a few miles away), she instead secured the court-ordered appointment of a conservator who never visited H. M. but who routinely signed off on any proposal she put in front of him.
Henry Molaison does not seem to have had much luck at the hands of those who experimented on him. Scoville cavalierly made use of him to see what would happen when large sections of his brain were suctioned out, and Corkin seems to have taken virtual ownership of him and then exploited her good fortune for all it was worth. According to Dittrich, H. M.’s travails did not end with his death. Quickly preserved, his remains were transported to the West Coast, where Jacopo Annese, a neuroanatomist and radiologist at the University of California at San Diego, carefully harvested his brain and began to reveal its secrets. The Italian-born physician Annese comes across as a superb scientist, eager to share what he was finding with the world, but also a naïf in shark-infested waters.
As his work proceeded, he discovered an old lesion in H.M.’s frontal lobes, presumably brought about when Scoville maneuvered them out of the way to reach the deeper structures in the brain he sought to remove. All the memory research on H. M., including Corkin’s work, rested on the assumption that only temporal lobe structures had been damaged, so this was a potentially important finding. Coincidentally or not, after being alerted to this discovery, Corkin called on MIT’s lawyers to reclaim H. M.’s brain and all the photographs and slides Annese had meticulously prepared. Annese had neglected to secure any sort of paper trail documenting his right to these materials, and UCSD’s lawyers hung him out to dry.
Waiving a document she had secured from the court-appointed guardian she had personally nominated, Corkin, aided by MIT’s attorneys, successfully reclaimed the lot. Annese promptly resigned his faculty appointment, his research career in tatters and his slides and photographs now lost. According to Dittrich, Corkin then made one final twist of the knife. In an interview with him before she died of liver cancer in May 2016, she announced that she planned to shred all her raw materials and the laboratory records relating to her work with Henry Molaison and that she had already shredded many of them. For the last 50 years of H.M.’s life, Corkin had essentially owned him. And she planned to carry with her to her grave whatever secrets lay hidden in her files.
Excerpted from Psychiatry and Its Discontents by Andrew Scull, published by the University of California Press. © 2019 by the Regents of the University of California.
Parallels In Time A History of Developmental Disabilities
Reprinted from Eliot Valenstein's
Great and Desperate Cures.
This photo, from UPI/Bettmann News photos, was taken at Western State Hospital, Fort Steilacoom, Washington, on July 11, 1949, during the height of the lobotomy craze.
The patient was first "sedated" by receiving an electroconvulsive shock.
"Immediately following the ECS," according to Eliot Valenstein in his book, Great and Desperate Cures, "Dr Walter Freeman, began the operation" to sever the brain's two frontal lobes from one other. Dr. Freeman was the foremost missionary for lobotomy in the United States.
The operation shown is the transorbital lobotomy, administered without incision by introducing cutting instruments through the eye sockets.
G.K. Yacorcynski. Medical Psychology
(The Ronald Press Co. 1951)
The result, according to one California institution superintendent, was the transformation of a troublesome young woman into one who, "asked if she expected to go home soon, answered pertly: 'That's up to the staff and I never debate with the staff.'"
The transorbital lobotomy replaced the surgical lobotomy, shown here in a drawing from a 1951 textbook, Medical Psychology.
Shown here, a display from the Glore Museum of Psychiatry, where a nurse points out the instruments used in lobotomy. The inventor of lobotomy, Dr. Egaz Moniz of Portugal, received the 1949 Nobel Prize in Medicine for his work.
Here is a closeup photo of the instruments from a Glore Museum display. Author Eliot Valenstein calls lobotomy "not a medical aberration, spawned in ignorance. These operations were very much a part of the mainstream of medicine of their time, and the factors that fostered their development and made them flourish are still active today." He attributes their use to "the strictly biological view of human differences."
From All Things Considered, Nov. 16, 2005
"On Jan. 17, 1946, a psychiatrist named Walter Freeman launched a radical new era in the treatment of mental illness in this country. On that day, he performed the first-ever transorbital or "ice-pick" lobotomy in his Washington, D.C., office. Freeman believed that mental illness was related to overactive emotions, and that by cutting the brain he cut away these feelings. Freeman, equal parts physician and showman, became a barnstorming crusader for the procedure. Before his death in 1972, he performed transorbital lobotomies on some 2,500 patients in 23 states."
Hear or read the whole story here:
'My Lobotomy': Howard Dully's Journey
THE FRONTAL LOBE SYNDROME AND THE DEVELOPMENT OF LOBOTOMY (C. 1935)
The historic operation that we can arguably describe as the first psychosurgery procedure was performed by psychiatrist Dr. Gottlieb Burckhardt (1836-1907) in Switzerland in 1888. Burckhardt removed an area of cerebral cortex that he believed was responsible for his patient's abnormal behavior. He followed this by performing selective resections mostly in the temporal and parietal lobes on six patients, areas in the lobes that Burckhardt considered responsible for his patients’ aggressive behavior and psychiatric disorder. His report was not well received by his colleagues and met with disapproval by the medical community. Subsequently, Burckhardt ceased work in this area.
The Second International Neurologic Congress held in London in 1935 was a landmark plenary session for psychosurgery. American physiologist Dr. John F. Fulton (1899-1960) presented a momentous experiment in which two chimpanzees had bilateral resections of the prefrontal cortex. These operations were pioneering experiments in the field because the animals became void of emotional expression” and were no longer capable of arousal of the 𠇏rustrational behavior” usually seen in these animals. The behavioral change was noted but the full implications were not. These findings would become very important decades later when social scientists in the 1970s noted that aggressive behavior and rage reaction were associated with low tolerance for frustration in individuals with sociopathic tendencies.
This Congress was historic also because it was attended by personages who would leave marks in the history of psychosurgery, the neurosurgical treatment of mental disorders. Among the participants were two Portuguese neuroscientists: Dr. Antonio Egas Moniz (1874-1955), Professor of Neurology at the University of Lisbon, and his collaborator, the neurosurgeon Dr. Almeida Lima (1903-1985). They worked together in performing frontal leucotomies for psychiatric illnesses in the 1930s. In fact, Dr. Moniz's efforts supported the work of physiologist John Fulton that frontal lobe ablation subdued the behavior of aggressive chimpanzees. Also attending this Congress was American neurologist Dr. Walter Freeman (1895-1972), who would soon leave a big imprint in the march of psychosurgery in the form of the frontal lobotomies.[13,15]
But there was more. Dr. R. M. Brickner described a patient with bilateral frontal lobectomies for excision of tumor. Postoperatively his patient showed a lack or restraint and social disinhibition, providing further evidence of the frontal lobe syndrome.
Frontal lobotomy, the sectioning of the prefrontal cortex, and leucotomy, the severing of the underlying white matter, for the treatment of mental disorders, reached a peak of popularity after World War II. But, as we have seen, development of this surgery began in the 1930s with the work of the celebrated Egas Moniz, who also made his mark in neuroradiology as the father of cerebral angiography [ Figure 12 ]. Moniz and Lima performed their first frontal leucotomy in 1935. The following year Dr. Moniz presented a series of 20 patients, and by 1949 he had received the Nobel Prize for his pioneering work on frontal leucotomy in which, specifically, the white matter connections between the prefrontal cortex and the thalamus were sectioned to alleviate severe mental illness, including depression and schizophrenia in long-term hospitalized patients.
Professor of Neurology, Dr. Antonio Egas Moniz (1874-1955)
It was at this time that a constellation of symptoms finally became associated with frontal lobe damage and removal – for example, distractibility, euphoria, apathy, lack of initiative, lack of restraint, and social disinhibition. Some of these symptoms were reminiscent of the personality changes noted by Dr. John M. Harlow in Phineas Gage nearly three quarters of a century earlier.
'My Lobotomy': Howard Dully's Journey
On Jan. 17, 1946, a psychiatrist named Walter Freeman launched a radical new era in the treatment of mental illness in this country. On that day, he performed the first-ever transorbital or "ice-pick" lobotomy in his Washington, D.C., office. Freeman believed that mental illness was related to overactive emotions, and that by cutting the brain he cut away these feelings.
Freeman, equal parts physician and showman, became a barnstorming crusader for the procedure. Before his death in 1972, he performed transorbital lobotomies on some 2,500 patients in 23 states.
One of Freeman's youngest patients is today a 56-year-old bus driver living in California. Over the past two years, Howard Dully has embarked on a quest to discover the story behind the procedure he received as a 12-year-old boy.
In researching his story, Dully visited Freeman's son relatives of patients who underwent the procedure the archive where Freeman's papers are stored and Dully's own father, to whom he had never spoken about the lobotomy.
Dr. Walter Freeman operating on a patient, c. 1950. University Archives, The Gelman Library, The George Washington University hide caption
"If you saw me you'd never know I'd had a lobotomy," Dully says. "The only thing you'd notice is that I'm very tall and weigh about 350 pounds. But I've always felt different — wondered if something's missing from my soul. I have no memory of the operation, and never had the courage to ask my family about it. So two years ago I set out on a journey to learn everything I could about my lobotomy."
Neurologist Egas Moniz performed the first brain surgery to treat mental illness in Portugal in 1935. The procedure, which Moniz called a "leucotomy," involved drilling holes in the patient's skull to get to the brain. Freeman brought the operation to America and gave it a new name: the lobotomy. Freeman and his surgeon partner James Watts performed the first American lobotomy in 1936. Freeman and his lobotomy became famous. But soon he grew impatient.
"My father decided that there must be a better way," says Freeman's son, Frank. Walter Freeman set out to create a new procedure, one that didn't require drilling holes in the head: the transorbital lobotomy. Freeman was convinced that his 10-minute lobotomy was destined to revolutionize medicine. He spent the rest of his life trying to prove his point.
Howard Dully holding one of Dr. Walter Freeman's original ice picks, January 2004. Courtesy Sound Portraits, George Washington University Gelman Library hide caption
As those who watched the procedure described it, a patient would be rendered unconscious by electroshock. Freeman would then take a sharp ice pick-like instrument, insert it above the patient's eyeball through the orbit of the eye, into the frontal lobes of the brain, moving the instrument back and forth. Then he would do the same thing on the other side of the face.
Freeman performed the procedure for the first time in his Washington, D.C., office on Jan. 17, 1946. His patient was a housewife named Ellen Ionesco. Her daughter, Angelene Forester, was there that day.
Howard, standing in front, with his parents, June Dully and Rodney Dully (holding Howard's brother Brian), in Oakland, Calif., c. 1950. Courtesy Howard Dully hide caption
"She was absolutely violently suicidal beforehand," Forester says of her mother. "After the transorbital lobotomy there was nothing. It stopped immediately. It was just peace. I don't know how to explain it to you, it was like turning a coin over. That quick. So whatever he did, he did something right."
Ellen Ionesco, now 88 years old, lives in a nursing home in Virginia. "He was just a great man. That's all I can say," she says. But Ionesco says she remembers little about Freeman, including what he looked like.
By 1949, the transorbital lobotomy had caught on. Freeman lobotomized patients in mental institutions across the country.
"There were some very unpleasant results, very tragic results and some excellent results and a lot in between," says Dr. Elliot Valenstein, who wrote Great and Desperate Cures, a book about the history of lobotomies.
Valenstein says the procedure "spread like wildfire" because alternative treatments were scarce. "There was no other way of treating people who were seriously mentally ill," he says. "The drugs weren't introduced until the mid-1950s in the United States, and psychiatric institutions were overcrowded. [Patients and their families] were willing to try almost anything."
By 1950, Freeman's lobotomy revolution was in full swing. Newspapers described it as easier than curing a toothache. Freeman was a showman and liked to shock his audience of doctors and nurses by performing two-handed lobotomies: hammering ice picks into both eyes at once. In 1952, he performed 228 lobotomies in a two-week period in West Virginia alone. (He lobotomized 25 women in a single day.) He decided that his 10-minute lobotomy could be used on others besides the incurably mentally ill.
Anna Ruth Channels suffered from severe headaches and was referred to Freeman in 1950. He prescribed a transorbital lobotomy. The procedure cured Channels of her headaches, but it left her with the mind of a child, according to her daughter, Carol Noelle. "Just as Freeman promised, she didn't worry," Noelle says. "She had no concept of social graces. If someone was having a gathering at their home, she had no problem with going in to their house and taking a seat, too."
Howard Dully's mother died of cancer when he was 5. His father remarried and, Dully says, "My stepmother hated me. I never understood why, but it was clear she'd do anything to get rid of me."
A search of Dully's records among Freeman's files archived at George Washington University turned up clues about why Freeman lobotomized him.
Howard Dully's stepmother, Lou, in California, 1955. Courtesy Howard Dully hide caption
Perhaps the lobotomy (the removal of parts of the brain) is the most controvertial of treatments that have been developed over the past two centuries for mania and depression. And what is even more bizarre is the results that lobotomies have produced. Here, we'll look at the accident that sparked the revolution in psychosurgery and its development over the past hundred years as a viable form of therapy.
The explosives accident that sparked it all.
Phineas Gage was a mild-tempered man from Vermont, USA, earning an honest living as a foreman during the construction of America's railroads in 1848. All was fine until an explosive charge was accidentally detonated, sending a 3-foot long rod of metal into one end of Gage's head and out of the other. The penetration, from a cheek to the top of his head, inevitably went through his brain. Gage was lucky to have survived this accidental lobotomy with the loss of an eye, and most of his cognitive abilities such as memory remained in tact, but a strange personality change occurred: Phineas became almost childish in his behavior, unwilling to listen to others and often using obscenities.
What had happened?
The part of Phineas' brain that had been destroyed by his experience was the orbitofrontal cortex the part which has been attributed to animal/human emotions. Could removing this part of the brain actually benefit people with overly-strong emotions, and help those suffering anxiety and depression? Later lobotomists would find out.
The experiments begin.
Phineas Gage's change in personality meant that he was virtually unemployable, and reports claimed that he ended up as a sight at a circus. His legacy lived on, however, and Gage's revelations about the function of the orbitofrontal cortex lead to important and even more fascinating research over the next century:
1935: Becky the Chimp
Jacobsen, Wolfe & Jackson put the lobotomy to the test on a chimpanzee called Becky. In removing the frontal lobe of her brain, they managed to make her immune to any distress that she would normally endure when she made mistakes. When made to take a test in a chamber, instead of being upset as she would have been previously, she behaved cheerfully.
London's Second International Congress of Neurology
The congress of summer, 1935 brought together the cream of the world of neurology. Among the attendees were Antonio Moniz, a Portugese neuropsychiatrist, his admirer-to-be, Walter Freeman and Fulton, whose lobectomy of animals (the removal of the frontal lobes of the brain) stunned the visitors. After a demonstration and great deal of debate of Fulton's discoveries, Moniz suggested the application of the lobectomy to humans. The crowd was shocked, but by September of the same year, Moniz had attempted the operation on a woman patient from an asylum. The woman's mental faculties were damaged after the operation, but the paranoia that she had previously from was lessened. Moniz later published his findings, attempting to present his operations in a positive light, and again gained the interest of the 1935 conference attendee, Walter Freeman, who would continue his work into later years.
More on Moniz: Did you know?
Away from his life in revolutionizing the world of lobotomy, Antonio Moniz enjoyed quite an influence in world politics. He helped end World War I in signing the Treaty of Versailles and served as an ambassador, but was eventually rewarded for his neurological achievements in 1949, with an award of the Nobel Peace Prize.
Moniz was eventually shot and murdered by one of his lobotomy patients.
Freeman was a US neurologist who was keen to experiment with lobotomies, and after reading Moniz's findings, was encouraged to collaborate with a surgeon colleague who was qualified to operate on patients, James Watts. The woman they operated on, an American with severe depression, underwent the surgery (her last concern being of her hair being cut off), and awoke carefree. Although side-effects of bad communication became apparent a week later, they soon disappeared and the woman appeared cured.
It was after the Second World War that a real need to treat new victims of war-related disorders such as shell-shock and severe depression became apparent. Lobotomies grew in popularity, and by 1955, tens of thousands of people had undergone the operation.
Despite the risks and adverse effects that were witnessed in previous patients, lobotomies remain a valid, if rare, form of treatment today. Instead of removing parts of the brain as Phineas had endured, the first lobotomies used alcohol to sever the fibers that linked the frontal lobe to the rest of the brain. Today, lobotomies are used often to treat epilepsy, and those such as the amygdalotomy use drills to create a hole in the head.
Brief history of lobotomy
Lobotomy is a technique that since its inception has been a huge controversy in the field of psychiatry. Its roots go back to the primitive trepanations of ancestral cultures. These types of interventions consisted of opening holes in the skull and “expelling” the evil spirits that were in the head. According to their beliefs, these cultures considered these entities to be responsible for mental disorders.
However, the lobotomy itself is much more modern and was developed during the 20th century. The Portuguese António Egas Moniz is the one who established the bases of this technique by means of his first leukotomies, With the aim of treating and curing psychotic disorders. This intervention consisted of cutting the connections of the frontal lobe with the rest of the brain, arguing that this would reduce the problematic symptoms. He won the Nobel Prize in Medicine in 1949 for being responsible for this technique.
later, Walter Freeman, a doctor with notions of surgery and neurosurgery, modified this technique of his contact with Moniz’s leukotomy, and was how he created the lobotomy. Rephrasing the Portuguese scientist’s postulates, Freeman argued that behind mental disorders was an interaction between the thalamus and the prefrontal cortex, and that the destruction of the connections between the two structures was necessary.
To perform his technique, Freeman reached a point where he needed barely ten minutes, and as a surgical instrument he needed an ice pick. Here, the word “icebreaker” is not a metaphor Mr. Walter Freeman used tools taken from his own kitchen (according to what he expresses for one of his children) in order to use them on the brains of his patients.
The intervention was quite simple. First, he took the aforementioned kitchen instrument and inserted it under the upper eyelid to reach the frontal lobe and, with a hammer, tapped to go “chop” (never better said) the aforementioned connections. A peculiarity of this intervention, unthinkable today, is that it was a blind operation. What does it mean? It means that Mr. Lobotomist didn’t know exactly where he was going.
In short, a lobotomy consisted of putting an ice pick in the brain of patients for ten minutes and trying their luck. During the process, the respondent was awake and asked questions. When what the patient was saying didn’t make sense, it meant it was a good time to stop.
It goes without saying at that time, little was known about the great importance of the frontal lobe, Region in charge of executive functions: concentration, planning, working memory, reasoning, decision-making …
Why Were Lobotomies Performed?
The lobotomy is considered one of the most barbaric treatments in the history of modern medicine. Even in the 1940s, lobotomies were the subject of growing controversy. But despite it's ethical issues regarding the procedure, it gained widespread popularity for several reasons:
- Absence of effective treatments: Antipsychotic drugs were not available until the mid-1950s. was available. People were desperate to do something, anything to help those with severe mental illness.
- Overcrowded institutions: In 1937, there were more than 450,000 patients in 477 mental institutions. Lobotomies were used to calm unruly patients and make them easier to manage.
- Media: At this time, media was able to influence surgical indications. The lobotomy was seen as “magic and heroic.”
The strange and curious history of lobotomy
It's 75 years since the first lobotomy was performed in the US, a procedure later described by one psychiatrist as "putting in a brain needle and stirring the works". So how did it come to be regarded as a miracle cure?
Deep in the archives of London's Wellcome Collection, that great treasure trove of medical curiosities, is a small white cardboard box.
Inside is a pair of medical devices. They are simple. Each consists of an 8cm steel spike, attached to a wooden handle.
"These two gruesome things are lobotomy instruments. Nothing sophisticated," says senior archivist Lesley Hall. "It's not rocket science is it?"
These spikes once represented the leading edge of psychiatric science. They were the operative tools in lobotomy, also known as leucotomy, an operation which was seen as a miracle cure for a range of mental illnesses.
For millennia, mankind had practised trepanning, drilling holes into skulls to release evil spirits.
The idea behind lobotomy was different. The Portuguese neurologist, Egas Moniz, believed that patients with obsessive behaviour were suffering from fixed circuits in the brain.
In 1935, in a Lisbon hospital, he believed he had found a solution. "I decided to sever the connecting fibres of the neurons in activity," he wrote in a monograph titled How I Came to Perform Frontal Leucotomy.
His original technique was adapted by others, but the basic idea remained the same.
Surgeons would drill a pair of holes into the skull, either at the side or top, and push a sharp instrument - a leucotome - into the brain.
The surgeon would sweep this from side to side, to cut the connections between the frontal lobes and the rest of the brain.
Moniz reported dramatic improvements for his first 20 patients. The operation was seized on with enthusiasm by the American neurologist Walter Freeman who became an evangelist for the procedure, performing the first lobotomy in the US in 1936, then spreading it across the globe.
From the early 1940s, it began to be seen as a miracle cure here in the UK, where surgeons performed proportionately more lobotomies than even in the US.
Despite opposition from some doctors - especially psychoanalysts - it became a mainstream part of psychiatry with more than 1,000 operations a year in the UK at its peak. It was used to treat a range of illnesses, from schizophrenia to depression and compulsive disorders.
The reason for its popularity was simple - the alternative was worse.
"When I visited mental hospitals… you saw straitjackets, padded cells, and it was patently apparent that some of the patients were, I'm sorry to say, subjected to physical violence," recalls retired neurosurgeon Jason Brice.
The chance of a cure through lobotomy seemed preferable to the life sentence of incarceration in an institution.
"We hoped it would offer a way out," says Mr Brice. "We hoped it would help."
There were centres for lobotomy across the UK, in Dundee, North Wales and Bristol. But by far the most prolific lobotomist in the country, and indeed the world, was the neurosurgeon Sir Wylie McKissock, based at the Atkinson Morley hospital in Wimbledon.
"He was one of the great men of medicine of the 20th Century," says Terry Gould, who worked as McKissock's anaesthetist.
He believes his former boss performed around 3,000 lobotomies, as part of his famously speedy approach to surgery. "It was a five-minute procedure. Very quickly done," says Dr Gould.
As well as operating at Atkinson Morley, McKissock would travel across the south of England at weekends, performing extra leucotomies at smaller hospitals.
"He was quite prepared to travel down to whatever the hospital was on a Saturday morning and do three or four leucotomies and then drive away again," says Mr Brice.
He says the operation could have dramatic benefits for some patients, including one who was terrified of fire. "Funnily enough she finished up after I had done the operation very much better, but she went and bought herself a fish and chip shop with grossly hot oil in it."
However, he had increasing doubts about lobotomy, especially for patients with schizophrenia.
Psychiatrist Dr John Pippard followed up several hundred of McKissock's patients. He found that around a third benefited, a third were unaffected and a third were worse off afterwards.
Although he himself had authorised lobotomies, he later turned against the practice.
"I got increasingly conservative about it because I don't think any of us were ever really happy about putting in a brain needle and stirring the works," he says. "Not a nice thought."
In 1949, Egas Moniz won the Nobel Prize for inventing lobotomy, and the operation peaked in popularity around the same time.
But from the mid-1950s, it rapidly fell out of favour, partly because of poor results and partly because of the introduction of the first wave of effective psychiatric drugs.
Decades later, when working as a psychiatric nurse in a long-stay institution, Henry Marsh used to see former lobotomy patients.
"They had been lobectimised 30-40 years ago, they were chronic schizophrenics and they were often the ones were some of the most apathetic, slow, knocked-off patients," he says.
Mr Marsh, who is now one of Britain's most eminent neurosurgeons, says the operation was simply bad science. "It reflected very bad medicine, bad science, because it was clear the patients who were subjected to this procedure were never followed up properly.
"If you saw the patient after the operation theyɽ seem alright, theyɽ walk and talk and say thank you doctor," he observes. "The fact they were totally ruined as social human beings probably didn't count."