Tuesday, December 3, 2024

The man who lived with no brain

https://www.youtube.com/watch?v=nCWuPMUPyRk

During World War II, a Russian soldier named Leva Zazetsky suffered a wound from a bullet that penetrated his skull and severely damaged his brain. For the rest of his life, he experienced the world in a bizarrely fragmented way. Although he appeared to be normal, he could remember neither the names of objects nor the meanings of words. Although he could talk, when he tried to speak he couldn't find the words to communicate his ideas and feelings. Before the war he had been a fourth-year student at a technical university; after his injury he couldn't read or perform simple addition.

The unfortunate young man's sense of space and his physical orientation to the world were severely disrupted. He could see only out of the left sides of both eyes. He simply had no visual awareness of things on the right side of his field of vision. He would see only parts of objects or sometimes not see them at all. For example, if he had a bowl of soup in front of him, he might be able to see merely a bit of the spoon, or he even might lose the spoon entirely if it was on his right side. In addition to leaving him able to see only parts of objects, Zazetsky's brain injury also caused him to have hallucinations. Ugly faces and rooms with odd shapes would appear when he closed his eyes, so he would open them immediately. This made it very difficult for him to sleep.

Neuropsychologist A. R. Luria worked with Zazetsky as he struggled valiantly to piece back together his disintegrated life. For twenty-five years, Zazetsky kept a journal, using it to try to recapture the thoughts, experiences, feelings, and memories that had been ripped away by the bullet that tore into his brain. In The Man with the Shattered World: The History of a Brain Wound, a book first published in Russian in 1972, Dr. Luria explained: "His only material consisted of fragmentary recollections that came to mind at random. On these he had to impose some order and sense of continuity though every word he recalled, every thought he expressed required the most excruciating effort. When his writing went well he managed to write a page a day, two at the most, and felt completely drained with this. Writing was his only link with life, his only hope of not succumbing to illness but recovering at least a part of what had been lost. This journal recounts a desperate fight for life with a skill psychologists cannot help but envy."

Dr. Luria tried to comprehend as a neuropsychologist what Zazetsky described as an existential trauma. At their first meeting, three months after the bullet wound, Zazetsky couldn't recall what had happened at the battlefront where he was injured. Finally, he remembered that it was the month of May. Then he was able to retrieve the names of the other months, but he couldn't remember, for example, which month came before September, and he couldn't remember the seasons.

Although he could see, Zazetsky couldn't interpret the things he saw. In order to learn how to read again, first he had to relearn the meanings of letters. Because he saw the visual world in shattered fragments, he could read only a few letters at a time. He had to retain these as he moved across the page, picking up other letters to combine into a single word.

Writing was easier, especially after Zazetsky realized that he could write quickly and automatically, getting a whole word down without thinking about the letters that made it up. Apparently the part of his brain that allowed him to write hadn't been destroyed. Eventually he could write as well as he had before his injury, even though he remained unable to read what he had put on the page.

Zazetsky's confusion about spatial relationships caused him to get lost even a short distance from his house and made him unable to comprehend directions. He didn't recognize places with which he'd been very familiar before his injury. In his journal, Zazetsky described how his visual problems and lack of spatial orientation would cause him to lose track of whole parts of his body: "Often I fall into a kind of stupor and don't understand what's going on around me. I have no sense of objects. One minute I stand there thinking about something; the next I lapse into forgetfulness. But suddenly I'll come to look at the right of me and be horrified to discover half my body is gone. I'm terrified. I try to figure out what's become of my right arm and leg, the entire right side of my body. I move the fingers of my left hand, feel them, but can't see the fingers of my right hand, and somehow I'm not even aware they're there."

The details of Zazetsky's story are unusual. Certainly his determination and persistence are rare. But medical history is replete with cases in which traumatic brain injuries have robbed their victims of some mental faculties but not others, and there is a simple reason for this: different parts of the brain coordinate different functions.

https://archive.nytimes.com/www.nytimes.com/books/first/w/winslade-brain.html

Monday, December 2, 2024

Culture, bereavement, and psychiatry

Courtesy of a colleague

Kleinman, Arthur. Culture, bereavement, and psychiatry. The Lancet, Volume 379, Issue 9816, 608 - 609.

The American Psychiatric Association (APA), as recently reported in The New York Times and an article in World Psychiatry, is undergoing a controversy over listing grief as a mental illness in the forthcoming fifth edition of its influential Diagnostic and Statistical Manual of Mental Disorders (DSM-5). Earlier editions of DSM have reasoned that after the death of a close relation, a psychiatrist should wait 1 year (DSM-III) or 2 months (DSM-IV) before labelling the sadness, disturbed sleep, loss of appetite and energy, agitation, difficulty concentrating, and other psychological and physiological sequelae of such profound loss, depression; and treating it with pharmacological agents and psychotherapy.

In fact, there is no conclusive scientific evidence to show what a normal length of bereavement is. Across the world, societies differ greatly in what they regard as normal grief: some do regard a year as a marker, and yet others sanction longer periods—even a lifetime. And intracultural differences among individuals can be important. The gender of the bereaved matters as does his or her religion, as well as the status and circumstances of the person who died. DSM-IV already stands out for the expectation that the symptoms of grief should abate by 2 months: no society, no religion holds that shockingly short expectation. This makes critics feel that APA's experts, lacking the constraint of biological measures of depression and encouraged by the pharmaceutical industry, are seeking to loosen standards and thereby create more patients. Its ubiquity makes grief a potential profit centre for the business of psychiatry. Proponents, by contrast, recognise that some bereaved individuals over time do experience their symptoms as disabling, for which they deserve a diagnosis of depressive disorder and would benefit from treatment; some wonder if it wouldn't be more generally desirable to remove the pain of grief for everyone who is bereaved.

In March, 2011, my wife died and I experienced the physiology of grief. I felt greatly sad and yearned for her. I didn't sleep well. When I returned to a now empty house, I became agitated. I also felt fatigued and had difficulty concentrating on my academic work. My weight declined owing to a newly indifferent appetite. This dark experience lightened over the months, so that the feelings became much less acute by around 6 months. But after 46 years of marriage, it will come as no surprise to most people that as I approach the first anniversary of my loss, I still feel sadness at times and harbour the sense that a part of me is gone forever. I'm not even sure my caregiving for my wife, who died of Alzheimer's disease, ended with her death. I am still caring for our memories. Is there anything wrong (or pathological) with that?

Experience, including the experience of loss, is never neat: that is, out of context. It is always framed by meanings and values, which themselves are affected by all sorts of things like one's age, health, financial and work conditions, and what is happening in one's life and in the wider world. The collective and personal process we usually refer to as culture is one sort of framing: a kind of master framing. Historically, widows in many patriarchal societies were culturally framed as grieving for a lifetime or at least, a long time. The globalisation of our era has brought in its wake an expectation of serial marriages with much shorter periods of bereavement. Still, DSM-IV's framing of normal grief as lasting only 2 months must stand out in global perspective as a shocking expectation. We can say the same about the APA's proposal for treating any grief as depressive disorder, which must be seen as a radical cultural framing peculiar to American academic psychiatric research.

Inasmuch as there is no compelling evidence that antidepressant drugs improve mood in normal people, the APA, if it wanted to authorise treatment for normal grief, had to make it over into a disease—ie, depression. Then psychiatrists could, as a routine practice, prescribe antidepressants for bereavement. This phenomenon of reframing a previously normal experience as a disease is called medicalisation and is quite far advanced in psychiatric practice, which already labels shyness as anxiety disorder and puts some people who are unskilled in negotiating social relationships in the Asperger's syndrome end of the autism spectrum. These framings represent a cultural shift, now well along its way, to remake experiences formerly regarded as morally bad, religiously sinful, disturbing, or just different as medical issues of illness and disablement. The upshot is that unprecedented numbers of people with what was earlier regarded as the ordinary distress of living are taking psychotropic medication.

The increasing secularisation of our age with the dominance of biotechnology is one factor behind this shift to a new cultural frame, just as much as the political economy of the pharmaceutical industry, the transformation of American medicine into big business, and the infiltration of bureaucratic standards and regulations ever more deeply into ordinary life. All of which brings me back to the experience of grieving. Why not medicalise it? Why not deprive death of its sting for the survivors and make the experience of loss as painless as possible? Given the parlous state of global capitalism at the moment, maybe this would also help to fund health-care systems. Professor David J Kupfer, who chairs the DSM-5 Task Force making the revisions, is reported to have told The New York Times that making grief into a disease would allow psychiatrists to treat people who were suffering so that they would get the treatment they need for being depressed. And that's the rub really. Is grief something that we can or should no longer tolerate? Is this existential source of suffering like any dental or back pain unwanted and unneeded?

My own experience, together with my reading of the literature, suggests caution is needed before we answer yes and turn ordinary grieving into a suitable target of therapeutic intervention. My grief, like that of millions of others, signalled the loss of something truly vital in my life. This pain was part of the remembering and maybe also the remaking. It punctuated the end of a time and a form of living, and marked the transition to a new time and a different way of living. The suffering pushed me out of my ordinary day-to-day existence and called into question the meanings and values that animated our life. The cultural reframing—at once subjective and shared with others in my life-world—held moral and religious significance. What would it mean to reframe that significance as medical? For me and my family, and I intuit for many, many others such a cultural reframing would seem inappropriate or even a technological interference with what matters most in our lives.

I am, however, enough of an anthropologist to recognise that this resistance on my part may simply be generational, an increasingly historical oddity out of keeping with the brave new world of technology that is remaking life and reframing the story of who we are. So that the now young adult generation, which claims to be refashioned by the internet and the rest of this transformative age of engineering and applied technology, may no longer want or need the suffering of grief to affirm its humanity, redeem its deepest values, and frame its collective and personal experience of loss. I had always imagined that if something like that happened, there would be a loss of the human. Yet, I am reminded that each age fears the loss of the human following upon changes in ways of living that affect established framings of ordinary experience. Perhaps the loss of grief will also eventuate in such unrealised fears rather than a new reality of what is ahead for human beings. So much will depend on how this professional reframing is experienced: either as just one more technological innovation or as a true cultural and subjective transformation.

The late great French historian, Philippe Ariès, obviously thought the making over of death and its charged consequences is a story of a disturbing cultural and subjective transformation, because he concluded his magisterial study, The Hour of Our Death, with this sardonic observation:

A small elite…propose(s) not so much to “evacuate” death as to humanise it. They acknowledge the necessity of death, but they want it to be accepted and no longer shameful. Although they may consult the ancient wisdom, there is no question of turning back or of rediscovering the evil that has been abolished. They propose to reconcile death with happiness. Death must simply become the discreet but dignified exit of a peaceful person from a helpful society that is not torn, not even overly upset by the idea of a biological transition without significance, without pain or suffering, and ultimately without fear.

Whatever the outcome of this particular conflict at the APA, a serious change is afoot and not just in the meaning of profound loss. Technology's attendant rational technical practices of classifying, diagnosing, and intervening do not just change the world; they carry the potential to make up new people. So does the cultural sensibility of new generations to use psychoactive substances to manage the moral and emotional discomfort of financial and social problems change the habits of the heart and create new subjectivities. With what unintended consequences?

The reframing of experience shows that medicine and its doctors may well be among the most effective and usually unrecognised agents of cultural change. Next time a bereaved person comes into the clinic and is asked about his or her background to assess how culture affects his or her health condition, the physician, before deciding whether to treat the grief as depressive disorder, should first look in the mirror to see where culture is also at work.