Playing with Fear

Some people have gotten into baking. Puzzles. Gardening. Feeding stray cats (hi, Mom and Dad!). My pandemic obsession for the past month has been trail biking. I spent June upset and anxious about injustice, Black Lives Matter protests, and police brutality and prosecutorial overreach in the city I live in. And then . . . I started doing something mostly pointless that scared me, and it began to seem that I was doing it simply because it scared me.

I got teargassed on Friday, May 29, and the next day at noon I took a Xanax to calm myself enough to go back downtown to the courthouse. Saturday afternoon protests at the courthouse have historically been safe, and although I don’t think any (left-leaning) protest at any time on any day is currently safe in my city, back on May 30, I believed in my mind that it was safe, even though I was scared. After the afternoon protest, I went home to eat something, then went back downtown to meet some friends. We marched across the Martin Luther King, Jr., bridge, crossing the river to downtown, and I saw however many police officers it was, lined up like dystopia, matching uniforms, matching shields, shoulder to shoulder with danger looks. I was terrified. I checked my watch and it was 8 o’clock, pumpkin time for this Cinderella, because the Xanax had worn off. I went home and left my friends to get teargassed without me. I went back the next night, but with attentive care to the timing of my drug. I asked myself and still do: how scared is too scared?

I discovered these mountain-biking trails and began experimenting with fear, trying to do something with fear. The first time I went, I planned to stay on the green trails, but I got lost or confused. I climbed and climbed, and then the path cut down into a gully and up the other side. I’m not doing that! I thought. No way. I turned away and pedaled around to try to leave but ended up going in a circle. Fine, I thought. I’ll do it, so then I can leave. It wasn’t true—I went down and up the gulch but then ended up going in a circle again. I went through the gulch two or three more times, not because I thought it would lead me out, but because I wanted it to become less scary, and it did, a little.

I wished I could go there every day, because I wanted to go back again and again to try to make things less scary: the bridge that was too narrow, with no railings and a three-foot drop off of each side; the roller coasters straight down into gulches or fast downward flows around curves; the small but sharp drops that made me fear flipping the bike. Sometimes I spent the whole time I was there afraid. I wondered if taking a Xanax before going would help, and I remembered the teargas and wondered how scared is too scared.

109648544_2996798140429845_2518918451509101516_n

I kept thinking if I kept going to the same scary spots over and over again, that would help. Or if I got stronger, that would help. But I kept coming to the same places and stopping as suddenly and as surely as a horse that knows it will not jump over that hurdle—just as it was when I was in seventh-grade gym class and I could do a front flip over the vault day after day after day, easy, and then one day I balked with fear and couldn’t, and I never did again.

Last night I went back again, with the plan of doing a trail full of roller coaster dips and climbs that had thoroughly terrified me the day before, when I did it for the first time, but instead I had the good fortune to fall in with an experienced rider who was teaching a beginner. He’s been riding out there since the late 1980s, on those trails since they were built in the early 1990s, and he is a good teacher, so he told me some techniques and showed and talked me and the other newbie through some of the things that I found scary. We watched the deer nibbling leaves, and he told us how the Christmas and Easter trails got their names and about how the volunteers come out to clean off the trails on Saturdays. Lore and stories and community and tips and tricks—and it all helped so much. After those two left, I kept going for another half hour, practicing what I had learned about how to make things safer.

It’s only recently that I have begun to think much about the wisdom of the body. Things didn’t become less scary as I kept doing the same scary things over and over and over again because my body sensed that my center of gravity wasn’t optimal. I needed to do something different, because without thought or knowledge, my body knew the dangers, but only with thought and knowledge could I mitigate them somewhat, because I’m not intuitively athletic enough to figure this stuff out on my own.

There is a connection, obviously, to the fears I felt in June about matters of life and death and justice and cruelty, but I don’t want to force a pat conclusion. Fear has been a dominant emotion in my life, and in the lives of many, recently. I have never been one to seek out fear—I have always hated haunted houses and horror movies—so it was weird to find myself doing something frightening on purpose.

As we lurch every day closer to dystopia, as I watch protesters in Portland being beaten and gassed and arrested by shadow figures without identification, as I think back to the awe I felt in 1989 at the bravery of the thin man standing straight and determined in front of the tanks at Tiananmen Square, questions of fear and courage feel urgent. I know from Aristotle that courage is not fearlessness—I can only hope that this strange time of playing with fear will help me somehow, someday when it actually counts.

110302888_385205512453480_1903507793449804330_n

To White People, on Discomfort

“People are going to do what they’re going to do until it gets too uncomfortable. Then, they change.”

—Wise mom of someone I used to know

I don’t like seeing violence or suffering. I don’t go to violent movies, and if I accidentally end up at one, I look away during the violent parts, or sometimes I leave. When I see real violence, I get upset—an anxious physical response and often crying. But the most painful kind of violence for me to witness is violence in which I’m complicit, because then it’s all the physiology of my individual experience of violence, plus deep moral guilt and shame.

One thing that became slightly less painful in my life after I went vegan was that I released myself from ever again having to watch video footage of animals suffering in factory farms. This was an actual deal I made with myself: in exchange for giving up these things that have given me pleasure in the past, I don’t ever have to watch a video of a downed cow being prodded to stand up and walk to slaughter . . . because I’m doing something real to not participate in that system, I don’t have to endure that distress again. Before, when I was eating animals and animal products, I felt that it was hypocritical to turn away, so I made myself keep my eyes open when such videos crossed my path.

Last Friday, in a conversation with some friends in which I confessed that I had not yet watched the video footage of George Floyd’s murder (I have since watched it), I saw the connection—I don’t want to watch video footage of unarmed, unresisting black people being killed by the police not just because it distresses me to watch any violence or suffering. In addition to those feelings I have from watching any violence, when I watch these killings, which happen over and over and over again, I feel personally guilty, personally complicit, personally implicated. And that’s why I have to watch, because I have done nothing other than have anti-racist opinions and an anti-racist voting record. That’s it. That’s all I’ve done, and that’s near enough to nothing that it’s no free pass.

When people change, for it to stick, I think it usually has to be at least a little bit selfish. When I went vegan, much of the decision was related to just being tired of the internal conflict. I was tired of almost three decades of feeling morally conflicted about the food I ate, tired of going back and forth and changing my mind and whiplashing between following my beliefs and ignoring my beliefs. Just tired. The discomfort of continuing to be that person made it possible for me to change.

And now I’m tired of being someone whose silence made me complicit in racist policing and institutionalized racism in general.  Before last weekend, I think I’d been to one or maybe two Black Lives Matter protests in the previous five years. I read things and I watch things and I feel things and I hold a sign and then I go back to my life and then it all happens again too, too soon. I’m tired of being a person so afraid of police brutality that I will only stand with my black siblings in the bright afternoon at the courthouse instead of staying into the night when people need help and need all the solidarity they can get. I’m tired of the repetitiveness and monotony of it, of seeing the same thing happen again and again and providing not much more than the Buddhist atheist’s equivalent of thoughts and prayers. I’m tired of being this person. Being this person has become uncomfortable enough that it’s worse than the discomfort of my fears of being teargassed, of seeing conflict and violence, of getting seriously hurt. I want to change.

It’s not just me, though—part of making any change happen is making people uncomfortable. People hate being uncomfortable. Right now I’m out of town, not able to participate in the ongoing protests in my city this coming weekend, but I’m watching from a distance as one group of protesters wants to have a unity march with the mayor and the police, and another group thinks that a unity march is premature, when the people arrested last weekend for peacefully-but-angrily protesting are still charged with crimes, when the young man whose eye was destroyed by a teargas canister hasn’t received an apology, and when the mayor says in a written statement that the police’s use of force last weekend was necessary. It’s not time to be comfortable yet. It’s not time to let others be comfortable.

Arresting four officers in Minneapolis—which took nine days, protests in every state in the country and in nations around the world, and widespread police and military overreaction that proved the point of the protests—is the easy part. The hard part is reforming the entire nation’s approach to policing: Whom do they serve? To whom are they accountable? How can policies be created and revised to shift the focus of policing to service to all communities and all Americans, with transparency and accountability? This work starts now, and it won’t happen without continual and ongoing pressure from all people of goodwill—because cultural change is uncomfortable, and people don’t want to be uncomfortable. Fellow white people, please embrace your discomfort, sit with it, talk to it—let it change you. And I will try to keep doing the same.

* Note that I am absolutely not comparing factory farms to US slavery, a rhetorical move that vegans sometimes make that rightly infuriates black people. I’m focusing on myself and discussing these two things because these are two times I have felt convicted regarding my own complicity in systems of suffering.

Tornado

It happened the first time the night of the tornado warning. Huddled on the floor of the pantry, Clare looked at the linoleum floor and breathed carefully—this was not social-distancing distance. A few minutes earlier, Clare had started to panic thinking about the tornado and how the cans would fall off the shelves onto them.  So Jeff went to the garage and found a big piece of cardboard, and together, Clare and Jeff held it as a little roof over the family, making the small space even smaller, even more stuffy. She was the only one worried about the cans; she could sense the impatience of all three of them, and she could see it in Sadie and Trent’s faces. But there were too many cans, so many cans, because now she was shopping for two weeks at a time instead of a few days or a week at most, and stocking up when she could. Pooky-cat was mellow, curled up in Sadie’s lap under the cardboard roof, but the other one, Bartholomew, was losing his mind, yowling and scratching at the door to get out. Trent smelled like baseball practice and teenage boy sweat, even though all he had done was to practice alone in the back yard before the storm blew in.

Clare had grown up in Kansas and spent her anxious childhood terrified of tornadoes, among other things. Clare sat, too close to Trent’s armpits (everywhere was too close), and remembered the fears that had never come to pass: her house had never burned down, a dog had never bitten through her cheek, her house had never been ripped apart by a tornado, her parents had never gotten divorced. An eight-year-old who understood the difference between a tornado watch and a warning, she would go to the basement before anyone else in her family was worried, before the sirens, up and down the stairs several times so she could save everything she needed to before the wailing of the sirens began. Then firefighters came to her fifth-grade classroom and scared her about fires. At least with tornadoes you had warning—and there was a season for tornadoes. So then every night it became necessary to lay out a curated set of Important Things by her bed: her diary, her charm bracelet, a stuffed animal that used to sleep in bed with her but had to stop for fear of being unfindable in the event of a midnight fire. The collection changed over the years of the ritual, but it had to happen every night. Thinking all the time about tornadoes and fires and how a balcony could collapse onto your head if you happened to be sitting underneath it at the wrong moment didn’t feel weird at the time. She didn’t know the word anxiety.

The anxiety mostly went away for a while and then reemerged, transformed, when she became a mother. It was now the Worst-Case Scenario Generator™ (WCSG), a mental simulation application that allowed her to see a straight line of events connecting whatever her children were doing in the present to disaster in the future. She collected worries, here and there, from books and newspapers and friends of friends. She had read A Prayer for Owen Meany and had John Irving to thank for the awareness that it is within the realm of possibility for a baseball to go astray and kill someone instantly. Of course her son had to love baseball—whom did she have to thank for that? Trent and Jeff believed that the WCSG wasn’t real, but poor Sadie had inherited her own that managed to generate new and unique fears—powerful magic. Clare knew that other people had them, too.

And then a strange thing happened—after a lifetime of telling herself that the WCSG was overactive, was false, was really just her anxiety; after a lifetime of houses not burning down, not blowing away, never a balcony crushing her in a mountain of rubble at the theater, never raped and murdered in a dark alley or anywhere else or her face ripped to shreds by an angry dog—suddenly everything went to shit, and here she was in April 2020 breathing in the carbon dioxide and aerosolized spit particles of these people that she was already sharing too much space with.

“This sucks,” Trent announced, scooting out from under the cardboard to sit slouched against the pantry shelves. Clare darted a look at him and took a breath to speak, but he cut her off. “No! I’m not going to hunch under there for the next hour because you think the cans are going to fall off the shelves. That’s stupid and I’m not going to do it. If the storm gets worse or if the house starts to blow away, I’ll get back under there and we’ll die together with the cardboard over our heads!”

Not going to fight not going to fight not going to fight. Clare put her forehead against her knees and wrapped her arms around her ankles. Jeff and Sadie said nothing, and then Sadie tried to lighten the mood by talking to Pooky-cat. “Tell your brother to calm down, Pooky-cat. Tell him to stop scratching!”

“I don’t need you to tell me to calm down, Sadie,” Trent snarled.

Sadie started to cry. “I’m talking about the cats. I’m not talking about you!” she wailed.

A can of tomato paste fell off a shelf, glanced off the side of Trent’s head, and bounced in the direction of Bartholomew. The cat startled dramatically—six inches straight up into the air and then down again—and then bolted for the back corner of the pantry, digging his rear claws into Clare’s thigh as he scrambled past. Trent gave his mother a dark look and then moved back under the cardboard. After a few minutes, the cat scratch began to sting. This would be a terrible time to get a blood infection, Clare fretted silently. The house didn’t blow away that night, but the WCSG had never seemed so powerful.

***

By the next morning, the skin around the scratch was puffy and red, even though Clare had washed the scratch thoroughly before bed. She thought about the bacteria that had had a chance to get a foothold there in her skin while she had waited and waited in the pantry.

Her trip to the grocery store was an unexpected success—she found yeast! And everything else on the list, too. But as the trip progressed, as the cart became fuller, Clare grew anxious about how much the total would be. She knew they were spending less money overall—nothing to do, nowhere to go—but spending more than she was used to on any given purchase always made her nervous. By the end of each grocery trip, after an hour or more wearing the mask that made breathing difficult and fogged up her glasses, the astonishing total at the checkout stand compounded with the anxiety of leaving the house at all, a double-whammy of fear of viral shedding and queasiness at the thought of all that money leaving their bank account. It didn’t matter that they could afford it. Could they? Either one of them could lose their jobs, get furloughed. She thought about it all the way home, whether it would help to refinance the house, whether other accounting firms whose clients were less affected by the pandemic might be hiring.

The trip was exhausting, and by the time she got home, Clare was running a low fever, and the cat scratch looked worse: oozing stuff already. If she went to the hospital to be treated for a septic wound, she would catch the virus. She washed it again, dabbed some antibiotic ointment on it, and covered it with two bandaids. She was an hour later getting to her desk than she had planned to be, and just as she was booting up the computer to start working, she got a text from her coworker Moira that read simply, “Fuck.”

She replied, “???”

“Check your email.” Clare’s heart started pounding.

A five-minute video from the company’s owner was embedded in an email. She was sorry she couldn’t tell them face to face, she said, so a video seemed better than a written message. Business was down, way down, as they all knew from having less work to do for the past month since they had started working remotely. She kept her voice upbeat, her mouth in a smile as she ran through the plans for the new normal. When the owner was done, she pressed a button to stop recording, but there was a delay: her face sagged into nothingness, and she closed her eyes. Then the video ended.

***

Some time later, on a break, Jeff walked past the room Clare was working in. The sight of her stopped him. She looked up from her desk and met his eyes.

“What’s wrong?” he asked. Carefully. Neutrally.

She stared at him, eyes wide open. “I can’t think. I can’t think.”

“What’s wrong?” he repeated.

“Furloughed. Half time. Half pay. Starting Monday.” She started to cry, a low, keening moan Jeff hadn’t heard before.

“It’s OK. You don’t have to think now. We can take a walk or have lunch and you can think about it later.”

“NO!” she yelled, glaring at him, before catching herself and regaining her composure. “Don’t you see it?” she implored. “I’m making it happen. It’s me. I have to stop thinking.” She slid out of her work chair and onto the floor.

Jeff watched his wife slowly rocking back and forth, forehead pressed to her knees, whispering, “I can’t think I can’t think I can’t think.” Slowly, somewhere deep inside him, his own WCSG, which had lain dormant since he was a teenager, hummed to life again.

Rachel Hile

On Staying Put

In the copy of the Dao De Jing I was reading in the early 1990s, next to the lines “He who goes to a distant land / in search of the Truth / Will only distance himself from the Truth” was written, in my then-boyfriend’s cramped handwriting, “Or Disneyland.” I remember this every time I reread those lines and mentally edit the passage:

He who goes to a distant land—or Disneyland—

in search of the Truth

Will only distance himself from the Truth.

A few days ago, a friend of mine wrote a good piece about the class injustice in coronavirus testing, given that the current testing criteria in the United States privilege those who have traveled and ignore those who have encountered the virus through community spread.

So I’ve been thinking about travel. Last month, an age ago, I saw performances of both parts of Tony Kushner’s play Angels in America, and as the pandemic has become an increasingly urgent concern in the United States, I have remembered the voice of the Angel, transmitting to Prior the angels’ plea that we should “Forsake the Open Road / . . . . / HOBBLE YOURSELVES! / . . . . / STASIS!” In a time of plague, the sensible nurse-practitioner’s advice to Prior, who has AIDS, is the same as the Angel’s mad request: “Keep breathing. Stop moving. STAY PUT.”

But while we are staying home, we should reckon with the role that travel has acquired in the age of social media. Twenty years ago, if you went on a cool vacation and wanted people to see your amazing photos, it took some doing. You would have to meet face to face, and your person would have to sit still and listen to you narrate the story of each photo. Maybe, if you were lucky, you had a screen and a projector and a lot of family and friends who loved you very much and had extremely long attention spans. There were limits on the extent to which travel could be CONSPICUOUS consumption, and it was easier to show off with consumer goods.

Now, not only is travel as easy to show off as consumer goods, it’s more socially acceptable. It might seem crass to post a photo of your new $80,000 car, but travel, even for anti-capitalist leftists, is seen as morally neutral or positive. In the early years of social media, I spent hours upon hours creating photo albums of two big trips, one to Italy in 2011 and one to India and the UK in 2013. I eventually started to feel queasy about social media personae, mine and others. I felt very alone, and very lonely, in the early years of social media, but the persona I projected was “alone but not lonely.” The individual words and photos that I posted were not false, but they were curated, and so the overall persona was false.

But trying to step away from documenting my travel online has been hard sometimes, because travel has risen so much as a marker of status and prestige—if you have to travel for your job, your job must be important. You must be important. If you take amazing vacations, you must be an interesting person. There aren’t many voices exhorting people to travel less . . . or at least there weren’t before two months ago. Last fall, Henry Wismayer wrote a wistful piece on his moral misgivings about being a travel writer in a time of climate emergency. His opening anecdote recounts a visit to Lake Abbe, a soda lake that is vanishing:

I’d gone to the Horn of Africa in search of timeless landscapes, but there was no respite from humanity’s penchant for remaking geography. It was hard to avoid a sense of complicity. After all, I was there to write a travel story, exhorting other English-speakers to visit the region as if all was well with the world. On that excursion, I would clock up 8,000 miles in flights, exceeding the annual carbon footprint per person recommended by the Swiss nonprofit MyClimate.org by a factor of four in the space of one long-haul round trip. As I stood where Lake Abbe was surrendering to the Grand Barra desert, the newly exposed ground appeared as a premonition of an uninhabitable Earth. A hot breeze blew eddies of dust around my ankles, scolding me. You shouldn’t have come.

If we remake our relationship to travel in the face of a pandemic, will we then return as quickly as possible to how things used to be as soon as this threat passes? Yascha Mounk summarizes Peter Singer’s famous thought experiment on empathy and proximity with reference to COVID19 as one possible explanation for why so many people are doing such a lousy job of social distancing: until someone near me is sick, I can’t empathize, and I won’t do anything.

But Singer, best known for his philosophical work on animal rights, always wants us to broaden the circle of our compassion and empathy. What is now causing massive cuts to our carbon footprint—as has already happened in China and will increase as people follow Kushner’s nurse’s directive to “STAY PUT”—is concern for human beings, not concern for the planet. When this is all over, whenever that may be—after we will have learned how many people really can do their jobs from home, after the meat-eaters who bought up all the canned beans will have learned how good a vegetarian meal can be, after people will have had some experience with broadening their circle of compassion to people they don’t know—can we on the left try to keep some focus on unnecessary travel as a problem?

Alt-Ac/Post-Ac: An Update

I looked at my boss blankly. “You have to understand,” I said. “As an academic, I’m used to writing a lot of things that no one reads.” She laughed. “No, really,” I said, all earnest. “Not just like humanities publications, but also assessment reports . . . all sorts of reports that you have to write but that no one reads. The comments on papers I’ve graded. I think maybe only one student in each class ever reads those.”

She had just told me that I should add a footnote explaining how I got the percentage I reported, since I had to do some simple math to derive it from the data presented in the medical article I was summarizing. “When the reviewer for the Notified Body reads these, you want to make their job as easy as possible,” she told me.

People ask me what the heck I’m doing these days, as “medical writer” covers an awful lot of territory. To be more specific, I’m doing what’s known in the field as “regulatory writing.” To be even more specific, I’m writing the literature review sections for Clinical Evaluation Reports (CER) to be submitted to the European Union as part of the continued post-market follow-up that is necessary to keep medical devices on the market in the EU.  For each medical device, at intervals that depend upon the riskiness of the device (e.g., more frequent reports are required for implantable devices that stay in the patient permanently), someone needs to determine what clinical evidence has been published about the device since the last report was submitted, research the “state of the art” (“SOTA”; i.e., the standard of care) of that particular field, and contextualize the clinical data from the published literature with regard to the state of the art.

I laughed in an earlier blog post about how my job title includes the root for “science” not just once but twice (my official job title is “Scientific Communications Scientist”). And yet, now that I’ve been in this job for six months, I can see that regulatory writing comes as close as I can imagine to writing-as-science. It is methodical to the point of plodding, detailed to the point of OCD (and I kind of love it). Searches of medical databases must be replicable and must be documented. Everything that comes up in a search must either be included in the final report or excluded, with a reason for exclusion provided. If my report includes a percentage or figure that is mentioned in four of the publications I am including, then I must cite all four publication, not only three of them. This is writing to prove, not to persuade (although there is as always an element of rhetoric even in these reports) . . . writing as science.

What I like best about the job, in comparison to being an English professor:

  1. The sense of teamwork. A couple of months ago, the CER for a guidewire had a tight deadline. My colleague T modified and updated the SOTA that my colleague D had done for a different guidewire a few months earlier, and M, the newest medical writer, did the quality check (i.e., technical editing, which involves not only proofreading-type stuff but also checking the accuracy of claims against the original articles cited). I did the clinical evidence section, and D did the quality check. All four of the medical writers lined up in our little row of cubicles worked on the project, and it was . . . weirdly fun to pull together to meet a tight deadline. And this is just a more extreme example of the everyday teamwork on any of our projects, which involve a lot of back-and-forth between me as the writer, the medical reviewer (who has an MD, provides medical context, and serves as the overall decider), and the editor.
  2. Not having to defend the value of my work. When I was job-hunting last summer, I was doing searches for any job that had the words “writer” or “editor” in the name . . . but I ignored all jobs for advertising writing. Although I was tired of the general public thinking that the work I did as an English professor wasn’t valuable, I wasn’t ready to take a (possibly better remunerated) job that *I* didn’t think was valuable. Regulatory writing, on the other hand—writing reports aimed at proving the continuing performance and safety of medical devices with reference to the state of the field as it is today, participating in a regulatory system that ensures that medical device makers fulfill their obligations to create and market products the benefits of which outweigh the risk—hell yes, I can get on board with that.

I’m writing this blog post in response to a friend who asked about “the advantages and adjustments” of this post-academic job. The main advantage is that I feel less ambivalent about the necessity of my work—the fact that manufacturers have to write the kind of reports that I work on now keeps them honest, keeps them focused on their responsibilities to the public, and I feel good about being part of that. [If you’re curious about why the EU is upping its regulatory requirements for medical devices, check out the story of the Poly Implant Prothèse scandal here, or else listen to the Swindled podcast episode titled “The Implants.”]

I haven’t experienced the main disadvantage yet, but I’m already sniffling about not having freedom and autonomy in the summer. Everyone who knows anything knows that academics work hard in the summer . . . but they don’t have to show up at a particular place at a particular time, and that’s heaven. And just in general, yes, I have less freedom and autonomy, because I work in a cube now instead of an office with a door that closes. The Panopticon is real—I do a lot less messing around on the Internet now than I did when I had an office with a door that closed . . . but on the other hand, the expectation in that job with the door that closed was that I’d be working evenings and weekends, and therefore I could do whatever I wanted with my time in my office, because I’d be making it up later no matter what. I don’t work evenings and weekends now, and that’s another big advantage.

As I’ve spent the past year or two contemplating an exit from academia, I’ve veered between two different thoughts about being an English professor. I’d think of the coal miners that Donald Trump said he wanted to prop up in order to get their votes. “Am I a coal miner?” I’d ask myself, wondering what it means that this nation’s educational system has decided that the majority of the people who teach our kids English language, writing, and literature don’t deserve tenure . . . or benefits . . . or job security.

But then, goddess help me, I think of Ayn Rand. I went through an Ayn Rand phase as a 15-year-old, as many dumb smart kids do. I think about the characters in Atlas Shrugged, who are like “Fuck y’all—you think you have a right to my work, my mind, my passion for this thing I do, but you want to just take it, and you don’t value it,” and, more so than at any other time in the 30+ years since I got over my Ayn Rand phase, I think, “Yeah.”

[Part 1 of this topic: “Leaving Academia: My Prehistory”]

Leaving Academia: My Prehistory

I started a version of this essay a couple of months ago, but I was embarrassed: to admit the ways that academia was or is like a drug to me felt too embarrassing. I didn’t want to admit it. So I never finished writing then about the prehistory of my decision to leave academia (a decision that I haven’t made final yet—I’m still on a leave of absence from my tenured professor position). But the story doesn’t make sense to me without the prehistory, so when people ask how I’m doing or inquire about an update, I feel stuck, because I have to tell the prehistory first.

In  spring 2018, on a trip to New Orleans, I took a substance that altered my perceptions, and I realized two things: (1) I was very cut off from my emotions, and (2) I cared too much about my professional reputation—I had been promoted to full professor the previous summer, and I was coming to suspect that I would never achieve enough to be satisfied professionally.

In the draft of this essay that I wrote before, I wrote about reconnecting with my emotions by beginning a meditation practice, and then I waved a verbal magic wand and wrote that the combination of feeling my feelings and  meditation loosened the hold that others’ opinions had on me . . . but it seems that that was leaving out a lot.

It’s embarrassing to admit how much my emotional experience of being in academia can be summed up as wanting more of everything—more respect, more recognition, more opportunities, more achievements, more, more, more—a desire that was literally in-satiable. I wished that the sorting hat had placed me somewhere with more intellectual excitement, somewhere I could work with PhD students to help them improve as writers, somewhere where I would receive more invitations to do this or that thing. But the sorting hat placed me where it did, and by the time I had become a full professor, it was clear what was available, the good and the bad, at the place I was. I wished that people would invite me to do this or that thing, but I could also perceive from talking with friends at more prestigious Research I universities that the cost of being higher up in the field is in fact a constant barrage of invitations to do this or that thing . . . and each of those things takes a heap of time, and cumulatively, looked at from the outside, and probably from the inside, appears fucking exhausting.

I was dissatisfied not to be in The Room Where It Happens, but  I could also see those people in the room dancing as hard as they could to stay relevant and to say yes to every invitation, because if you say no, people will stop asking.

That night 2 years ago, I realized not only how much of my identity was wrapped up in being a successful academic, but also how the definition of “successful academic” depended almost entirely on how other people thought about me.

I resolved to change, and meditation and trying to cultivate self-awareness did help me begin to change, but I began to question whether it was possible for me to become the person I’d like to continue becoming while staying in academia. It seemed crazy to think about leaving academia in order to have mental space to change as a human being, because academia provides so much freedom over one’s schedule. Couldn’t I be one of those professors who mentally checks out of their paid job to follow their bliss and continues pulling a paycheck and keeping those sweet, sweet, sweet summers off?

It felt like I couldn’t. I like to be all in; if I’m not all in, I’m not likely to stay in at all—this makes me, to be precise, a quitter, but I want to be all in. I believe in this.

What I learned from leaving academia is that I wasn’t the reason why I cared so much about what other people think about my professional merit—that was academia, not me. I traded out the Vocation of academia for a new career, not a job, and I do think about medical writing in career-focused ways—my goals for training, for expertise, for things I want to learn how to do and to do better—but it doesn’t have anything like the hold over my mind and my sense of self that academia had. All I had to do to care less about what people thought about me professionally was to leave academia—it didn’t require therapy or more meditation or any special effort: just leaving the world where the logic of the place dictates that you are your position in the academic pecking order, the world that holds you in line with the knowledge that your place in the pecking order can change, for the better or for the worse.

Two movie moments have emblematized these thoughts for me and this process. Some people can be in academia without being of it, and I salute them. For me, though, what has resonated with me most strongly is the closing scene of War Games: “A strange game. The only winning move is not to play.”

Another image, for the moments when it seems crazy to give up prestige and so much freedom and autonomy to take a job where I have to show up in the office 5 days a week, 12 months a year (which sounds like simply, er, *LIFE* to the non-academics reading this), is the scene at the end of James Cameron’s Titanic, when Rose gives a tiny laugh and drops the diamond necklace into the ocean:

 

What? How can you do that? How can you throw a diamond necklace away? It’s absurd. It calls into question the value of a diamond necklace!

From the first day of graduate school, when they sat us down and told us that half of us would never get tenure-track jobs, through the years after that, when I learned that women, and mothers in particular, would drop out of the academic pipeline at disproportionate rates at every step on the way to full professor, the very scarcity of good academic jobs made them into diamond necklaces: rare, precious, and beautiful. I worked so hard, specifically as a feminist project, not to fall out of the pipeline, not to get stuck at the associate professor level—always wanting more, both for myself and because women are underrepresented more and more the higher up you get in academia. To drop that necklace, to watch it swirl and spiral through the water as it sinks down, down, down—what an odd thing to do. And yet how freeing.

Heroic Virtue and Ordinary Virtue

Although I have really been enjoying The Ezra Klein Show’s series on moral philosophy, it took me a few weeks to listen to the episode with Wayne Hsiung, the founder of Direct Action Everywhere who will go to trial soon, facing up to 60 years in prison for rescuing dying animals from a factory farm and documenting it. Because I am a vegan who does not break into factory farms to rescue animals and document animal abuse, I kind of didn’t want to listen: my life would be harder if I were an extremist, and so I don’t want to want to be an extremist, and I don’t want anyone to seduce me into becoming an extremist. But that wasn’t the impact the show had on me. One thing I continue to learn is that I don’t have to react to other people’s moral choices from a dichotomous mindset: either “I should be doing what that person is doing (but I’m not, so what’s wrong with me?)”, or “that person should be doing what I’m doing (but they’re not, so what’s wrong with them?).” I felt I could recognize something of this mindset in Klein’s repeated attempts to figure out what makes Hsiung tick: “Why are you the way you are (and why aren’t I?)?”

As I was listening to the show, and connecting parts of the Hsiung episode with the earlier episode with philosopher Peter Singer, the Roman Catholic concept of “heroic virtue” kept coming to mind. Singer’s philosophical arguments in favor of kidney donation to strangers made Klein a little uncomfortable. He admitted that, even though his 9-month-old son’s kidneys are fine, he would not be willing to donate a kidney to a stranger, because someday his own son might need that kidney.

Much of Singer’s philosophical project is rightly concerned with encouraging individuals to widen their circles of compassion, to try to get people to care as much for distant creatures as they do for their family. And yet something starts to niggle at me when I consider that an exemplar of what I consider secular heroic virtue—someone such as Hsiung, who will likely do some jail time for his crime of saving a near-death piglet; or like Zell Kravinsky, who donated one of his kidneys to a stranger—they must be childless, mustn’t they? (In fact, no, but more on that later.)

What is heroic virtue, anyway? Joseph Wilhelm, in the 1917 Catholic Encyclopedia, explains that an ordinary virtue can “attain the grade of heroicity when practiced with unflagging perseverance, during a long period of time, or under circumstances so trying that by them men of but ordinary perfection would be deterred from acting.” Wilhelm provides three examples: “Martyrs dying in torments for the Faith, missionaries spending their lives in propagating it, the humble poor who with infinite patience drag out their wretched existence to do the will of God and to reap their reward hereafter, these are heroes of the Faith.” But it’s notable that the rolls of Catholic saints are filled with martyrs and missionaries, not with “the humble poor who . . . drag out their wretched existence.” Some heroic virtue is more equal than others.

In any religion with a tradition of monasticism, family life is seen as antithetical to true progress in religion . . . in practice, historically, this is sexist. Basically, whatever sacrifices I make for my children are examples of “ordinary virtue” . . . we only get to “heroic virtue” when the sacrifices are made for strangers (or “god,” or animals). There is something to this, because it’s good to get away from tribalism – but to the extent that this idea of virtue shuts out people with significant caregiving responsibilities, we need to watch out for it. People who volunteer a few hours a week and post it on Instagram can get lots and lots of praise nuggets, but someone who spends every waking hour that is not devoted to paid work taking care of a child or an elderly parent gets nothing (in terms of social validation, which we can’t pretend isn’t important). Something is amiss here—a devaluing of the sacrifices involved in “ordinary virtue.”

Zell Kravinsky, the kidney-donation evangelist and philanthropist, does in fact have a wife and four children, and they are disturbed by his desire to give everything away, up to and possibly including his one remaining kidney, if someone needed it who could do more good in the world than he can. Jason Fagone’s story gives a sense of the incompatibility of heroic virtue with family ties—Kravinsky’s wife, for one, appears thoroughly sick of her husband’s eccentricities.  Siddhartha Gautama abandoned his wife and family; Jesus denied his mother and his brothers in Mark 3 and Matthew 12; that same Catholic Encyclopedia entry says that Abraham’s willingness to slaughter his own son out of obedience to God was heroic virtue.

Very showy examples of virtue attract attention, and I am 100% impressed by Hsiung’s commitment to justice and compassion for animals. But I’m not willing to feel bad that I myself am no exemplar of heroic virtue (though who knows? donating a kidney does sound good). Much of my very ordinary virtue derives from the care that I give to others; the world of philosophy needs more attention to the ethics of care and feminist philosophers’ attempts to tilt the balance from ideas of virtue and morality that assume a privileged male actor to the many ways others and Others demonstrate virtue. In other words, Ezra Klein, I don’t think you should feel bad that you want to hold onto that kidney in case your son needs it—it means you’re thinking like a mom.

 

No Small Roles, Only Small Actors

The first time I ever cleaned a toilet was at Camp Towanyak in the summer of nineteen eighty something. Each cabin rotated through various chores, and on this day my cabin was cleaning the Rec Hall. I was maybe 12? I had never cleaned a toilet before, but I knew that toilet brushes existed, because I knew that there wasn’t one. Maybe there was one somewhere, locked inside a supply closet? I will never know . . . but it’s hard to believe that there wasn’t a toilet brush for a toilet that, evidently, got some pretty heavy use, given the condition it was in that day, and given the fact that logically, this should have been a different cabin’s job only the day before.

(Only now, tonight, all these years later, does it occur to me that maybe the reason the toilet was so dirty was that previous child-workers, finding no toilet brush, had simply not cleaned it.)

It was utterly and completely filthy, the worst toilet I have ever cleaned (and I worked as a janitor during graduate school, and I cleaned at a *factory*, so . . . ), and I cleaned it with a sponge and no rubber gloves. For some reason (I was a total slob as a child, so it’s not that I liked cleaning), I approached the task matter-of-factly and even derived a certain amount of satisfaction from seeing just how much improvement I made in a short amount of time.

What I’m trying to say is that you can’t scare me with toilet cleaning. It’s a job.

I was cleaning someone else’s toilets again today—the company I work for does this thing each year where all of the employees spend the whole afternoon in groups volunteering at places around the city. I was at the homeless shelter where, among other things, I cleaned every toilet, every shower, and both urinals. And I was thinking about this oldie-but-goodie from Jacobin that a friend posted yesterday about the insidiousness of the “do what you love” (DWYL) mantra. Tokumitsu writes:

By keeping us focused on ourselves and our individual happiness, DWYL distracts us from the working conditions of others while validating our own choices and relieving us from obligations to all who labor, whether or not they love it. It is the secret handshake of the privileged and a worldview that disguises its elitism as noble self-betterment. According to this way of thinking, labor is not something one does for compensation, but an act of self-love. . . . Its real achievement is making workers believe their labor serves the self and not the marketplace.

She talks about the class privileges of those who get to do “lovable” work and how “For those forced into unlovable work, it’s a different story. Under the DWYL credo, labor that is done out of motives or needs other than love (which is, in fact, most labor) is not only demeaned but erased.”

She doesn’t mention another mindset for approaching unlovable work, which is to “do it out of love” for something other than the work. Back when I used to be Catholic, I would clean the church sometimes with this other woman, and she used to irritate me by making all these noises about how this was degrading work that we were doing for love of Christ, or some such nonsense. I did love Christ, but I thought it was silly to need to do these mental gymnastics to try to make cleaning toilets into “lovable work”—cleaning toilets is simply work, and I didn’t see it as degraded and I also didn’t want to exalt it.

The US Bureau of Labor Statistics can provide some information on jobs that might be “lovable” or “unlovable” in their Occupational Outlook Handbook. Here, we can learn things like that of the 808 job categories that the OOH tracks, 104 require less than a high school diploma (janitor is one of them); 333 require a high school diploma. Whether these jobs are lovable or not, everyone knows that the 1.8 million people working for minimum wage or less in 2018 have seen their wage’s purchasing power erode during the past ten years, the longest period Congress has ever gone without raising the minimum wage.

In a terrific essayfrom 2016 that unfortunately is not freely available online, Heather Howley criticizes President Obama’s September 2015 remarks on education as providing young Americans with “a shot at success.” Howley writes:

It is difficult to pinpoint when the American dream transformed from something in which everyone was a potential participant into a lottery-like “shot at success.” Although always a rhetorical trope used to minimize social inequality in the United States, the American dream has been downgraded by both political parties, suggesting that a middle-class life is now reserved for the semielite—the ones who complete college and choose an in-demand career. Plans to eliminate tuition at community colleges or four-year public institutions will benefit the students who attend as well as their communities. Educated populaces are more economically productive, healthier (physically and relationally), and more engaged in the democratic process. However, these benefits, as great as they are, do not provide a solution to the destructive inequalities that affect low-wage workers.

Cleaning is important work. I know this from common sense, but I also know it because the university where I used to work cut its janitorial budget in 2013, which has led to problems with fruit flies and mice ever since. Cleaning toilets is not degrading work; if you think it is, you’re wrong. But the lopsidedness of how we think about work means that workers in Tokumitsu’s “unlovable jobs” are treated as degraded, and those in “lovable jobs” (some subset of the 272 out of 808 jobs in the BLS’s Occupational Outlook Handbook that require a bachelor’s degree or higher) are exalted.

This is why a lot of people who hate school take out loads of student debt to go to college: it used to be you could have a good life without a college degree, but now you can’t. This is part of my own ambivalence that developed around teaching at a non-elite university: students who take out tens of thousands of dollars of debt, who often leave without a degree (but with the debt, of course), who would have been perfectly happy working in an “unlovable” job if only we as a society would decide to treat all workers as performing valuable work and as deserving stability and a decent wage.

 

 

 

 

 

Allegory Stuff—A Series: Where the Project Came From

After more than a year away from it, I’m finally back to work on my book about allegory. During my Spring 2018 sabbatical, I withdrew from blogging and social media and didn’t post anything online the whole semester. I got a lot done, but there was something pretty monastic about that whole semester that isn’t possible or even desirable right now. Because I’m not working in academia this year, I don’t have the kind of built-in support for this kind of scholarly work that I’ve had in the past; because I’m working on the book evenings and weekends around a full-time and demanding non-academic job, I need to think about how to stay motivated. I’m working on the book proposal right now, and part of that document is writing brief summaries of each chapter. It’s hard to boil down a whole chapter into a paragraph, and that made me think I might enjoy writing informally here in this blog about some of the things I’m working through in each chapter of the book.

So this first post is about the pre-history of the project, and then I’ll occasionally post some chapter summaries and—who knows?—maybe other things about or from the book as well.

How I became interested in allegory: When I was working on my book on Spenserian satire (free download in hyperlink), I ended up naming Edmund Spenser’s way of writing satire “indirect,” in contrast to the more direct style of satire that became possible when press censorship became less extreme in England (thank you, John Milton!). This more direct style flourished in the eighteenth century and became what we think of when we think about satire: sharp, vituperative, naming-names kinds of satirical attacks. But in the late sixteenth or early seventeenth century, it was quite a bit more dangerous to speak your mind in print: you might have your hand chopped off for publishing a comment on Queen Elizabeth’s love life (John Stubbs), you might be put to death for having printed the Martin Marprelate tracts (John Penry), or you might be imprisoned for seven years and have your ears cut off for writing a book against plays at a time when Queen Henrietta Maria had recently appeared in a play (William Prynne). So the most prominent theorists of satire, who not surprisingly tend to be specialists in eighteenth-century British literature, will look at the kind of satires written during Spenser’s period and think these writers were a bunch of milquetoasts . . . but it was a different censorship situation.

While working on that book, I read a lot of satirical poetry written during the sixteenth and early seventeenth centuries, some more “direct” and some more “indirect”—poems by Edmund Spenser, of course, but also, in no particular order, John Marston, Thomas Middleton, Thomas Nashe, Joseph Hall, Tailboys Dymoke, Everard Guilpin, “Martin Marprelate,” William Shakespeare, John Skelton, John Donne, Michael Drayton, George Wither, “Peter Woodhouse,” Richard Niccols, John Hepwith, anonymous libels aimed at Robert Cecil, and more. When I first started the project, I was hunting foxes . . . really. Spenser used fox imagery to criticize Robert Cecil’s father, William Cecil, Lord Burghley, in his Mother Hubberds Tale, and my beginning to the project was to look for places where a fox reference might actually be an allusion to Spenser and might therefore be subtly satirical. In its essence, indirect satire is allegorical, because it criticizes without naming who or what is criticized.

What happened as I continued to work on the satire book was that I shifted from hunting foxes and other obvious symbols to something more subtle: the more I read, the more I came to sense intuitively when a poem had shifted from straight narrative or exposition to a passage of allegorical satire. I believe that by immersing myself in so much satire of that time period, I picked up the same sort of habits of attentive reading that led Spenser’s contemporaries—and, unfortunately, also people like Queen Elizabeth and Lord Burghley—to find critical references to famous and powerful people even when no one was mentioned by name. This is “the allegorical intuition,” and it is not exclusive to satirical works but is, I think, what makes allegory recognizable. Medieval and Renaissance allegories tend to announce themselves pretty clearly as allegories, but I started to think about how it is that a reader can recognize a text as allegory when the author doesn’t explicitly proclaim, as Spenser does about The Faerie Queene, that the poem is “cloudily enwrapped in Allegorical devices.” Even when an author does announce an allegory, the nature of the beast is that the work doesn’t refer directly to the “hidden meaning” (that’s what makes it hidden, after all), so a reader is clearly doing some cognitive heavy lifting in order to make sense of any allegory.

I started to think that the brain is doing something very particular when a person recognizes and interprets allegory, so it seemed likely that cognitive metaphor theorists would have already addressed this . . . but they hadn’t. Only one scholar, Peter Crisp, had tried to develop a thorough theory of allegory through the lens of cognitive metaphor theory, and he had not—in my opinion—fully succeeded, because he went down the wrong path when he rejected the idea that allegory has anything to do with what Gilles Fauconnier and Mark Turner call “blended spaces,” a mental space where ideas, images, and concepts can blend together.

I looked at other theorists of allegory, and I saw over and over again what I think is another mistake: creating definitions of allegory that require that allegory be a narrative. The thing is, when you have a term as old as allegory—a term that names an extremely remarkable feat of the human mind—you end up with a lot of problematic definitions because it really is very hard to explain. So a lot of the erroneous belief that allegory must be narrative in form stems from the very old conflation of allegory with extended metaphor. I came to believe that they are distinct.

And then, during my sabbatical, I was reading a lot of philosophy and theory—posthumanist theory, object-oriented ontology, ideas about vital materialism—all of which share a general desire to topple humans from their unquestioned-for-millennia belief that humans are above everything else on earth, second only to God and then, to the extent that we have moved into a post-religious present, second to no one. Immersing myself in these ideas changed my perspective on a lot of things. One day as I was rereading Angus Fletcher’s “Allegory without Ideas,” in which he mourns the loss of a belief in allegory’s “chief traditional claim” “to be able to project permanent truths,” I suddenly really got that Fletcher, along with C. S. Lewis, Walter Benjamin, and others, attribute something divine or magical to allegory, because yeah, how does it work to get your reader to project out from what is there on the page to other meanings? It’s mysterious AF. So though I knew that this was part of the history of thinking about allegory, it was in that moment that I realized that the view is incompatible with both posthumanist philosophy and also with insights from cognitive science.

That seems like a good place to stop for now. So this was the genesis of the project.

 

A Tiny Trauma

A year or two ago, I was at the dentist having a filling done. They put something in my mouth so that I couldn’t close it, probably something like this:

Plastic Mouth Opener

I absolutely hated this experience. Tears were rolling out of my eyes, one after the other, dripping down into my hair, and instead of asking them to remove the thing, I tried to talk myself through it. They’re not hurting me, it makes their job easier, I can get through this, this is OK, this won’t last very long. I didn’t want to ask for something special, didn’t want to make their jobs harder, didn’t want to be difficult. I got through the experience and went along with my life.

But now when I’m in a dentist’s chair, the same thing happens. I tremble while big fat tears roll down my face into my hair. And I’ve had a couple of particularly crappy dental procedures this year: I had a root canal in March, and last week I broke a tooth, so there was a long appointment to prepare for a crown. For both of those, I took a Xanax and listened to music very loudly through my earbuds—but needing to do that made sense, because those were procedures that everyone agrees are unpleasant dental experiences. But this past Thursday, I had to admit that something is now deeply messed up with my relationship with dentists.

My temporary crown fell off Thursday morning, so I scheduled a trip back to the dentist for the lunch hour. I forgot to bring a Xanax to work, but I wasn’t too worried, because this was just not going to be a big deal at all. When I got there, the technician said she didn’t think there was any need to numb me, because creating a new temporary crown was going to be pretty quick and thoroughly painless. And it was.

But it didn’t matter. Same tears, same trembling, same feelings of utter abjection and helplessness. I think this is what trauma is, right—an experience that doesn’t leave you, the afterlife of which you can’t control? And this is innocuous—something that is not secret or embarrassing or humiliating or shameful, as so many traumas are; something that is not so horrific that I can’t even think or talk or write about it. So if it seems dumb that I’m saying I was traumatized by ten minutes in a dentist’s chair with a piece of plastic in my mouth . . . well, I think it’s dumb, too, but I think it helps me to understand trauma better.

I did a lot of emotional work in my late teens and twenties, mostly because I had to, and I learned enough through four years of weekly therapy to have a pretty stable adult life. Eventually, though, in the past couple of years, I figured out that a lot of what I had learned was about control and clamping down and avoiding anything that looked like a potential slippery slope to depression. I had learned to “stop it,” as the Bob Newhart skit advises:

I really hate that skit.

I became adept at “stopping it” through the kind of mind-body split exemplified by my self-talk when I was trying to get through that experience at the dentist: trying to *think* enough that I wouldn’t *feel*, using thoughts to deny my emotional experience, which was also necessarily my physical experience.

And I don’t think that these were bad skills to develop when I was in my early twenties. I just think I relied on them for too long and didn’t realize that they were tools, not truth. I’ve been working for the past couple of years on doing a better job of welcoming my difficult emotions, and that means also accepting the ways those feelings are in my body.

I don’t really know how to get to where I can be calm in a dentist’s chair again. By not advocating for myself in the moment when I needed to, I now have a problem that I have figure out how to solve (preferably before I get the permanent crown placed in a couple of weeks, hahaha). It’s helpful for me to have had this experience, though, because it confirms so much of what I’ve been thinking about the emotional work I need to do now, all these decades after I felt like I had figured things out.