All I can believe in now is change

Published 18 Sep 2020, last modified 19 Sep 2020

This is the story of my life, but here I have tried to tell it not so much from my own perspective as much as from the perspective of history and broad social trends. This essay grew out of an assignment for an introductory sociology course I took as part of my bachelor’s degree program. I have significantly expanded it and am sharing it here not because my life is particularly interesting, on its own, but because it’s a good exercise for any of us to take a step back and view our lives from an outside perspective, to ask ourselves what experiences shaped us, how we came to believe what we believe, and what factors influenced the way we view the world today.

Greycourt State Park, Methuen, MA

From the early 1990s to the present day, my life has followed, in many ways, the convoluted post-Cold War trajectory of the social, economic, and political development of the United States. I have witnessed the deindustrialization of the U.S. economy, the birth of the “War on Terror,” and new developments in the social movements for the rights of LGBTQ and disabled people. All of these events have weighed upon the timeline of my personal life and even shaped what’s on my résumé.

My mother was a freshly educated nurse from Lawrence, Massachusetts; she had grown up the oldest of five children in a working-class household. Her father, who was from the French- Canadian community around Lowell, had dropped out of college after what my grandmother describes as a nervous breakdown and spent his entire career working for the local electric company, Mass Electric. Her mother was from an Irish-American family of Lawrence textile mill workers, and she herself had taken secretarial work in one of the city’s remaining mill businesses before she married. Both of my mother’s parents were steeped in a history of labor struggle. The Lawrence textile mills, as exploitative as any such work sites of the era, were at the center of several of the earliest large-scale industrial strikes in the United States, most famously the Bread and Roses Strike of 1912, an early success story for the Industrial Workers of the World (IWW). My maternal grandfather himself was a union worker career, and my mother recalls the tension and dietary changes that occurred in the family whenever the Mass Electric workers’ union called a strike; the cheapest hot dogs from DeMoulas would become their primary source of protein for extended periods of time, and her mother struggled to navigate the byzantine restrictions that were placed on food stamps (which, she reminds me, were literal stamps at the time). Yet, amongst these privations, my mother’s family enjoyed certain remarkable economic opportunities that were extended to white working families in the U.S. at the time. My mother was born into a drafty public housing unit but her family was soon able to obtain a mortgage on a sturdy, amply sized, recently constructed three-bedroom house in a comfortable neighborhood with easy access to the center of town; my grandparents still live in that house now. After all those strikes and negotiations with the company, my grandfather was able to retire with a decent pension.

My father was from a large French-Canadian farming family in northeastern Vermont , and had moved to the Merrimack Valley region for work, after living with his sister in Alabama for a time to escape the control of his physically abusive father. Like all but one of his nine siblings who lived to adulthood, my father found a career that did not involve operating a farm; he became an electronics technician after attending community college and working a number of “unskilled” jobs, including in security at Wang Laboratories. He met my mother, then a freshly- educated nurse, in a faith-based social group for young singles at a Catholic church. Both of my parents were able to essentially pay for their own college education without significant loans, through a combination of tuition costs at community and public colleges that were unimaginably low by today’s standards, steady, easy-to-come-by employment throughout their student years, and government grants (like the Pell grant) which at the time covered a significant portion of the cost.

I was born in 1991, not long after the fall of the Berlin Wall, at a time when the United States had emerged from the long Cold War as the world’s only superpower. My parents were in their early twenties, and owned a somewhat cramped two-family house in a small city in the Lawrence-Lowell metropolitan area north of Boston, Massachusetts; I was their first child to survive infancy. From a very early age my parents noticed that I exhibited certain developmental atypicalities: that I stacked my toys in neat piles in my crib and sorted them by size, that I would hold my arms out horizontally when picked up, and that my early verbal development progressed much more quickly than for most children. A friend of my parents who specialized in early childhood development expressed concern that I was speaking in full sentences far earlier than expected, but my parents were unperturbed. As I grew into toddlerhood and elementary school, I developed a pattern of behavior, including an intense interest in identifying the composers of music on the classical radio stations and frequent meltdowns when faced with minor changes in plan, that would likely have resulted in a diagnosis of Asperger’s syndrome (now essentially rolled into the autism spectrum disorder diagnosis) at a somewhat later point in time, or if my parents had seen this behavior as worrying from the start. As it was I spent the first decade of my life without this being pathologized. (I was eventually brought to psychological counseling at age 11, but was treated for anxiety, not for a developmental disorder.)

In 1993, we moved a mile or so to a larger house, in anticipation of the birth of my younger brother; at the time this seemed like a completely world-altering experience, but we had a much bigger change in store. In 1997, when I was six years old, my father’s company relocated to Texas. My father struggled to find a similar job near our home; eventually a friend recommended him for a position at a microwave communications company in the Puget Valley area of Washington. The four of us moved rather abruptly; I had already been measured for a uniform so I could start first grade at the grammar school affiliated with our church; but soon enough we found ourselves moving 3,000 miles away, far from my entire extended family. Living in Washington was not hard for us; we had an enormous yard and an amount of living space that seems almost absurd by New England standards, and the public school faculty were kind and accommodating of my (again, mostly not pathologized) developmental atypicalities,but the suddenness of the change was almost earth-shattering for me emotionally, and in the long term it gave me an unsettling first impression of what “the economy” was. As a child, I developed a sense that the inscrutable vicissitudes of the market could uproot entire families without warning and send my life in a completely different direction. And even after we moved to Washington, we didn’t find long-term stability in my parents’ employment. My mother quit nursing for the first few years after we moved, but then everyone at my father’s workshop was laid off without warning as the dot-com bubble burst, circa 2001. For a while, my mother returned to nursing while my father played at homemaking, until he took a job in flight recorder repair. Between the sluggishness of the aviation industry in the early 2000s and the departure of Boeing manufacturing from the Seattle area, this was not to last. We finally moved back to Massachusetts in 2003, living in my grandparents’ basement for a few months until my parents could mortgage a new house.

If it was abruptly moving far from home in 1997 that made me aware of the economy, it was Seattle WTO conference protests of 1999, the “hanging chad” U.S. presidential election of 2000, and ultimately the airplane hijacking attacks of 11 September 2001 that brought politics home into my conscious daily reality. When the attacks began I was asleep, as I lived on Pacific Time and had a late-starting school day—but my mother woke me up early and compelled me to watch the news on her bedroom television when the story broke that a plane had struck one of the World Trade Center towers. She was crying, but she told me she thought it was important for me to see what was happening. When she was a child, her own parents had hidden from her the very existence of the war in Vietnam until it was over, and I suppose she recognized that she would not be able to do the same for me. So I watched the collapse of the Twin Towers in real time as my mother recalled aloud that taking a public elevator to the top of one of those buildings on a family vacation was one of the most exciting moments of her childhood.

It was how things developed after the attacks, as much as the attacks themselves, that changed how I saw the world. In particular, the proliferation of jingoistic imagery and rhetoric, the “These colors don’t run” bumper stickers and extraneous decorative flags, and the anti- Muslim rhetoric from Michael Savage on the radio, struck me at the time as a baffling and concerning response to the horrifying loss of life we had witnessed on television. If the immediate rush of firefighters into collapsing buildings had represented the best of humanity, this enduring urge to project vindictive American power seemed to smack of something almost unspeakable, of humanity’s less charitable impulses. Moreover, adults around me seemed unable or unwilling to address my biggest question about the event. Why? Why did this happen, and more to the point, why did people choose to do this, ending not only so many others’ lives but also their own? “Because they were evil men, because they hate America…” As a child interested in storytelling and theater, these answers did not satisfy me; they couldn’t adequately describe the flawed motivations of human beings. My lingering distrust of the hawkish line of public rhetoric simmered, largely unexpressed, and carried me right through the U.S. invasion of Iraq, until I began to see my private nightmares reflected on the television screen in photos from Abu Ghraib. Until 2001 I was made to believe that I lived in an especially enlightened corner of an enlightened world more or less at peace, and after 2001 it seemed I lived in a country forever at war, where the only common ground uniting a majority of people was a vague sentiment of “support” for the disembodied “troops.”

Though my educators up to that point had made some effort at promoting multicultural values, and had taught a somewhat sanitized version of the history of the American Civil Rights Movement of the 1960s, the first time I found myself regularly spending time in an environment that wasn’t predominantly, perhaps even overwhelmingly white and English- speaking was when we had moved back to New England for some time and my parents sent me to a Catholic high school in Lawrence, Massachusetts. In my mother’s youth, Lawrence was largely dominated by ethnic Irish and Italian communities, but by the time I started high school there in 2005, white flight had weathered the infrastructure and institutions of the town and new communities had been forming amidst the mess left behind, comprised especially of new arrivals from Puerto Rico and the Dominican Republic. Classmates at my previous school, a combined Catholic middle-and-high-school in a wealthier town closer to Boston, where my parents couldn’t really afford the tuition, acted surprised when I told them where I would be attending high school—“Central Catholic?” they’d ask, “Isn’t that, like, in the ghetto?” Sadly, neither the faculty of Central Catholic High School nor the student body was entirely reflective of ethnic and economic diversity of the surrounding community, but it did turn out to be a much more diverse environment in those respects than my previous school, and I found myself more at ease there for that reason—because although, being white, I did not immediately stand out in the student body at the previous school, I did have a distinct sense of being somehow apart from the student culture, and felt that far less acutely in this new environment, as the student culture seemed much more loosely defined.

It was at this time, as I entered high school, that my parents divorced. There were a lot of factors in the divorce, but one of the most important was my parents’ differing conceptions of the institution of marriage. Over time my mother came to understand that my father held a more traditionalist view of marriage, believing that their respective gender roles naturally conferred upon them different sets of responsibilities. My father resented that my mother earned more money at work than he did, expected a degree of control over my mother’s life and household affairs that my mother found unreasonable, and was often dismissive of her work at home, maintaining the household, while refusing to share the burden of this work. Lifelong marriages are a cornerstone of Catholic tradition, and at an earlier time in history, my mother would likely have remained married to my father for the rest of her life, despite their differences. As it was, they remained married for well over a decade, though my mother will confess to having doubts in their very first week of marriage.

The meaning of marriage was a very timely topic in my youth. Same-sex marriage became very much a hot-button issue in my social milieu in 2004, when I was in middle school, as my home state of Massachusetts became the first state in the U.S. to extend marriage to same-sex couples. Most of the rhetoric on marriage that I was exposed to was influenced by Catholic doctrine (which is governed by a central authority that does not recognize same-sex marriage) or by the reactionary “social conservative” politics espoused by figures like Howie Carr, whose radio show I then heard often in my mother’s clunky Saturn L300. At first I learned to parrot these ideas when probed, but as the limitations of state-recognized marriage evolved into an enduring national issue over the next several years, and as I became more capable of independent thought, I began to mistrust this line of thinking just as I had distrusted the wave of militaristic nationalism I had witnessed beginning in 2001. It was to become an increasingly personal issue for me. In high school I was increasingly aware that I didn’t possess the sexual attraction to women that was supposed to be a defining feature of my adolescence. This is to say, I had some sense that I wasn’t straight, but neither did I have what seemed a defining experience for gay and bisexual men—I couldn’t really say that I was attracted to men either. This became a source of constant, private distress to me until 2011, when I was in college for the first time and first discovered a community of asexual people. It was only then that I gained the ability to express my personal connection with the broader national and global social movement for LGBTQ rights.It wasn’t only Howie Carr’s opposition to same-sex marriage that increasingly bothered me as I came of age; he also expressed ideas about the political economy that seemed to align less with reality the more I learned about the world. At the time, what my mother seemed to find most agreeable about Howie Carr’s ideas was his praise of American meritocracy. Howie Carr and my mother agreed that people who had a lot of money had it because they worked hard for it, and that upward social mobility was open to anyone who was diligent to pursue it. None of this made sense to me, because the evidence in my own life didn’t bear it out. My mother was just about the hardest-working person I could imagine; at the time she was a full-time nurse in a chronically understaffed medical-surgical unit at Saints Memorial Medical Center in Lowell. (The hospital is now a campus of Lowell General Hospital.) She worked long hours, often clocking out and continuing to work, or working straight through her unpaid lunch breaks, because she could lose her job if she took overtime or skipped breaks, but patients would be left unsupervised if she did not continue to work anyway. Yet this hard work never seemed to help her get ahead; she continuously struggled to balance the household budget, and when the hospital began to have financial trouble, she lost her accrued vacation time. And when it came to my own ability to participate in the fabled American meritocracy, I didn’t seem to have much going for me. My parents first started cajoling me to start my first job in the immediate wake of the 2008 Great Recession; many of my peers had said they went looking for part-time work in entry-level service positions only to find themselves competing with struggling middle-aged homemakers. Still, I did what my parents suggested and applied for work to any random store or fast food business I thought I could regularly get to, and a job never materialized for me. All of this eventually brought me to a position of sympathy for the Occupy Wall Street movement when I saw it materialize in person in New York, in 2011.

In 2009 I graduated high school and went to college for the first time; my parents co-signed enormous student loans so I could attend a small liberal arts college in Manhattan. I initially planned to study theater, but my mother told me in absolute terms that I would not be able to financially support myself in the long-term if I studied theater, and I believed her, so I chose what I thought was the closest viable field: psychology. I devoted myself very seriously to pursuing my understanding of my chosen field and earning my degree, but by the middle of my third year in college I started to realize that I had a problem I didn’t know how to solve: I couldn’t complete many of the large writing assignments I was given in my required literature and history courses. I sought help from my schools psychological counseling office and was told I “probably” had ADHD, but the school’s tutoring services for students with ADHD were contingent on presenting an official diagnosis of the condition, and I could not find anyone who could screen me for ADHD at a price I could afford with my health insurance. Some of my friends who were involved in the autistic self-advocacy movement had also begun to explain to me why they felt I might be autistic. (Eventually, after I left school, I brought this up with psychological counselors who agreed I was “probably” on the autism spectrum, but as with ADHD, I have similarly never obtained a formal screening for autism spectrum disorder.) With or without any official diagnosis, however, the behavioral or neurological atypicalities that had made me hopelessly “weird” throughout my youth, that had been dismissed as side-effects of being a “gifted” student, or medicable symptoms of anxiety, were standing in the way of me finishing my degree, and I had never developed effective coping strategies or mitigations for them. In the end I flunked a few classes in one semester and went into a tailspin of depression and secret suicidal ideation and finally dropped out of school in 2012.

Shortly before I dropped out of college, my extended family gave me an opportunity I never had before or since, to travel outside the United States and Canada. Specifically, I was given a ticket to visit my uncle, who was then working as a director at a school in Panajachel, Guatemala, and his wife, who was writing grants and performing other volunteer work for an indigenous women’s organization called Oxlajuj B’atz’. It was an eye-opening moment for me in a number of ways. It was the first time I saw the American empire from the outside—in a country where the ongoing economic and social effects of neoliberal U.S. foreign policy were still very apparent, and sometimes led to open hostility. My uncle later told me that mere weeks after we had spent a night in Antigua Guatemala, and had stopped on a park bench under the street lamps (where I smoked the only cigar I’ve ever had in my life, because my uncle thought I ought to have a Cuban cigar while I was in a country where one could openly buy them), two American tourists were stabbed to death, apparently because they were visibly American. And this didn’t shock me, perhaps, as much as it should, because that night in Antigua Guatemala we had snuck into the courtyard of an opulent hotel frequented by American industry executives, that we could never afford ourselves, to see some of the city’s historic ruins that are surrounded by the hotel. Seeing what it cost to stay there in comparison to the going rates for simple going rates for various services, I could only imagine the resentment that might build amongst people who lived in great poverty – the people I saw carrying enormous bundles of wood on foot, using straps around their foreheads to balance the load, alongside the country’s long, dusty highways, or the tuk-tuk drivers in Panajachel who could expect no more than a 5-quetzal fare for driving someone clear across town. These people lived always in the shadow of ostentatious wealth that has been controlled by institutions like the IMF on behalf American companies like Chiquita, which instigated a coup in Guatemala in 1954, when it was known as United Fruit Company.

In Guatemala, I also witnessed the run-up to the Guatemala’s 2011 general election. There were unconscionably loud rallies and dance parties in the street organized by supporters of the two presidential front-runners, including alleged war criminal Otto Pérez Molina, who ultimately one the presidency. I heard rumors that both of these two campaigns had already begun discretely advertising the prices they would pay for individual votes. Around Panajachel, I saw many hand-painted campaign advertisements for the leftist UNRG-MAIZ coalition and the presidential candidate it was endorsing, Rigoberta Menchú, who won a Nobel Peace Prize for documenting war crimes committed against indigenous Guatemalans in the Civil War. I even attended an event at Oxlajuj B’atz’ in which indigenous women spoke about the upcoming election. My Spanish was poor, and their speeches were peppered with K’iche’ loan words I didn’t know, but I picked out a recurring theme: “Every time an election is coming, we hear so many promises, and as soon as we have voted these people into office, they forget about us.” I see shades of the 2011 elections in Guatemala whenever I look at U.S. electoral politics.

Back in the U.S., the proliferation of smartphones—and, with them, mobile social media usage and citizen video journalism—was about to bring about a groundswell of awareness of the frequent violent deaths of Black Americans during encounters with law enforcement and adjacent organizations. I first became really aware of this as a contemporary problem in March 2012, after a civilian neighborhood watch member in Florida pursued and murdered unarmed 17-year old Trayvon Martin, who was walking home after buying snacks from a convenience store. Over the coming months and years the names of more lives cut short like this came to national attention. Just a few of them were Rekia Boyd, Michael Brown, Eric Garner, Tamir Rice, Freddie Gray, Sandra Bland, Philando Castile… The public demonstrations that followed these deaths were not physically close as I dropped out of school and left the city, but Black Lives Matter movement that arose from them organically was always present in the culture around me. And so, too, was the movement’s opposition, a jingoistic “pro-police” movement that eventually adopted the slogan “Blue Lives Matter.”

Having seen up close what poverty looks like in the developing world, I soon got a taste of poverty as a young adult in the U.S. When I finally found work after leaving school, I began a series of minimum-wage or near-minimum wage retail jobs, and moved in with two people I had initially met through the asexual community in Boston and through their occasional involvement in a local chapter of the Autisitc Self-Advocacy Network (ASAN); the three of us became committed partners, and, for reasons of disability, I was the only one of us consistently capable maintaining a job. Some of the jobs I held during this period were stable positions that gave me 40 hours of work per week with the potential for overtime, but for about three years I was stuck in a job at Target where hours varied wildly in quantity and scheduling, ranging from 35 hours per week during the busiest parts of the year to 12 hours per week at other times, leaving us often unable to pay our rent in full when my hours dropped without warning. This, combined with frequent threats that I would be fired for underperformance, a state of disrepair in our Fitchburg, Massachusetts apartment, and our inability to continuously heat our apartment and maintain hot water during the winter months, caused an enduring strain in our lives, that exacerbated anxiety in all three of us and seemed in particular to worsen the debilitating chronic pain one of my partners experienced. Eventually, the three of us moved into an apartment in my mother’s two-family house, and as the labor market loosened, I finally found a steady job at the local wholesale club that guaranteed 40 hours of work per week—though, as in all retail jobs I’ve worked, I was strongly warned not to participate in any attempt to unionize the workplace. It was only when we reached this level of material stability that we decided to have a child, and that I could finally return to college and study again.

In 2016, while I was preparing to become a college student again—but as a working parent, this time, and with more specific plans for using my coursework to land a better-paying job—the national political landscape seemed to implode around me. In the early stages of the 2016 U.S. presidential primary elections I had followed the Bernie Sanders campaign with great interest. Here was someone who seemed capable of at least acknowledging the difficulties of everyday people’s lives, of the struggle to balance unreliable working conditions with a difficult housing market and financial obstacles that prevented many people from obtaining even basic healthcare, who at least made some attempt to address shameful police killings of ordinary black people like Eric Garner and Sandra Bland respectfully, as a policy issue. It wasn’t to be; even early on, the Democratic party primary race in which Sanders was running came to be dominated not so much by any kind of policy proposals as much as by pure opposition to one of the contenders in the Republican primary, Donald Trump, who launched a campaign founded on a fear-based proposal to effectively halt immigration from Latin America and from majority-Muslim countries. The Democratic party leadership coalesced around a moderate, established political figure, former Secretary of State Hillary Clinton, as the only viable candidate to oppose Trump, as Trump was proving that he was the only candidate in a large Republican slate who could generate real enthusiasm for the party. Thus, with these two parties having long held an effective duopoly over U.S. government, the enormously expensive and furiously contested 2016 U.S. presidential election was framed to the public as a choice between a perhaps mildly reformed status quo and a political paradigm rooted in overt anti-immigrant sentiment. Ultimately, a majority of voters would back the status quo represented by Hillary Clinton, while Trump, who received 2.8 million fewer votes than his main opponent, nonetheless won the presidency due to the country’s Electoral College system which awarded the office on the basis of 538 “electors” alloted to the 50 states, with greater statistical weight given to less populous states. It was the second time in my life that the U.S. presidential candidate who won the most votes from the general population was not elected to the office.

In the run-up to this election I had watched the bounds of acceptable political discourse in the United States change wildly. The fight for a minimum wage commensurate with cost of living and a single-payer healthcare system, the hallmark demands of Bernie Sanders’ scuttled 2016 campaign, were largely abandoned in the national political scene as, at best, demands that would have to be delayed until Trump’s overt appeals to overt racism and chauvinism were defeated in the election. Meanwhile, increasingly bizarre displays of a fascistic cult of personality emerged in and around Trump’s campaign events and gradually gained acceptance in the political mainstream. “Roman salutes” most familiar from World War II-era Nazi propaganda appeared amongst the candidate’s supporters at rallies, journalists from mainstream news outlets were relentlessly harassed and physically intimidated in their press pens, and an emerging group of “alt-right” organizers were credulously interviewed about their commitment to a “white ethnostate.” I knew that this fascist cultural undercurrent would not simply dissipate if Hillary Clinton won the presidency, but it only accelerated when Trump was elected instead. I had a shift at work before dawn the day after that election day, counting inventory for all the jewelry stocked in the store. It was obvious which of us were concerned… bleary-eyed, groggy, distracted, as if we had all caught a bad cold at once. Those among us who had been cheering for this new president were quite chipper.

Determined to at last finish a bachelor’s degree while avoiding long-form writing assignments in history or literary analysis, and to move as quickly as possible into a full-time job that could financially support my growing family, I became a part-time computer science student at UMass Lowell in the Spring 2017 semester, cobbling together an awkward schedule of work shifts at 3am or 4am in Salem, New Hampshire followed by afternoon classes on campus in Lowell; I continued my studies through the summer of that year. In the following autumn I was already beginning to search somewhat desperately for an internship I could use to build my résumé and ultimately get out of retail work. None of the applications I made seemed to get any interest for months until, out of the blue, one of the first companies I had applied to asked to interview me for a different internship position.

That internship began in the summer of 2018, immediately paid almost double the hourly wage I had been paid at the wholesale club, and was ultimately extended through the rest of my studies. Working for this software company, I had to adjust to a lot of things, like not being expected to perform obvious physical labor at all times, being encouraged to casually chat with coworkers throughout the day, having the opportunity to just step away from my work and grab a quick company-provided snack or some water pretty much whenever I thought it could help me clear my head. Wearing comfortable clothes that didn’t have to be in identifiable company colors. Being invited to lunch on the company dime sometimes. This last practice was especially bewildering to me at first; I remember wondering nervously whether I would be expected to cover my tab after being trotted out to some pizza-and-pasta restaurant that was a little more upscale than anyplace I would usually visit.

Another thing that distinguished this internship from the jobs I’d had before was that it was extremely flexible in its scheduling; I was able to become a full-time student after that initial summer and work sporadic part-time schedules with the company between classes. Things accelerated at work and at school until I realized I’d be able to complete my degree even sooner than I’d hoped, in May 2020, and my manager offered me a full-time position. My commencement was scheduled for a Saturday; the following afternoon I had been booked on a flight to North Carolina in order to attend a two-day employee orientation at corporate headquarters.

Or, that was the plan as of February 2020. Early in March I had a week of spring break at home, during which the governor of Massachusetts declared a state of emergency as it became clear that many Massachusetts residents that had been denied testing for the new coronavirus disease COVID-19 by the CDC did in fact have the disease, and that it had already spread to countless others who didn’t even know they had been exposed. That week many universities announced that they would abruptly make all instruction online-only, something they’d never done before. I haven’t been back to UMass Lowell’s campus since then, and six months later life in the United States is still shaped by its enduring COVID-19 outbreaks. But I’ve been lucky. Those close to me who have had the disease are in good health again, and unlike millions of people in this country I still have my job; they’ve even allowed me to continue working from home all this time.

The public reaction to the COVID-19 pandemic in the US has a markedly different character what I saw following the events of September 2001. For one thing, the memorializing of the dead is hampered because we aren’t watching them die in a fiery inferno on live television, because their deaths are happening all around us, constantly, and not all in a terrifying instant, and because we don’t know when the deaths will finally stop. This detachment from the reality of the crisis also allows bizarre new heights of denialism. Whereas conspiracy theories about the 11 September attacks posited that they were an “inside job,” and that elements of the events we saw on television were faked in order to cover the true identity of the perpetrators, COVID-19 conspiracy theories often question whether the pandemic is happening at all, suggesting that its enormous death toll is a statistical fabrication. At this point, the extent to which one believes the virus exists and that public health precautions are necessary is a matter of intense political rhetoric. It will be hard for us as a society to heal from a disease when we can’t even agree that it exists. And this dialectic of opposing conceptions of basic reality extends, too, the the ongoing Black Lives Matter movement, which reached new heights of public attention amidst the pandemic lockdowns of the summer of 2020, as demonstrations followed the murders of Elijah McClain, Breonna Taylor, and George Floyd.

We learned something else during that week in March: we’re expecting to welcome a new member of our family before December. There’s a lot we don’t know about how things will unfold then, like which of us will be allowed in the hospital at the time of the birth, or whether this pandemic will persist here long enough that the child will remember it later in life. All we can do is make near-future plans. Plans to prepare our living space for a newborn, for saving up to hopefully buy a house for the first time, perhaps a year from now.

The convoluted story of my life thus far is nothing I would have planned for myself; it is as much a story of accidents and unexpected events as it is the story of my own choices. History, with its absurd social, political, and economic upheavals and fluctuations, has shaped my life as much as, or more than, my own character. Thankfully, however, historical progress and collective action for positive change have also had an impact on my life, allowing my family and I to live in relative peace today. But in 2020, looking back over the course of my life, there is very little I can say with confidence about our shared future. All I can believe in now is change.


There are no comments on this article.

Leave a comment