Published 26 Oct 2020, last modified 27 Oct 2020
At some point in 2018 I was trawling YouTube looking for covers of the Sony Bono composition “Bang Bang (My Baby Shot Me Down)” in different languages when I found a bilingual French-and-Vietnamese performance by pop singer and actor Thanh Lan. From there, by copying search terms from the Vietnamese video description I didn’t understand, I found Nhạc Trẻ 6, (Youth Music 6 in English), the sixth entry in a series of audiocassettes featuring various artists in the emerging “youth music” genre and produced in Saigon (now Ho Chi Minh City) during what would turn out to be the final years of South Vietnamese rule.
Nhạc Trẻ 6 itself was released in 1975, the same year that the People’s Army of Vietnam took control of the city, bringing a decisive end to a twenty-year war against the U.S.-backed forces of South Vietnam. So it’s already apparent just from the cassette sleeve that this is a relic of a failed state, an enclave that would not exist much longer. This is reinforced by the live introduction on the first track, featuring concert-hall applause and two announcers speaking in multiple languages. Just before the first song begins, one of the announcers concludes in English:
And remember when we air from South Vietnam… composed by Ngọc Chánh and Phạm Duy, singing by Thanh Lan. “I Have Learned Sorrow.”
This brooding, bluesy number is preformed entirely in Vietnamese and ends in more applause; no live-concert noise occurs occur on the rest of the album, but with a murky reverb on the vocals and the same consistent sound from the acoustic backing band (strings, guitar, piano, and organ) it’s not hard to imagine the cassette as the well-edited product of a single live performance—though I don’t believe it is, as some of its tracks appeared on previous tapes.
The next number is a sentimental pop ballad, perhaps a Vietnamese reworking of a J-pop song since the cassette sleeve notes “Anata” as an alternate name. But the third track in stats a pattern that characterizes most of the songs on the album. The song is “Après toi,” a French hit from 1972, and Thanh Lan sings the original lyrics all the way through with the diction of a lifelong Francophone and the passion of a Method actor, then when the song seems to be reaching it’s musical conclusion, the band keeps playing and she sings it all again, just as capably and passionately, in Vietnamese. Twelve of the album’s sixteen songs are performed bilingually like this. On all but one of these, the Vietnamese lyrics are credited to Phạm Duy, who would go on to become the preeminent songwriter of the Vietnamese diaspora.
I imagine that simply to sing so much in French would haven been quite a political act in Vietnam in 1975; the government of North Vietnam was born from armed resistance to French colonization of the region, and the South Vietnamese government was seen in a number of ways as heir to the legacy of French colonialism. Especially notable, given South Vietnam’s status as a sort of de facto Catholic theocracy in a majority-Buddhist region, is the inclusion of “Oui devant Dieu,” whose original French lyrics echo traditional Catholic wedding vows.
But for all the political portent surrounding the album’s performances, they don’t strike me as especially ideological. Granted, knowing no Vietnamese, I can’t evaluate the songs’ lyrical content in that language. But on an album that contains mostly international pop standards of its era, these are all thematically quite inoffensive—basically, they’re just bog-standardq love songs. The political element here (again, as someone who understands no Vietnamese) sounds more like a matter of ethnic/cultural signifiers (the French language, Catholic overtones). This is in contrast to other songs recorded during the war by musicians like Elvis Phương and Carol Kim that explicitly decry socialism or romanticize the Army of the Republic of Vietnam.
One interesting inclusion is a French/Vietnamese performance of the traditional folk song “Scarborough Fair.” The lyrics of the song itself are not especially relevant to 1975 politics, but it was internationally fashionable at that time because of a late-1960s recording by Simon & Garfunkel that turned it into a thinly-veiled protest of the American war in Vietnam by setting it in counterpoint to an original song about a wartime massacre. Only the traditional tune and its lyrical themes are performed on Nhạc Trẻ 6.
What really gets to me about the whole album is just the depth of pure feeling throughout the whole thing, from the opening right through Thanh Lan’s duet with charismatic baritone Elvis Phương on the final number, the only song to feature a second vocalist. I’ve listened to a smattering of other music from the Nhạc Trẻ series and from Thanh Lan’s post-1975 career, and its all musically good, but none if it quite matches the emotional resonance of this tape. Part of it is simply the quality of the band—many of Thanh Lan’s later recordings use more mechanical sequenced synth instrumentals belying her powerful voice—but much of it is just the ineffable sincerity of the musicians and the mystique of the moment in time it captures. For me this is most present in the virtuosic and highly emotionally charged performance of Christophe’s “Oh ! mon amour,” a rapturous and plaintive song about love’s power to uncover the value and meaning in life.
The people who made this tape had to know that the social order that defined their lives was about to drastically change. They must have understood that the cruel war that had long surrounded them, with its constant uncertainty and massacres of civilians on all sides, was coming to a close. And at the same time they must have feared (and rightly so), that there would still be hard times ahead; there would still be displaced refugees and old scores to settle after the war. There’s something interesting in the choice to enjoy all these beautifully naïve love songs in the middle of that fraught context.
I guess this is why I enjoy the tape so much, as someone with no personal connection to the world it represents. It is obvious from the way this music is preserved online, categorized explicitly as pre-1975 music and mixed into YouTube slideshows of photographic street scenes from early 1970s Saigon, that it has a much more personal significance for much of the Vietnamese diaspora; it’s a touchstone that connects them to a period of their lives, a whole world of experiences, that had a definite and non-negotiable end date in 1975.
Maybe it’s just because I haven’t got just the right Vietnamese web search terms yet, but I wasn’t able to find Nhạc Trẻ 6 as a collection of downloadable audio files, so I ended up making mp3 files from a YouTube video of the whole tape, and I’ve put them in a ZIP archive to share here.
Published 25 Sep 2020, last modified 25 Sep 2020
Why does anyone make art, if they aren’t personally expecting to “earn a living” from it? Why invest the time, make yourself emotionally vulnerable, “bare your soul,” study the masters, take creative risks, and submit your work to members of a judgmental public, if no one is paying you for it? Why, especially, do those of who are of the proletarian majority, who have to work to feed and house ourselves and our families, make art that doesn’t pay? To quote The Coming Insurrection:
We know that individuals are possessed of so little life that they have to earn a living, to sell their time in exchange for a modicum of social existence.
So why do we invest some of that time we have to sell for no pay at all? I suspect the answer is that we are cultivating experiences—for ourselves and for others. We are gardners of our lives, and any gardner knows that no seed you plant is guaranteed to sprout. Those that do sprout don’t always wind up feeding you. We make art to cultivate the gardens of our lives. Not all gardens are for food. Some are for medicine, or for spices, or for private æsthetic enjoyment, or for public recreation… Every thoughtfully written letter you send, every doodle you share, every craft you make as a gift is a seed that may grow untold human experiences.
I recently discovered this little game of sorts called The Endling Archive I made as a high school student. I guess the word game isn’t entirely right; it’s more like hypertext fiction. The conceit is that you the player/reader have connected to this text console that serves heavily abbreviated encyclopedic articles about the world, left by a small post-apocalyptic human space colony. These little article snippets are arranged in a hierarchical menu and the more articles you open, the deeper this menu expands, opening previously hidden bits of lore. What strikes me about it now, more than a decade later, is how earnest and original it is in some ways, how many creative risks I was taking. This isn’t to say that it’s all well-executed or that it’s not entreating in any way. Some of the writing, the weird angsty selfie I instead into the game, and this Japanese pseudonym that somehow seemed totally fine for me to use at the time (even as I would explain quite often that I was not actually Japanese) are all pretty embarrassing to revisit. But I think as a parent I’ve become somewhat more patient with these artifacts of my own clumsy youth, and I can also just see that these artistic themes I was working on at the time were very much worth exploring, even if I wasn’t especially adept at worrying about them.
The Endling Archive is about the experience of being an endling—that is, the last living member of one’s species—and that means facing the self-destruction of humanity that’s going on all around us, but also the age-old dilemma that comes from us being such desperately social animals while also being really bad at forming and keeping social bonds, especially in our atomized contemporary capitalist mode of socioeconomic organization. I would find it daunting to write about these things now, and it’s so strange that it was the most natural thing to me in high school, that I would naturally spend my time making not-especially-noticed art about this stuff just because I intuitively felt there was something important underneath it all that I should share with the world. And as embarrassing as I find my work from that time, a few people did tell me they found it interesting or even inspiring in some way. That’s really all I want as an amateur artist
The more professional our internet spaces become, the more we’re surrounded by the slick handiwork of career influencers, the more I long for people to get out there and make their embarrassing amateur art. I’ve said every piece of art you make is like a seed, but I could also say that on the internet everything you do is a balloon; it floats away from you and becomes a part of something you can’t control, something that could touch other people’s lives. And I think if more of that were earnest, genuine, amateur art rather than advertising, it would enrich all of us.
Also I’m huge Boards of Canada fan and just really wanted to name this essay after one of their songs.
Published 18 Sep 2020, last modified 19 Sep 2020
This is the story of my life, but here I have tried to tell it not so much from my own perspective as much as from the perspective of history and broad social trends. This essay grew out of an assignment for an introductory sociology course I took as part of my bachelor’s degree program. I have significantly expanded it and am sharing it here not because my life is particularly interesting, on its own, but because it’s a good exercise for any of us to take a step back and view our lives from an outside perspective, to ask ourselves what experiences shaped us, how we came to believe what we believe, and what factors influenced the way we view the world today.
From the early 1990s to the present day, my life has followed, in many ways, the convoluted post-Cold War trajectory of the social, economic, and political development of the United States. I have witnessed the deindustrialization of the U.S. economy, the birth of the “War on Terror,” and new developments in the social movements for the rights of LGBTQ and disabled people. All of these events have weighed upon the timeline of my personal life and even shaped what’s on my résumé.
My mother was a freshly educated nurse from Lawrence, Massachusetts; she had grown up the oldest of five children in a working-class household. Her father, who was from the French- Canadian community around Lowell, had dropped out of college after what my grandmother describes as a nervous breakdown and spent his entire career working for the local electric company, Mass Electric. Her mother was from an Irish-American family of Lawrence textile mill workers, and she herself had taken secretarial work in one of the city’s remaining mill businesses before she married. Both of my mother’s parents were steeped in a history of labor struggle. The Lawrence textile mills, as exploitative as any such work sites of the era, were at the center of several of the earliest large-scale industrial strikes in the United States, most famously the Bread and Roses Strike of 1912, an early success story for the Industrial Workers of the World (IWW). My maternal grandfather himself was a union worker career, and my mother recalls the tension and dietary changes that occurred in the family whenever the Mass Electric workers’ union called a strike; the cheapest hot dogs from DeMoulas would become their primary source of protein for extended periods of time, and her mother struggled to navigate the byzantine restrictions that were placed on food stamps (which, she reminds me, were literal stamps at the time). Yet, amongst these privations, my mother’s family enjoyed certain remarkable economic opportunities that were extended to white working families in the U.S. at the time. My mother was born into a drafty public housing unit but her family was soon able to obtain a mortgage on a sturdy, amply sized, recently constructed three-bedroom house in a comfortable neighborhood with easy access to the center of town; my grandparents still live in that house now. After all those strikes and negotiations with the company, my grandfather was able to retire with a decent pension.
My father was from a large French-Canadian farming family in northeastern Vermont , and had moved to the Merrimack Valley region for work, after living with his sister in Alabama for a time to escape the control of his physically abusive father. Like all but one of his nine siblings who lived to adulthood, my father found a career that did not involve operating a farm; he became an electronics technician after attending community college and working a number of “unskilled” jobs, including in security at Wang Laboratories. He met my mother, then a freshly- educated nurse, in a faith-based social group for young singles at a Catholic church. Both of my parents were able to essentially pay for their own college education without significant loans, through a combination of tuition costs at community and public colleges that were unimaginably low by today’s standards, steady, easy-to-come-by employment throughout their student years, and government grants (like the Pell grant) which at the time covered a significant portion of the cost.
I was born in 1991, not long after the fall of the Berlin Wall, at a time when the United States had emerged from the long Cold War as the world’s only superpower. My parents were in their early twenties, and owned a somewhat cramped two-family house in a small city in the Lawrence-Lowell metropolitan area north of Boston, Massachusetts; I was their first child to survive infancy. From a very early age my parents noticed that I exhibited certain developmental atypicalities: that I stacked my toys in neat piles in my crib and sorted them by size, that I would hold my arms out horizontally when picked up, and that my early verbal development progressed much more quickly than for most children. A friend of my parents who specialized in early childhood development expressed concern that I was speaking in full sentences far earlier than expected, but my parents were unperturbed. As I grew into toddlerhood and elementary school, I developed a pattern of behavior, including an intense interest in identifying the composers of music on the classical radio stations and frequent meltdowns when faced with minor changes in plan, that would likely have resulted in a diagnosis of Asperger’s syndrome (now essentially rolled into the autism spectrum disorder diagnosis) at a somewhat later point in time, or if my parents had seen this behavior as worrying from the start. As it was I spent the first decade of my life without this being pathologized. (I was eventually brought to psychological counseling at age 11, but was treated for anxiety, not for a developmental disorder.)
In 1993, we moved a mile or so to a larger house, in anticipation of the birth of my younger brother; at the time this seemed like a completely world-altering experience, but we had a much bigger change in store. In 1997, when I was six years old, my father’s company relocated to Texas. My father struggled to find a similar job near our home; eventually a friend recommended him for a position at a microwave communications company in the Puget Valley area of Washington. The four of us moved rather abruptly; I had already been measured for a uniform so I could start first grade at the grammar school affiliated with our church; but soon enough we found ourselves moving 3,000 miles away, far from my entire extended family. Living in Washington was not hard for us; we had an enormous yard and an amount of living space that seems almost absurd by New England standards, and the public school faculty were kind and accommodating of my (again, mostly not pathologized) developmental atypicalities,but the suddenness of the change was almost earth-shattering for me emotionally, and in the long term it gave me an unsettling first impression of what “the economy” was. As a child, I developed a sense that the inscrutable vicissitudes of the market could uproot entire families without warning and send my life in a completely different direction. And even after we moved to Washington, we didn’t find long-term stability in my parents’ employment. My mother quit nursing for the first few years after we moved, but then everyone at my father’s workshop was laid off without warning as the dot-com bubble burst, circa 2001. For a while, my mother returned to nursing while my father played at homemaking, until he took a job in flight recorder repair. Between the sluggishness of the aviation industry in the early 2000s and the departure of Boeing manufacturing from the Seattle area, this was not to last. We finally moved back to Massachusetts in 2003, living in my grandparents’ basement for a few months until my parents could mortgage a new house.
If it was abruptly moving far from home in 1997 that made me aware of the economy, it was Seattle WTO conference protests of 1999, the “hanging chad” U.S. presidential election of 2000, and ultimately the airplane hijacking attacks of 11 September 2001 that brought politics home into my conscious daily reality. When the attacks began I was asleep, as I lived on Pacific Time and had a late-starting school day—but my mother woke me up early and compelled me to watch the news on her bedroom television when the story broke that a plane had struck one of the World Trade Center towers. She was crying, but she told me she thought it was important for me to see what was happening. When she was a child, her own parents had hidden from her the very existence of the war in Vietnam until it was over, and I suppose she recognized that she would not be able to do the same for me. So I watched the collapse of the Twin Towers in real time as my mother recalled aloud that taking a public elevator to the top of one of those buildings on a family vacation was one of the most exciting moments of her childhood.
It was how things developed after the attacks, as much as the attacks themselves, that changed how I saw the world. In particular, the proliferation of jingoistic imagery and rhetoric, the “These colors don’t run” bumper stickers and extraneous decorative flags, and the anti- Muslim rhetoric from Michael Savage on the radio, struck me at the time as a baffling and concerning response to the horrifying loss of life we had witnessed on television. If the immediate rush of firefighters into collapsing buildings had represented the best of humanity, this enduring urge to project vindictive American power seemed to smack of something almost unspeakable, of humanity’s less charitable impulses. Moreover, adults around me seemed unable or unwilling to address my biggest question about the event. Why? Why did this happen, and more to the point, why did people choose to do this, ending not only so many others’ lives but also their own? “Because they were evil men, because they hate America…” As a child interested in storytelling and theater, these answers did not satisfy me; they couldn’t adequately describe the flawed motivations of human beings. My lingering distrust of the hawkish line of public rhetoric simmered, largely unexpressed, and carried me right through the U.S. invasion of Iraq, until I began to see my private nightmares reflected on the television screen in photos from Abu Ghraib. Until 2001 I was made to believe that I lived in an especially enlightened corner of an enlightened world more or less at peace, and after 2001 it seemed I lived in a country forever at war, where the only common ground uniting a majority of people was a vague sentiment of “support” for the disembodied “troops.”
Though my educators up to that point had made some effort at promoting multicultural values, and had taught a somewhat sanitized version of the history of the American Civil Rights Movement of the 1960s, the first time I found myself regularly spending time in an environment that wasn’t predominantly, perhaps even overwhelmingly white and English- speaking was when we had moved back to New England for some time and my parents sent me to a Catholic high school in Lawrence, Massachusetts. In my mother’s youth, Lawrence was largely dominated by ethnic Irish and Italian communities, but by the time I started high school there in 2005, white flight had weathered the infrastructure and institutions of the town and new communities had been forming amidst the mess left behind, comprised especially of new arrivals from Puerto Rico and the Dominican Republic. Classmates at my previous school, a combined Catholic middle-and-high-school in a wealthier town closer to Boston, where my parents couldn’t really afford the tuition, acted surprised when I told them where I would be attending high school—“Central Catholic?” they’d ask, “Isn’t that, like, in the ghetto?” Sadly, neither the faculty of Central Catholic High School nor the student body was entirely reflective of ethnic and economic diversity of the surrounding community, but it did turn out to be a much more diverse environment in those respects than my previous school, and I found myself more at ease there for that reason—because although, being white, I did not immediately stand out in the student body at the previous school, I did have a distinct sense of being somehow apart from the student culture, and felt that far less acutely in this new environment, as the student culture seemed much more loosely defined.
It was at this time, as I entered high school, that my parents divorced. There were a lot of factors in the divorce, but one of the most important was my parents’ differing conceptions of the institution of marriage. Over time my mother came to understand that my father held a more traditionalist view of marriage, believing that their respective gender roles naturally conferred upon them different sets of responsibilities. My father resented that my mother earned more money at work than he did, expected a degree of control over my mother’s life and household affairs that my mother found unreasonable, and was often dismissive of her work at home, maintaining the household, while refusing to share the burden of this work. Lifelong marriages are a cornerstone of Catholic tradition, and at an earlier time in history, my mother would likely have remained married to my father for the rest of her life, despite their differences. As it was, they remained married for well over a decade, though my mother will confess to having doubts in their very first week of marriage.
The meaning of marriage was a very timely topic in my youth. Same-sex marriage became very much a hot-button issue in my social milieu in 2004, when I was in middle school, as my home state of Massachusetts became the first state in the U.S. to extend marriage to same-sex couples. Most of the rhetoric on marriage that I was exposed to was influenced by Catholic doctrine (which is governed by a central authority that does not recognize same-sex marriage) or by the reactionary “social conservative” politics espoused by figures like Howie Carr, whose radio show I then heard often in my mother’s clunky Saturn L300. At first I learned to parrot these ideas when probed, but as the limitations of state-recognized marriage evolved into an enduring national issue over the next several years, and as I became more capable of independent thought, I began to mistrust this line of thinking just as I had distrusted the wave of militaristic nationalism I had witnessed beginning in 2001. It was to become an increasingly personal issue for me. In high school I was increasingly aware that I didn’t possess the sexual attraction to women that was supposed to be a defining feature of my adolescence. This is to say, I had some sense that I wasn’t straight, but neither did I have what seemed a defining experience for gay and bisexual men—I couldn’t really say that I was attracted to men either. This became a source of constant, private distress to me until 2011, when I was in college for the first time and first discovered a community of asexual people. It was only then that I gained the ability to express my personal connection with the broader national and global social movement for LGBTQ rights.It wasn’t only Howie Carr’s opposition to same-sex marriage that increasingly bothered me as I came of age; he also expressed ideas about the political economy that seemed to align less with reality the more I learned about the world. At the time, what my mother seemed to find most agreeable about Howie Carr’s ideas was his praise of American meritocracy. Howie Carr and my mother agreed that people who had a lot of money had it because they worked hard for it, and that upward social mobility was open to anyone who was diligent to pursue it. None of this made sense to me, because the evidence in my own life didn’t bear it out. My mother was just about the hardest-working person I could imagine; at the time she was a full-time nurse in a chronically understaffed medical-surgical unit at Saints Memorial Medical Center in Lowell. (The hospital is now a campus of Lowell General Hospital.) She worked long hours, often clocking out and continuing to work, or working straight through her unpaid lunch breaks, because she could lose her job if she took overtime or skipped breaks, but patients would be left unsupervised if she did not continue to work anyway. Yet this hard work never seemed to help her get ahead; she continuously struggled to balance the household budget, and when the hospital began to have financial trouble, she lost her accrued vacation time. And when it came to my own ability to participate in the fabled American meritocracy, I didn’t seem to have much going for me. My parents first started cajoling me to start my first job in the immediate wake of the 2008 Great Recession; many of my peers had said they went looking for part-time work in entry-level service positions only to find themselves competing with struggling middle-aged homemakers. Still, I did what my parents suggested and applied for work to any random store or fast food business I thought I could regularly get to, and a job never materialized for me. All of this eventually brought me to a position of sympathy for the Occupy Wall Street movement when I saw it materialize in person in New York, in 2011.
In 2009 I graduated high school and went to college for the first time; my parents co-signed enormous student loans so I could attend a small liberal arts college in Manhattan. I initially planned to study theater, but my mother told me in absolute terms that I would not be able to financially support myself in the long-term if I studied theater, and I believed her, so I chose what I thought was the closest viable field: psychology. I devoted myself very seriously to pursuing my understanding of my chosen field and earning my degree, but by the middle of my third year in college I started to realize that I had a problem I didn’t know how to solve: I couldn’t complete many of the large writing assignments I was given in my required literature and history courses. I sought help from my schools psychological counseling office and was told I “probably” had ADHD, but the school’s tutoring services for students with ADHD were contingent on presenting an official diagnosis of the condition, and I could not find anyone who could screen me for ADHD at a price I could afford with my health insurance. Some of my friends who were involved in the autistic self-advocacy movement had also begun to explain to me why they felt I might be autistic. (Eventually, after I left school, I brought this up with psychological counselors who agreed I was “probably” on the autism spectrum, but as with ADHD, I have similarly never obtained a formal screening for autism spectrum disorder.) With or without any official diagnosis, however, the behavioral or neurological atypicalities that had made me hopelessly “weird” throughout my youth, that had been dismissed as side-effects of being a “gifted” student, or medicable symptoms of anxiety, were standing in the way of me finishing my degree, and I had never developed effective coping strategies or mitigations for them. In the end I flunked a few classes in one semester and went into a tailspin of depression and secret suicidal ideation and finally dropped out of school in 2012.
Shortly before I dropped out of college, my extended family gave me an opportunity I never had before or since, to travel outside the United States and Canada. Specifically, I was given a ticket to visit my uncle, who was then working as a director at a school in Panajachel, Guatemala, and his wife, who was writing grants and performing other volunteer work for an indigenous women’s organization called Oxlajuj B’atz’. It was an eye-opening moment for me in a number of ways. It was the first time I saw the American empire from the outside—in a country where the ongoing economic and social effects of neoliberal U.S. foreign policy were still very apparent, and sometimes led to open hostility. My uncle later told me that mere weeks after we had spent a night in Antigua Guatemala, and had stopped on a park bench under the street lamps (where I smoked the only cigar I’ve ever had in my life, because my uncle thought I ought to have a Cuban cigar while I was in a country where one could openly buy them), two American tourists were stabbed to death, apparently because they were visibly American. And this didn’t shock me, perhaps, as much as it should, because that night in Antigua Guatemala we had snuck into the courtyard of an opulent hotel frequented by American industry executives, that we could never afford ourselves, to see some of the city’s historic ruins that are surrounded by the hotel. Seeing what it cost to stay there in comparison to the going rates for simple going rates for various services, I could only imagine the resentment that might build amongst people who lived in great poverty – the people I saw carrying enormous bundles of wood on foot, using straps around their foreheads to balance the load, alongside the country’s long, dusty highways, or the tuk-tuk drivers in Panajachel who could expect no more than a 5-quetzal fare for driving someone clear across town. These people lived always in the shadow of ostentatious wealth that has been controlled by institutions like the IMF on behalf American companies like Chiquita, which instigated a coup in Guatemala in 1954, when it was known as United Fruit Company.
In Guatemala, I also witnessed the run-up to the Guatemala’s 2011 general election. There were unconscionably loud rallies and dance parties in the street organized by supporters of the two presidential front-runners, including alleged war criminal Otto Pérez Molina, who ultimately one the presidency. I heard rumors that both of these two campaigns had already begun discretely advertising the prices they would pay for individual votes. Around Panajachel, I saw many hand-painted campaign advertisements for the leftist UNRG-MAIZ coalition and the presidential candidate it was endorsing, Rigoberta Menchú, who won a Nobel Peace Prize for documenting war crimes committed against indigenous Guatemalans in the Civil War. I even attended an event at Oxlajuj B’atz’ in which indigenous women spoke about the upcoming election. My Spanish was poor, and their speeches were peppered with K’iche’ loan words I didn’t know, but I picked out a recurring theme: “Every time an election is coming, we hear so many promises, and as soon as we have voted these people into office, they forget about us.” I see shades of the 2011 elections in Guatemala whenever I look at U.S. electoral politics.
Back in the U.S., the proliferation of smartphones—and, with them, mobile social media usage and citizen video journalism—was about to bring about a groundswell of awareness of the frequent violent deaths of Black Americans during encounters with law enforcement and adjacent organizations. I first became really aware of this as a contemporary problem in March 2012, after a civilian neighborhood watch member in Florida pursued and murdered unarmed 17-year old Trayvon Martin, who was walking home after buying snacks from a convenience store. Over the coming months and years the names of more lives cut short like this came to national attention. Just a few of them were Rekia Boyd, Michael Brown, Eric Garner, Tamir Rice, Freddie Gray, Sandra Bland, Philando Castile… The public demonstrations that followed these deaths were not physically close as I dropped out of school and left the city, but Black Lives Matter movement that arose from them organically was always present in the culture around me. And so, too, was the movement’s opposition, a jingoistic “pro-police” movement that eventually adopted the slogan “Blue Lives Matter.”
Having seen up close what poverty looks like in the developing world, I soon got a taste of poverty as a young adult in the U.S. When I finally found work after leaving school, I began a series of minimum-wage or near-minimum wage retail jobs, and moved in with two people I had initially met through the asexual community in Boston and through their occasional involvement in a local chapter of the Autisitc Self-Advocacy Network (ASAN); the three of us became committed partners, and, for reasons of disability, I was the only one of us consistently capable maintaining a job. Some of the jobs I held during this period were stable positions that gave me 40 hours of work per week with the potential for overtime, but for about three years I was stuck in a job at Target where hours varied wildly in quantity and scheduling, ranging from 35 hours per week during the busiest parts of the year to 12 hours per week at other times, leaving us often unable to pay our rent in full when my hours dropped without warning. This, combined with frequent threats that I would be fired for underperformance, a state of disrepair in our Fitchburg, Massachusetts apartment, and our inability to continuously heat our apartment and maintain hot water during the winter months, caused an enduring strain in our lives, that exacerbated anxiety in all three of us and seemed in particular to worsen the debilitating chronic pain one of my partners experienced. Eventually, the three of us moved into an apartment in my mother’s two-family house, and as the labor market loosened, I finally found a steady job at the local wholesale club that guaranteed 40 hours of work per week—though, as in all retail jobs I’ve worked, I was strongly warned not to participate in any attempt to unionize the workplace. It was only when we reached this level of material stability that we decided to have a child, and that I could finally return to college and study again.
In 2016, while I was preparing to become a college student again—but as a working parent, this time, and with more specific plans for using my coursework to land a better-paying job—the national political landscape seemed to implode around me. In the early stages of the 2016 U.S. presidential primary elections I had followed the Bernie Sanders campaign with great interest. Here was someone who seemed capable of at least acknowledging the difficulties of everyday people’s lives, of the struggle to balance unreliable working conditions with a difficult housing market and financial obstacles that prevented many people from obtaining even basic healthcare, who at least made some attempt to address shameful police killings of ordinary black people like Eric Garner and Sandra Bland respectfully, as a policy issue. It wasn’t to be; even early on, the Democratic party primary race in which Sanders was running came to be dominated not so much by any kind of policy proposals as much as by pure opposition to one of the contenders in the Republican primary, Donald Trump, who launched a campaign founded on a fear-based proposal to effectively halt immigration from Latin America and from majority-Muslim countries. The Democratic party leadership coalesced around a moderate, established political figure, former Secretary of State Hillary Clinton, as the only viable candidate to oppose Trump, as Trump was proving that he was the only candidate in a large Republican slate who could generate real enthusiasm for the party. Thus, with these two parties having long held an effective duopoly over U.S. government, the enormously expensive and furiously contested 2016 U.S. presidential election was framed to the public as a choice between a perhaps mildly reformed status quo and a political paradigm rooted in overt anti-immigrant sentiment. Ultimately, a majority of voters would back the status quo represented by Hillary Clinton, while Trump, who received 2.8 million fewer votes than his main opponent, nonetheless won the presidency due to the country’s Electoral College system which awarded the office on the basis of 538 “electors” alloted to the 50 states, with greater statistical weight given to less populous states. It was the second time in my life that the U.S. presidential candidate who won the most votes from the general population was not elected to the office.
In the run-up to this election I had watched the bounds of acceptable political discourse in the United States change wildly. The fight for a minimum wage commensurate with cost of living and a single-payer healthcare system, the hallmark demands of Bernie Sanders’ scuttled 2016 campaign, were largely abandoned in the national political scene as, at best, demands that would have to be delayed until Trump’s overt appeals to overt racism and chauvinism were defeated in the election. Meanwhile, increasingly bizarre displays of a fascistic cult of personality emerged in and around Trump’s campaign events and gradually gained acceptance in the political mainstream. “Roman salutes” most familiar from World War II-era Nazi propaganda appeared amongst the candidate’s supporters at rallies, journalists from mainstream news outlets were relentlessly harassed and physically intimidated in their press pens, and an emerging group of “alt-right” organizers were credulously interviewed about their commitment to a “white ethnostate.” I knew that this fascist cultural undercurrent would not simply dissipate if Hillary Clinton won the presidency, but it only accelerated when Trump was elected instead. I had a shift at work before dawn the day after that election day, counting inventory for all the jewelry stocked in the store. It was obvious which of us were concerned… bleary-eyed, groggy, distracted, as if we had all caught a bad cold at once. Those among us who had been cheering for this new president were quite chipper.
Determined to at last finish a bachelor’s degree while avoiding long-form writing assignments in history or literary analysis, and to move as quickly as possible into a full-time job that could financially support my growing family, I became a part-time computer science student at UMass Lowell in the Spring 2017 semester, cobbling together an awkward schedule of work shifts at 3am or 4am in Salem, New Hampshire followed by afternoon classes on campus in Lowell; I continued my studies through the summer of that year. In the following autumn I was already beginning to search somewhat desperately for an internship I could use to build my résumé and ultimately get out of retail work. None of the applications I made seemed to get any interest for months until, out of the blue, one of the first companies I had applied to asked to interview me for a different internship position.
That internship began in the summer of 2018, immediately paid almost double the hourly wage I had been paid at the wholesale club, and was ultimately extended through the rest of my studies. Working for this software company, I had to adjust to a lot of things, like not being expected to perform obvious physical labor at all times, being encouraged to casually chat with coworkers throughout the day, having the opportunity to just step away from my work and grab a quick company-provided snack or some water pretty much whenever I thought it could help me clear my head. Wearing comfortable clothes that didn’t have to be in identifiable company colors. Being invited to lunch on the company dime sometimes. This last practice was especially bewildering to me at first; I remember wondering nervously whether I would be expected to cover my tab after being trotted out to some pizza-and-pasta restaurant that was a little more upscale than anyplace I would usually visit.
Another thing that distinguished this internship from the jobs I’d had before was that it was extremely flexible in its scheduling; I was able to become a full-time student after that initial summer and work sporadic part-time schedules with the company between classes. Things accelerated at work and at school until I realized I’d be able to complete my degree even sooner than I’d hoped, in May 2020, and my manager offered me a full-time position. My commencement was scheduled for a Saturday; the following afternoon I had been booked on a flight to North Carolina in order to attend a two-day employee orientation at corporate headquarters.
Or, that was the plan as of February 2020. Early in March I had a week of spring break at home, during which the governor of Massachusetts declared a state of emergency as it became clear that many Massachusetts residents that had been denied testing for the new coronavirus disease COVID-19 by the CDC did in fact have the disease, and that it had already spread to countless others who didn’t even know they had been exposed. That week many universities announced that they would abruptly make all instruction online-only, something they’d never done before. I haven’t been back to UMass Lowell’s campus since then, and six months later life in the United States is still shaped by its enduring COVID-19 outbreaks. But I’ve been lucky. Those close to me who have had the disease are in good health again, and unlike millions of people in this country I still have my job; they’ve even allowed me to continue working from home all this time.
The public reaction to the COVID-19 pandemic in the US has a markedly different character what I saw following the events of September 2001. For one thing, the memorializing of the dead is hampered because we aren’t watching them die in a fiery inferno on live television, because their deaths are happening all around us, constantly, and not all in a terrifying instant, and because we don’t know when the deaths will finally stop. This detachment from the reality of the crisis also allows bizarre new heights of denialism. Whereas conspiracy theories about the 11 September attacks posited that they were an “inside job,” and that elements of the events we saw on television were faked in order to cover the true identity of the perpetrators, COVID-19 conspiracy theories often question whether the pandemic is happening at all, suggesting that its enormous death toll is a statistical fabrication. At this point, the extent to which one believes the virus exists and that public health precautions are necessary is a matter of intense political rhetoric. It will be hard for us as a society to heal from a disease when we can’t even agree that it exists. And this dialectic of opposing conceptions of basic reality extends, too, the the ongoing Black Lives Matter movement, which reached new heights of public attention amidst the pandemic lockdowns of the summer of 2020, as demonstrations followed the murders of Elijah McClain, Breonna Taylor, and George Floyd.
We learned something else during that week in March: we’re expecting to welcome a new member of our family before December. There’s a lot we don’t know about how things will unfold then, like which of us will be allowed in the hospital at the time of the birth, or whether this pandemic will persist here long enough that the child will remember it later in life. All we can do is make near-future plans. Plans to prepare our living space for a newborn, for saving up to hopefully buy a house for the first time, perhaps a year from now.
The convoluted story of my life thus far is nothing I would have planned for myself; it is as much a story of accidents and unexpected events as it is the story of my own choices. History, with its absurd social, political, and economic upheavals and fluctuations, has shaped my life as much as, or more than, my own character. Thankfully, however, historical progress and collective action for positive change have also had an impact on my life, allowing my family and I to live in relative peace today. But in 2020, looking back over the course of my life, there is very little I can say with confidence about our shared future. All I can believe in now is change.
Published 17 Sep 2020, last modified 17 Sep 2020
Some things I want to do with hamster.dance soon, as a reminder to myself more than anything:
In the longer term I’m considering migrating the site to a NixOS VPS, implementing a collaborative hypertext gamebook, and hosting some of my own interactive fiction here. I’d also like to have some kind of Gopher or Gemini mirror of my blog, but I don’t know if I currently have the time to maintain such a thing in addition to the HTML-over-HTTP version.
Published 6 Aug 2020, last modified 15 Sep 2020
It’s late summer here, time to think about school again. I finished my bachelor’s degree at long last in May; otherwise it would be about time to set up my personal calendar and start letting the folks at work know what times of week are best for reaching me between classes. If we hadn’t already decided long ago to homeschool through the early years, it would be time for us to prepare our child for her first days away at kindergarten. There were times in my youth when, for reasons related to my neurodivergence and the sometimes unforgiving expectations of the schools I attended, that anticipation of the new school year could make me feel physically unwell. As I have gained more autonomy in my life and grown into adulthood (I can hardly believe I’m nearly 30 years old), I have learned to cherish this back-to-school season as an annual moment of new plans, of relative calm with Halloween as the only major commercial holiday on the horizon, and of uncharacteristically gentle weather in New England. The appearance of back-to-school merchandise in stores, then, is an obnoxious reminder of misgivings about school as an institution, but also a harbinger of decent times and a source of cheap office supplies when it all goes on clearance.
This year, 2020, is, of course, different. This year the very idea of school as a place, of children crowding into classrooms for hours at a time—even if schools succeed in making them wear masks throughout the day—strikes fear into my heart.
Yes, I’m in Massachusetts, where COVID-19 deaths & cases have come down from their late April peak and settled into a relatively low statistical valley. And yes, we know that children under 10 are far less likely than other people to actually become ill with the disease. But in the US, where state lines exist more in theory than in the physical realm, a plague anywhere in the country means a potential plague everywhere else. We don’t really know yet with empirical certainty whether children under 10 are as unlikely to spread the disease, compared to adults, as they are to develop pearly obvious symptoms. And in places like the suburbs of Atlanta, Georgia, students, including many older than 10, have already started a new school year, leading to social media alarm over pictures of mostly maskless adolescents packed tightly into jammed school hallways, rushing to class as if this were the Before Times. If that’s how school looks here when it resumes in September, I do not expect that we will remain in this peaceful statistical valley of COVID-19 cases.
The reopening of American schools at a time when this disease is out of control in so many areas is a sort of vast, uncontrolled experiment. A lot of parents, back to working outside the house out of necessity, cannot supervise their children at home during the workday, and the fact is that, culturally, the American school is just as much (if not more) a place to put children during the business week when their parents are expected to focus on something else as it is an institution of training or learning. We do not know, as a market-driven society, how to have children around, participating in our lives all through the work week, and it seems that on a national level we’d rather experiment with putting our children back into these crowded institutions in the midst of a plague than with rearranging the dynamics of labor and the family unit to allow children to stay at home and learn in the home environment until all the metaphorical fires have been put out.
It was in early March, during the last spring break of my academic career (which I spent quietly at home) that a lot of US universities started abruptly moving all instruction online. My university eventually did the same. What was at first a temporary delay in returning to campus quickly became the rest of the semester. It was awkward as hell; generally, instructors who teach well in person do not have a universal nack for teaching their classes online, not could they be expected to have online-friendly curricula prepared without notice. But we all did this—the awkward videoconference class sessions, scanning exam papers with our phones to submit them, the bizarre Commencement ceremony video with long speeches from deans and pre-recorded well-wishes from assorted television personalities I mostly failed to recognize as I watched in my cap and gown amongst my family at home. We did it all with an understanding that we were operating within a social contract: “stay at home and flatten the curve.” The promise was that we were buying time to make a return to physical social spaces safe, to get things under control. And that pretty much happened here, in Massachusetts. We had an awful crisis in long-term care facilities, where chronic understaffing and inconsistent funding & oversight exacerbated the effects of a disease that is already devestating amongst elders, but overall the statewide effort to curb the virus, the closure of businesses & public places eventually combined with a loosely enforced mask mandate, had a noticeable impact on public health data. We are doing relatively okay here—for now.
But “relatively okay” does not mean “back to normal,” nor does it even mean we’ve reached the level of control in which I’d happily return to doing my office job in the actual office, whatever guarantees are made about face masks, reduced occupancy and improved airflow. When I realized, back in March, that I had already gone to class on campus for the last time without even knowing it, I told myself maybe I’d visit sometime in the fall, just to properly say goodbye. Now the university, surely cognizant of the budgetary risks of doing otherwise, has elected to reopen its campus for the fall semester, but I’d feel irresponsible setting foot in the area. Just the act of existing there as an uninvited human body seems wrong now.
When I go out for exercise these days, often the only person on the sidewalks or the local bike trail wearing a mask, and I see these groups of kids enjoying the last hurrah of summer before school resumes, I have to wonder if we’re about to undo all the painful work we did.
Note: As of 11 August 2020, in light of a slow but worrying upward trend in Massachusetts COVID-19 cases, my university has chosen to scale back the reopening of its campus, and will only be hosting a minimum of in-person lab and studio classes, and students who have demonstrated a particular need to be on campus; nearly all instruction will be remote. The public school district in my town is currently still planning to employ a “hybrid” approach of in-person and virtual classroom instruction.