From my point of view, there is no fundamental reason that machines could not, in principle, someday think, be creative, funny, nostalgic, excited, frightened, ecstatic, resigned, hopeful, and, as a corollary, able to translate admirably between languages. There’s no fundamental reason that machines might not someday succeed smashingly in translating jokes, puns, screenplays, novels, poems, and, of course, essays like this one. But all that will come about only when machines are as filled with ideas, emotions, and experiences as human beings are. And that’s not around the corner. Indeed, I believe it is still extremely far away. At least that is what this lifelong admirer of the human mind’s profundity fervently hopes.
When, one day, a translation engine crafts an artistic novel in verse in English, using precise rhyming iambic tetrameter rich in wit, pathos, and sonic verve, then I’ll know it’s time for me to tip my hat and bow out.
A day later, we drove to a diner that multiple local guides said had the best deal in town, and warned to arrive early to fight for a seat. The parking lot was straight-up empty. Where were all of the old people? What of the need for an $8.99 chicken breast with a pair of watery, steamed-vegetable sides? What happened to the early bird special?
The short answer, I learned, is that the retirees who heralded the early bird are going away, and that their replacements, while burdened by the overall decline of the middle class, have different expectations about what retired life should look like — mostly, they do not want to be reminded in any way that they’re old now, especially if they can afford that luxury. Millennials might be killing chains, but boomers are driving the early bird to extinction.
At some point it dawned on me why I felt so connected to the show: it is, emotionally and often structurally, exactly like a writing workshop or, more loosely, like the art of writing as a whole. A cookie in place of a poem, a cake in place of a story. All day, the bakers stand at their little islands, feverishly attempting to create something that is both beautiful and tempting, that others might enjoy. At the end of each challenge, they’re covered in flour and chocolate, their cooking areas a mess of dirtied spoons and orange peels. Then, one by one, they are forced to approach the judges bearing the fruits of their labors, vulnerable to ridicule and eager for praise. They then wait patiently as their superiors literally tear their creation into pieces before determining their worth as an artist. Whatever the contestants have baked, it’s the best they can do, and yet they understand that sometimes the best is still not enough.
Unlike the hours one has during air travel to while away on, say, an unexpected Mack Bolan, my time on the subway (probably 20 to 30 minutes, tops) feels constrained and precious, so I demand high-reward reading. While I envy the razor-focused commuters who crouch over a dog-eared Dostoyevsky, I’ve learned that my subway-brain, addled by constant announcements and the overheard conversations of my neighbors, can’t give a dense classic the close attention it requires. Nor can I abide playful meta-fiction, digressive autofiction or anything that’s coy with its charms. If the perfect “beach read” is like a slowly melting margarita, to be kept close at hand and sipped at lazily, then the perfect “subway read” is like the hypodermic needle that gets jabbed through Uma Thurman’s breastplate in “Pulp Fiction.” It delivers a jolt, stat.
He promises no happy ending to the tensions that still plague France, but the book manages to thrill and entertain, while never losing the sharp political edge that also makes it important.
And suddenly, I was back. I unboxed my archives of maps and notes, all of it carefully annotated in a fourteen-year-old’s attempt at calligraphic hand. Drawings, stories, rules, maps; it was all there, waiting. And to my surprise, everyone loved it. Even my wife. The former track star was now an elven assassin. In fairness, she played mostly out of love for everyone at the table, but she played and had a great time. We all did.
Why had I stayed away so long? It was ideal family time—everyone looked directly at each other over the table, eyes bright, describing their next move in detail, moving their miniature warriors around the grid map on the table, engaging with the story, building powerful avatars of good. No phones, no screens, no video games, no earbuds—just family talking and laughing for hours at a time.
Older writers find younger ones irritating, Martin Amis writes in “The Rub of Time,” his fourth nonfiction miscellany, because their emergence is like a series of telegrams from the boneyard. “They are saying, ‘It’s not like that anymore. It’s like this.’”
Zadie Smith must be particularly galling to him. It’s not just that she was born in 1975, he in 1949. Her novels, beginning with “White Teeth” at the turn of the century, have deactivated many of the power instruments of Amis and his literary generation.
It is Genevieve Fox’s misfortune – not ours – that she is joining the ranks of those writing about cancer. As an accomplished journalist, she could write about anything and make it interesting. This exceptionally involving memoir doubles as a narrative about growing up as an orphan. And the strands dovetail – she lost her mother to cancer when she was nine years old (her father died of a heart attack when she was younger still). Her book considers orphans in literature (Harry Potter, Jane Eyre, Caliban) and she reflects upon this theme partly because, as a mother, she is steeling herself against history repeating itself. In her narrative, a ramshackle past and fraught present collide. Her account of her outlandish upbringing (to the limited extent that she was brought up at all) astonishes with its mix of privilege and neglect.
McGregor is a beautiful, controlled writer, who can convey the pathos of a life in a few lines. Despite the large cast of characters, each feels specific and real.
At any given moment, thousands of trials are taking place all around the globe, geared toward eradicating Alzheimer’s disease, Lewy body dementia, frontotemporal dementia, vascular dementia and all the other dementias, disorders and syndromes that gradually, inexorably turn once-intact brains into hollowed-out husks. Each trial usually includes scores of patients, who researchers call enrollees, and each enrollee has at least one caregiver, and sometimes two or three. Before the enrolment, and before the caregiver signs the paperwork and commits to monthly visits, there are all the conversations between the caregiver and other concerned parties (the enrollee’s children, for example) – the strategy sessions. The thinking is: we can only take part in so many studies; if we pick the right study, we get first dibs on the magic potion, before anyone else, at a reduced rate. So you do your due diligence. You follow developments.
It takes a long time to see things clearly. First, you must convince yourself that nothing can be done. So you scour, hunt, email, call and text doctors and PhDs in New York, Washington, Baltimore, Phoenix, Los Angeles, Tel Aviv, Stockholm and Paris. You Google “Alzheimer’s groundbreaking research”. You persist with a crazy lust for answers. Everyone you speak to is exceedingly kind. They have had this conversation many times. You start by saying, almost apologetically, that you want to know what the most promising trials are, where they are taking place, who is in charge. A few years ago, there was a lot of buzz surrounding solanezumab. You ask questions about timeframes, conventional wisdoms, odds and percentages. You lapse, unwittingly, into descriptions of your father, which are really synopses of his curriculum vitae (“then there was his PhD, then med school, then his residency at Mass General”), which are fuelled by a deeply irrational subtext: certainly, you can’t let him fade away.
Fifty years on, though my father has long since retired from regular practice both as a doctor and as a father, I’m still chasing after that recipe for life and still, four times a father myself, doing part-time work as a son. At this point, to be honest, being my father’s son is less than a sideline; it’s more like a hobby, one of a number of pastimes acquired early, pursued with intensity, laid aside, and then only intermittently, over the years, resumed—origami, cartooning, model building, being a baseball fan, being a son. I think of my father at least once a day, try (but fail) to call him once a week, and, as required, afford him regular access to his grandchildren. Beyond that, the contours of the job turn vague and history-haunted. Outside the safe zone of our telephone calls, with their set menu of capsule film and book reviews, amateur political punditry, and two-line status reports on the other members of our respective households, the territory of our father-and-son-hood is shadowed by the usual anger, disappointment, and failure, strewn with the bones of old promises and lies.
One of the terrors of damnation is that there is no course correcting. Final damnation is, well, final. The punishment outlasts the crime by an unimaginable margin, and no one gets to repent. In Hell, you know you’ve done wrong but you can do nothing about it, forever. In the Divine Comedy, the damned do glean some facts from the Pilgrim here and there, but, as in the case of Farinata, what they do learn only intensifies their suffering. For Dante, as for much of the Christian tradition, Hell is defined by stasis. Satan is characterized by his stillness, kept stagnant in the middle of a frozen lake at the bottom of creation. Sisyphus doesn’t build a pulley or refine his rock-pushing technique. As the Silver Jews put it in their song, “Tennessee”: “The dead do not improve.”
But in Michael Schur’s series The Good Place, the education of the dead drives the plot.
Mitchell is 61. Three years ago, she was diagnosed with Alzheimer’s; she also has vascular dementia. She is young to have an illness that is associated with age: in the UK, about one in six people over 80 will get dementia. Yet early-onset dementia represents approximately 5% of the 850,000 people living with the condition in the UK, and it may be significantly under-diagnosed. This means that at least 43,000 people of working age have it, some in their 30s and 40s. Mitchell is cruelly unlucky but not unique.
What marks her out, however, is that she has written a book about her experience, making a narrative about the loss of narrative; finding words for the failure of language; giving a voice to emotions that are usually unspoken. Somebody I Used to Know, written with the help of the journalist Anna Wharton, is a lucid, candid and gallant portrayal of what the early stages of dementia feel like, from the days of fog and exhaustion, through the bewilderment of medical examinations and psychological tests, into the certainty and fear of knowing what was wrong – and then into fear’s aftermath, which for her meant finding a new purpose, a way to be optimistic and valuable in the world in the face of her own unravelling. This memoir, with its humour and its sense of resilience, demonstrates how the diagnosis of dementia is not a clear line that a person crosses; they are no different the day after the word is attached to their sense of confusion and the vague but insistent sense that something is wrong than they were the day before. They have an illness for which at the moment there is no cure, but they are not other; they are still one of us. It is the stigma and the loneliness surrounding the disease that turns something that is painfully hard into something that is barely endurable.
If you watch Danez Smith’s poem dear white america on YouTube – where it has racked up more than 300,000 viewings (not the sort of figures poetry usually attracts) – it is easy to see why Smith is becoming a phenomenon. The video is a powerful introduction to the collection Don’t Call Us Dead (a finalist in the US’s National Book award for poetry), which is about to be published in the UK. Smith has a colossal gift for performance. You are moved – shaken – as if you had been involved in an argument you couldn’t win. And, in a sense, if you are white, that describes the position. The poem – set out like prose – is a raging, calculated polemic that needs no critic (though the New Yorker has devoted pages to Smith), and that contains its own commentary. It imagines leaving Earth in search of somewhere black people can uncomplicatedly reside. It builds quickly, turns emotion inside out, presents valediction as protest. Smith has the first and last word, and all those in between.
For three months, the Spanish and French negotiated the end to their long war on the island, as it was considered neutral territory. Wooden bridges were extended from both sides. The armies stood ready as the negotiations began.
A peace agreement was signed - the Treaty of the Pyrenees. Territory was swapped and the border demarcated. And the deal was sealed with a royal wedding, as the French King Louis XIV married the daughter of the Spanish King Philippe IV.
A grinning toddler is bundled in a creamy quilted blanket and bear-eared hat. Next to him, an iPhone atop a wicker basket displays a Winnie-the-Pooh audiobook. The caption accompanying the Instagram shot explains, “i am quite excited to have partnered with @audible_com.... i’m not sure who loves it more, this little bear or his mama!?”
More than 260,000 people follow Amanda Watters, a stay-at-home mom in Kansas City, Mo., who describes herself on Instagram as “making a home for five, living in the rhythm of the seasons.” Her feed is filled with pretty objects like cooling pies and evergreen sprigs tucked into apothecary vases, with hardly any chaos in sight.
This is the “mommy Internet” now. It’s beautiful. It’s aspirational. It’s also miles from what motherhood looks like for many of us — and miles from what the mommy Internet looked like a decade ago.
Yet beneath all of the advice, recipes, and techniques runs a subtle, political message about the power of domestic happiness to affect social change. Colwin’s focus on domesticity is about reclaiming the home as a locus of connection with family and with the outside world. Home is both a place to recuperate from the outside world and a place to let the outside in. Home is a safe haven for experimentation in the kitchen and in conversation with others. Colwin’s essays present the case that domestic happiness is not a luxury, but is instead vital to both the personal and the political. For those who might not be comfortable protesting for social change in the streets, Colwin suggests beginning in the kitchen.
The poetics of witnessing give the text a voyeuristic look, like readers are watching the story unfold from the outside looking into an isolated, atemporal, eerie space: “[F]or a strange moment it did feel like we were alone together on the edge of the world, and that I was somehow both vulnerable and entirely safe,” says the narrator. Winnette’s book is in fact somewhere on the edge of the world, away from what is familiarly human. As the children hang out on this edge, alone, haunted, and threatened, the events that punctuate their lives — however unlikely or long-winded some of them might be — puncture the reader at her core. It’s the unheimlich, the uncanny nature of Winnette’s story that makes each narrative occurrence visceral and creepily familiar. Against all odds, we end up believing the child, and we might be the only ones. “I felt I could see all of humanity in that progression of faces. I wept too, and for all to see.”
For one girl, her seriously ill sister posed a life-threatening danger. If she died, the other would live only a few more hours. For the other, however, the healthy sister was a life support. The 3-D model we made of the skeleton and the blood vessels of the twins clearly shows the artery running from one body to the other right across the lower chest, supplying it with oxygen-rich blood. We knew that if we separated them, we would have to cut that lifeline.
We sought advice from the pediatric ethics committee of our hospital. In many extensive conversations, I learned how important it is how to frame such a situation: Our intent was not to end the life of one girl, but rather to save the other's. The difference is subtle, because the result would be the same: We would push two living children into the operating room and leave it with only one.
The distinction between the two awards is suggested by the way in which the respective Oscars are bestowed. For Best Picture, the people whose names are mentioned when the envelope is opened and who go onstage to receive the statuette and give a speech are the producers. They are, for starters, the business people, who arrange for the funding, oversee the spending, and organize the shoot. What’s more, in big-budget studio filmmaking, whether classic or contemporary, they’re all-powerful, retaining the ability over the director to make the final decisions on such matters as casting and editing, ordering reshoots (even by other directors), and maintaining ultimate control over the movie that’s released.
It’s in that gap, as it played out in the classic age of Hollywood—in the distinction between the total and independent creators, who oversee every aspect of their own productions (such as Charlie Chaplin), and studio filmmakers, who were employees directing at the sufferance of studio bosses (such as Nicholas Ray)—that the idea of the modern cinema, of the “auteur,” the director as the actual artistic creator of a movie, came in. It’s a uniquely powerful idea because it’s often an accurate one—it speaks to the experience of viewing, the understanding that the unifying factors of Ray’s films from studio to studio are far stronger (and far more original) than the studio’s imprint on each of them.
Surely some people can read a book or watch a movie once and retain the plot perfectly. But for many, the experience of consuming culture is like filling up a bathtub, soaking in it, and then watching the water run down the drain. It might leave a film in the tub, but the rest is gone.
“Memory generally has a very intrinsic limitation,” says Faria Sana, a professor of psychology at Athabasca University, in Canada. “It’s essentially a bottleneck.”
O.K., granted I’m a sucker for anything to do with Berlin, so you may want to weigh my opinion with that in mind. But it strikes me that Cristina García’s “Here in Berlin” is one of the most interesting new works of fiction I’ve read.
But, let’s say you aren’t machine-gunned or beheaded or hacked to pieces with a machete just on your way through the airport to (so-called) security. Let’s say you survive the full-body scan as well as the obligatory two-hour, duty-free-spangled dwalm afterwards, searching the departure area for a decent bar. Let’s go crazy and suppose that you make it to your seat on the plane without being publicly shamed, involuntarily ousted, or socked in the jaw by airline staff or a fellow passenger.
Your reward is that you then must fly. During the airless, comfortless journey that follows (for which you more and more wondrously have to pay), amid air contaminated by engine oil and other toxic substances, you will also be at risk of radiation, congestion, constipation, nausea, dizziness, headaches, hypoxia, jet lag, flatulence, the flatulence of all (and I mean all) the people around you, deep vein thrombosis, fleas, bedbugs, and whooping cough. No one delays a flight because of illness anymore—that would be costly and cowardly. Instead, they leap on board in the service of their microbes, dutifully coughing, sniffing and exuding right next to you for hours. If you’re very unlucky, you may catch Ebola or TB while innocently trying to untangle your gratis audio set; to be capped by Montezuma’s revenge on arrival at your destination.
And what a plan it was. Eight days after Otto’s death, my putative guide Shane, an Irishman one year my senior, was responding to my visa-application queries with answers like “You can use fake hotels” and “Handle that independently.” He enjoined me to present the Russian embassy with a bogus itinerary centered around bus tours in Estonia. At no point during this process was any peep made as to the State Department’s warning: 2. Emphasis Foggy Bottom’s. “Do not travel to Chechnya or any other areas in the North Caucasus region. If you reside in these areas depart immediately.” As for the South Caucasus, well: “U.S. Embassy personnel are restricted from travel to Abkhazia or South Ossetia, even in the case of emergencies involving U.S. citizens … There are no commercial airports in either region making air ambulance evacuations impossible during medical emergencies.”
In other words, it was profoundly stupid—nay, monumentally irresponsible—for an American to go traipsing along these geopolitical fault lines. I knew this journey was selfish to the point where I could very well affect international relations. That I could be justly portrayed on TV as one more callous and/or terminally privileged dingus who had viewed the prospect of his death as a feature and not a bug. I knew this—as I suppose Otto knew, on some level, the risks that went along with his own YPT trip.
Yet the moment my plane swayed gently into its runup to liftoff, I felt the purest of pre-journey elation. As the YPT literature in my hand suggested, “It does not get much more adventurous and off the beaten path than this.”
Setiya is one of the very best moral philosophers working in the broadly analytic tradition. And like Bernard Williams and Thomas Nagel from an earlier generation, his work is characterized by a broad humanistic concern that reminds us that analytic philosophy can be something more than a pedantic accumulation of distinctions. So Setiya is ideally suited to exploring the philosophical dimensions of the midlife crisis — and not just because he is plainly someone with extensive personal experience of his topic. But popular philosophy is a strange beast — invariably looked upon by professional philosophers with some mix of disdain and poorly concealed envy. And the idea of a “philosophical self-help guide” — as Midlife bills itself — may seem especially suspect.
After all, the purpose of self-help literature is, well, self-improvement. But philosophy, one might think, should seek to understand the truth, without concern for its positive or negative consequences for the reader. While politicians and self-help gurus may be justified in peddling “noble lies,” the philosopher must follow her arguments to their conclusions, however debasing or ignoble these might be. Or so the thought goes.
Doctors today often complain of working in an occupational black hole in which patient encounters are compressed into smaller and smaller space and time. You can do a passable job in a 10-minute visit, they say, but it is impossible to appreciate the subtleties of patient care when you are rushing.
Enter “Slow Medicine: The Way to Healing,” a wonderful new memoir by Dr. Victoria Sweet. The term “slow medicine” has different interpretations. For some it means spending more time with patients. For others it means taking the time to understand evidence so as to avoid overdiagnosis and overtreatment. For Sweet, it means “stepping back and seeing the patient in the context of his environment,” and providing medical care that is “slow, methodical and step-by-step.”
In these potty-mouthed times, when certain world leaders sling profanity about with abandon, many observers naturally lament the debasement of speech. But instead of clutching pearls, why not find a silver lining? Learning more about when, how and why people swear offers insight into everything from the human brain to a society’s taboos. Trash talking even affords some real physical and social benefits, as Emma Byrne argues in “Swearing Is Good for You”.
On a warm morning last September, a dozen Herero men and women paid a visit to the American Museum of Natural History, in Manhattan. The men wore dark suits and ties, like guests at a funeral. The women wore colorful dresses and hats, following a tradition from Namibia, their home country, in southern Africa. They had come to view relics of a tragic episode in their nation’s history, and to ask the museum, after almost a century, to give them back.
Kavemuii Murangi, an education researcher who lives in Maryland, arrived wearing a gray suit and dark glasses that hid his gentle eyes. Inside the museum, several curators led Murangi and his companions to a private room upstairs. A table was covered with cardboard boxes, which the curators invited them to open when they felt ready. Inside the boxes were human skulls and skeletons. On many of the skulls, four-digit numbers had been scrawled above the eye sockets. Many of the visitors wept at the sight. “We looked at each other, we talked to each other, we hugged each other,” Murangi told me afterward. They were staring at remains of their own people.
Two memoirs of the past year left me spellbound. They have much in common—Roxane Gay’s Hunger and Melissa Febos’s Abandon Me both deal with longing to be understood and fighting the instinct to try to disappear. They both explore themes of love, desire, and identity, and, to varying degrees, trauma and its aftermath. They are both achingly honest and vulnerable.
But the spellbinding nature of these books has everything to do with language; both use repetition as a literary device to achieve a lyricism, rhythm, and resonance that build power.
When James Hyman was a scriptwriter at MTV Europe, in the 1990s, before the rise of the internet, there was a practical — as well as compulsive — reason he amassed an enormous collection of magazines. “If you’re interviewing David Bowie, you don’t want to be like, ‘O.K., mate, what’s your favorite color?’,” he said. “You want to go through all the magazines and be able to say, ‘Talk about when you did the Nazi salute at Paddington Station in 1976.’ You want to be like a lawyer when he preps his case.”
Whenever possible, Mr. Hyman tried to keep two copies of each magazine he acquired. One pristine copy was for his nascent magazine collection and another was for general circulation among his colleagues, marked with his name to ensure it found its way back to him. The magazines he used to research features on musicians and bands formed the early core of what became the Hyman Archive, which now contains approximately 160,000 magazines, most of which are not digitally archived or anywhere on the internet.
If you are, like me, entirely paranoid about preserving the integrity of your hearing, and if you also live in a midsize or large city, you have likely realized that true quiet—much like true darkness—is in horrifyingly short supply. Sirens, idling trucks, other people’s irate phone calls, crying babies, construction crews cracking up the pavement, lonesome dogs tied to fence posts, buskers whacking upturned plastic buckets, the gruesome screech of subway brakes: it is an ugly and relentless symphony. Some days, I calm down only by locking the door to my apartment, slapping on my noise-cancelling headphones, and comparing prices for plane tickets to the quietest places on Earth (a nature reserve in Russia, a cenote on the Yucatán Peninsula, a national park in Botswana).
But as a vision of the collective that is carefully attuned to the importance of the ties that bind us to each other, and to our world, it is nonetheless moving and compelling.
All parents remember the moment when they first held their children—the tiny crumpled face, an entire new person, emerging from the hospital blanket. I extended my hands and took my daughter in my arms. I was so overwhelmed that I could hardly think. Afterward I wandered outside so that mother and child could rest. It was three in the morning, late February in New England. There was ice on the sidewalk and a cold drizzle in the air. As I stepped from the curb, a thought popped into my head: When my daughter is my age, almost 10 billion people will be walking the Earth. I stopped midstride. I thought, How is that going to work?
Many years ago, though not so many years, I sat in a room and listened to a writer speak. I considered him old; I was not yet 30. The writer was Barry Hannah, and he was somewhere in his sixties—an age far, far over my horizon. He was meant to deliver a craft lecture. As far as I can remember, he spoke mostly of his recent treatment for colon cancer. I can see him vividly still: certain moments, the way he sat sidelong in his chair in a toppled column of sunlight, describing how one morning he woke from a dream, a vision really, of Jesus at the foot of his hospital bed.
I can’t quote a word of that lecture. What I remember was how that day, those moments, shook me deeply. Made me feel embarrassed—for what? For him? Me? I was awake. I was scared. I wondered, Is this a craft lecture? Now I know it was.
The change of perception that occurs then leads to a realignment. That is, it has to do with a reordering of ideas you already have about what’s unimportant and what’s consequential. So it was with these houses. For years you don’t see them because they are not monuments. You don’t already know what to think about them, as you do when viewing a landmark. And, being modern, they belong to the realm of the everyday; they don’t embody heritage or a glorious past. So you don’t study them. Then a moment occurs when you do see them. This moment introduces an inversion. The everyday now seems unfamiliar; famous landmarks appear, in comparison, tedious. The moment in which radical realignment takes place is, then, akin to a work of the imagination. In every domain of knowledge—in the sciences, in history, in the news—there is relative consensus about what constitutes significance. Only in imaginative works—in poetry and fiction—is there a reformulation of what we think to be important or unimportant.
All in all, Schwartz’s biography adds importantly to the literature of the utterly remarkable men and women who opened up nuclear physics to the world.
I didn’t know who William Kelley was when I found that book but, like millions of Americans, I knew a term he is credited with first committing to print. “If You’re Woke, You Dig It” read the headline of a 1962 Op-Ed that Kelley published in the New York Times, in which he pointed out that much of what passed for “beatnik” slang (“dig,” “chick,” “cool”) originated with African-Americans.
A fiction writer and occasional essayist, Kelley was, himself, notably woke. A half century before the poet Claudia Rankine used her MacArthur “genius” grant to establish an institute partly dedicated to the study of whiteness, Kelley turned his considerable intellect and imagination to the question of what it is like to be white in this country, and what it is like, for all Americans, to live under the conditions of white supremacy—not just the dramatic cross-burning, neo-Nazi manifestations of it common to his time and our own but also the everyday forms endemic to our national culture.
We call her Upstairs; she calls us Downstairs.
From our ground-floor apartment in Paris, my husband and I can look across the courtyard to her apartment on the top floor, with its large, curved windows.
“Downstairs,” she writes, “before drawing the curtain for the night, stepped out on the balcony, and saw your light on; which was good news.”
Each message from her is a treasure: “When next we meet, we’ll salute each other like two lamp-posts, lighting up at the same time. Have a lovely day without rain.”
Bad coffee is the best coffee. Or less cryptically: The lower you can set your standard for acceptable coffee, the happier you’ll be.
Look at this extraordinary American, Eggers’s attention says. And more to the point, look at him at this particular moment; give him some proper time; no story is more urgent.
If you are tired of puns, are you tired of life? Puns are easy to disdain. They are essentially found, not made; discovered after the fact rather than intended before it. Puns are accidental echoes, random likenesses thrown out by our lexical cosmos. They lurk, pallidly hibernating, inside fortune cookies and Christmas crackers; the groan is the pun’s appropriate unit of appreciation. On the other hand, everyone secretly loves a pun, and, wonderfully, the worst are often as funny as the best, as the great punster Nabokov knew, because the genre is so democratically debased. Puns are part of the careless abundance of creation, the delicious surplus of life, and, therefore, fundamentally joyful. Being accidental, they are like free money—nature’s charity. There’s a reason that the most abundant writer in the language was so abundant in puns: words, like Bottom’s dream, are bottomless.
The Scottish writer Ali Smith is surely the most pun-besotted of contemporary novelists, edging out even Thomas Pynchon. It’s not simply that she loves puns; it’s that she thinks through and with them; her narratives move forward, develop and expand, by mobilizing them. She is an insistently political writer, and her most recent work can be seen as an urgent, sometimes didactic intervention into post-Brexit British animosities, into a world that could be called, to borrow from one of her many punning characters, “nasty, British and short.” Since that calamitous referendum, in June, 2016, Smith has quickly published two novels, “Autumn,” in October of that year, and now “Winter” (Pantheon), the second of a projected seasonal quartet. But, for all the sense of bitter urgency, her work remains essentially sunny (pun-drenched, pun-kissed). “Autumn” and “Winter,” novels full of political foreboding, are also brief and almost breezy—topical, sweet-natured, something fun to be inside. The last page of “Winter” bears a baleful reference to President Trump’s hideous speech to the Boy Scouts in West Virginia, and the book contains a fair amount of family strife; yet the novel ends more like a Shakespearean comedy than like a political tragedy, with an air of optimistic renaissance and familial unity. One of the characters makes a reference to “Cymbeline” that might also function as a description of the novel we have just read: “Cymbeline, he says. The one about poison, mess, bitterness, then the balance coming back. The lies revealed. The losses compensated.” And much of the comedy and the fundamental cheerfulness in Smith’s work has to do, I think, with the figurative consolations the pun embodies: that life is generative, and that, even as things split apart, they can be brought together. For the pun is essentially a rhyme, and rhyme unites.
In our current historical moment, STEM disciplines, with their experimental-mathematical methods and measurable results, are central in educational practices, and humanistic education is in decline. At my own elite liberal arts college, Swarthmore, only 15 percent of the students now major in the Humanities or the Arts, and 75 percent major in Computer Science, Engineering, Biology, Economics, or Political Science. To some extent, this is natural. After all, in a difficult world like ours, why should anything as vague and unmeasurable as cultivation be taken seriously? Why should one learn Greek or art history or music composition, unless one just happens to enjoy such things? And why should the public or parents pay for these private enjoyments that seemingly lack significant public effect and value for the conduct of life?
Yet education is a historically evolved and evolving ensemble of practices, and it is also possible to wonder whether we might have lost our collective way. Do we really know what we’re doing in turning so strikingly toward STEM and away from the humanities? And are there good reasons for this turn?
Dystopias tend toward fantasies of absolute control, in which the system sees all, knows all, and controls all. And our world is indeed one of ubiquitous surveillance. Phones and household devices produce trails of data, like particles in a cloud chamber, indicating our wants and behaviors to companies such as Facebook, Amazon, and Google. Yet the information thus produced is imperfect and classified by machine-learning algorithms that themselves make mistakes. The efforts of these businesses to manipulate our wants leads to further complexity. It is becoming ever harder for companies to distinguish the behavior which they want to analyze from their own and others’ manipulations.
We live in Philip K. Dick’s future, not George Orwell’s or Aldous Huxley’s.
It has become easier to live longer, but harder to die well. Most people want to die at home; most die in hospital. Most want to be with family; often they are alone or with strangers. “Their death has been stolen from them,” writes Seamus O’Mahony in his bracing and unsentimental account of dying, The Way We Die Now, which charts how something that used to be public and acknowledged, with a common script, has become an aggressively medicalised and bureaucratic process placed in the hands of experts; sometimes banal, sometimes farcical, sometimes painful or undignified. Modern, sanitised death becomes a dirty little secret, almost embarrassing: our language circles round it, we don’t like to name it, cross the road to avoid those recently touched by it, and shy away from the physical, squeamish fact of it, so that the dead body is whisked away, frequently embalmed (for fear of its smell), cremated in “facilities” that are often in industrial zones.
The vast majority of Americans have never eaten proper Camembert cheese. Sure, there are plenty of little wheels stacked in shops nationwide, labeled “Camembert.” They’re creamy, earthy, and just a little pungent; they taste fine. But they lack the subtle microbial punch and complexity of the Camembert found in France. And it’s not just Camembert: Most Americans have also never tasted the full potential of proper Brie, Epoisses, or Roquefort, among the highlights of any cheese snob’s must-have list, or even the better types of mozzarella, the most popular cheese in America.
This isn’t an elitist most people think they’re drinking champagne but it’s only real if it comes from Champagne, France critique. The opinions of protectionist European regulators aside, there are plenty of legitimate varieties of Brie or Camembert or what have you in the States. Still, you cannot find their most desirable variants, even at the most cultured American cheese monger’s shop, because many of the world’s most fascinating and sought after cheeses are made with unpasteurized milk, aged less than 60 days, which the feds deem so dangerous that it’s illegal to make them in or import them into the United States. And this rule is not arcane. It is enforced.
The reference-book approach has had its critics from the start. The 17th-century philosopher John Locke complained that verse divisions so “chop’d and minc’d” the sacred text that readers “take the Verses usually for distinct Aphorisms” rather than reading them as a whole. But only now have publishers begun in earnest to offer a radically different kind of Bible consumption experience. The concept is to present the Bible in a form that is meant to be consumed as a multi-course meal, not a “chop’d and minc’d” sampler platter. These “Reader’s Bibles” strip away footnotes, sidebars, and chapter and verse divisions; text typically appears in a single column and is formatted to represent each passage’s true genre, rather than making all the books look the same.
When someone asks you to perform a task, there are many ways to say yes. Yes, for one. There’s also yep, yeah, yea, yup, ya, yessir, you bet, alright, alrighty, absolutely, of course, gladly, sounds good, will do, no problem, aye aye, roger, totally, definitely, and, if you are a trucker, 10-4.
Then, there is my absolute least favorite affirmative phrase: sure. Not to be confused with “sure thing” (folksy, casual) or “for sure” (loose, stoned), sure is a word that makes my skin prick, my eye twitch. Sure is used as “yes,” though it never means “yes.” Sure is a thumbs up to your face, and a jerkoff motion behind your back. Sure says “if I must.” Sure is the Mars Rover of passive aggression — an envoy to see how far you can really go before the other person snaps and says, “You know what, you’re being an asshole.”
The Devil’s Highway accounts for some thousands of years of human history in very slightly more than 200 pages, a feat of compression managed by three interwoven timelines, alternating chapter by chapter and linked through the presence of a real Roman road – the titular highway – which in our day can still be followed from Sunningdale in Berkshire, across the Blackwater river, to Silchester and beyond.
The playful seriousness of Morton’s prose mixes references to Blade Runner and Tibetan Buddhism with lyrics from Talking Heads and concepts from German philosophers. He doesn’t offer a plan to make society more environmentally friendly; instead, in what is an inspiringly idealistic book, he calls for a paradigm shift in our relationship to the world.
By September 2018, one of the country’s most famous civil-engineering projects will finally complete construction, six decades after work on it began.
Interstate 95, the country’s most used highway, will finally run as one continuous road between Miami and Maine by the late summer. The interstate’s infamous “gap” on the Pennsylvania and New Jersey border will be closed, turning I-95 into an unbroken river of concrete more than 1,900 miles long. In so doing, it will also mark a larger milestone, say transportation officials—the completion of the original United States interstate system.
Construction to fix the I-95 gap began more than eight years ago in Pennsylvania, but it has now reached its final stage. This week, the New Jersey Department of Transportation began switching out road signs in preparation for the switch.
When my son was born, 15 months ago, I was under no illusion that I had any idea what I was doing. But I did think I understood self-help books. For longer than I’d like to admit, I’ve written a weekly column about psychology and the happiness industry, in the course of which I have read stacks and stacks of books on popular psychology. I even wrote one myself, specifically aimed at readers who – like me – distrusted the hyperbolic promises of mainstream self-help. Midway through my partner’s pregnancy, when I first clicked “Bestsellers in Parenting: Early Childhood” on Amazon, I naively assumed it would be easy enough to pick up two or three titles, sift the science-backed wheat from the chaff, apply it where useful, and avoid getting too invested in any one book or parenting guru.
Where the Past Begins is subtitled “a writer’s memoir”, and it’s worth mentioning what Tan doesn’t include. There is very little mention of published books; instead, she elaborates on the act of writing, the mechanics and results of her own imagination. She explains the central importance of metaphors, the stories her mind spins while she listens to music. “Spontaneous epiphanies always leave me convinced once again that there is no greater meaning to my life than what happens when I write,” she says. Tan’s epiphanies and revelations often revive suppressed memories: “as if I were seeing the ghost of my mother, bringing me a sweater she had knit for me when I was five”.
In a world full of widebody airliners including the Airbus A380, people forget the 747’s mammoth size and its status as a prestige aircraft. Dubbed the “Jumbo Jet” by the media, the 747-100 was about 1.5 times as large as a Boeing 707 and could carry 440 passengers compared to the 707’s modest 189 headcount. In fact the airplane was so large, Boeing had to build a new factory in Everett, Washington, just for assembly and it remains the largest building by volume in the world.
While the supersonic dream was ultimately a commercial failure (for now), the 747 became an icon of industrial design. Along with numerous aerodynamic innovations, it was the first commercial aircraft to incorporate high-bypass turbofan engines like those developed for the C-5. The Jumbo also pioneered commercial autopilot for landing and quadruple main landing gear.
Maybe it's time for new resolutions, new experiments. My goal going forward, if I am to retain my sanity, seems clear enough: to try to avoid imposing fixity on an increasingly fluid world, and to surrender in good faith to the flow, even when I struggle to find good reasons to embrace it. Less stockpiling, more listening? Sure. But I don't believe I will ever pass a stack of dusty CDs in a Goodwill and not feel a pang of excitement, an insatiable curiosity, a compulsive need to rifle, touch, and understand. My old behavior is simply too enjoyable, too integral to my identity to give up completely.
My father once challenged me on what he perceived as the senselessness of my record-buying habit, and I explained to him that the happiest feeling in the world, for me, was walking into a record store with a few dollars to spend; few things have ever made me feel as good, and I suspect few things ever will. There is, in any obsession, a kind of helplessness, but as addictions go, this one has always seemed to me pretty harmless. The modern world, however, has issued a new and terrifying challenge: Try and keep up.
And now, in 2018, the economics of online publishing are running everyone off the map. I sometimes think, with some regretful wonder and gratitude, about an Awl chat-room conversation that took place in 2013. Some annoying mini-scandal had transpired on the Internet, and everyone else who worked for the little network—they all had years of experience on me—was typing out lively scenarios of what they would do if our online infrastructure magically burned down. Sitting in my little blue house in Ann Arbor, I kept quiet for a while, and then typed something like, “Aww guys, no, the Internet is great.” I meant it, though the sentiment now feels as distant as preschool. Reading the Awl and the Hairpin, and then working with the people that ran them, had actually convinced me that the Internet was silly, fun, generative, and honest. They all knew otherwise, but they staved off the inevitable for a good long while.
When we think of reading, we think of scrolling, clicking, and pushing screens, seeing these as replacements for the analog method of turning, flipping, and folding pages. But the book, made through mechanical processes, engineered as an appliance for leisure and instruction, can also be seen as a machine, a tool for use, a technology developed to serve a need, and one with a long and rich history. We, of a certain generation, can remember learning to type and swipe, touching screens as we once touched paper pages, forgetting that previously the book, too, was a form to be learned. In 2001, a Norwegian television show spoofed this very idea in a skit called “The Medieval Help Desk” in which a monk, distraught, unable to use this new thing called “the book,” goes for help. The aide at the desk then teaches him how to open the cover, assuaging the monk’s fear that “some of the text would disappear” upon turning the page. Curator John Roach cites this skit in his introduction to The Internal Machine, an exhibition, at the Center for Book Arts in New York, of more than a dozen artworks that explore and reimagine the mechanical aspects of the analog book’s status as both a sculptural and use-value object. From flipping pages to creasing spines, these artists present the book at the intersection of form and function, wedging open the space between its intended purpose as tactile tool for research and the acquisition of knowledge, and the vast possibilities for books as objects with myriad surfaces, planes, and bindings.
The sidewalk line is a beast of its own kind, native to the space outside whatever latest bakeshop or store selling limited-edition streetwear. Within the broader genus of lines, it differs from those inside the post office or Starbucks. (I’ll call those normal lines “normal lines.”) All types of lines are a product of math that expresses the rate at which people arrive and how fast a cashier can distribute some stuff. Normal lines are borne from a solvable fluke: too many people, too few cashiers. Sidewalk lines do not want to be solved. They are intentional — cultivated, managed, bred like show dogs. In certain types of luxury transactions, we’ve come to accept them as a predestined fact.
I used to believe that standing in line was a natural arrangement of human bodies, much like geese flying south in a “V.” But queuing is a recent and man-made invention. The first historical description of the line only appeared in 1837, in Thomas Carlyle’s The French Revolution. Describing a postwar scarcity of bread, he wrote: “If we look now at Paris, one thing is too evident: that the Bakers’ shops have got their Queues, or Tails; their long strings of purchasers, arranged in tail, so that the first come be the first served.” According to Carlyle, lining up was a uniquely French eccentricity. How earlier peoples distributed their bread is a fact that I’ve not yet been able to suss out. Before self-serve supermarkets, most stores relied on a deli-counter model. I can only assume that shoppers massed around a vendor, who granted his attention to the squeakiest wheel.
I am grateful that we are increasingly careful about how we talk about and report on suicide, and I respect the intention behind the ban on “commit suicide.” But I can’t support it. I don’t begrudge those who are more comfortable with “died by suicide” or “killed themselves,” but I bristle at the prescriptive nature of their objections, as though the rest of us who prefer “committed suicide” are wrong and need to catch up.
“Why Liberalism Failed” is a book that reads like an attempt to enunciate a primal scream, a deeply exasperating volume that nevertheless articulates something important in this age of disillusionment.
Sourdough is a soup of skilfully balanced ingredients: there’s satire, a touch of fantasy, a pinch of SF, all bound up with a likeable narrator whose zest for life is infectious. The novel opens a door on a world that’s both comforting and thrillingly odd.
Today, if there's traffic in the area and you want to follow the law, you need to find a crosswalk. And if there's a traffic light, you need to wait for it to change to green.
To most people, this seems part of the basic nature of roads. But it's actually the result of an aggressive, forgotten 1920s campaign led by auto groups and manufacturers that redefined who owned the city streets.
"In the early days of the automobile, it was drivers' job to avoid you, not your job to avoid them," says Peter Norton, a historian at the University of Virginia and author of Fighting Traffic: The Dawn of the Motor Age in the American City. "But under the new model, streets became a place for cars — and as a pedestrian, it's your fault if you get hit."
“The New Colossus” emerges at a pivotal moment in history. The year before Lazarus’s poem was read at the Bartholdi Pedestal Fund Art Loan Exhibition in New York, in 1883, the Chinese Exclusion Act became the first Federal law that limited immigration from a particular group. Though set to last for 10 years, various extensions and additions made the law permanent until 1943. The year after Lazarus’s poem was read, the European countries met in Berlin to divide up the African continent into colonies. “The New Colossus” stands at the intersection of U.S. immigration policy and European colonialism, well before the physical Statue of Liberty was dedicated. The liberal sentiments of Lazarus’s sonnet cannot be separated from these developments in geopolitics and capitalism.
The poem’s peculiar power comes not only from its themes of hospitality, but also from the Italian sonnet form that contains them. A Petrarchan sonnet is an awkward vehicle for defenses of American greatness. Historically, the epic poem has been the type of poetry best suited to nationalist projects, since its narrative establishes a “storied pomp” in literature that has yet to exist in the world. The sonnet, in contrast, is a flexible, traveling form, one that moved from Italy to England. It is more at home in the conversations, translations, and negotiations between national literatures than in the creation or renewal of national eminence.
The question of how to make a living as a writer is at its surface very simple. The answer is, you write whenever you’re not doing your real, proper job. The proper job, where you earn your proper living. The answer is, you feel grateful to have a job at all. The answer is, you tuck your writing away, like a cyclist rolling up one trouser leg so the cuff doesn’t get caught up in the chain. The answer is, you have reasons to write other than to make any money—some of them banal and maybe even embarrassing, like wanting to be seen, wanting to be someone. Some of them grander and easier to own up to, like trying to understand what it means to be in this world when so many of us feel we are outside of it. Whatever your reasons, they push you forward.
Why did Sidney Lumet in 1974 and Kenneth Branagh in 2017 go to the trouble of assembling star-studded casts to revisit Agatha Christie’s ingenious but very creaky novel of the 1930s, Murder on the Orient Express? One try was understandable. After all, the novel draws on at least two compelling genres: 12 suspects confined to their elegant sleeping and dining cars on a train delayed by a snow drift offers not just the bafflement of the closed room mystery, but also the inescapable microcosm of the “ship of fools” (Narrenschiff), which goes back to the Middle Ages and leaves traces on the masterpieces of Boccaccio and Chaucer. Christie’s fools, too, are obliged to tell stories that we may or may not accept at face value. Even before Christie’s novel, both genres were made popular in star-studded films like Grand Hotel (1932) and The Kennel Murder Case (1933). But why did Branagh try what Lumet had tried already? Yes, remakes offer a certain kind of artistic challenge, and there are lots of remakes. And the choice of this Christie novel is peculiar for any form of ambition that expects to be taken seriously.
Of course, even the permanent sabbatical must come to an end. There was a time when I feared that I would not make it. Now I am distressed that I will have to give it all up. Better not to linger on the thought. According to Freud, we are really unable to believe that we are mortal, for we cannot conceive of ourselves as being absent.
The arts of dying well and ceremoniously, the artes moriendi, were cultivated when there was a commonly held belief in the afterlife. These so-called arts were part of a larger religious mystery. To an unbeliever, the ineluctable moment is a mystery of another kind: When and where will the grim terrorist strike?
Child prodigies are exotic creatures, each unique and inexplicable. But they have a couple of things in common, as Ann Hulbert’s meticulous new book, “Off the Charts,” makes clear: First, most wunderkinds eventually experience some kind of schism with a devoted and sometimes domineering parent. “After all, no matter how richly collaborative a bond children forge with grown-up guides, some version of divorce is inevitable,” Hulbert writes. “It’s what modern experts would call developmentally appropriate.” Second, most prodigies grow up to be thoroughly unremarkable on paper. They do not, by and large, sustain their genius into adulthood.
I don’t know if this strategy is effective. I still think about the things I should be doing instead, but moving feels better than staying still, so I keep riding. When you’re broke, your body becomes your last resort, a mostly reliable means to make money that also comes with great precarity. If you get injured in a low-wage job with no employment insurance, there’s nothing to fall back on. You pay with your health.
I feel this job in my body. My neck cracks, my shoulders pop, my ankles creak. Some nights, I ride until my legs turn numb and the wind whips tears in my eyes and the world becomes fuzzy at the edges. Then I have a choice. I can keep riding or I can stop and wait until my path becomes clear again.
But just as often, we allow ourselves to be borne along by the currents of what’s swirling around us without abstracting away from it. Getting swept up in a musical performance is just one among a whole host of familiar activities that seem less about computing information, and more about feeling our way as we go: selecting an outfit that’s chic without being fussy, avoiding collisions with other pedestrians on the pavement, or adding just a pinch of salt to the casserole. If we sometimes live in the world in a thoughtful and considered way, we go with the flow a lot, too.
I think it’s a mistake to dismiss these sorts of experiences as ‘mindless’, or the notion of a merely visceral grasp of something as oxymoronic. Instead, I think that the lived reality of music puts pressure on philosophers to broaden their conception of what the mind is, how it works, and to embrace the diversity of ways in which we can begin to grapple with the world around us.
The delusion isn’t that criticism is important; it is important, the more so as discourse increasingly takes the form of people screaming at each other on the internet. The delusion is that critics can ever transcend the subjectivity that makes good criticism so interesting in the first place. And if a certain negativity, even a certain schadenfreude, attaches to that subjectivity, well, would you rather have a pretended objectivity that observes all the proprieties and never risks giving offense?
In these times, the most important task of game journalism isn’t to serve a public interest but to ensure that fans can continue to identify some version of themselves in the games they have played, and ensure future releases will allow them access to even deeper levels of self-expression and understanding. In playing the next game, owning the newest console, having an opinion on the latest patch, we feel like we can become stabler versions of ourselves, all at the cost of clearing out space—both mental and financial—for open-ended consumption of a form without any purpose beyond this increasingly tautological pleasure. This process is necessarily dehumanizing. Games matter because you are here to play them, and you remain here to play them because they matter.
Maybe this is why, with video games, we break from the tradition of identifying people with particular pastimes as lovers—bibliophile, cinephile, audiophile. To love video games is to become not a ludophile but a gamer, a claim on identity rather than a statement of personal interest. Every fact or feeling in our lives that doesn’t relate to games is an extraneous detail, so much so that it can feel like one’s whole life might be beside the point.
While Johnson treads much of the same territory here as in his 1992 cult classic—addiction, crime, obsession, psychedelic transcendence—his final book is overall more concerned with what follows such madness. If Jesus’ Son proclaims, “Holy shit, after all the drugs and alcohol and violence, I can’t believe I’m still alive,” then Largesse responds, “Yes, but you’re still going to die anyway.”
As I looked through a big window in the living room, I could see most of Honolulu below. On the edge of the city is the lush Diamond Head extinct volcanic crater. Beneath that lies rows and rows of homes that seem to melt together. The suburban boxes feel like they would just fall into the sea, if not for the skyscrapers that dot Waikiki and Downtown Honolulu and form a wall that stops them from flowing straight into the ocean.
It was the view I had looked at my whole life. But for a split second, I saw it all gone — just gray dust and rubble, like a photo I might have seen of a bombed-out city in World War II.
Working artists who have fouled their reputations will have to fend for themselves. Directors who have taken advantage of the casting couch, actors who have grotesquely exploited their stardom, conductors who have preyed on their young charges deserve to have the rug pulled out from under them. If the work they've done lives on, it will do so apart from the memory of their shameful deeds. This will take time.
I hear from my former students occasionally. A few have gone on to accomplish remarkable work. Hear equally from the ordinary and, remarkable. Requests for recommendations, announcements of new jobs, marriages, children, a photo, copy of a book or film script, story in a magazine or anthology, perhaps inscribed personally to me or sent directly from the publisher. The gift of a snapshot, book, or story meant to break silence that settles in after they leave the university, the silence that being here, a student for a semester in my fiction-writing class, doesn’t break, silence of living ordinary lives we all endure whether our writing is deemed remarkable by others or not.
A current student, Teresa McConnell, wants to help other people. The story she submits to my fiction-writing class, though not very long, is quite ambitious. It wishes to save the life of its main character, a young woman of color, a few years out of high school, single, child to support, no money, shitty job, living with her mother who never misses an I-told-you-so chance to criticize her daughter’s choices. Voice of the character my student invents to narrate the story reveals the young colored woman to be bright, articulate, thoughtful, painfully aware of how race, gender, age, poverty trap her. Worse now because a baby daughter is trapped with her. Lack of understanding not the narrator’s problem. She’s stifled by lack of resources, options.
Miéville’s generic boundary-crossing is more than simply a stylistic approach; it is also a political commitment “to think the world, and to change it.” What distinguishes Miéville as a writer is the way that his novels utilize the imaginative potential of fantastic fiction to engage with social and political reality. In interviews, Miéville has situated his novels as post-Seattle literature, and his writing responds to what the late Mark Fisher has termed capitalist realism, “the widespread sense that not only is capitalism the only viable political and economic system, but also that it is now impossible to even imagine a coherent alternative to it.” For Miéville, the radical potential of the fantastic lies in its ability to conceptualize a world beyond reality as presently constructed. Miéville uses fantasy in a way that makes the familiar appear strange and that challenges the stability of the present. By constructing fantastical worlds that continually thwart established rules and expectations, Miéville’s novels unmask the limitations of social imagination and hold open the utopian possibilities of imagining the world otherwise — of conceptualizing “the not-this-ness of this” (as he puts it in Iron Council).
The recent publication of two books on Miéville testifies not only to his relevance to modern fiction, but also to the political importance of fantastic literature to contemporary culture. Carl Freedman’s Art and Idea in the Novels of China Miéville and Caroline Edwards and Tony Venezia’s edited collection China Miéville: Critical Essays both draw attention to Miéville as an important contemporary literary figure and as a critical thinker whose novels engage with wider concerns of genre, politics, and the imagination. Choosing to write about a figure like Miéville is no easy task. As Edwards and Venezia note in their introduction, given Miéville’s rapid rate of productivity (since the publication of these books, Miéville has released two novellas, a short story collection, and a nonfiction account of the Russian Revolution), as well as his own theorization of the genre, Miéville “always seems to be two steps ahead of his critics.” However, through their exploration of the literary and political significance of Miéville’s fiction, both Art and Idea and China Miéville: Critical Essays* provide fascinating and engaging analyses of Miéville’s novels that remarkably integrate their textual and theoretical elements. Together, both works point to the ways that fantastic literature can help to imagine alternatives to the enclosing realities of contemporary capitalism.
Sure, some comedians get lucky and get seven-figure book deals, but many comedians are not selling books just for the money. Moreover, the relationship between a book publisher and a comedian—even in the absence of these large-figure transfers—is still appealing on both sides.
So what is the appeal of books, and what does the book industry offer that comedians can’t get elsewhere?
From time to time, we have an experience that makes us stop and really think about what matters. Often, it takes something jarring to put everything into perspective. This happened to me recently, when I was wandering around a construction site at night and I bumped my head on a steel girder. As soon as it happened, I knew this wasn’t one of those regular, run-of-the-mill bumps on the head you get from keeping your bowling ball on a high shelf or by operating a scissor lift in your basement. This was the kind of bump on the head that opens your eyes and gives you a new outlook on life.
All at once, I understood that life isn’t about our own selfish needs, but about what we do for others. I see now, too, that I do a lot of things that put my head in harm’s way, and that I should limit how often I do that, with the goal of never having it happen. In short, I’m a changed man.
Ultimately, Medoff's book is about finding oneself — and satisfaction — in a combination of absorbing work and personal relationships. In addition to kindness, Rosa ingrains this idea in her acolytes: "The key is to be the same person at home and at work."
Joe Duff, CEO of Operation Migration, is not the only conservationist to wear a uniform to work. But instead of the khakis and polos that serve to show that humans are all part of the same team, his uniform helps him blend in among a flock of whooping cranes. It’s not a bird costume, per se. Rather than making the wearer look like something else, its purpose is to conceal what they are — a human being who’s trying to teach these cranes how to be wild.
Most of the suit is nothing more than an amorphous white bag that covers the wearer’s arms and everything from head to mid-calf. A volunteer of theirs makes every part specially for the program. To hide their faces, they use white plastic construction helmets covered in a layer of white fabric, except for a small plate made out of reflective mylar that they use to see and a strip of mesh to help them breathe. The costumes are neither stylish nor, in the hot summer months, particularly comfortable. (“Whooping cranes can spend their life in the marsh and mud and they’re still pure white; we can’t spend 10 minutes,” Duff says.) They use the same outfits year-round and have to make them from a material thick enough that when the light shines through, there’s no chance of a crane making out the human silhouette underneath. One hand is covered by a black fabric mitten stitched to the costume so the birds never see a glimpse of skin. In the other, they carry a puppet meant to look like the head of a whooping crane. It’s this, not the blob of white human attached to it, that the birds interact with.
The process helped Oxford’s editors study all of the shades of meaning expressed by a single word, but it was also tedious and messy. With thousands of slips pouring into the OED’s offices every day, things could often go wrong.
And they did.
If one is to appreciate Coetzee’s essays, one must recognize how precisely, with what concentration, he lays bare a terrifically complex psychology, giving readers a handhold on which to build even more complex readings of their own.
It's a deeply generous, compassionate book that asks its readers to open their hearts and treat one another with understanding, even as the world grows more complicated, and more unknowable, every day.
MyAppleMenu Reader will be taking a break tomorrow, and will return on Saturday, 13 Jan, 2018.
Over the past century, the quest to describe the geometry of space has become a major project in theoretical physics, with experts from Albert Einstein onwards attempting to explain all the fundamental forces of nature as byproducts of the shape of space itself. While on the local level we are trained to think of space as having three dimensions, general relativity paints a picture of a four-dimensional universe, and string theory says it has 10 dimensions – or 11 if you take an extended version known as M-Theory. There are variations of the theory in 26 dimensions, and recently pure mathematicians have been electrified by a version describing spaces of 24 dimensions. But what are these ‘dimensions’? And what does it mean to talk about a 10-dimensional space of being?
In the 1960s, Richard Feynman and Bryce DeWitt set out to quantize gravity using the same techniques that had successfully transformed electromagnetism into the quantum theory called quantum electrodynamics. Unfortunately, when applied to gravity, the known techniques resulted in a theory that, when extrapolated to high energies, was plagued by an infinite number of infinities. This quantization of gravity was thought incurably sick, an approximation useful only when gravity is weak.
Since then, physicists have made several other attempts at quantizing gravity in the hope of finding a theory that would also work when gravity is strong. String theory, loop quantum gravity, causal dynamical triangulation and a few others have been aimed toward that goal. So far, none of these theories has experimental evidence speaking for it. Each has mathematical pros and cons, and no convergence seems in sight. But while these approaches were competing for attention, an old rival has caught up.
There is nothing quite like the scent of a library. The aroma of old paper when you walk through the door is the smell of thought itself, of memory and time. For years, I bought used books I could display on a shelf because being an English major and aspiring writer who didn’t own books made me feel like more of an imposter than I already did. Occasionally I loaned them to people, letting the recipient assume that this copy I was giving them, this paperback or hardcover and not the four-track cassettes in the green plastic container, was the one I had read. Occasionally, when no one was around, I pulled one from the shelf and turned the pages. Even without a magnifier, my eyes can tell where the text lies, locate the little black wings on the otherwise blank page that must be the dedication. A few times I would hold the hardcover in my hand while the cassette played, guessing when to turn the page.
In graduate school, I’ve feared that I might be missing some element of the reading experience, that I might never have been reading at all. Even today, I continue to worry that the stories and books I write are not organic, authentic creations because all the books that have inspired and educated me were consumed through secondary media, replications of the original text. A scholar of the humanities might point out the Homeric tradition of oral storytelling, noting that once upon a time writing and publishing didn’t even exist. For years I tried to write such an essay, defending the way I read by describing the different languages of the world, the unique alphabets with their own characters incomprehensible to other cultures. But there is no defense quite like the feeling that you have nothing to defend.
Here’s the heart of the problem: The set of critics’ and audiences’ interests do not perfectly overlap but rather form a Venn diagram. In the audience circle, the pressing question is, “Should I spend some number of the dollars I have to my name and the hours I have left on Earth on this thing?” Critics get in for free and by definition have to read or watch or listen to whatever’s next up. So their circle is filled with relativistic questions about craft and originality and wallet quality and the often unhelpfully general “Is it good?” (Some of them even have an idea of what they mean by “good”; the rest are winging it.)
Most of our science, philosophy and religion starts from the assumption that there are humans and there are animals – and there could never at any point be any common ground between them. To call someone an animal is as bad an insult as you can offer, and yet we’re all mammals. For centuries, the notion of human uniqueness was the most fundamental orthodoxy. Now it is being challenged. Book after book ventures into the no-man’s-land – the no-animal’s-land – that lies between our species and the other ten million or so in the animal kingdom. As often as not, they reveal more of ourselves than of our fellow animals.
With every page we turn, we can feel the resistance to any suggestion that non-human animals are even remotely like ourselves. Of course animals can’t think, can’t feel, can’t talk. We resist this not because such things are impossible but because they are unthinkable. Our lives would be horribly compromised if we accepted that we humans were just one more species of animal.
The village of Hobart, New York, is home to two restaurants, one coffee shop, zero liquor stores, and, strangely enough, five independent bookstores. “The books just show up,” Barbara Balliet, who owns Blenheim Hill Books, says. “I’ve come to the store and bags of books are waiting for me.” Fewer than 500 people live in Hobart. Yet from Main Street, in the center of town, you’re closer to a copy of the Odyssey in classical Greek, or a vintage collection of Jell-O recipes, than a gas station.
This literature-laden state of affairs emerged just after the turn of the millennium, when two residents of Manhattan, Diana and Bill Adams, stopped in Hobart during a trip through the Catskills. “We were both intrigued,” says Bill, who worked as a physician for 40 years. “I saw what a charming, and somewhat rustic, but civilized, area it was.” He and his wife Diana, a former lawyer, were looking for retirement activities that they could pursue into their old age.
Of all the mistakes made by city planners in the postwar era, the passion for highway construction has to be one of the most foolhardy. After the early success of systems like the autobahn and freeways, cities everywhere were carved up to make way for giant roads, crashing through neighbourhoods and creating opportunities for “comprehensive redevelopment”.
This was considered progress, a necessary part of entering the modern world. But some strange things happened – the most damning being that these new roads didn’t reduce traffic at all. Instead, they induced demand, clogging up almost as quickly as they were built. As time went on, communities began rejecting the plans and fighting back against the bulldozers, halting development in its tracks and kickstarting the modern conservation movement.
Finally, the knockout blow was the oil crisis of the 1970s which put an end to many big plans. Looking back, we can marvel at how outrageous some of these and later schemes were, and the traces they left behind.
Part love story and part speculative sci-fi, it’s a meandering, albeit meaningful, look at marriage, technology and ghosts — those of the otherworldly type that may exist but also specters of our past that influence our present.
These debates point to an apparent paradox in our understanding of autism: is it a disorder to be diagnosed, or an experience to be celebrated? How can autism be something that must be ‘treated’ at one level, but also praised and socially accommodated at another? Many people in the neurodiversity community say that autism is just a natural variant in the human condition. But should autistic individuals have the same legal rights as everyone else? Or are their needs different, and if so, how? If we are invited to be skeptical of clinical approaches, how can we decide who qualifies for additional support? The fundamental conundrum is that, over its troubled history, views have shifted about whether autism is part of a narrative description of an individual’s developing life, or whether it’s a measurable category that others have the right to count, demarcate and define.
When the nature of work changes, companies reward new ways of feeling about it. The rise of white-collar work in the 1950s birthed the risk-averse organization man, whose highest values were loyalty and orderly conduct. The deregulation of the 1980s made virtues of aggression and ruthless competition. The new economy is characterized by instability and disruption; its ideal worker is calm in the midst of it all, productive and focused. The mindfulness training his company offers isn’t so much a perk as it is the means of turning him into a new type of person. He can work all weekend, multitask frantically on Monday morning, and reset mentally just by breathing in and out on his lunch break. His employer is not just the center of his overflowing, unpredictable working life but the center of his simple, dependable spiritual life too.
Mornings are normal, but they are not regular. While they happen daily, each morning of course starts and ends at a slightly different time, as the sun rises earlier in the summer and later in the winter. Our society follows a regular 24-hour cycle that glosses over each day’s variations. Morning shows are one way that mornings become regulated. Beyond merely establishing a set schedule, morning shows provide a sense of continuity. The same hosts appear every morning, continuing a conversation that picks up on the previous day’s stories and events. A morning show points to the newly risen sun in the sky. It says that today’s morning is just like yesterday’s.
Chronological decades have little material significance. To a biologist or physician, the physiological differences between, say, 39-year-old Fred and 44-old Fred aren’t vast—probably not much different than those between Fred at 38 and Fred at 39. Nor do our circumstances diverge wildly in years that end in nine compared with those that end in zero. Our life narratives often progress from segment to segment, akin to the chapters of a book. But the actual story doesn’t abide by round numbers any more than novels do. After all, you wouldn’t assess a book by its page numbers: “The 160s were super exciting, but the 170s were a little dull.” Yet, when people near the end of the arbitrary marker of a decade, something awakens in their minds that alters their behavior.
Taylor has journeyed deep into the human psyche, and if you accompany him on the pilgrimage afforded by these Complete Stories, you will emerge transformed.
“Winter” is an insubordinate folk tale, with echoes of the fiction of Iris Murdoch and Angela Carter, that plays out against a world gone wrong.
Around 500 BC, the Carthaginian explorer Hanno the Navigator guided a fleet of sixty oared ships through the Strait of Gibraltar and along the northwest lobe of the great elephant ear that is the African continent. Toward the end of his journey, on an island in a lagoon, he encountered a “rude description of people”—rough-skinned, hairy, violent. The local interpreters called them Gorillae. Hanno and his crew attempted to capture some of them, but many climbed up steep elevations and hurled stones in defense. Eventually, the Carthaginians caught three female Gorillae, flayed them, and brought their skins back home, where they hung in the Temple of Tanit for several centuries.
Though scholars dispute whether the Gorillae were gorillas, chimpanzees, or an indigenous tribe of humans, many regard Hanno’s account as the oldest surviving record of humans encountering another species of great ape. The ambiguity of Hanno’s early descriptions—are the Gorillae human or beast, people or apes?—is not just an artifact of translational difficulties; it is exemplary of a profound misunderstanding in historical attitudes about our closest animal cousins, a confusion that is still being resolved today.
On a windy spring morning two years ago, at an ape sanctuary and research facility in the heart of North America, I had an encounter of my own. Spread over six acres of forest, fields, and lakes in Des Moines, Iowa, the Ape Cognition and Conservation Initiative is home to a clan of five bonobos, including the renowned Kanzi, perhaps the most linguistically talented ape ever studied. When Kanzi was an infant, researchers tried to teach his adoptive mother to communicate using an array of lexigrams on a keyboard. She never made much progress, but Kanzi, like a human child exposed to language, began to use the symbols on his own. Today, he knows the meanings of hundreds of lexigrams and understands many phrases of spoken English as well.
A few months back, I landed in Narita Airport, and from there I took a rail to the center of Tokyo. The plan was to spend a few weeks in Japan. Partly because I’d always wanted to, but mostly because the chance had presented itself. The island lived in my imagination the way it does for gaijin all over the world. There are millions of us, and we never think we’ll go—but my boyfriend Dave and I were going; we’d bought the tickets, booked the Airbnbs, and one night before we left, a whiteboy in a bar asked what any of that had to do with me.
My parents used to travel. They’d made their way all over. We had a cupboard full of mugs from Sweden, and some salt shakers from Peru, and, some weekends, my father wore a kimono around the living room, mumbling after shitty NFL calls in German. Growing up in Houston, I studied Japanese in school. None of it was practical. It had absolutely nothing to do with my life.
In an essay about translating “Human Acts,” published in the online magazine Asymptote, Deborah Smith describes reading Han’s work and being “arrested by razor-sharp images which arise from the text without being directly described there.” She quotes a couple of her “very occasional interpolations,” including the striking phrase “sad flames licking up against a smooth wall of glass.” Charse Yun, in his essay about “The Vegetarian,” declares his admiration for Smith’s work but argues that it is a “new creation.” Smith insists that the phrases she added are images “so powerfully evoked by the Korean that I sometimes find myself searching the original text in vain, convinced that they were in there somewhere, as vividly explicit as they are in my head.”
Once you’ve made a reservation at Paris’s first nudist restaurant, you find yourself neurotically broadcasting this bit of news to anyone who will listen. While vacationing in France’s capital recently, a visitor from New York City approached the front desk of his hotel and told the thoughtful-looking employee seated there, “Tonight, we will be eating at the naturist restaurant, O’Naturel. In addition to our clothing, we will also be surrendering our phones, so between eight-forty-five and eleven o’clock we will be unreachable.” The desk clerk nodded gravely.
O’Naturel is situated on a residential street in the Twelfth Arrondissement, a stone’s throw from a nursery school. The restaurant’s co-proprietor, smiling and fully dressed, buzzed the visitor and a friend into a tiny, curtained-off lobby. “New York City!” the co-proprietor said, glancing at his reservation book. “A woman from there is eating with us tonight as well!” The visitor murmured to his friend, “Probably Maureen Dowd.”
Gnomon is a big, ambitious book that sometimes trips over its own bigness, but reads like some kind of game of literary telephone played by Philip K. Dick, Arthur Conan Doyle and William Gibson.
Thirteen years after Michener left Alcatraz, the prison was shuttered. The beds became overgrown and birds established nesting colonies there. Plants, including nine rose bushes, did their own hard time, surviving austere conditions and neglect.
In 2003, the Garden Conservancy, Golden Gate National Parks Conservancy and the National Park Service combined efforts to restore the gardens. More work remains, says Shelagh Fritz, the Golden Gate National Parks Conservancy’s project manager for Alcatraz. As they have toiled, park staffers and volunteers have unearthed evidence of inmate life — including 100 fugitive handballs, escapees from the prison’s rec yard.
Low-power nonprofit FM stations are the still, small voices of media. They whisper out from basements and attics, and from miniscule studios and on-the-fly live broadcasts like KBFG’s. They have traditionally been rural and often run by churches; many date to the early 2000s, when the first surge of federal licenses were issued.
But in the last year, a diverse new wave of stations has arrived in urban America, cranking up in cities from Miami to the Twin Cities in Minnesota, and especially here in the Northwest, where six community stations began to broadcast in Seattle. At least four more have started in Portland. Some are trying to become neighborhood bulletin boards, or voices of the counterculture or social justice. “Alternative” is the word that unites them.
“New year, new me” is one of those lies we tell ourselves like “my parents did the best they could’’ or “wearing sweatpants in public is acceptable.” I should know, I usually made it to about January 9 before I was up to my neck in deep-fried ice cream covered in Jack Daniels with my new gym membership on fire in a paper shredder. But I’ve learned that beating an addiction is not impossible. In 2009 I was a 240-pound, alcoholic cocaine addict who lived in my mom’s basement. I fully expected my addictions to kill me before I turned 30. And they almost did.
Their designs recall radical pamphlets of yore. Their titles suggest what might have once been unfashionable didacticism or naïve breadth: Twenty Lessons from the Twentieth Century, Demagoguery and Democracy, A Rumination on Moral Panic in Our Time. They are reasonably priced: either to be accessible to the people or to be impulse buys — it’s not clear which.
The books are on display at my local bookstore in the front section by the registers. It is the section I like to call Progressive Identity Items, which also features reusable shopping bags with clever slogans and pretty designs, handwoven baby slings, a variety of leftist lawn signs, political coffee mugs, and bumper stickers.
I buy them all. I can’t help it. They’re so cute.
Over the past two decades, the U.S. labor market has undergone a quiet transformation, as companies increasingly forgo full-time employees and fill positions with independent contractors, on-call workers or temps—what economists have called “alternative work arrangements” or the “contingent workforce.” Most Americans still work in traditional jobs, but these new arrangements are growing—and the pace appears to be picking up. From 2005 to 2015, according to the best available estimate, the number of people in alternative work arrangements grew by 9 million and now represents roughly 16 percent of all U.S. workers, while the number of traditional employees declined by 400,000. A perhaps more striking way to put it is that during those 10 years, all net job growth in the American economy has been in contingent jobs.
Around Washington, politicians often talk about this shift in terms of the so-called gig economy. But those startling numbers have little to do with the rise of Uber, TaskRabbit and other “disruptive” new-economy startups. Such firms actually make up a small share of the contingent workforce. The shift that came for Borland is part of something much deeper and longer, touching everything from janitors and housekeepers to lawyers and professors.
After arriving in Tromsø, I was terrified at the thought of the impending winter. Months of friends and family telling me how they could “never move some place so cold and dark” because the winter makes them “so depressed” or “so tired” had me bracing for the worst-case scenario.
But it didn’t take long for me to realize that most residents of Tromsø weren’t viewing the upcoming winter with a sense of doom. In fact, to many locals, the original question I’d planned to ask—“Why aren’t people in Tromsø more depressed during the winter?”—didn’t make sense. Most people I spoke to in Tromsø were actually looking forward to the winter. They spoke enthusiastically about the ski season. They loved the opportunities for coziness provided by the winter months.
If novelists are relinquishing the very things that are exclusively the province of the novel, then they are complicit in the demise of the novel. If they don’t want to save the novel, why should anyone else?
Sixty years ago, Anglican children used to sing, with some gusto, a hymn extolling the beauties of the Earth, from “Greenland’s icy mountains” and “India’s coral strand” to other examples of the Creator’s artistry. Two lines, however, suggested a glitch in the divine plan: our planet is a place where “every prospect pleases, / And only man is vile.” Caught up in our singing, we paid little attention. Few of us were budding deep ecologists.
If humanity were originally charged with the stewardship of a wonderfully designed world, as the story in Genesis claims, then it is easy to think we have failed in our responsibilities. We have modified the Earth’s surface — as well as the oceans and the atmosphere — in all manner of unattractive ways. But then, so have other species. Beetles have devastated elm trees around the globe. Ants have altered the vegetation and topography of regions they have invaded. Ivy, gypsy moths, and beavers have wrought their own kinds of devastation. Perhaps we have acted on a vaster scale than other species, but it seems unfair to charge Homo sapiens as uniquely vile.
If the idea of stewardship is taken seriously, it must be rethought.
Where do new words come from? Few are purely invented, in the sense of being coined from a string of sounds chosen more or less at random. Most tend to be existing words given new meaning (“to tweet”). In other cases, a word changes its parts of speech (“to Photoshop”, “to Facebook”). And in some of the most creative instances, people chop words and recombine them to make new ones (as in “sexting”).
New words mostly become embedded through use. A few countries have official academies that declare when a word has been accepted, but they have little actual influence over how people speak. Which words make their way into a language says a lot about where phrasing comes from today.
Some writers believe that they have to ease their readers into darkness. It's a popular gambit, and to an extent, it makes sense — you don't want to lose the reader by plunging them instantly into misery; there has to be some glimmer of hope at the beginning, even if you plan to extinguish it eventually.
Neel Mukherjee, thankfully, is not one of those writers, as his stunning third novel, A State of Freedom, proves. His latest book starts off benignly enough, but it doesn't take long at all for him to twist the knife, letting the reader know that this isn't going to be a saccharine, feel-good story. It's a brutal novel that gets darker and darker, and it's as breathtakingly beautiful as it is bleak.
In 1835, Mary reflected on how “the true end of biography” was to deduce “the peculiar character of the man” from the “minute, yet characteristic details” that punctuated the life: from the specifics of place and clothing and bodily experience in which Sampson’s biography excels. And it is their shared faith in biography as a valuable exploration of character, despite the imperfections of the genre, that is perhaps what brings Sampson closest in her search for Mary Shelley.
As New York evolved over the decades, the subway was the one constant, the very thing that made it possible to repurpose 19th-century factories and warehouses as offices or condominiums, or to reimagine a two-mile spit of land between Manhattan and Queens that once housed a smallpox hospital as a high-tech university hub. When the city is in crisis — financial or emotional — the subway is always a crucial part of the solution. The subway led the city’s recovery from the fiscal calamity of the 1970s. The subway was at the center of the rebuilding of Lower Manhattan after the Sept. 11 attacks. The subway got New York back to work after the most devastating storm in the city’s history just five years ago.
The questions we are facing today are not so different from the ones our predecessors faced 100 years ago. Can the gap between rich and poor be closed, or is it destined to continue to widen? Can we put the future needs of a city and a nation above the narrow, present-day interests of a few? Can we use a portion of the monumental sums of wealth that we are generating to invest in an inclusive and competitive future? The answer to all of these questions is still rumbling beneath New York City.
One day in the nineteen-eighties, a woman went to the hospital for cancer surgery. The procedure was a success, and all of the cancer was removed. In the weeks afterward, though, she felt that something was wrong. She went back to her surgeon, who reassured her that the cancer was gone; she consulted a psychiatrist, who gave her pills for depression. Nothing helped—she grew certain that she was going to die. She met her surgeon a second time. When he told her, once again, that everything was fine, she suddenly blurted out, “The black stuff—you didn’t get the black stuff!” The surgeon’s eyes widened. He remembered that, during the operation, he had idly complained to a colleague about the black mold in his bathroom, which he could not remove no matter what he did. The cancer had been in the woman’s abdomen, and during the operation she had been under general anesthesia; even so, it seemed that the surgeon’s words had lodged in her mind. As soon as she discovered what had happened, her anxiety dissipated.
For me, there is no punctuation mark as versatile and appealing as the em dash. I love the em dash in a way that is difficult to explain, which is, probably, the motivation of this essay. And my love for it is emphasized by the fact that many writers never, or rarely, use it—even disdain it. It is not, so to speak, an essential punctuation mark, the same way commas or periods are essential. You can get along without it and most people do. I don’t remember being taught to use it in elementary, middle, or high school English classes; I’m not even sure I was aware of it then, and I have no clear recollection of when or why I began to rely on it, yet it has become an indispensable component of my writing.
“When Breath Becomes Air,” Paul Kalanithi’s memoir of his final years as he faced lung cancer at age 37, was published posthumously, in 2016, to critical acclaim and commercial success. “The Bright Hour,” Nina Riggs’s memoir of her final years as she faced breast cancer at age 39, was published posthumously, in 2017, to critical acclaim and commercial success. The two books were mentioned together in numerous reviews, lists and conversations.
Perhaps less inevitable was that the late authors’ spouses would end up together, too.
Storytelling is certainly reductive, but its simplifications are the means by which human beings make sense of themselves and of each other. It’s not until the book’s brilliant final act that De Kretser allows the reader to fall in love with a character, Christabel, whose particularity grips and moves, and who achieves the ultimate revenge against the writers who have wounded her, by throwing their novels in the bin.
Over the course of an 18-month investigation, officials in the county’s Office of Children, Youth and Families (C.Y.F.) offered me extraordinary access to their files and procedures, on the condition that I not identify the families involved. Exactly what in this family’s background led the screening tool to score it in the top 5 percent of risk for future abuse and neglect cannot be known for certain. But a close inspection of the files revealed that the mother was attending a drug-treatment center for addiction to opiates; that she had a history of arrest and jail on drug-possession charges; that the three fathers of the little girl and her two older siblings had significant drug or criminal histories, including allegations of violence; that one of the older siblings had a lifelong physical disability; and that the two younger children had received diagnoses of developmental or mental-health issues.
Finding all that information about the mother, her three children and their three fathers in the county’s maze of databases would have taken Byrne hours he did not have; call screeners are expected to render a decision on whether or not to open an investigation within an hour at most, and usually in half that time. Even then, he would have had no way of knowing which factors, or combinations of factors, are most predictive of future bad outcomes. The algorithm, however, searched the files and rendered its score in seconds. And so now, despite Byrne’s initial skepticism, the high score prompted him and his supervisor to screen the case in, marking it for further investigation. Within 24 hours, a C.Y.F. caseworker would have to “put eyes on” the children, meet the mother and see what a score of 19 looks like in flesh and blood.
But what if an algorithm could predict death? In late 2016 a graduate student named Anand Avati at Stanford’s computer-science department, along with a small team from the medical school, tried to “teach” an algorithm to identify patients who were very likely to die within a defined time window. “The palliative-care team at the hospital had a challenge,” Avati told me. “How could we find patients who are within three to 12 months of dying?” This window was “the sweet spot of palliative care.” A lead time longer than 12 months can strain limited resources unnecessarily, providing too much, too soon; in contrast, if death came less than three months after the prediction, there would be no real preparatory time for dying — too little, too late. Identifying patients in the narrow, optimal time period, Avati knew, would allow doctors to use medical interventions more appropriately and more humanely. And if the algorithm worked, palliative-care teams would be relieved from having to manually scour charts, hunting for those most likely to benefit.
Every two years, the American valet-parking industry sends its best parkers—optimistically described as athletes—to compete in a head-to-head battle known as the National Valet Olympics. True to their Athenian namesake, the games push participants to the limit. Competitors sort keys. They pack trunks. They slalom through orange cones. They sprint across parking lots. Organized into corporate teams, they also dress in the snazzy uniforms of their trade.
At first glance, an Olympics organized entirely around valet parking seems absurd: a luxury service treated as a Decathlon. Yet the Valet Olympics draw attention to a line of work—or, as some would say, an emerging motorsport—that few ever pause to consider. Successful valets boast automotive skills unappreciated outside the parking lot. And valet parking is a hidden vein of economic opportunity that provides full-time work, first jobs, and summer employment to thousands. For immigrants from Nigeria, India, or Ecuador, or displaced by war in Iraq, the industry can supply a much-needed foothold in the United States, even launching a lifelong career. What’s more, as cities grow in size and complexity, America’s urban centers are becoming harder to navigate—with byzantine parking laws, dense downtowns that require real-life Tetris skills to park, and massive lots located blocks from the venues they serve. All of this makes valets, as they invisibly rearrange streets, the set designers of every busy cityscape. Giving them an arena to demonstrate their talents is, in this sense, a no-brainer.
I recall Noel Annan, the provost of University College London, declaring in the 1970s that the English literature department, historically the first such in England, was the “very heart” of the school. Any college president making such a claim as Annan’s today could await the men in white coats.
It’s with exhilaration, then, that one hails Martin Puchner’s book, which asserts not merely the importance of literature but its all-importance.
And precision — of observation, of language — is Hoby’s gift. Her sentences are sleek and tailored. Language molds snugly to thought.
So if you’re a Silicon Valley brogrammer who wants to drive women out of your manspace, or a Bernie Bro who thought that ‘Bern the witch’ was the summit of wit, then you’re more likely to rush to the rationale that society and culture – ie, things that we create outside of biology – don’t shape the male spaces that exclude women. No, you desperately want Nature to be responsible.
Appealing to a higher power and cherrypicking ‘evidence’ to support a convenient claim of superiority over others of your species is as human as scratching your butt. In this false iteration of that desperate measure of a failing privileged class, Nature created men to like and do certain man things, and naturally, therefore, men are simply better at these things. In complement, goes this wish-fulfilment rationale, Nature created women to like and do certain lesser things, and women are condescendingly told how great they are at it, especially the talking part, and please stay in the kitchen and make a sandwich because all of this analysis is over your head, Sweetie.
But Nature made me. And I am not rare. Indeed, an entire category of girl and woman exists that is large enough to warrant the now-archaic category of tomboy. We are legion, and most of us likely wished fervently at some point that we could be boys so we could simply gain access to what interested us most. How could nature both create an entire legion of girls whose interests and abilities cross into manworld, yet somehow be capable of producing brains only on a binary?
For a writer like Langston Hughes, who made a name for himself as a poet before the age of 21, his debut novel, “Not Without Laughter,” feels like an effort to stake out a bigger claim on his abilities, to create artistic and thematic breathing room. Arna Bontemps, celebrated poet and friend to Hughes, described “Not Without Laughter” as the novel that both Hughes and his readers knew he had to write, coming as it did on the heels of Hughes’s two well-received poetry collections, “The Weary Blues” (1926) and “Fine Clothes to the Jew” (1927). Hughes published these collections while a student at Lincoln University, and he released “Not Without Laughter” in 1930, shortly after graduating. “By the date of his first book of prose Hughes had become for many a symbol of the black renaissance,” Bontemps writes. The stakes were high, then, for the young man born in Joplin, Mo. He had to deliver.
“Not Without Laughter” crystallizes some of the themes introduced in Hughes’s first two poetry collections and examines in detail subjects he would return to throughout his decades-long career, among them the experiences of working-class and poor blacks, the importance of black music to black life, the beauty of black language and the trap of respectability. It begins as a tale of family life, following the Williamses — the matriarch, Aunt Hager; her daughters, Harriet, Annjee (Annjelica) and Tempy; and Annjee’s husband, Jimboy — in the small Kansas town of Stanton. After establishing the conflicts and desires of the adults, the narrative becomes a bildungsroman. Here it finds its true purpose: chronicling the upbringing of Sandy, the son of Jimboy and Annjee, as he struggles to forge an identity outside of the boxes the white and black worlds have put him in, and tries to find stability within his increasingly unstable home.
I may be starting to sound like a stereotypical radical leftist Marxist English professor, influencing my innocent students and corrupting their minds. Two defences: First, my own college education happened at Hillsdale College, a bastion of free market libertarianism and conservative politics; Hillsdale is where I first learned to read closely for economic dynamics (if not exactly with the intended grain, there). Second, my students led me to at least half of the epiphanies in the present Bloomberg article. And these epiphanies took place right on the surface: we were not ‘reading into’ this piece. It’s all right there.
The consequences of this vast gambit for our attention is that we have been drawn into a kind of mental slavery. Masters of profits and propaganda are farming our minds, doing cumulative damage that may go to the very core of our humanity. As a result, our attention is becoming locked into a low level of living and functioning.
A recurring theme in “The King Is Always Above the People” is the need to explore how leaving home, and returning to it, changes you irremediably. Alarcón manages to offer a fresh look at migration, the oldest story of all. “The place you are born,” he writes, “is simply the first place you flee.”
The next day finds you lying naked in a dumpster in a different state, smeared from head to toe with a mixture of Sriracha sauce and glitter. At first you remember nothing. But then, as your throbbing brain slowly reboots, memories of the night before, disturbing memories, begin creeping into your consciousness. As the full, hideous picture comes into focus, you curl into a ball, whimpering, asking yourself over and over: Did that really happen?
That’s how we here at the Year in Review feel about 2017. It was a year so surreal, so densely populated with strange and alarming events, that you have to seriously consider the possibility that somebody — and when we say “somebody,” we mean “Russia” — was putting LSD in our water supply. A bizarre event would occur, and it would be all over the news, but before we could wrap our minds around it, another bizarre event would occur, then another and another, coming at us faster and faster, battering the nation with a Category 5 weirdness hurricane that left us hunkering down, clinging to our sanity, no longer certain what was real.
For literary memoirs, a brisk survey of the genre insists, are hardly ever about what really happens to the people whose names appear on their jackets. They are far more likely to be about what those people think happens to them or how they wish to be regarded by the readers who buy their work. Anthony Powell, for example, and despite compelling evidence to the contrary, always imagined himself to be “a poor boy made good”. Lodge, on the other hand, offered the highly unusual spectacle of a creative writer simply setting down, with sometimes disarming lack of guile, how he had come to be the person he was.
Clause by clause, word by word, anything becomes plausible. Control is achieved through willing proximity to its loss. It seems he’s “just filling a notebook with jazz”, but then these directionless improvisations acquire the weight of stories. Sideways drift gives way to narrative. So let’s hand the wheel back to the narrator at the Starlight who has “nothing to show for 36 years on this earth. Except that God is closer to me than my next breath. And that’s all I’ll ever need or want. If you think I’m bullshitting, kiss my ass. My story is the amazing truth.”