The sidewalk line is a beast of its own kind, native to the space outside whatever latest bakeshop or store selling limited-edition streetwear. Within the broader genus of lines, it differs from those inside the post office or Starbucks. (I’ll call those normal lines “normal lines.”) All types of lines are a product of math that expresses the rate at which people arrive and how fast a cashier can distribute some stuff. Normal lines are borne from a solvable fluke: too many people, too few cashiers. Sidewalk lines do not want to be solved. They are intentional — cultivated, managed, bred like show dogs. In certain types of luxury transactions, we’ve come to accept them as a predestined fact.
I used to believe that standing in line was a natural arrangement of human bodies, much like geese flying south in a “V.” But queuing is a recent and man-made invention. The first historical description of the line only appeared in 1837, in Thomas Carlyle’s The French Revolution. Describing a postwar scarcity of bread, he wrote: “If we look now at Paris, one thing is too evident: that the Bakers’ shops have got their Queues, or Tails; their long strings of purchasers, arranged in tail, so that the first come be the first served.” According to Carlyle, lining up was a uniquely French eccentricity. How earlier peoples distributed their bread is a fact that I’ve not yet been able to suss out. Before self-serve supermarkets, most stores relied on a deli-counter model. I can only assume that shoppers massed around a vendor, who granted his attention to the squeakiest wheel.
I am grateful that we are increasingly careful about how we talk about and report on suicide, and I respect the intention behind the ban on “commit suicide.” But I can’t support it. I don’t begrudge those who are more comfortable with “died by suicide” or “killed themselves,” but I bristle at the prescriptive nature of their objections, as though the rest of us who prefer “committed suicide” are wrong and need to catch up.
“Why Liberalism Failed” is a book that reads like an attempt to enunciate a primal scream, a deeply exasperating volume that nevertheless articulates something important in this age of disillusionment.
Sourdough is a soup of skilfully balanced ingredients: there’s satire, a touch of fantasy, a pinch of SF, all bound up with a likeable narrator whose zest for life is infectious. The novel opens a door on a world that’s both comforting and thrillingly odd.
Today, if there's traffic in the area and you want to follow the law, you need to find a crosswalk. And if there's a traffic light, you need to wait for it to change to green.
To most people, this seems part of the basic nature of roads. But it's actually the result of an aggressive, forgotten 1920s campaign led by auto groups and manufacturers that redefined who owned the city streets.
"In the early days of the automobile, it was drivers' job to avoid you, not your job to avoid them," says Peter Norton, a historian at the University of Virginia and author of Fighting Traffic: The Dawn of the Motor Age in the American City. "But under the new model, streets became a place for cars — and as a pedestrian, it's your fault if you get hit."
“The New Colossus” emerges at a pivotal moment in history. The year before Lazarus’s poem was read at the Bartholdi Pedestal Fund Art Loan Exhibition in New York, in 1883, the Chinese Exclusion Act became the first Federal law that limited immigration from a particular group. Though set to last for 10 years, various extensions and additions made the law permanent until 1943. The year after Lazarus’s poem was read, the European countries met in Berlin to divide up the African continent into colonies. “The New Colossus” stands at the intersection of U.S. immigration policy and European colonialism, well before the physical Statue of Liberty was dedicated. The liberal sentiments of Lazarus’s sonnet cannot be separated from these developments in geopolitics and capitalism.
The poem’s peculiar power comes not only from its themes of hospitality, but also from the Italian sonnet form that contains them. A Petrarchan sonnet is an awkward vehicle for defenses of American greatness. Historically, the epic poem has been the type of poetry best suited to nationalist projects, since its narrative establishes a “storied pomp” in literature that has yet to exist in the world. The sonnet, in contrast, is a flexible, traveling form, one that moved from Italy to England. It is more at home in the conversations, translations, and negotiations between national literatures than in the creation or renewal of national eminence.
The question of how to make a living as a writer is at its surface very simple. The answer is, you write whenever you’re not doing your real, proper job. The proper job, where you earn your proper living. The answer is, you feel grateful to have a job at all. The answer is, you tuck your writing away, like a cyclist rolling up one trouser leg so the cuff doesn’t get caught up in the chain. The answer is, you have reasons to write other than to make any money—some of them banal and maybe even embarrassing, like wanting to be seen, wanting to be someone. Some of them grander and easier to own up to, like trying to understand what it means to be in this world when so many of us feel we are outside of it. Whatever your reasons, they push you forward.
Why did Sidney Lumet in 1974 and Kenneth Branagh in 2017 go to the trouble of assembling star-studded casts to revisit Agatha Christie’s ingenious but very creaky novel of the 1930s, Murder on the Orient Express? One try was understandable. After all, the novel draws on at least two compelling genres: 12 suspects confined to their elegant sleeping and dining cars on a train delayed by a snow drift offers not just the bafflement of the closed room mystery, but also the inescapable microcosm of the “ship of fools” (Narrenschiff), which goes back to the Middle Ages and leaves traces on the masterpieces of Boccaccio and Chaucer. Christie’s fools, too, are obliged to tell stories that we may or may not accept at face value. Even before Christie’s novel, both genres were made popular in star-studded films like Grand Hotel (1932) and The Kennel Murder Case (1933). But why did Branagh try what Lumet had tried already? Yes, remakes offer a certain kind of artistic challenge, and there are lots of remakes. And the choice of this Christie novel is peculiar for any form of ambition that expects to be taken seriously.
Of course, even the permanent sabbatical must come to an end. There was a time when I feared that I would not make it. Now I am distressed that I will have to give it all up. Better not to linger on the thought. According to Freud, we are really unable to believe that we are mortal, for we cannot conceive of ourselves as being absent.
The arts of dying well and ceremoniously, the artes moriendi, were cultivated when there was a commonly held belief in the afterlife. These so-called arts were part of a larger religious mystery. To an unbeliever, the ineluctable moment is a mystery of another kind: When and where will the grim terrorist strike?
Child prodigies are exotic creatures, each unique and inexplicable. But they have a couple of things in common, as Ann Hulbert’s meticulous new book, “Off the Charts,” makes clear: First, most wunderkinds eventually experience some kind of schism with a devoted and sometimes domineering parent. “After all, no matter how richly collaborative a bond children forge with grown-up guides, some version of divorce is inevitable,” Hulbert writes. “It’s what modern experts would call developmentally appropriate.” Second, most prodigies grow up to be thoroughly unremarkable on paper. They do not, by and large, sustain their genius into adulthood.
I don’t know if this strategy is effective. I still think about the things I should be doing instead, but moving feels better than staying still, so I keep riding. When you’re broke, your body becomes your last resort, a mostly reliable means to make money that also comes with great precarity. If you get injured in a low-wage job with no employment insurance, there’s nothing to fall back on. You pay with your health.
I feel this job in my body. My neck cracks, my shoulders pop, my ankles creak. Some nights, I ride until my legs turn numb and the wind whips tears in my eyes and the world becomes fuzzy at the edges. Then I have a choice. I can keep riding or I can stop and wait until my path becomes clear again.
But just as often, we allow ourselves to be borne along by the currents of what’s swirling around us without abstracting away from it. Getting swept up in a musical performance is just one among a whole host of familiar activities that seem less about computing information, and more about feeling our way as we go: selecting an outfit that’s chic without being fussy, avoiding collisions with other pedestrians on the pavement, or adding just a pinch of salt to the casserole. If we sometimes live in the world in a thoughtful and considered way, we go with the flow a lot, too.
I think it’s a mistake to dismiss these sorts of experiences as ‘mindless’, or the notion of a merely visceral grasp of something as oxymoronic. Instead, I think that the lived reality of music puts pressure on philosophers to broaden their conception of what the mind is, how it works, and to embrace the diversity of ways in which we can begin to grapple with the world around us.
The delusion isn’t that criticism is important; it is important, the more so as discourse increasingly takes the form of people screaming at each other on the internet. The delusion is that critics can ever transcend the subjectivity that makes good criticism so interesting in the first place. And if a certain negativity, even a certain schadenfreude, attaches to that subjectivity, well, would you rather have a pretended objectivity that observes all the proprieties and never risks giving offense?
In these times, the most important task of game journalism isn’t to serve a public interest but to ensure that fans can continue to identify some version of themselves in the games they have played, and ensure future releases will allow them access to even deeper levels of self-expression and understanding. In playing the next game, owning the newest console, having an opinion on the latest patch, we feel like we can become stabler versions of ourselves, all at the cost of clearing out space—both mental and financial—for open-ended consumption of a form without any purpose beyond this increasingly tautological pleasure. This process is necessarily dehumanizing. Games matter because you are here to play them, and you remain here to play them because they matter.
Maybe this is why, with video games, we break from the tradition of identifying people with particular pastimes as lovers—bibliophile, cinephile, audiophile. To love video games is to become not a ludophile but a gamer, a claim on identity rather than a statement of personal interest. Every fact or feeling in our lives that doesn’t relate to games is an extraneous detail, so much so that it can feel like one’s whole life might be beside the point.
While Johnson treads much of the same territory here as in his 1992 cult classic—addiction, crime, obsession, psychedelic transcendence—his final book is overall more concerned with what follows such madness. If Jesus’ Son proclaims, “Holy shit, after all the drugs and alcohol and violence, I can’t believe I’m still alive,” then Largesse responds, “Yes, but you’re still going to die anyway.”
As I looked through a big window in the living room, I could see most of Honolulu below. On the edge of the city is the lush Diamond Head extinct volcanic crater. Beneath that lies rows and rows of homes that seem to melt together. The suburban boxes feel like they would just fall into the sea, if not for the skyscrapers that dot Waikiki and Downtown Honolulu and form a wall that stops them from flowing straight into the ocean.
It was the view I had looked at my whole life. But for a split second, I saw it all gone — just gray dust and rubble, like a photo I might have seen of a bombed-out city in World War II.
Working artists who have fouled their reputations will have to fend for themselves. Directors who have taken advantage of the casting couch, actors who have grotesquely exploited their stardom, conductors who have preyed on their young charges deserve to have the rug pulled out from under them. If the work they've done lives on, it will do so apart from the memory of their shameful deeds. This will take time.
I hear from my former students occasionally. A few have gone on to accomplish remarkable work. Hear equally from the ordinary and, remarkable. Requests for recommendations, announcements of new jobs, marriages, children, a photo, copy of a book or film script, story in a magazine or anthology, perhaps inscribed personally to me or sent directly from the publisher. The gift of a snapshot, book, or story meant to break silence that settles in after they leave the university, the silence that being here, a student for a semester in my fiction-writing class, doesn’t break, silence of living ordinary lives we all endure whether our writing is deemed remarkable by others or not.
A current student, Teresa McConnell, wants to help other people. The story she submits to my fiction-writing class, though not very long, is quite ambitious. It wishes to save the life of its main character, a young woman of color, a few years out of high school, single, child to support, no money, shitty job, living with her mother who never misses an I-told-you-so chance to criticize her daughter’s choices. Voice of the character my student invents to narrate the story reveals the young colored woman to be bright, articulate, thoughtful, painfully aware of how race, gender, age, poverty trap her. Worse now because a baby daughter is trapped with her. Lack of understanding not the narrator’s problem. She’s stifled by lack of resources, options.
Miéville’s generic boundary-crossing is more than simply a stylistic approach; it is also a political commitment “to think the world, and to change it.” What distinguishes Miéville as a writer is the way that his novels utilize the imaginative potential of fantastic fiction to engage with social and political reality. In interviews, Miéville has situated his novels as post-Seattle literature, and his writing responds to what the late Mark Fisher has termed capitalist realism, “the widespread sense that not only is capitalism the only viable political and economic system, but also that it is now impossible to even imagine a coherent alternative to it.” For Miéville, the radical potential of the fantastic lies in its ability to conceptualize a world beyond reality as presently constructed. Miéville uses fantasy in a way that makes the familiar appear strange and that challenges the stability of the present. By constructing fantastical worlds that continually thwart established rules and expectations, Miéville’s novels unmask the limitations of social imagination and hold open the utopian possibilities of imagining the world otherwise — of conceptualizing “the not-this-ness of this” (as he puts it in Iron Council).
The recent publication of two books on Miéville testifies not only to his relevance to modern fiction, but also to the political importance of fantastic literature to contemporary culture. Carl Freedman’s Art and Idea in the Novels of China Miéville and Caroline Edwards and Tony Venezia’s edited collection China Miéville: Critical Essays both draw attention to Miéville as an important contemporary literary figure and as a critical thinker whose novels engage with wider concerns of genre, politics, and the imagination. Choosing to write about a figure like Miéville is no easy task. As Edwards and Venezia note in their introduction, given Miéville’s rapid rate of productivity (since the publication of these books, Miéville has released two novellas, a short story collection, and a nonfiction account of the Russian Revolution), as well as his own theorization of the genre, Miéville “always seems to be two steps ahead of his critics.” However, through their exploration of the literary and political significance of Miéville’s fiction, both Art and Idea and China Miéville: Critical Essays* provide fascinating and engaging analyses of Miéville’s novels that remarkably integrate their textual and theoretical elements. Together, both works point to the ways that fantastic literature can help to imagine alternatives to the enclosing realities of contemporary capitalism.
Sure, some comedians get lucky and get seven-figure book deals, but many comedians are not selling books just for the money. Moreover, the relationship between a book publisher and a comedian—even in the absence of these large-figure transfers—is still appealing on both sides.
So what is the appeal of books, and what does the book industry offer that comedians can’t get elsewhere?
From time to time, we have an experience that makes us stop and really think about what matters. Often, it takes something jarring to put everything into perspective. This happened to me recently, when I was wandering around a construction site at night and I bumped my head on a steel girder. As soon as it happened, I knew this wasn’t one of those regular, run-of-the-mill bumps on the head you get from keeping your bowling ball on a high shelf or by operating a scissor lift in your basement. This was the kind of bump on the head that opens your eyes and gives you a new outlook on life.
All at once, I understood that life isn’t about our own selfish needs, but about what we do for others. I see now, too, that I do a lot of things that put my head in harm’s way, and that I should limit how often I do that, with the goal of never having it happen. In short, I’m a changed man.
Ultimately, Medoff's book is about finding oneself — and satisfaction — in a combination of absorbing work and personal relationships. In addition to kindness, Rosa ingrains this idea in her acolytes: "The key is to be the same person at home and at work."
Joe Duff, CEO of Operation Migration, is not the only conservationist to wear a uniform to work. But instead of the khakis and polos that serve to show that humans are all part of the same team, his uniform helps him blend in among a flock of whooping cranes. It’s not a bird costume, per se. Rather than making the wearer look like something else, its purpose is to conceal what they are — a human being who’s trying to teach these cranes how to be wild.
Most of the suit is nothing more than an amorphous white bag that covers the wearer’s arms and everything from head to mid-calf. A volunteer of theirs makes every part specially for the program. To hide their faces, they use white plastic construction helmets covered in a layer of white fabric, except for a small plate made out of reflective mylar that they use to see and a strip of mesh to help them breathe. The costumes are neither stylish nor, in the hot summer months, particularly comfortable. (“Whooping cranes can spend their life in the marsh and mud and they’re still pure white; we can’t spend 10 minutes,” Duff says.) They use the same outfits year-round and have to make them from a material thick enough that when the light shines through, there’s no chance of a crane making out the human silhouette underneath. One hand is covered by a black fabric mitten stitched to the costume so the birds never see a glimpse of skin. In the other, they carry a puppet meant to look like the head of a whooping crane. It’s this, not the blob of white human attached to it, that the birds interact with.
The process helped Oxford’s editors study all of the shades of meaning expressed by a single word, but it was also tedious and messy. With thousands of slips pouring into the OED’s offices every day, things could often go wrong.
And they did.
If one is to appreciate Coetzee’s essays, one must recognize how precisely, with what concentration, he lays bare a terrifically complex psychology, giving readers a handhold on which to build even more complex readings of their own.
It's a deeply generous, compassionate book that asks its readers to open their hearts and treat one another with understanding, even as the world grows more complicated, and more unknowable, every day.
MyAppleMenu Reader will be taking a break tomorrow, and will return on Saturday, 13 Jan, 2018.
Over the past century, the quest to describe the geometry of space has become a major project in theoretical physics, with experts from Albert Einstein onwards attempting to explain all the fundamental forces of nature as byproducts of the shape of space itself. While on the local level we are trained to think of space as having three dimensions, general relativity paints a picture of a four-dimensional universe, and string theory says it has 10 dimensions – or 11 if you take an extended version known as M-Theory. There are variations of the theory in 26 dimensions, and recently pure mathematicians have been electrified by a version describing spaces of 24 dimensions. But what are these ‘dimensions’? And what does it mean to talk about a 10-dimensional space of being?
In the 1960s, Richard Feynman and Bryce DeWitt set out to quantize gravity using the same techniques that had successfully transformed electromagnetism into the quantum theory called quantum electrodynamics. Unfortunately, when applied to gravity, the known techniques resulted in a theory that, when extrapolated to high energies, was plagued by an infinite number of infinities. This quantization of gravity was thought incurably sick, an approximation useful only when gravity is weak.
Since then, physicists have made several other attempts at quantizing gravity in the hope of finding a theory that would also work when gravity is strong. String theory, loop quantum gravity, causal dynamical triangulation and a few others have been aimed toward that goal. So far, none of these theories has experimental evidence speaking for it. Each has mathematical pros and cons, and no convergence seems in sight. But while these approaches were competing for attention, an old rival has caught up.
There is nothing quite like the scent of a library. The aroma of old paper when you walk through the door is the smell of thought itself, of memory and time. For years, I bought used books I could display on a shelf because being an English major and aspiring writer who didn’t own books made me feel like more of an imposter than I already did. Occasionally I loaned them to people, letting the recipient assume that this copy I was giving them, this paperback or hardcover and not the four-track cassettes in the green plastic container, was the one I had read. Occasionally, when no one was around, I pulled one from the shelf and turned the pages. Even without a magnifier, my eyes can tell where the text lies, locate the little black wings on the otherwise blank page that must be the dedication. A few times I would hold the hardcover in my hand while the cassette played, guessing when to turn the page.
In graduate school, I’ve feared that I might be missing some element of the reading experience, that I might never have been reading at all. Even today, I continue to worry that the stories and books I write are not organic, authentic creations because all the books that have inspired and educated me were consumed through secondary media, replications of the original text. A scholar of the humanities might point out the Homeric tradition of oral storytelling, noting that once upon a time writing and publishing didn’t even exist. For years I tried to write such an essay, defending the way I read by describing the different languages of the world, the unique alphabets with their own characters incomprehensible to other cultures. But there is no defense quite like the feeling that you have nothing to defend.
Here’s the heart of the problem: The set of critics’ and audiences’ interests do not perfectly overlap but rather form a Venn diagram. In the audience circle, the pressing question is, “Should I spend some number of the dollars I have to my name and the hours I have left on Earth on this thing?” Critics get in for free and by definition have to read or watch or listen to whatever’s next up. So their circle is filled with relativistic questions about craft and originality and wallet quality and the often unhelpfully general “Is it good?” (Some of them even have an idea of what they mean by “good”; the rest are winging it.)
Most of our science, philosophy and religion starts from the assumption that there are humans and there are animals – and there could never at any point be any common ground between them. To call someone an animal is as bad an insult as you can offer, and yet we’re all mammals. For centuries, the notion of human uniqueness was the most fundamental orthodoxy. Now it is being challenged. Book after book ventures into the no-man’s-land – the no-animal’s-land – that lies between our species and the other ten million or so in the animal kingdom. As often as not, they reveal more of ourselves than of our fellow animals.
With every page we turn, we can feel the resistance to any suggestion that non-human animals are even remotely like ourselves. Of course animals can’t think, can’t feel, can’t talk. We resist this not because such things are impossible but because they are unthinkable. Our lives would be horribly compromised if we accepted that we humans were just one more species of animal.
The village of Hobart, New York, is home to two restaurants, one coffee shop, zero liquor stores, and, strangely enough, five independent bookstores. “The books just show up,” Barbara Balliet, who owns Blenheim Hill Books, says. “I’ve come to the store and bags of books are waiting for me.” Fewer than 500 people live in Hobart. Yet from Main Street, in the center of town, you’re closer to a copy of the Odyssey in classical Greek, or a vintage collection of Jell-O recipes, than a gas station.
This literature-laden state of affairs emerged just after the turn of the millennium, when two residents of Manhattan, Diana and Bill Adams, stopped in Hobart during a trip through the Catskills. “We were both intrigued,” says Bill, who worked as a physician for 40 years. “I saw what a charming, and somewhat rustic, but civilized, area it was.” He and his wife Diana, a former lawyer, were looking for retirement activities that they could pursue into their old age.
Of all the mistakes made by city planners in the postwar era, the passion for highway construction has to be one of the most foolhardy. After the early success of systems like the autobahn and freeways, cities everywhere were carved up to make way for giant roads, crashing through neighbourhoods and creating opportunities for “comprehensive redevelopment”.
This was considered progress, a necessary part of entering the modern world. But some strange things happened – the most damning being that these new roads didn’t reduce traffic at all. Instead, they induced demand, clogging up almost as quickly as they were built. As time went on, communities began rejecting the plans and fighting back against the bulldozers, halting development in its tracks and kickstarting the modern conservation movement.
Finally, the knockout blow was the oil crisis of the 1970s which put an end to many big plans. Looking back, we can marvel at how outrageous some of these and later schemes were, and the traces they left behind.
Part love story and part speculative sci-fi, it’s a meandering, albeit meaningful, look at marriage, technology and ghosts — those of the otherworldly type that may exist but also specters of our past that influence our present.
These debates point to an apparent paradox in our understanding of autism: is it a disorder to be diagnosed, or an experience to be celebrated? How can autism be something that must be ‘treated’ at one level, but also praised and socially accommodated at another? Many people in the neurodiversity community say that autism is just a natural variant in the human condition. But should autistic individuals have the same legal rights as everyone else? Or are their needs different, and if so, how? If we are invited to be skeptical of clinical approaches, how can we decide who qualifies for additional support? The fundamental conundrum is that, over its troubled history, views have shifted about whether autism is part of a narrative description of an individual’s developing life, or whether it’s a measurable category that others have the right to count, demarcate and define.
When the nature of work changes, companies reward new ways of feeling about it. The rise of white-collar work in the 1950s birthed the risk-averse organization man, whose highest values were loyalty and orderly conduct. The deregulation of the 1980s made virtues of aggression and ruthless competition. The new economy is characterized by instability and disruption; its ideal worker is calm in the midst of it all, productive and focused. The mindfulness training his company offers isn’t so much a perk as it is the means of turning him into a new type of person. He can work all weekend, multitask frantically on Monday morning, and reset mentally just by breathing in and out on his lunch break. His employer is not just the center of his overflowing, unpredictable working life but the center of his simple, dependable spiritual life too.
Mornings are normal, but they are not regular. While they happen daily, each morning of course starts and ends at a slightly different time, as the sun rises earlier in the summer and later in the winter. Our society follows a regular 24-hour cycle that glosses over each day’s variations. Morning shows are one way that mornings become regulated. Beyond merely establishing a set schedule, morning shows provide a sense of continuity. The same hosts appear every morning, continuing a conversation that picks up on the previous day’s stories and events. A morning show points to the newly risen sun in the sky. It says that today’s morning is just like yesterday’s.
Chronological decades have little material significance. To a biologist or physician, the physiological differences between, say, 39-year-old Fred and 44-old Fred aren’t vast—probably not much different than those between Fred at 38 and Fred at 39. Nor do our circumstances diverge wildly in years that end in nine compared with those that end in zero. Our life narratives often progress from segment to segment, akin to the chapters of a book. But the actual story doesn’t abide by round numbers any more than novels do. After all, you wouldn’t assess a book by its page numbers: “The 160s were super exciting, but the 170s were a little dull.” Yet, when people near the end of the arbitrary marker of a decade, something awakens in their minds that alters their behavior.
Taylor has journeyed deep into the human psyche, and if you accompany him on the pilgrimage afforded by these Complete Stories, you will emerge transformed.
“Winter” is an insubordinate folk tale, with echoes of the fiction of Iris Murdoch and Angela Carter, that plays out against a world gone wrong.
Around 500 BC, the Carthaginian explorer Hanno the Navigator guided a fleet of sixty oared ships through the Strait of Gibraltar and along the northwest lobe of the great elephant ear that is the African continent. Toward the end of his journey, on an island in a lagoon, he encountered a “rude description of people”—rough-skinned, hairy, violent. The local interpreters called them Gorillae. Hanno and his crew attempted to capture some of them, but many climbed up steep elevations and hurled stones in defense. Eventually, the Carthaginians caught three female Gorillae, flayed them, and brought their skins back home, where they hung in the Temple of Tanit for several centuries.
Though scholars dispute whether the Gorillae were gorillas, chimpanzees, or an indigenous tribe of humans, many regard Hanno’s account as the oldest surviving record of humans encountering another species of great ape. The ambiguity of Hanno’s early descriptions—are the Gorillae human or beast, people or apes?—is not just an artifact of translational difficulties; it is exemplary of a profound misunderstanding in historical attitudes about our closest animal cousins, a confusion that is still being resolved today.
On a windy spring morning two years ago, at an ape sanctuary and research facility in the heart of North America, I had an encounter of my own. Spread over six acres of forest, fields, and lakes in Des Moines, Iowa, the Ape Cognition and Conservation Initiative is home to a clan of five bonobos, including the renowned Kanzi, perhaps the most linguistically talented ape ever studied. When Kanzi was an infant, researchers tried to teach his adoptive mother to communicate using an array of lexigrams on a keyboard. She never made much progress, but Kanzi, like a human child exposed to language, began to use the symbols on his own. Today, he knows the meanings of hundreds of lexigrams and understands many phrases of spoken English as well.
A few months back, I landed in Narita Airport, and from there I took a rail to the center of Tokyo. The plan was to spend a few weeks in Japan. Partly because I’d always wanted to, but mostly because the chance had presented itself. The island lived in my imagination the way it does for gaijin all over the world. There are millions of us, and we never think we’ll go—but my boyfriend Dave and I were going; we’d bought the tickets, booked the Airbnbs, and one night before we left, a whiteboy in a bar asked what any of that had to do with me.
My parents used to travel. They’d made their way all over. We had a cupboard full of mugs from Sweden, and some salt shakers from Peru, and, some weekends, my father wore a kimono around the living room, mumbling after shitty NFL calls in German. Growing up in Houston, I studied Japanese in school. None of it was practical. It had absolutely nothing to do with my life.
In an essay about translating “Human Acts,” published in the online magazine Asymptote, Deborah Smith describes reading Han’s work and being “arrested by razor-sharp images which arise from the text without being directly described there.” She quotes a couple of her “very occasional interpolations,” including the striking phrase “sad flames licking up against a smooth wall of glass.” Charse Yun, in his essay about “The Vegetarian,” declares his admiration for Smith’s work but argues that it is a “new creation.” Smith insists that the phrases she added are images “so powerfully evoked by the Korean that I sometimes find myself searching the original text in vain, convinced that they were in there somewhere, as vividly explicit as they are in my head.”
Once you’ve made a reservation at Paris’s first nudist restaurant, you find yourself neurotically broadcasting this bit of news to anyone who will listen. While vacationing in France’s capital recently, a visitor from New York City approached the front desk of his hotel and told the thoughtful-looking employee seated there, “Tonight, we will be eating at the naturist restaurant, O’Naturel. In addition to our clothing, we will also be surrendering our phones, so between eight-forty-five and eleven o’clock we will be unreachable.” The desk clerk nodded gravely.
O’Naturel is situated on a residential street in the Twelfth Arrondissement, a stone’s throw from a nursery school. The restaurant’s co-proprietor, smiling and fully dressed, buzzed the visitor and a friend into a tiny, curtained-off lobby. “New York City!” the co-proprietor said, glancing at his reservation book. “A woman from there is eating with us tonight as well!” The visitor murmured to his friend, “Probably Maureen Dowd.”
Gnomon is a big, ambitious book that sometimes trips over its own bigness, but reads like some kind of game of literary telephone played by Philip K. Dick, Arthur Conan Doyle and William Gibson.
Thirteen years after Michener left Alcatraz, the prison was shuttered. The beds became overgrown and birds established nesting colonies there. Plants, including nine rose bushes, did their own hard time, surviving austere conditions and neglect.
In 2003, the Garden Conservancy, Golden Gate National Parks Conservancy and the National Park Service combined efforts to restore the gardens. More work remains, says Shelagh Fritz, the Golden Gate National Parks Conservancy’s project manager for Alcatraz. As they have toiled, park staffers and volunteers have unearthed evidence of inmate life — including 100 fugitive handballs, escapees from the prison’s rec yard.
Low-power nonprofit FM stations are the still, small voices of media. They whisper out from basements and attics, and from miniscule studios and on-the-fly live broadcasts like KBFG’s. They have traditionally been rural and often run by churches; many date to the early 2000s, when the first surge of federal licenses were issued.
But in the last year, a diverse new wave of stations has arrived in urban America, cranking up in cities from Miami to the Twin Cities in Minnesota, and especially here in the Northwest, where six community stations began to broadcast in Seattle. At least four more have started in Portland. Some are trying to become neighborhood bulletin boards, or voices of the counterculture or social justice. “Alternative” is the word that unites them.
“New year, new me” is one of those lies we tell ourselves like “my parents did the best they could’’ or “wearing sweatpants in public is acceptable.” I should know, I usually made it to about January 9 before I was up to my neck in deep-fried ice cream covered in Jack Daniels with my new gym membership on fire in a paper shredder. But I’ve learned that beating an addiction is not impossible. In 2009 I was a 240-pound, alcoholic cocaine addict who lived in my mom’s basement. I fully expected my addictions to kill me before I turned 30. And they almost did.
Their designs recall radical pamphlets of yore. Their titles suggest what might have once been unfashionable didacticism or naïve breadth: Twenty Lessons from the Twentieth Century, Demagoguery and Democracy, A Rumination on Moral Panic in Our Time. They are reasonably priced: either to be accessible to the people or to be impulse buys — it’s not clear which.
The books are on display at my local bookstore in the front section by the registers. It is the section I like to call Progressive Identity Items, which also features reusable shopping bags with clever slogans and pretty designs, handwoven baby slings, a variety of leftist lawn signs, political coffee mugs, and bumper stickers.
I buy them all. I can’t help it. They’re so cute.
Over the past two decades, the U.S. labor market has undergone a quiet transformation, as companies increasingly forgo full-time employees and fill positions with independent contractors, on-call workers or temps—what economists have called “alternative work arrangements” or the “contingent workforce.” Most Americans still work in traditional jobs, but these new arrangements are growing—and the pace appears to be picking up. From 2005 to 2015, according to the best available estimate, the number of people in alternative work arrangements grew by 9 million and now represents roughly 16 percent of all U.S. workers, while the number of traditional employees declined by 400,000. A perhaps more striking way to put it is that during those 10 years, all net job growth in the American economy has been in contingent jobs.
Around Washington, politicians often talk about this shift in terms of the so-called gig economy. But those startling numbers have little to do with the rise of Uber, TaskRabbit and other “disruptive” new-economy startups. Such firms actually make up a small share of the contingent workforce. The shift that came for Borland is part of something much deeper and longer, touching everything from janitors and housekeepers to lawyers and professors.
After arriving in Tromsø, I was terrified at the thought of the impending winter. Months of friends and family telling me how they could “never move some place so cold and dark” because the winter makes them “so depressed” or “so tired” had me bracing for the worst-case scenario.
But it didn’t take long for me to realize that most residents of Tromsø weren’t viewing the upcoming winter with a sense of doom. In fact, to many locals, the original question I’d planned to ask—“Why aren’t people in Tromsø more depressed during the winter?”—didn’t make sense. Most people I spoke to in Tromsø were actually looking forward to the winter. They spoke enthusiastically about the ski season. They loved the opportunities for coziness provided by the winter months.
If novelists are relinquishing the very things that are exclusively the province of the novel, then they are complicit in the demise of the novel. If they don’t want to save the novel, why should anyone else?
Sixty years ago, Anglican children used to sing, with some gusto, a hymn extolling the beauties of the Earth, from “Greenland’s icy mountains” and “India’s coral strand” to other examples of the Creator’s artistry. Two lines, however, suggested a glitch in the divine plan: our planet is a place where “every prospect pleases, / And only man is vile.” Caught up in our singing, we paid little attention. Few of us were budding deep ecologists.
If humanity were originally charged with the stewardship of a wonderfully designed world, as the story in Genesis claims, then it is easy to think we have failed in our responsibilities. We have modified the Earth’s surface — as well as the oceans and the atmosphere — in all manner of unattractive ways. But then, so have other species. Beetles have devastated elm trees around the globe. Ants have altered the vegetation and topography of regions they have invaded. Ivy, gypsy moths, and beavers have wrought their own kinds of devastation. Perhaps we have acted on a vaster scale than other species, but it seems unfair to charge Homo sapiens as uniquely vile.
If the idea of stewardship is taken seriously, it must be rethought.
Where do new words come from? Few are purely invented, in the sense of being coined from a string of sounds chosen more or less at random. Most tend to be existing words given new meaning (“to tweet”). In other cases, a word changes its parts of speech (“to Photoshop”, “to Facebook”). And in some of the most creative instances, people chop words and recombine them to make new ones (as in “sexting”).
New words mostly become embedded through use. A few countries have official academies that declare when a word has been accepted, but they have little actual influence over how people speak. Which words make their way into a language says a lot about where phrasing comes from today.
Some writers believe that they have to ease their readers into darkness. It's a popular gambit, and to an extent, it makes sense — you don't want to lose the reader by plunging them instantly into misery; there has to be some glimmer of hope at the beginning, even if you plan to extinguish it eventually.
Neel Mukherjee, thankfully, is not one of those writers, as his stunning third novel, A State of Freedom, proves. His latest book starts off benignly enough, but it doesn't take long at all for him to twist the knife, letting the reader know that this isn't going to be a saccharine, feel-good story. It's a brutal novel that gets darker and darker, and it's as breathtakingly beautiful as it is bleak.
In 1835, Mary reflected on how “the true end of biography” was to deduce “the peculiar character of the man” from the “minute, yet characteristic details” that punctuated the life: from the specifics of place and clothing and bodily experience in which Sampson’s biography excels. And it is their shared faith in biography as a valuable exploration of character, despite the imperfections of the genre, that is perhaps what brings Sampson closest in her search for Mary Shelley.
As New York evolved over the decades, the subway was the one constant, the very thing that made it possible to repurpose 19th-century factories and warehouses as offices or condominiums, or to reimagine a two-mile spit of land between Manhattan and Queens that once housed a smallpox hospital as a high-tech university hub. When the city is in crisis — financial or emotional — the subway is always a crucial part of the solution. The subway led the city’s recovery from the fiscal calamity of the 1970s. The subway was at the center of the rebuilding of Lower Manhattan after the Sept. 11 attacks. The subway got New York back to work after the most devastating storm in the city’s history just five years ago.
The questions we are facing today are not so different from the ones our predecessors faced 100 years ago. Can the gap between rich and poor be closed, or is it destined to continue to widen? Can we put the future needs of a city and a nation above the narrow, present-day interests of a few? Can we use a portion of the monumental sums of wealth that we are generating to invest in an inclusive and competitive future? The answer to all of these questions is still rumbling beneath New York City.
One day in the nineteen-eighties, a woman went to the hospital for cancer surgery. The procedure was a success, and all of the cancer was removed. In the weeks afterward, though, she felt that something was wrong. She went back to her surgeon, who reassured her that the cancer was gone; she consulted a psychiatrist, who gave her pills for depression. Nothing helped—she grew certain that she was going to die. She met her surgeon a second time. When he told her, once again, that everything was fine, she suddenly blurted out, “The black stuff—you didn’t get the black stuff!” The surgeon’s eyes widened. He remembered that, during the operation, he had idly complained to a colleague about the black mold in his bathroom, which he could not remove no matter what he did. The cancer had been in the woman’s abdomen, and during the operation she had been under general anesthesia; even so, it seemed that the surgeon’s words had lodged in her mind. As soon as she discovered what had happened, her anxiety dissipated.
For me, there is no punctuation mark as versatile and appealing as the em dash. I love the em dash in a way that is difficult to explain, which is, probably, the motivation of this essay. And my love for it is emphasized by the fact that many writers never, or rarely, use it—even disdain it. It is not, so to speak, an essential punctuation mark, the same way commas or periods are essential. You can get along without it and most people do. I don’t remember being taught to use it in elementary, middle, or high school English classes; I’m not even sure I was aware of it then, and I have no clear recollection of when or why I began to rely on it, yet it has become an indispensable component of my writing.
“When Breath Becomes Air,” Paul Kalanithi’s memoir of his final years as he faced lung cancer at age 37, was published posthumously, in 2016, to critical acclaim and commercial success. “The Bright Hour,” Nina Riggs’s memoir of her final years as she faced breast cancer at age 39, was published posthumously, in 2017, to critical acclaim and commercial success. The two books were mentioned together in numerous reviews, lists and conversations.
Perhaps less inevitable was that the late authors’ spouses would end up together, too.
Storytelling is certainly reductive, but its simplifications are the means by which human beings make sense of themselves and of each other. It’s not until the book’s brilliant final act that De Kretser allows the reader to fall in love with a character, Christabel, whose particularity grips and moves, and who achieves the ultimate revenge against the writers who have wounded her, by throwing their novels in the bin.
Over the course of an 18-month investigation, officials in the county’s Office of Children, Youth and Families (C.Y.F.) offered me extraordinary access to their files and procedures, on the condition that I not identify the families involved. Exactly what in this family’s background led the screening tool to score it in the top 5 percent of risk for future abuse and neglect cannot be known for certain. But a close inspection of the files revealed that the mother was attending a drug-treatment center for addiction to opiates; that she had a history of arrest and jail on drug-possession charges; that the three fathers of the little girl and her two older siblings had significant drug or criminal histories, including allegations of violence; that one of the older siblings had a lifelong physical disability; and that the two younger children had received diagnoses of developmental or mental-health issues.
Finding all that information about the mother, her three children and their three fathers in the county’s maze of databases would have taken Byrne hours he did not have; call screeners are expected to render a decision on whether or not to open an investigation within an hour at most, and usually in half that time. Even then, he would have had no way of knowing which factors, or combinations of factors, are most predictive of future bad outcomes. The algorithm, however, searched the files and rendered its score in seconds. And so now, despite Byrne’s initial skepticism, the high score prompted him and his supervisor to screen the case in, marking it for further investigation. Within 24 hours, a C.Y.F. caseworker would have to “put eyes on” the children, meet the mother and see what a score of 19 looks like in flesh and blood.
But what if an algorithm could predict death? In late 2016 a graduate student named Anand Avati at Stanford’s computer-science department, along with a small team from the medical school, tried to “teach” an algorithm to identify patients who were very likely to die within a defined time window. “The palliative-care team at the hospital had a challenge,” Avati told me. “How could we find patients who are within three to 12 months of dying?” This window was “the sweet spot of palliative care.” A lead time longer than 12 months can strain limited resources unnecessarily, providing too much, too soon; in contrast, if death came less than three months after the prediction, there would be no real preparatory time for dying — too little, too late. Identifying patients in the narrow, optimal time period, Avati knew, would allow doctors to use medical interventions more appropriately and more humanely. And if the algorithm worked, palliative-care teams would be relieved from having to manually scour charts, hunting for those most likely to benefit.
Every two years, the American valet-parking industry sends its best parkers—optimistically described as athletes—to compete in a head-to-head battle known as the National Valet Olympics. True to their Athenian namesake, the games push participants to the limit. Competitors sort keys. They pack trunks. They slalom through orange cones. They sprint across parking lots. Organized into corporate teams, they also dress in the snazzy uniforms of their trade.
At first glance, an Olympics organized entirely around valet parking seems absurd: a luxury service treated as a Decathlon. Yet the Valet Olympics draw attention to a line of work—or, as some would say, an emerging motorsport—that few ever pause to consider. Successful valets boast automotive skills unappreciated outside the parking lot. And valet parking is a hidden vein of economic opportunity that provides full-time work, first jobs, and summer employment to thousands. For immigrants from Nigeria, India, or Ecuador, or displaced by war in Iraq, the industry can supply a much-needed foothold in the United States, even launching a lifelong career. What’s more, as cities grow in size and complexity, America’s urban centers are becoming harder to navigate—with byzantine parking laws, dense downtowns that require real-life Tetris skills to park, and massive lots located blocks from the venues they serve. All of this makes valets, as they invisibly rearrange streets, the set designers of every busy cityscape. Giving them an arena to demonstrate their talents is, in this sense, a no-brainer.
I recall Noel Annan, the provost of University College London, declaring in the 1970s that the English literature department, historically the first such in England, was the “very heart” of the school. Any college president making such a claim as Annan’s today could await the men in white coats.
It’s with exhilaration, then, that one hails Martin Puchner’s book, which asserts not merely the importance of literature but its all-importance.
And precision — of observation, of language — is Hoby’s gift. Her sentences are sleek and tailored. Language molds snugly to thought.
So if you’re a Silicon Valley brogrammer who wants to drive women out of your manspace, or a Bernie Bro who thought that ‘Bern the witch’ was the summit of wit, then you’re more likely to rush to the rationale that society and culture – ie, things that we create outside of biology – don’t shape the male spaces that exclude women. No, you desperately want Nature to be responsible.
Appealing to a higher power and cherrypicking ‘evidence’ to support a convenient claim of superiority over others of your species is as human as scratching your butt. In this false iteration of that desperate measure of a failing privileged class, Nature created men to like and do certain man things, and naturally, therefore, men are simply better at these things. In complement, goes this wish-fulfilment rationale, Nature created women to like and do certain lesser things, and women are condescendingly told how great they are at it, especially the talking part, and please stay in the kitchen and make a sandwich because all of this analysis is over your head, Sweetie.
But Nature made me. And I am not rare. Indeed, an entire category of girl and woman exists that is large enough to warrant the now-archaic category of tomboy. We are legion, and most of us likely wished fervently at some point that we could be boys so we could simply gain access to what interested us most. How could nature both create an entire legion of girls whose interests and abilities cross into manworld, yet somehow be capable of producing brains only on a binary?
For a writer like Langston Hughes, who made a name for himself as a poet before the age of 21, his debut novel, “Not Without Laughter,” feels like an effort to stake out a bigger claim on his abilities, to create artistic and thematic breathing room. Arna Bontemps, celebrated poet and friend to Hughes, described “Not Without Laughter” as the novel that both Hughes and his readers knew he had to write, coming as it did on the heels of Hughes’s two well-received poetry collections, “The Weary Blues” (1926) and “Fine Clothes to the Jew” (1927). Hughes published these collections while a student at Lincoln University, and he released “Not Without Laughter” in 1930, shortly after graduating. “By the date of his first book of prose Hughes had become for many a symbol of the black renaissance,” Bontemps writes. The stakes were high, then, for the young man born in Joplin, Mo. He had to deliver.
“Not Without Laughter” crystallizes some of the themes introduced in Hughes’s first two poetry collections and examines in detail subjects he would return to throughout his decades-long career, among them the experiences of working-class and poor blacks, the importance of black music to black life, the beauty of black language and the trap of respectability. It begins as a tale of family life, following the Williamses — the matriarch, Aunt Hager; her daughters, Harriet, Annjee (Annjelica) and Tempy; and Annjee’s husband, Jimboy — in the small Kansas town of Stanton. After establishing the conflicts and desires of the adults, the narrative becomes a bildungsroman. Here it finds its true purpose: chronicling the upbringing of Sandy, the son of Jimboy and Annjee, as he struggles to forge an identity outside of the boxes the white and black worlds have put him in, and tries to find stability within his increasingly unstable home.
I may be starting to sound like a stereotypical radical leftist Marxist English professor, influencing my innocent students and corrupting their minds. Two defences: First, my own college education happened at Hillsdale College, a bastion of free market libertarianism and conservative politics; Hillsdale is where I first learned to read closely for economic dynamics (if not exactly with the intended grain, there). Second, my students led me to at least half of the epiphanies in the present Bloomberg article. And these epiphanies took place right on the surface: we were not ‘reading into’ this piece. It’s all right there.
The consequences of this vast gambit for our attention is that we have been drawn into a kind of mental slavery. Masters of profits and propaganda are farming our minds, doing cumulative damage that may go to the very core of our humanity. As a result, our attention is becoming locked into a low level of living and functioning.
A recurring theme in “The King Is Always Above the People” is the need to explore how leaving home, and returning to it, changes you irremediably. Alarcón manages to offer a fresh look at migration, the oldest story of all. “The place you are born,” he writes, “is simply the first place you flee.”
The next day finds you lying naked in a dumpster in a different state, smeared from head to toe with a mixture of Sriracha sauce and glitter. At first you remember nothing. But then, as your throbbing brain slowly reboots, memories of the night before, disturbing memories, begin creeping into your consciousness. As the full, hideous picture comes into focus, you curl into a ball, whimpering, asking yourself over and over: Did that really happen?
That’s how we here at the Year in Review feel about 2017. It was a year so surreal, so densely populated with strange and alarming events, that you have to seriously consider the possibility that somebody — and when we say “somebody,” we mean “Russia” — was putting LSD in our water supply. A bizarre event would occur, and it would be all over the news, but before we could wrap our minds around it, another bizarre event would occur, then another and another, coming at us faster and faster, battering the nation with a Category 5 weirdness hurricane that left us hunkering down, clinging to our sanity, no longer certain what was real.
For literary memoirs, a brisk survey of the genre insists, are hardly ever about what really happens to the people whose names appear on their jackets. They are far more likely to be about what those people think happens to them or how they wish to be regarded by the readers who buy their work. Anthony Powell, for example, and despite compelling evidence to the contrary, always imagined himself to be “a poor boy made good”. Lodge, on the other hand, offered the highly unusual spectacle of a creative writer simply setting down, with sometimes disarming lack of guile, how he had come to be the person he was.
Clause by clause, word by word, anything becomes plausible. Control is achieved through willing proximity to its loss. It seems he’s “just filling a notebook with jazz”, but then these directionless improvisations acquire the weight of stories. Sideways drift gives way to narrative. So let’s hand the wheel back to the narrator at the Starlight who has “nothing to show for 36 years on this earth. Except that God is closer to me than my next breath. And that’s all I’ll ever need or want. If you think I’m bullshitting, kiss my ass. My story is the amazing truth.”