Alex Weber takes a deep breath through her snorkel and dives to the bottom of Carmel Bay, a calm coastal cove several kilometers south of Monterey, California. Just meters away, atop the small cliffs that drop into the waves, golfers tread on the emerald greens of Pebble Beach Golf Links. It’s early December—the sky’s blue, the weather T-shirt warm. The golfers swing their way from hole to hole on the famed golf course. Unfortunately, their aim is rarely perfect.
Weber surfaces 45 seconds later and drops nearly a dozen golf balls into a yellow mesh bag held open by her father, Mike, who is also in a wetsuit and snorkeling gear. The pair have been in the water for several hours and have collected more than 1,500 golf balls—the fallout of a sport that has unseen, and probably significant, consequences for the ocean.
In America Death is a whisper. Instinctively we feel we should dim the lights, lower our voices and draw the screens. We give the dead, dying and the grieving room. We say we do so because we don’t want to intrude. And that is true but not for these reasons.
We don’t want to intrude because we don’t want to look at the mirror of our own mortality. We have lost our way with death.
On the Irish island off the coast of County Mayo, where my family have lived in the same village for the last 200 years, death speaks with a louder voice.
I was a bookseller when I first encountered Kristin Newman’s travel memoir nestled among the morning delivery. Squinting for a moment, I recognized the red blob beneath the title — What I Was Doing While You Were Breeding — as a lipstick kiss on an airplane window. The jacket copy not only summed up the countries she visited, but also the men she met: “Israeli bartenders, Finnish poker players, sexy Bedouins, and Argentinean priests.” My throat constricted, heartbeat erratic, as I slipped a copy in my bag. I chalked up this difficulty swallowing and sweaty palms to disagreeing with Newman’s central argument: that by travelling alone instead of settling down, she found herself, and only then could she walk off into the sunset she’d been destined for. I couldn’t read past the first chapter, instead wanting to take to the streets like an Evangelical grasping a paint paddle and duct tape sign: this book is a lie; that’s not how the story goes; repent! What I meant was: this is not how my story goes.
“Blue Dreams,” like all good histories of medicine, reveals healing to be art as much as science. Slater doesn’t demonize the imperfect remedies of the past or present — even as she describes their costs with blunt severity. And, improbably perhaps, she ends on a note of hope, calling these early efforts to address mental illness “the first golden era.” If the story of the magic bullet of psychopharmacology is coming to an end, another story — a potentially better one, Slater believes — is coming to take its place.
Bob and Robin made three documentaries in the highlands, two of them about Joe Leahy and his neighbors. Each was a triumph, and they still are recognized as such, icons of a genre, touchstones of both anthropology and film. The initial one, First Contact, was nominated for an Academy Award, and the last, Black Harvest, had “extraordinary historical resonance,” the New York Times wrote, “so rich that watching it feels like taking an inspired crash course in economics and cultural anthropology.” Newsweek said it had “the scale and richness of classical tragedy.” Which was true, because everything ended so badly.
On a clear, bright summer morning more than two decades later, Bob inhales a great gulp of crystalline air. Away from Mount Hagen, a chaotic burgh wrapped in barbed wire and wood smoke, the highlands are pristine. Bob always was struck by that, the clarity of the place. As for the rest—Kilima, Joe, the Ganiga who have lived on this land for untold generations—Bob isn’t sure how it all turned out, and that’s why he’s back for the first time in more than 25 years. Maybe he’ll even find enough story for a fourth film.
The Dutch-born historian and New York Review of Books editor Ian Buruma spent several years in Tokyo when he was in his early 20s, and he picked up a nickname. Because “Buruma” sounds a bit like the word “bloomers” in Japanese, at least one acquaintance — Hijikata Tatsumi, the father of Butoh dance — called him Underpants or, for short, Pants.
It was an affectionate sobriquet. The Japanese playwrights, dancers, film directors and actors who made up Buruma’s milieu in the mid-1970s adored him. He was exotic, a gaijin, as white as tofu. He was tall and good-looking and possessed a trace of bohemian glamour because it was known that his uncle was John Schlesinger, the director of “Midnight Cowboy.”
“Freshwater” is a poetic and disturbing depiction of mental illness as it haunts the protagonist from birth to adulthood: “the brief insanities that are in you, not just the ones that blossomed as you grew taller … but the ones you were born with, tucked behind your liver.” It is an unflinching account of the way mental illness can grow, transform and destroy not just relationships, but one’s sense of self as well.
Fifty-five years ago, I built a house (that is, paid for the building of it) in the northwest corner of Princeton Township, in New Jersey. It was on an unpaved road, running through woods and past an abandoned cornfield that had become a small meadow. My house looks out through trees and down that meadow.
Improbably, I developed a yearning, almost from the get-go, to see a bear someday in the meadow. While I flossed in the morning, looking north through an upstairs bathroom window, I hoped to see a bear come out of the trees. If this seems quixotic, it was. This was four miles from the campus of Princeton University, around which on all sides was what New Yorkers were calling a bedroom community. Deer were present in large familial groups, as they still are in even larger families. They don’t give a damn about much of anything, and when I walk down the driveway in the morning to pick up the newspaper I all but have to push them out of the way. Beforehand, of course, I have been upstairs flossing, looking down the meadow. No bears.
More often than I’d like to admit, something seemingly inconsequential will cause the same feeling to rear its head again. Something as small as accidentally squashing the panettone I was bringing my boyfriend’s family for Christmas can tumble around in my mind for several days, accompanied by occasional voices like “How stupid!” and “You should have known better”. Falling short of a bigger goal, even when I know achieving it would be near-impossible, can temporarily flatten me. When an agent told me that she knew I was going to write a book someday but that the particular idea I’d pitched her didn’t suit the market, I felt deflated in a gut-punching way that went beyond disappointment. The negative drowned out the positive. “You’re never going to write a book,” my internal voice said. “You’re not good enough.” That voice didn’t care that this directly contradicted what the agent actually said.
That’s the thing about perfectionism. It takes no prisoners.
“I understand your frustration,” I replied, “and wish I could help to change the situation.”
I may have been a lowly intern, but it was a feeble reply. And he knew it. “Understanding is not enough,” he said. “You should be doing something to help fix this system.”
The hospital, he lamented, is more like a factory — “it tests every ache and treats every laboratory abnormality, but it does little to heal its patients.” Treating and healing are both necessary, but modern health care too often disregards the latter.
It doesn't seem like anyone remembers Titanic as a great movie, despite the fact that it won 11 goddamned Oscars. Maybe it's because we've decided that anything teen girls like is terrible. Or maybe it's that every line sounds like it was directly copied and pasted from some other period romance movie. But what if there was something much more daring going on under the surface? What if all of that fanfiction positing that Jack was short for "Jacqueline" is in fact onto the real story being told ... or at least the one that James Cameron wanted to tell, but chickened out of at some point? If you think I'm just screwing with you, give me a chance to make my case.
A moan begins in the back of his throat, lower pitched than a whine, higher than a groan, and grows. His head tips back. His eyes close. The moan escapes in a rush of vowels, louder and louder and louder, and now he is howling. It’s the sound he made in his youth whenever he heard a siren passing on the big road at the edge of the neighborhood, but he can’t hear that far any more.
In this wide-ranging study spanning from the Oscar Wilde trials in the 1890s to the gay liberation movement of the late 20th century, Woods demonstrates that this paranoid fantasy of a clandestine queer underground has been a persistent feature of the modern heterosexual imagination. Yet, daring to take it seriously, Woods tells a history of cultural modernity that focuses on interconnected queer cliques and coteries that, taken together, formed the backbone of the modernist movement that revolutionized the visual, literary, and performing arts. Rather than presenting it as an organized conspiracy against hetero hegemony, though, he imagines the 20th-century gay avant-garde as a single transnational network, one with a “consistency of purpose” that “cohere[s] as a single narrative of lives lived against the grain.” Dubbing this the “Homintern,” a play on the “Comintern,” or Communist International organization founded by Vladimir Lenin in 1919, Woods grounds his history in a simple yet subtle claim: that ever since the invention of “homosexuality” as an identity category in the late 19th century, and the simultaneous rise of individuals who began to identify themselves as “homosexuals,” those who desire their own sex have been obliged to keep those desires hidden from public view. This forced them to create clandestine connections for intimacy, support, and comradeship. Yet because of this socially enforced secrecy, many straight people came to see homosexuals as deliberately and inherently disingenuous. They seemed to present a deceptively “normal” public image that masked the perverse pleasures they indulged behind closed doors. Much like the communists with whom they were often associated, the Homintern was believed to have no allegiance to any nation or culture beyond itself, and hence was inherently opposed to the State and to the commonly held values that hold society together. “The willingness of gay men and lesbians to associate across national boundaries throughout the last century,” Woods states, “led to extraordinary encounters, some fleeting, others more enduring; some sexual, some social, many creative.”
“It’s all happened so quickly,” says Oliver, who feels unhappy not only about the poster but, in the British style of ever-decreasing circles of self-consciousness, unhappy about the ingratitude his dislike of the poster might be said to show his employers. “Which I know is bizarre to say, 11 years later, but I don’t feel I’ve come up for breath on any of this yet.” He grins with the boyish incredulity that has become a large part of the appeal of his show. “The fact there is a poster of me in Times Square is absurd.”
It has been a strange thing, particularly for British observers, to watch Oliver transform from Jon Stewart’s vaguely Beatles-like young sidekick into a middle-aged man with the heft – both figuratively and literally – to drive his own hit series. Last Week Tonight, in which he and his team use comedy to animate stories either too complicated or too dull to excite rolling news interest, is a relatively small product in the HBO canon, but clearly a big source of prestige. Watching Oliver land jokes about nefarious town planning, or tease from Edward Snowden confirmation of when the National Security Agency can look at pictures of his penis, not only makes the viewer feel smart, it has the giggly sense of laughing at things we’re not supposed to find funny. At its best – and when it most often goes viral – Last Week Tonight is that rare thing, a highly entertaining show with a measure of social utility (at its worst, it has the over-anxious air of someone trying to tap-dance life into the unmentionably tedious). “We’ve done some really boring things,” Oliver says, with the delight of a man who has bucked every commercial principle in his industry and still come out victorious.
This is how I found myself crying early one morning in January: For the three weeks I had been living in a studio apartment in Daytona Beach, Fla., I had gone running four times a week. On that particular day, I ran down to the beach, turning right when I hit the sand so that I would see the sun rise.
The sight wasn’t new, but on that day I felt so good and warm and light that, just as the sun crested over the waves, that well-worn line from “Annie” blasted through my mind: “The sun will come out tomorrow.” And I burst into tears.
In an effort to outrun more than a year of depression and grief, and the seasonal affective disorder that swamped me most winters since childhood, I had become a 37-year-old snowbird. The tears that morning were a mixture of joy and relief because, in pointing my car — packed with three duffel bags of clothes and a box of books — south for most of January and February after the worst year of my life, I had finally started to feel whole again.
What Langlands is advocating for in his book is more widespread knowledge about the time when craft was integral to daily life. In the era he studies, activities like beekeeping weren’t escapes from reality, but essential to it. He also smartly notes that neither “craft” nor “craeft” is a synonym for “working with one’s hands.” At its root, the word “manufacture,” which is associated with mass production, means “to make by hand.” Most of the cheap goods we buy are made at least in part by people. The reason assembly isn’t “craeft,” to follow his logic, is that the final form of an assembled object is predetermined, requiring no ingenuity or material wisdom.
Lagerspetz’s book is an investigation into what we mean by “dirt” and whether it is an actual quality of the world or, as most current theoretical work would have us believe, a subjective idea projected on to reality. Lagerspetz deconstructs the easy reductionism of theorists for whom “dirt is not really dirt but something else”. This includes such luminaries as Mary Douglas (for whom dirt was merely “matter out of place”) and Julia Kristeva, who he says has banished dirt “to the misty regions of symbolism”. He believes this is a mistake, one that has arisen because “matter is not perceived as strange enough”.
But much of the story, as insiders tell it, will ring disconcertingly familiar to anyone involved in the modern news industry. It’s a tale of a precarious business model, a roller coaster of explosive growth and cruel contraction, mercurial corporate ownership, and journalists forced to produce work so shoddy and craven that they were embarrassed to attach their name to it, all in the name of “saving the company”—and their jobs. At a time when Google and Facebook have become the prime conduits to online news, Newsweek’s downfall highlights the existential vulnerability of even the best-known media brands to the whims of tech companies’ algorithms. It also suggests something more chilling: how quickly a reputable news organization can disintegrate in the hands of the wrong owners.
It's hard to think of a writer of his generation who has more defined the American male's perception of himself, for better or worse, than David Mamet. This is perhaps not the best period in our history to be that person, but it makes his writing particularly noteworthy. The 1992 film adaptation of his play, "Glengarry Glen Ross," has become the fountainhead of so much male bonding vocabulary that we've wearied of the "Coffee's for closers," "Third place is you're fired," and, of course, "Always Be Closing," that each generation of fraternity boys and young financial service professionals seems to discover anew. By now, aided by the ascension of a salesman absurdum to the highest office in the land, generations of young men speak these lines as gospel instead of satire. Mamet's new novel, "Chicago," is as linguistically rich as "Glengarry Glen Ross" — in fact, as any of his previous work in any medium.
Pinker speaks fondly of the gifts of the Enlightenment. And there were many. But the Enlightenment also gave us habits of mind, among them the habit of thinking that the human experience can be aligned along a single axis of progress. Pinker’s gift is to challenge us not only to update the Enlightenment but to think beyond it.
Even so, the quest to capture “the meaning of everything” – as the writer Simon Winchester described it in his book on the history of the OED – has absorbed generations of lexicographers, from the Victorian worthies who set up a “Committee to collect unregistered words in English” to the OED’s first proper editor, the indefatigable James Murray, who spent 36 years shepherding the first edition towards publication (before it killed him). The dream of the perfect dictionary goes back to the Enlightenment notion that by classifying and regulating language one could – just perhaps – distil the essence of human thought. In 1747, in his “Plan” for the English dictionary that he was about to commence, Samuel Johnson declared he would create nothing less than “a dictionary by which the pronunciation of our language may be fixed, and its attainment facilitated; by which its purity may be preserved, its use ascertained, and its duration lengthened”. English would not be merely listed in alphabetical order; it would be saved for eternity.
Ninety years after the first edition appeared, the OED – a distant, far bulkier descendant of Johnson’s Dictionary – is currently embarked on a third edition, a goliath project that involves overhauling every entry (many of which have not been touched since the late-Victorian era) and adding at least some of those 30,000 missing words, as well as making the dictionary into a fully digital resource. This was originally meant to be completed in 2000, then 2005, then 2010. Since then, OUP has quietly dropped mentions of a date. How far had they got, I asked Proffitt. “About 48%,” he replied.
The films, in addition to having diminishing returns, were causing a physical toll: He was a big man doing stunts, running around in front of green screens, going from set to set. His body began to fall apart. “By the time I did the third Mummy picture in China,” which was 2008, “I was put together with tape and ice—just, like, really nerdy and fetishy about ice packs. Screw-cap ice packs and downhill-mountain-biking pads, 'cause they're small and light and they can fit under your clothes. I was building an exoskeleton for myself daily.” Eventually all these injuries required multiple surgeries: “I needed a laminectomy. And the lumbar didn't take, so they had to do it again a year later.” There was a partial knee replacement. Some more work on his back, bolting various compressed spinal pads together. At one point he needed to have his vocal cords repaired. All told, Fraser says, he was in and out of hospitals for almost seven years.
He laughs a small, sad laugh. “This is gonna really probably be a little saccharine for you,” Fraser warns. “But I felt like the horse from Animal Farm, whose job it was to work and work and work. Orwell wrote a character who was, I think, the proletariat. He worked for the good of the whole, he didn't ask questions, he didn't make trouble until it killed him.… I don't know if I've been sent to the glue factory, but I've felt like I've had to rebuild shit that I've built that got knocked down and do it again for the good of everyone. Whether it hurts you or not.”
Specifically, I have deuteranomaly, or red-green color blindness. It was diagnosed in elementary school, and for a while it terrified me. I remember exactly what the room looked like; I was getting an annual eye exam (I’d worn glasses since about the third grade), and I guess they’d just started regularly screening for color blindness. They asked me to identify a number in a group of colored dots, and of course I couldn’t see any numbers at all. The lady giving the test turned and hollered to my mother, on the other side of the waiting room, “Did you know she’s color-blind?”
My mom was shocked, and not in a good way, and I thought, “Blind? Did she just say the word blind?”
A lot of things don’t have meaning until there’s some basis of comparison. When you’re a kid, your world is just your world, and you don’t know it’s different from anyone else’s until that’s pointed out to you. When I was a kid, people pointed to red and called it red, so I called it red; but what I was seeing wasn’t the red they saw at all. It took me a long time to understand what it meant to be color-blind. All I knew then was it felt like a dark shadow following me around, and I just wanted it to go away.
White folks would ask Vertamae Smart-Grosvenor the weirdest questions.
They noticed that she called herself a "Geechee girl" in her 1970 cookbook-memoir, Vibration Cooking: or, the Travel Notes of a Geechee Girl. But what the hell was a Geechee girl? Was that anything like a "geisha girl," they wondered? If she was black, then why didn’t she consider herself a “soul food” writer like most other black food writers? And what was this concept of vibration cooking, anyway? Was it cooking with a vibrator?
You’d think that she was speaking gibberish when she wrote the book. In it, she’d introduced America to a radical concept: Cooking isn’t rocket science as much as it is an outgrowth of feeling.
Crammed with feverish, hallucinatory imagery—a “glutinous” swimming pool, “smears of curry and iodine” in a sky that “looks like a massacre”—these are gynocentric tales of angsty adolescent girls, anomic wives, adult women with difficult mothers, and elderly women with lost daughters. Braverman’s feminism can be hot and oleaginous, like the burning oil of medieval punishment, as when a domestic helpmate transforms into a witchy Medea figure to exact vengeance on her husband. But elsewhere the book’s politics, gender or otherwise, seem threaded with a gentle mysticism.
We all struggle with how to deal with our shortcomings, but few of us seem as comfortably wrapped in the question’s delicious, pig-in-a-blanket anxiety as Knapp, whose perpetual handwringing serves as the book’s unfocused narrative spine.
Negative news is one reason why people consistently underestimate the progress humanity is making, complains Steven Pinker. To discern the true state of the world, he says, we should use numbers. In “Enlightenment Now”, he does just that. The result is magnificent, uplifting and makes you want to rush to your laptop and close your Twitter account.
Citizenship and its varying legal definition has become one of the key battlegrounds of the 21st century, as nations attempt to stake out their power in a G-Zero, globalized world, one increasingly defined by transnational, borderless trade and liquid, virtual finance. In a climate of pervasive nationalism, jingoism, xenophobia, and ever-building resentment toward those who move, it’s tempting to think that doing so would become more difficult. But alongside the rise of populist, identitarian movements across the globe, identity itself is being virtualized, too. It no longer needs to be tied to place or nation to function in the global marketplace.
Hannah Arendt called citizenship “the right to have rights.” Like any other right, it can be bestowed and withheld by those in power, but in its newer forms it can also be bought, traded, and rewritten. Virtual citizenship is a commodity that can be acquired through the purchase of real estate or financial investments, subscribed to via an online service, or assembled by peer-to-peer digital networks. And as these options become available, they’re also used, like so many technologies, to exclude those who don’t fit in.
Mr. Hughes is one of the co-founders of Facebook, for which he did “three years worth of work for nearly half a billion dollars,” as he puts it, emphasizing the extreme nature of his success. He and Mark Zuckerberg were roommates at Harvard, and early on, Mr. Hughes ran the company’s communications and marketing department. The social network’s colossal success fast-tracked Mr. Hughes’s career. In 2008, he joined Barack Obama’s first presidential campaign to launch and manage My.BarackObama.com, a robust system that organized Obama supporters and was viewed as instrumental to his victory. In 2012, when Facebook went public and The New Republic came up for sale, he bought it, hoping to herald the publication into a digital future and expand its reach. His tumultuous ownership ended in 2016, when he sold the magazine. Later that year, he partnered with Mr. Warren and Ms. Foster to form the Economic Security Project.
In his new book, “Fair Shot: Rethinking Inequality and How We Earn,” out this week, Mr. Hughes, 34, traces his ascent to show how the forces that influenced his and Facebook’s success — technological advancements, globalization and the rise of private equity firms — have created a “winner takes all” economy in which only a small group of people succeed.
Modern retellings bring the horror into sharp focus. Drawing upon the fantastic, otherworldly logic and familiar narratives of the fairy-tale genre, Machado’s stories dwell upon the inescapable queerness of embodied life for women in a patriarchal world—where queerness describes not only unruly sexual desire, but also a whole spectrum of peculiar, delightful, and devastating things that can happen to a body. If living in a feminine body is a party, as the book’s ambivalent title may suggest, the precise nature of the occasion might be more Donner Party than Cinderella Ball. Some conditions are often mundane in their horror: the pressure to conform to beauty norms, the difficulties of pregnancy and childbirth, the lasting trauma of sexual assault.
What we can know of our bodies, ourselves, or each other is the subject of Jessie Greengrass’s debut novel. The author of an award-winning short story collection, An Account of the Decline of the Great Auk, According to One Who Saw It, Greengrass adapts to the novel format with enviable flexibility. Or rather, she adapts it to her own specific literary sensibilities: ruminative, taking a judicious distance from things, just far enough to see the links between them, but not so far she misses their textures.
For years, our most important records have been committed to specialized materials and technologies. For archivists, 1870 is the year everything begins to turn to dust. That was the year American newspaper mills began phasing out rag-based paper with wood pulp, ensuring that newspapers printed after would be known to future generations as delicate things, brittle at the edges, yellowing with the slightest exposure to air. In the late 1920s, the Kodak company suggested microfilm was the solution, neatly compacting an entire newspaper onto a few inches of thin, flexible film. In the second half of the century, entire libraries were transferred to microform, spun on microfilm reels, or served on tiny microfiche platters, while the crumbling originals were thrown away or pulped. To save newspapers, we first had to destroy them.
Then came digital media, which is even more compact than microfilm, giving way, initially at least, to fantasies of whole libraries preserved on the head of a pin. In the event, the new digital records degraded even more quickly than did newsprint. Information’s most consistent quality is its evanescence. Information is fugitive in its very nature.
“It would be a D’Anjou or a Bosc, something run-of-the-mill,” he said on a recent afternoon as snowflakes feathered down outside the bakery window. “I would touch it and it would just kind of yield to my thumb, and then I would take a bite of it right there in the store and I would have to make a little slurping sound, so not all the juice would run down my chin, but some would.”
A feeling of deprivation is part of the psychological cycle of life in Alaska. In summertime, farmers’ market tables overflow with greens, and garden zucchinis swell to the size of small dogs in the all-night light. But now is the season of austerity and anticipation. Though each day brings a few more minutes of light, the root-cellar stock has dwindled down to last summer’s dry-skinned beets, gnarled carrots and potatoes with eyes. All those blocks of frozen sockeye in the chest freezer don’t have the same appeal that they did in November.
And yet people who actually like cooking tend to crave boundaries—to want to be, as Julia Child assured us we could be, “alone in the kitchen.” What if you wish to preserve a kitchen secret—to slip, say, the odd, shameful envelope of Lipton’s onion-soup mix into your meat loaf, à la Ann Landers? Radical transparency becomes kitchen exhibitionism: we are all on cooking shows now. The food writer Sierra Tishgart, whose kitchen opens onto her Greenwich Village junior one-bedroom, told me that she dreams of a closed kitchen. “I also don’t have a dishwasher, and part of the horror of my open kitchen (which is basically inside my living room) is that there’s nowhere to hide dirty plates.” One friend told me that she has developed the habit of hiding dirty pots in her oven. We all know, from experience, that the open kitchen is an invitation to guests to hop up, one by one, like whack-a-moles, with their dirty dishes.
As I sat in the Library of Congress’s reading room poring over drafts swamped with marginalia, paragraphs for episodes that never materialized, and ephemera scribbled on the backs of grocery store receipts and old envelopes, I was alternately entranced and dismayed. Amidst this thicket of sentences and ideas, I had hoped to discover a plan, an ending, or—better yet—an explanation for why this writer of the first order hadn’t completed what he was certain would be his magnum opus. I never found any of these. Instead, I was given an inside view of artistic struggle stretched across decades that had resulted not in the conquest of an author over form but in a sprawling curiosity cabinet of literary possibilities.
“If I have any gift at all,” Zadie Smith admits in one of the essays in Feel Free, “it’s for dialogue—that trick of breathing what-looks-like-life into a collection of written sentences.” Smith does voices. Sometimes literally: an audio recording of her reading her story “Escape from New York,” includes the treat that is impressions of its three characters, Michael Jackson, Marlon Brando and Elizabeth Taylor. Her fiction, of course, is full of voices, but the rendering of this familiar trio and their escape occupies that fertile gray area somewhere between entirely real and entirely fabricated. It isn’t mimicry, which leads nowhere, but a curious sort of imaginary impersonation, which leads everywhere.
It is one thing to read “The Old Man and the Sea,” for instance, when you are 15 and the world lies ahead of you in all its endless possibility. It is another to read it in middle age, when a few big dreams may have died, and by “a few dreams” I don’t just mean catching a big honking marlin off the coast of Cuba, although sure: that too.
It works the other way as well. There are some books you should read only when you are young.
In exploring the creation of art and its purpose, and the authorial risks of cultural appropriation, Halliday has produced a skillfully executed, layered work — a novel about writers engaged in, and contemplative about, the act of writing — without the self-consciousness and overt intention that burdens so much metafiction.
Most people relish voyages, can’t wait to get out of town to lighten winter blues. But the impulse to hit the road never seized me. I gulped my sanity in doses close to home. Travel: I saw it as the contrary of place-based affections. Journeys often unmoored my ship of state. My body could draw to a halt in the face of racing contemplations. I was a thing that kept on thinking.
“I have traveled widely in Concord,” Henry David Thoreau wrote, a wry lift of his lip curling the page. Explore “your own streams and oceans,” he advised. Learn the joys of voyaging at home. His peer, Margaret Fuller, drowned at sea when she blundered from her homeland soil. She shipwrecked with her Italian family. She never learned the benefits of staying put.
Recently, though, I’ve been increasingly interested in reading against my tendencies, rather than solely into them. Whether or not I love a piece, I’m always invited into a previously-unconsidered perspective; It’s not hyperbole to say it enlarges my view of humanity.
And then a conversation on Twitter sparked an epiphany: I saw a poetry conversation (fostered by Paige Lewis and Kaveh Akbar) about the way long poems insist on taking up space, taking up room, and can thus be seen as powerful political gestures, especially for writers of color, queer writers, and women writers. This notion struck me powerfully. I’ve been wired since girlhood, by factors ranging from my Catholic upbringing to low self-esteem, to shrink, to make myself smaller, to avoid bringing attention to myself. Perhaps, I decided, my comfort with small poems had underpinnings I should interrogate.
Laura Freeman was first diagnosed with anorexia aged 14. A decade later she had begun to rebuild her life but still struggled with her attitude to food, eating small portions of the same thing for months on end. “At 24, I’d got to the point where I was recovered enough that I could eat, but only in a very formulaic way,” she says. “I had a pretty boring diet. It was more about getting through each day.”
Then one day she read a passage in Siegfried Sassoon’s 1928 Memoirs of a Fox-Hunting Man describing “a breakfast of boiled eggs eaten in winter”. It changed everything.
Young's story is one of three detailed pictures across the country that Eubanks draws to illustrate that automated systems used by the government to deliver public services often fall short for the very people who need it most: An effort to automate welfare eligibility in Indiana, a project to create an electronic registry of the homeless in Los Angeles, and an attempt to develop a risk model to predict child abuse in Allegheny County, Penn.
Behind the examples in this thoughtful book lies the realization that the relationship between the spectator and art is inevitably complex. After all, we can’t return art or history to a lost past; as the architectural historian Sigfried Giedion observed: “The backward look transforms its object. ... History cannot be touched without changing it.”
Vaughan’s technique is absurdly simple: He uses a saw to chop healthy hard coral pieces into much smaller fragments; these grow back extremely quickly atop small concrete plugs and are then replanted in the sea. In essence, he’s created a sea-life version of Mickey Mouse’s broomsticks in the Sorcerer’s Apprentice. Smash them up, then watch them come roaring back with a vengeance.
The technique is a vital one for the field. Coral’s biggest problems might be warming seas and rising acid levels, but those are magnified by a sad fact of life for corals: They aren’t very good breeders. “We actually didn’t know how corals reproduced until the 1980s,” Vaughan said. That’s because, as if adhering to some dirty fairy tale, corals breed only a few days a year, en masse, for around 30 minutes, shortly after the full moon in August, when they simultaneously fill the sea with their white, snowy-looking gametes in a single, very unkinky orgy.
The only sure thing about covering North Korea is this: The lead is always clear, the second sentence isn’t.
Over the past week, the latest twist in seven decades of inter-Korean conflict unfolded before thousands of journalists. After sharply accelerating its pursuit of nuclear weapons last year, North Korea seized an offer from South Korea to show a different side at the Winter Olympics, one of the globe’s biggest media events. It sent a handful of athletes, 229 cheerleaders, a music group, dozens of coaches and minders and Kim Yo Jong, the sister of dictator Kim Jong Un and the first member of the Kim family to ever visit South Korea.
And suddenly, the dominant narrative flipped. The tyrannical North Korean regime was…okay? But was it really? Was everyone, including the international media, being outsmarted? What was the regime really doing?
But the hunger for that older way of doing things persists. Movie studios and television networks are soulless, monstrous entities, ravenous heads of a corporate hydra. A Broadway theater is an empty shell. There is not much we see that commands our loyalty, or inspires our solidarity. The unsatisfied, atavistic part of ourselves that harbors a dim memory of those wondrous nights in the village square experiences a special frisson, a jolt of recognition and excitement, when we witness the work of players who seem loyal to one another.
This is why we react with a special kind of excitement when we encounter what looks like the work of a genuine troupe in the crowded, highly mediated, aggressively monetized postmodern landscape where we scavenge for beauty, fun and enlightenment. What connects certain television shows, movies and stage productions to ancient folkways is a particular blend of novelty and familiarity. You see the same faces again and again in new disguises. The afternoon’s clown is the evening’s tragic hero; yesterday’s princess is tomorrow’s wicked stepmother.
Among Irish people alive today, my mother is rare in that English was not her native tongue. She was born in the Gaeltacht, the ever-waning sliver of the country where Irish is spoken daily, in a place whose name translates to “Step of the Deer,” after the moss-obscured legend of a stag’s miraculous leap in flight from the hunting hordes of ancient days. In the village of her birth, speaking Irish was unremarkable, as water is unremarkable to the fish that breathe it.
But when as an adult my mother moved to America, there was no one to speak her native language with, and a curtain fell across this part of her life; slowly, by degrees, the words faded from her mind as dew goes from the grass. By the time my siblings and I were born, I doubt she ever seriously entertained the notion of teaching it to us. As I grew older, old enough to register a lack, I would ask her about this and receive a fatalistic response: “There’s no point in learning it,” she would answer. “It’s going extinct.” Our mother’s first language became to us like an heirloom once treasured but now lost, or like a member of the family who remained now only in sepia pictures: a wedding photo, a grainy beachside snapshot, a picture of childhood sport, the laughter and movement rendering their features nothing more than a blur.
Ground Work is an extraordinary and life-affirming book. Perhaps its greatest value lies in the multiplicity of ways in which its contributors connect and communicate with the natural world and with the places and people about them. One doesn’t need to be a farmer, or a conservationist, to justify a relationship with the wild. We just need to learn to look properly, and to find the common ground.
If the urbanistic publishing-academic complex substituted the words “the world” for “the city”, the grandiosity of their projects would immediately be exposed as preposterous. But writing about cities has purpose, if it stays close to specific cases. Building and Dwelling, subtitled Ethics for the City, doesn’t escape all the traps that go with writing on “the city”. It becomes so general that the complexity it champions gets simplified. It’s when he gets specific, intellectually and observationally, that Sennett’s insights are worth reading.
She’s careful not to suggest that it’s all nature over nurture. But her view that women are generally more conscientious and collaborative than men, if less openly ambitious and confident, feels surprisingly conservative in a book about turning conventional thinking upside down. For women who want to smash down the boardroom door, this is a terrific read. But smashing the system? That, it seems, would be a step too far.
What happened, in a prominent strain of American crime films beginning in the 1940s, was a coalescing of narrative and stylistic influences that included the Anglo thriller, the American vernacular school of hard-boiled crime writing (whose best-known representatives were Dashiell Hammett and Raymond Chandler), and a chiaroscuro visual style that owed no small debt to the German Expressionism of the 1920s — a legacy that could be accessed with relative ease, as many of the leading lights of the Germany film industry had decamped to Hollywood in advance of or during the European political catastrophe of the 1930s. (Hitchcock, it should be mentioned, had himself been profoundly shaped by the time he spent around German studios in the 1920s.) In his 1972 essay “Notes on Film Noir,” Paul Schrader defines the emergent noir style through its propensity for nocturnal settings, “oblique and vertical lines,” flashback-heavy narratives, and “compositional tension,” as well as that oft-cited, catch-all symptom of “postwar disillusionment.”
None of this is accidental. K-pop has become the international face of South Korea thanks to an extremely regimented, coordinated production system. More than any other international music industry, K-pop has been strategically designed to earworm its way into your brain — and to elevate South Korea and its culture onto the world stage.
How did we get here? Through a combination of global political changes, savvy corporatization and media management, and a heck of a lot of raw talent being ground through a very powerful stardom mill.
Don’t Skip Out on Me shines a light on the broken-down and the drifters; it is a bruising yet surprisingly tender study of the need for human connection, and the way that urban landscapes can be more isolating than any wilderness.
The Apple Store captures everything I don't like about today’s mall. A trip here is never easy—the place is packed and chaotic, even on weekdays. It runs by its own private logic, cashier and help desks replaced by roving youths in seasonally changing, colored T-shirts holding iPads, directing traffic.
Apple operates some stand-alone retail locations, including a glass cube entrance in midtown Manhattan and a laptop-shaped location on Chicago’s Michigan Avenue. But a lot of the stores are located in shopping malls. The Apple Store is one of the only reasons I go to the mall anymore. Usually I get in and out as fast as I can. But today I’m stuck.
When all is said and done, it turns out to be a strange relief. Contrary to popular opinion, malls are great, and they always were.
“You instigate valiantly, and then second guess. … You know your problem? No follow-through.” Emperor Georgiou (Michelle Yeoh) hisses these words to Michael Burnham (Sonequa Martin-Green) in the final episode of the first season of Star Trek: Discovery, in a moment so meta it nearly leaps off the screen. What has happened to the series that, when it ended its midseason finale last November, was not only the best opening season of any Trek series, but easily the best run of back-to-back episodes the Star Trek franchise has ever produced? What went wrong? How did the promise of those early episodes land us here? Discovery, like Star Trek: Beyond before it, is fine, I guess — but when Captain Gabriel Lorca (Jason Isaacs) typed those bogus coordinates and hijacked the ship last November, he hijacked the series too. Nothing has been quite right since.
The problem truly is follow-through. Since November, Discovery has careened from one ill-advised “shocking” plot twist to another — Mirror Universe! Secret Klingons! Neck Snap! Evil Captain! Another Evil Captain! Oops, Genocide! Uh, Nevermind! — without doing the deeper work necessary to make each twist sensible either on its own terms or in terms of the larger Star Trek mythos of which this is, ostensibly, an important but heretofore hidden chapter. In fact, what has typically happened is that the Big Twist has served as the opportunity to summarily eject a previously established plot line from the series. The discovery that Ash Tyler is really Voq, or that Lorca is really Mirror Lorca, or that Emperor Georgiou’s plan is to destroy the Klingon home world — each of these revelations serves not as the springboard for new narrative developments but rather as that story arc’s unnaturally abrupt conclusion.
On my last visit, I saw students studying, a retiree reading Shakespeare out of a big leather-bound edition, a family filling out visa applications they printed, and a kid in headphones making beats in Ableton. This is a nice representation of the world as I wish it to be—all creation, appreciation, education, and exploration. The library is what brought them together, and it asks for nothing back. Its purpose is fulfilled by all of us using it. That means, I think, that the library is one of the best places to get a real and generous sense of the city. How does a city wish to be? Look to the library. A library is the gift a city gives to itself.
This all sounds very rosy and pure, but the first visit to the library as an adult can be a little unnerving. It feels like you are doing something wrong by being there. What’s the catch? Is it a trap? How often is nothing expected of us? It is so rare in New York (and in many other places), because our presence is expected to be the start of a transaction. New York City has some of the most expensive real estate in the world, so every square inch must be monetized. That is why it is so special that this big, beautiful building is plopped right in the center of everything. Just a few blocks away from the mania of Times Square, Samuel Tilden’s gift sits waiting for anyone who wants to open it.
Despite the abundance of quit-lit out there, we’re still not, as a community of scholars, doing a great job dealing with this thing that happens to us all the time. The genre is almost universally written by those leaving, not those left behind, a reflection of the way we insulate ourselves from grappling with what it means for dozens, hundreds, thousands of our colleagues to leave the field.
Quit-lit exists to soothe the person leaving, or provide them with an outlet for their sorrow or rage, or to allow them to make an argument about what needs to change. Those left behind, or, as we usually think of them, those who “succeeded”, don’t often write about what it means to lose friends and colleagues. To do so would be to acknowledge not only the magnitude of the loss but also that it was a loss at all. If we don’t see the loss of all of these scholars as an actual loss to the field, let alone as the loss of so many years of people’s lives, is it any wonder I felt I had no right to grieve? Why should I be sad about what has happened when the field itself won’t be?
When I first mentioned my idea of writing a memoir to David Carr, he told me that I needed to “visit a foreign land where writers live.”
“That bag of tricks of journalism — anecdote, blah bidy blah, flick of the smarty pants here and there, juicy quote, more blah bidy blah, which you and I own and know, is no help here,” The Times’s beloved media columnist emailed me weeks before he died.
Three years later when I sat down to write my first book — a memoir about how covering Hillary Clinton’s presidential campaigns consumed the formative years of my 20s and 30s — I wished I could ask David what he meant by that foreign land. How did I get there?
“The first duty of love is to listen,” Paul Tillich advised, and the attendant duties of love are also the real subject of Li-Young Lee’s stunning sixth book, The Undressing. No surprise then that “Listen” should be the first word of this collection. The lengthy, opening title poem begins with the speaker apparently more interested in undressing his lover than he is interested in listening to her: “I unbutton the top button of her blouse/and nibble her throat with kisses/Go on, I say, I’m listening.”
It is a bold move to open with such a sensual poem, but it is quickly clear this poem is not “just” a love poem. It’s also about the personal relationships between author and speaker(s) and audience, between the present and the past, and even human beings to language since “The world/is a story that keeps beginning” and every word “has many lives.” Because this collection is obsessed with language—its necessity and its limits—Lee brilliantly engages with a polyphony of voices to undress (and address and redress) words. It is an intensely personal investigation, a truly great mind in dialogue with itself and with the universe.
Why does the experience of being lost feel so valuable? It seems to me that this is one more myth about how people are supposed to engage with the city: to give over to discovery, imagination, and self-reliance. All of these, the myth assumes, lurk in the unknown and unplanned, not in the daily commute from A to B and back again.
And yet, individuals’ different understandings of geography are so subjective, founded on observation and experience and desire. My map of Paris may look nothing like yours, or like Google’s. But when I’m following it I’m neither lost nor found. I’m simply attuned to my city. Getting lost is not as important as being alert to the world, even when you know where you’re going.
The crimes and misdemeanors language perpetrates against music are many and various, but one offense is more insidious than most, simply for being so insignificant. It’s a preposition. In English, invariably, we listen to a piece of music. Never with a piece of music.
That little rut of syntax conceals a speed bump on what seemingly should be a musical express lane: the generation of empathy. Empathy is something music can and ought to steadily, even effortlessly create. Performing music, particularly in any sort of ensemble, large or small, exercises the muscles of empathy like no other. But even just listening to it should give empathy a boost, one would think. Name another art form that so regularly launches even its most historically, culturally, and ethnologically distant artifacts into newly immediate vitality, again and again.
The new American man, in other words, is more likely than ever before to be a capable home cook; maybe you’ve even read about him, in Jessica Pressler’s memorable 2015 introduction to the sous vide-loving dude foodie—the “doodie”—or perhaps in stories about how men’s increasing interest in cooking is making the kitchen the new man cave.
As men discover kitchens, kitchens have been quietly discovering men. Take a look at any roundup of the kitchenwares every man should own—the kitchen “tools” and “gadgets,” that is, or “essentials,” a favorite man-brand euphemism for “accessories.” For one thing, you’ll notice a lot of kitchenwares now have the stark, clean, neutral-masculine palette of brushed chrome and matte black as a default. (If there’s a dudely analog to “shrink it and pink it,” it’s something like “steel it, matte-black it, and make it heavier.”) Both appliances and the kitchens they fill have evolved around the men who now inhabit them—even if appliance brands often would prefer not to talk about it.
Few novels have had such mythical beginnings, and few have themselves achieved the status of myths, as “Frankenstein” has. It was the founding text of modern science fiction. It has been endlessly retold in different forms—perhaps only Emily Brontë’s “Wuthering Heights” and Bram Stoker’s “Dracula” have proved as fertile. Each generation of its readers finds new allegories for the anxieties and ambitions of what they take for modernity; the monster each sees is a reflection of themselves. Yet at the heart of the story, as of Mary’s biography, were primeval sadnesses and fears.
An amateur-investigator story, a black comedy, a family saga, The Hoarder knots together a number of genres, but with Bridlemere at its centre – part Bluebeard’s castle, part fly-tipped Manderley – its roots lie in the gothic tradition.
Twenty years ago, in the parking lot of a Cirque du Soleil show at Santa Monica Beach, I saw in the dust an antique diamond engagement ring. Of course I picked it up, all tiny diamond and huge ring size, but the mystery took hold of me: who was its owner, what was her story, and did she mean to throw away her marital promise ring?
“Look at this!” I said to my new husband James. We’d only recently found each other, were instantly simpatico, and had married at nearly first sight.
“Are you sure you want to mess with that?” he asked. “That’s somebody’s magic, you know, sitting in the dirt.” He was always talking about somebody’s magic, and messing with it.
“I do!” I gleaned, and pocketed the sweet thing.
Some readers think fiction writers garnish the truth with a sprig of parsley and pass it off as a story entrée. I’ve heard it all. Fiction is roman à clef. A first novel is always autobiographical. The protagonist’s problem is a stand-in for the author’s neurosis. Of course, novelists routinely protest too much. Fiction writers make things up, and it can be difficult work, we claim, because these things must be summoned from somewhere beyond memory and history and then shaped into narrative. However, somewhere between the accusation and the defense, there’s the story, and sometimes a writer’s biography may need to be considered. When it comes to reading Junichiro Tanizaki’s 1928 novel “In Black and White,” translated beautifully by Phyllis I. Lyons, an emeritus professor of Japanese language and literature at Northwestern, it may help the reader to know that there was an actual death associated with Tanizaki’s literary murder story.
Certain issues have become so noisy and stigmatized that they seem to be all-consuming and invisible at once. Abortion is one of them, and Katie Watson wants to change how Americans talk about it — when, that is, they deign to truly talk about it at all.
Watching my son refuse food sometimes feels like payback for the trouble I caused my family. He is not polite in letting us know how revolting he finds a dish he has not even deigned to taste. I have lost much of the pleasure I used to take in cooking, frustrated by having my efforts in the kitchen treated with reliable disdain. His kindergarten teachers rave about his creativity and kindness, but then, with a lowering of the voice, remark on how poorly he eats compared with the other children. His grandparents prepare him meals out of special children’s cookbooks, and look on with barely disguised concern as he rejects the spinach lasagna or broccoli bake the author assured them would be a hit. My husband and I have taken to opening kids’ cookbooks, staring at the photos of Things That Are Not Plain Pasta, and laughing the hollow laugh of the defeated.
Still, the boy grows. He has boundless energy. He is clever and fun and loving. There is nothing visibly wrong with him. His doctor is unconcerned. When I see people try to cajole him into acting like a normal hungry child, I feel like I am the only person who really understands him, his one ally in a world of robust and unquestioning eaters. I know the frustration of being browbeaten into eating something with a texture or smell I couldn’t bear, of staring down a plate of unfinished food for hours. I recognize his stubbornness, the way he turns down even a food he loves if he feels he is being coerced. I resent that his eating habits so often overshadow his many good qualities, as though this one flaw weighed heavier in the balance than his curiosity, empathy, or devilish grin.
Their story was clear, and it was a strange and compelling tale. Around 600 A.D., Pope Gregory the Great decreed that fetal rabbits, or laurices, were not meat, and could be eaten during Lent, when meat was not allowed. Monks in France — where else? — quickly saw an opportunity and began to keep and breed rabbits as a meaty non-meat to nourish them through a cold and fishy Lent.
Lent, a period of penance and self-denial for many Christians, begins Wednesday, but anyone to whom this story suggests a new menu should stop right there. Apart from the scarcity of laurices at the supermarket these days, the whole story is wrong according to a new scientific report. “None of it is even close to being true,” said Greger Larson, one of the main authors of the report debunking the myth.
One cloudless Saturday morning, last February, I sat alone on a bench, in Long Island City, waiting for the bus. The street felt artificially still, like a stage set. I had forgotten to listen to my boyfriend tell me where he was going—to the office, maybe? I was distracted by the book I was reading: “The Idiot,” by Elif Batuman, in which the main character, Selin, is in her freshman year at Harvard and in love. It’s “an amazing sight, someone you’re infatuated with trying to fish something out of a jeans pocket,” Selin thinks at one point. And later: “It felt insane to make a plan to do something after I was going to meet with Ivan—like making plans for after my own death.”
Capitalism’s genius for absorbing and integrating every challenge to it is on vivid display in this thoroughly absorbing history. Behind familiar brands like Stonyfield (now a subsidiary of Danone) or Cascadian Farm (now part of General Mills) stand hippie ideals as well as pioneering organic farmers. As Gene Kahn, the hippie-farmer founder of Cascadian Farm, told me, somewhat ruefully, after he sold his company to General Mills, “Everything eventually morphs into the way the world is.” True enough, and yet the world is also changed in the process. Hippie foods may have been absorbed into the mainstream, and to an extent hippie farming too, but the big hippie idea about food — that our eating has moral, ethical and political implications — has lost none of its power, and continues to feed a movement.
For all that this sort of thing keeps you on your toes, The Melody sometimes threatens to become (almost literally) a shaggy dog story, with the novel’s central conflict between profit and justice settled offstage rather than in the hinted-at grandstand finish. Yet the book retains a lingering power – not least in Crace’s gentle reminder that, although the personal may well be political, it’s often easier to pretend otherwise.
Unlike Kant and the other high priests of the Enlightenment, today’s rationalist somehow has to make do without God or unfolding historical logic. This (as Nietzsche noted) makes science harder, not easier. The heroic ethos of science, of progress, is to carry on regardless, even in the knowledge that entropy will eventually win. Perhaps making this argument makes me a “leftist intellectual”, but I couldn’t help finding it a more appealing – even affecting – ethical pitch than the triumphalism that announces that the good guys have already won.
If you live in a city today, you will have to think about malls. Urbanization will go on for some decades before it flattens off, “once Africa completes the industrialization process started by Britain in the 19th century,” per Stephen Cairns of the Future Cities Laboratory. At least until then, the planet will have new malls.
Dealing with this new urban form requires vigilance. Even if you never experienced the Jane Jacobs fantasy of a neighborhood street ballet, it’s draining to give in to city life predicated on spending. Resistance is difficult: spend more time in houses (although city homes tend to shrink or drift further from the center over time)? Seek out parks (even as the climates of many cities get more inhospitable by the day)? But the efforts count, especially in scale: “the freedom to make and remake our cities and ourselves is […] one of the most precious yet most neglected of our human rights,” as the leftist geographer David Harvey once said.
Without rejecting the language claim outright, I’d like to venture a new defining feature of humanity – wary as I am of ink spilled trying to explain the folly of such an effort. Among all these wins for animals, and while our linguistic differences might define us as a matter of degree, there’s one area where no other animal has encroached at all. In our era of Teslas, Uber and artificial intelligence, I propose this: we are the beast that automates.
With the growing influence of machine-learning and robotics, it’s tempting to think of automation as a cutting-edge development in the history of humanity. That’s true of the computers necessary to produce a self-driving car or all-purpose executive assistant bot. But while such technology represents a formidable upheaval to the world of labour and markets, the underlying point of these inventions is to achieve a goal nearly as old as our very first tools: exporting a task to an autonomous system that can complete it without continued human input. This might seem like something far beyond the reach of all but the most modern humans, but it’s actually a thread that runs through human history.
Lord Chesterfield called the novel “a kind of abbreviation of a Romance.” Ian McEwan described the more compact novella as “the beautiful daughter of a rambling, bloated, ill-shaven giant.” William Trevor considered the short story “essential art.” Writing a story, he said, is infinitely harder than writing a novel, “but it’s infinitely more worthwhile.” And now we have the even shorter story, a form that was validated, if it needed to be, when Lydia Davis, whose stories are sometimes a sentence long, was awarded the 2013 Man Booker International Prize. In their citation, the judges said of Davis’s works: “Just how to categorize them? They have been called stories but could equally be miniatures, anecdotes, essays, jokes, parables, fables, texts, aphorisms or even apothegms, prayers or simply observations.”
The short-short story is narrative (or it’s not) that is distilled and refined, concentrated, layered, coherent, textured, stimulating, and resonant, and it may prove to be the ideal form of fiction for the 21st century, an age of shrinking attention spans and busy and distracted lives, in which our mobile devices connect us to the world as they simultaneously divert us from it. And on the screens of our smartphones and our iPads and our laptops, we can fit an entire work of flash fiction. It’s short but not shallow; it’s a reduced form used to represent a larger, more complex story; it’s pithy and cogent, brief and pointed, and like the gist of a recollected conversation, it offers the essential truth, if not all the inessential facts.
My first approach to a cover always begins in my sketchbook. After reading the manuscript and taking brief notes, I begin drawing small thumbnail versions of loose cover ideas to work out composition and placement. This helps me to avoid getting mired in unnecessary details too early in the process. There is so much that I wanted to capture on this jacket, this was definitely an instance where I had to break it down to one overarching mood: I chose fa-bu-lo-si-ty.
Voguing obviously came to mind—could I represent the graphic hand movements cleanly? Would it be readable and tell the reader exactly what he or she needed to know right away? I found the work of an illustrator named Blake Kathryn who creates these dreamlike 3D illustrations, and while I loved that these figures appeared to be voguing, something about the digital rendering looked a bit too sci-fi.
The hamburger, long the all-American meal, has always contained an element of instability to it, not only because it can rot. From references in popular culture to investors like Bill Gates seeking to find the non-animal burger that can feed the world, the burger’s identity is as malleable as that patty of protein itself before it is thrown on a grill. Perhaps both the burger and the citizens it feeds are changing.
Curling is absolutely the best sport to watch on television, particularly for viewers looking for an escape from the frantic "more, faster, bigger, higher" grind of most televised games. Watching basketball or hockey can get you so hyped up, you feel like drinking a Red Bull and doing jumping jacks. Watching curling makes you want to drink a glass of red wine and lie down on the shag carpet. Curling is deliberate. Thoughtful, even. The games move very slowly. The players spend a lot of time talking strategy. There are nods and quiet words of encouragement; rarely are there disagreements. When it comes time for a team member to play their turn by sliding a stone down the ice, the moves are elegant. There's a wind up, a push-off, a slide, and a gentle release. Such poise and finesse!
In death there is no longer any control over your life and your work; this control we call privacy. For the famous, this means everything is published: the writer’s journals, the director’s cuts, and even the secrets of the actress’s internal organs, which contained, we are told, traces of cocaine, heroin, and MDMA.
Autopsy: literally, to see for one’s self. The dead are pried apart, defenseless before our judgment.
For the ordinary, what is flayed open is of much more private interest: words written down, things squirreled away, photographs, possessions, browsing history, texts, unposted selfies. These too are published, but only in the sense that there are no more additions, no changes. The work of living has stopped, and only these impressions remain.
As in the work of Anne Tyler (a writer Jones loves, and who also favors one city, in her case Baltimore, above others in her work) the apparently “gentle” themes of Jones’ novels — the interior lives of women, explorations of the American family and relationships — can be deceptive, as can the simplicity of her prose. Jones places black people at the center of her stories (without making them elaborate race ambassadors) while doing subtle work in parsing and interrogating those intimate themes.
And now An American Marriage, with its ruminations on masculinity, married life, and what constitutes marital debt, manages the trick of arriving at the right time while also feeling utterly untethered to just one era. If Silver Sparrow was mainstream America’s introduction to Tayari Jones, her newest novel could be the one to propel her into its heart.
It’s safe to say that few of us stop and marvel at the extraordinary progress that humankind has made in the past couple of hundred years – a mere blink of the eye in evolutionary terms. Instead we’re more likely to lament the state of the world, deplore the ravenous nature of humanity, rage at the political and financial elites and despair at the empty materialism of consumer society.
But for Pinker, that’s an indulgence we can no longer afford. His book is a sustained, data-packed argument in favour of the principles promoted by the Enlightenment, “The Case,” as its subtitle puts it, “for Reason, Science, Humanism, and Progress.”
Halliday has written, somehow all at once, a transgressive roman à clef, a novel of ideas and a politically engaged work of metafiction. “Asymmetry” is extraordinary, and the timing of its publication seems almost like a feat of civics. The effect on the reader feels identical to the way Ezra describes a piano suite by Isaac Albéniz, which he selects near the program’s end. “Each of the pieces builds on the last,” he says. “They’re discrete and yet all the richer for being heard together, and you just ache with the mounting intensity of it.”
“How to Turn Down a Billion Dollars” ably if uncritically chronicles the short history of a young company catering to young users, with a young chief executive, and reveals, intentionally or not, the limitations that come with that combination.
Sure, PowerPoint presentations have displaced chalkboards; enrollments in massive open online courses often exceed 100,000 (though the number of engaged students tends to be much smaller); and “flipped classrooms” replace homework with watching taped lectures, while class time is spent discussing homework exercises. But, given education’s centrality to raising productivity, shouldn’t efforts to reinvigorate today’s sclerotic Western economies focus on how to reinvent higher education?
One can understand why change is slow to take root at the primary and secondary school level, where the social and political obstacles are massive. But colleges and universities have far more capacity to experiment; indeed, in many ways, that is their raison d’être.
I had assumed that Terayama’s spectacles were the madly exaggerated, surreal fantasies of a poet’s feverish mind. But what astonished me about Tokyo, on visiting it for the first time, in the fall of 1975, was how much it resembled a Tenjo Sajiki theatre set. There was something theatrical, even hallucinatory, about the cityscape itself, where nothing was understated: products, places, entertainment, and fashion screamed for attention. In Tokyo, it seemed, very little was out of sight.
I never thought that I could be Japanese, nor did I wish to be. But I was open to change. This meant, in the early stages of my life in Japan, almost total immersion. In the first few weeks, I walked around in a daze, a lone foreigner bobbing along in crowds, taking everything in. I walked and walked, often losing my way in the maze of streets in Shinjuku or Shibuya. Much of the advertising was in the same intense hues as the azure skies of early autumn. And I realized now that the colors in old Japanese woodcuts were not stylized at all, but an accurate depiction of Japanese light.
Change, then, isn’t just a constant — it’s a source of hope. Horn’s all-encompassing vision embraces the potential of technological advancement, the bugbear of How to Stop Time, and while Haig’s novel fetishizes the past, Eternal Life is resolutely forward-looking — it even features a crucial plot point that involves a cryptocurrency mining rig. At the end of the book, Rachel finds herself holding a newborn in one hand and a smartphone — that symbol of our age — in the other, awash in an unusual sense of peace and possibility. And she awaits a man who “had run out to pick up a few necessities, a strange thing that young men now seemed to routinely do for their wives and children, along with dozens of other tasks she had never seen any man do, like vacuuming a rug or emptying a dishwasher.” And that, if nothing else, is hopeful indication indeed that things can change for the better.
For most of my life, I knew of just one photograph from my parents’ wedding. It wasn’t even from the actual day. A few weeks after they’d gone to Cambridge, Massachusetts, City Hall, a friend hosted a small celebratory dinner party, which included a cake decorated with edible nasturtium flowers. They are pictured there, my mother, seated, laughing in white, a crown of flowers in her dark hair. My father stands behind her, smiling in a dress shirt, vest, and tie, and wearing tinted glasses, his curly hair combed out like a face-framing halo in a fourteenth-century fresco. Their clasped hands hover in the space between them like a third presence.
The photograph is washed out from over-exposure and many years of handling. That it was the sole memento of the occasion was, to me, a source of pride—the preciousness imbued by its singularity, the lack of egotism it implied. They must have been so happy they forgot to take pictures! Also, they are gorgeous, like late-Seventies movie stars. I imagined the snapshot born of a briefly remembered duty to document the event before they returned to celebrating.
“Unhappy individuals who hope never have the same pain as those who remember,” wote Soren Kierkegaard — the Danish philosopher of whom Chidi is so fond — wrote in Either/Or. In The Good Place, there is no memory — or, rather, memory is limited to whatever Michael chooses for you, as he can reboot the characters at any time, entirely erasing their memories. There is a kind of pain that they get to escape by having no memories — and one of the great questions of Judeo-Christian theology is how, exactly, God remembers. God is a God of remembrance — He remembers His people, their trials, their actions, their sins. Memory brings pain. There is no getting around the fact that an omniscient God knows and recalls all that we do, which is a terrifying prospect.
What are we to make of that? Does The Good Place get it right? It might, although in the book of Isaiah it is God, not people, who forgets in heaven: “I, I am He who blots out your transgressions for my own sake, and I will not remember your sins.” In this way, God’s forgetting — His choice not to remember — is the very thing that saves people from their sins and allows them into heaven.
One question I’ve often been asked since I started learning Korean is: do the two halves of the peninsula speak the same language? The answer is yes and not quite. Yes, because division happened only in the previous century, which isn’t enough time for mutual unintelligibility to develop. Not quite, because it is enough time for those countries’ vastly different trajectories to impact on the language they use, most noticeably in the case of English loanwords – a veritable flood in the South, carefully dammed in the North. The biggest differences, though, are those of dialect, which have pronounced regional differences both between and within North and South. Unlike in the UK, a dialect doesn’t just mean a handful of region-specific words; conjunctions and sentence endings, for example, are pronounced and thus written differently. That’s a headache until you crack the code.
At the heart of Winterfolk is the hope that Rain, through seeing the greater world, will gain the ability to make her own choices and move forward with her life. In this sense, it is not dissimilar from the fairy tales that Rain reads again and again. Her journey leaves her changed, and more aware of who she truly is. It's rare to find a book that is so gentle and so brutal at once, but Rain will take your hand and show you the way through.
By adopting the trappings of science, the forensic disciplines co-opted science’s authority while abandoning its methods.
These overblown and largely imaginary numbers—and forensic testimony offered with the certainty O’Neil claimed—are dangerous, because they give a false sense of scientific precision to juries and contribute to wrongful convictions. When examiners testify that they can make a match “to a reasonable degree of scientific certainty,” they are making what sounds like a statistical statement. Dr. Searls’s point was that they hadn’t done the studies required to back up such a statement, so there was no way for O’Neil to support his claims—the “match” was simply what he subjectively judged to be true.
The fact is that it takes time to measure time; the challenge of Olympic timing through the decades has been to make that measurement as quickly as possible. Watches capable of discerning hundredths of a second were in regular use in the Olympics by 1948. But what good is such refinement if, when an athlete crosses the finish line, the judge drops a tenth of a second or more merely clicking the stopwatch? (Human thought takes time to propagate and enact, too.) The weakness of this link became terribly apparent during the 1960 Summer Olympics, in Rome, when two swimmers, the American Lance Larson and the Australian John Devitt, seemingly tied in the hundred-metre freestyle. A half-dozen judges, peering through the waves at the finish, reached a stalemate: three declared Larson the winner, the other three Devitt. Though Omega’s stopwatches indicated that Larson had the faster time, by at least a tenth of a second, a referee broke the tie and awarded Devitt gold.
Some observations are unsubtle and the metaphors are occasionally overcooked. But these are forgivable blips in a book with the compassion to capture the loneliness of a trans woman with AIDS who rides the subway at rush hour to feel the warmth of “human bodies all against her”, and the sensuousness to convey the beauty of young gay lovers mimicking Fred and Ginger on a hot rooftop as the sun sets. The New York of “The House of Impossible Beauties” may not warrant much nostalgia, but it is a moving place to visit.
People move in and out of the narrative with their own baggage and preoccupations. What they choose to tell us is very subjective and not always directly relevant, and this clamor of voices gives the novel satisfying depth and texture. There’s a sense here that we’re brushing up against many lives, many versions of the truth.
In this last subset of moments—the quick ones, the instants, the markers of some kind of change—perhaps there is something to understand about how we perceive time. The moment the wheel began rolling down a hill. The moment the woman finished her drink. The moment I woke up this morning. What are these moments? Are they singular, discrete? Are they all the same? When do they begin and end? How long is a moment?
Time, even in its most fiat-based forms, is a slippery beast that seems to defy our attempts to define, cage, contain, and regulate it. Bringing into focus something that is, at once, as precise and fuzzy as a moment means taking a closer look at what time is and how we understand it, through linguistic, scientific, neurological, and philosophical lenses.
Of all the wretched places to visit in New York, Planet Hollywood is king. The moldering eatery’s main entryway—beneath a colossal, glitzy sign that jostles for attention with Times Square’s other lurid neons—leads you to one of two elevators, their doors designed to mimic a subway car’s (as though the real and grimy thing were not a block or two away). One need not feast at a Planet Hollywood to know that the experience will be underwhelming and too expensive, that the earsplitting soundtrack will consist only of pop anthems and Disney theme songs, that there will be a weekly changing burger named the OMG! Burger, and that a visit to the gift shop will make you want to cry. A cursory search of online reviews confirms Planet Hollywood’s status as a dwindling brasserie chain attached to a substandard museum—a place that should no longer exist and yet seems to defy market logic. To quote a recent note on TripAdvisor, “The threat from dust falling from the above decorations was enough to put you off.”
But shortly after moving to America, and for reasons that now evade me, I began dining regularly—and with near-evangelical enthusiasm—at Planet Hollywood Times Square. (This is the city’s only branch, and it has lived here since 2000, after relocating from its original 1991 location on West 57th Street.) I have noshed on spinach dip served in a cocktail glass, and on a pizza whose pepperoni is glistening and wet. I have stopped in for drinks—some of the cocktails, by the way, involve bacon, some chocolate milk, and most have vaguely clever names like Eternal Sunshine, Hawaii Five Ohhh, There’s Something About Mary and Pineapple Express. A couple of titles are less divinely inspired, such as the Red Carpet Margarita. (Also available, for forty-two dollars a bottle, is Vanderpump Rosé, one of Lisa Vanderpump’s wines. If you actually want to get drunk, I recommend that—or a beer.)
Empty Set has important visual components, but first and foremost it displays Gerber Bicecci’s talent as a writer. The characters are rich and well developed, the mood is contagious, and the plot is simple yet intriguingly complex. The novel, which is achronological (although the shifts in time are so subtle that, at first, we barely notice them), unfolds in short, fragmented sections that are frequently punctuated by drawings, puzzles, and letters. But as Juan Pablo Villalobos, the talented author of Down the Rabbit Hole and Quesadillas, says in a review, Gerber Bicecci “does not shirk the narrator’s responsibility toward plot.”
Perhaps all this makes it seem like the novel is a political treatise, but it is Richard’s subtle transformation and Erpenbeck’s liquid prose style that make this book glow above and beyond the content. Erpenbeck’s mastery of language and image ripples through her pages.
The man felt like a speck in the frozen nothingness. Every direction he turned, he could see ice stretching to the edge of the Earth: white ice and blue ice, glacial-ice tongues and ice wedges. There were no living creatures in sight. Not a bear or even a bird. Nothing but him.
It was hard to breathe, and each time he exhaled the moisture froze on his face: a chandelier of crystals hung from his beard; his eyebrows were encased like preserved specimens; his eyelashes cracked when he blinked. Get wet and you die, he often reminded himself. The temperature was nearly minus forty degrees Fahrenheit, and it felt far colder because of the wind, which sometimes whipped icy particles into a blinding cloud, making him so disoriented that he toppled over, his bones rattling against the ground.
The man, whose name was Henry Worsley, consulted a G.P.S. device to determine precisely where he was. According to his coördinates, he was on the Titan Dome, an ice formation near the South Pole that rises more than ten thousand feet above sea level. Sixty-two days earlier, on November 13, 2015, he’d set out from the coast of Antarctica, hoping to achieve what his hero, Ernest Shackleton, had failed to do a century earlier: to trek on foot from one side of the continent to the other. The journey, which would pass through the South Pole, was more than a thousand miles, and would traverse what is arguably the most brutal environment in the world. And, whereas Shackleton had been part of a large expedition, Worsley, who was fifty-five, was crossing alone and unsupported: no food caches had been deposited along the route to help him forestall starvation, and he had to haul all his provisions on a sled, without the assistance of dogs or a sail. Nobody had attempted this feat before.
In retrospect, the months before held plenty of signs. The unusually hot spring night outside Rosa’s Pizzeria when I couldn’t finish my slice because the cheese kept stringing down my throat. Or my little sister’s twenty-third birthday dinner at an overpriced steakhouse when a bite of crab cake stuck, the crispy breadcrumbs scraping along that tender column. I knew I wasn’t choking—this was nothing like the wad of yellow rice that lodged halfway down at El Sombrero in college years before. That had been a plug, suctioned to the full circumference of my windpipe: a sudden and alien absence of air. This was stubborn grit in my esophagus.
The crab cakes were just the appetizer. When my entrée came, I nearly refused to eat. I can’t remember now what I ordered except that it had sounded simultaneously rich and refreshing, like chicken roasted in a lemon-herb cream sauce. But when placed in front of me, the dish looked the furthest thing from appetizing. My food looked dangerous.
Less than a month later, I would be unable to swallow solid food at all.
Larry is using his own body, and his ongoing struggle with Crohn’s, as an experiment. He keeps precise measures of his body’s input (what he eats and drinks) and output (the energy he burns and what he excretes—and yes, that is precisely what it sounds like). He undergoes periodic MRIs, has his blood and stool analyzed frequently, submits to annual colonoscopies, and has had his DNA sequenced. Among the things Calit2 does with all these data is create a stunning, regularly updated three-dimensional image of his insides, which he calls “Transparent Larry.” His colleague Jürgen Schulze projects it inside “The Cave,” a virtual-reality room that literally places the viewer inside the picture. Larry can not only chart the changes taking place inside his body; he can actually see them.
As a result, he arguably knows more about his own inner workings than anyone else ever has. His goal, as he puts it, is for each of us to become “the CEO of our own body.”
I write in praise of difficulty in writing—specifically difficulty in the novel form. Why? Well, not least because of the Modernist direction my own fiction has taken since I began my Umbrella Trilogy seven years ago—but also because I believe that in the contemporary era, with the novel under assault from digital text and other more compelling narrative forms, only soi-disant “difficulty” can preserve the form’s unique capability to both describe the contours of our brave new world, and enfold the reader in them. What has struck me about the reception of my own works—and those of others of the “new difficult” school of writing such as Eimear McBride—is that while critics may praise them, they feel compelled to place a health warning on our texts: “difficult,” with the minatory subtext—an echo, perhaps, of Der Steppenwolf—that “this book is not for everybody.”
The novel — told in chapters through the voices and letters of the main characters, whose regrets seep between the lines — unfolds seamlessly and naturally for the unsuspecting reader, whose perspectives on life and marriage, responsibility and survival, the challenges that break us and the ones that make us strong, are about to be disturbed, if not subverted. It asks the “Big Questions” that loom in the middle of the night when our squabbles have grown into hardened grievances and life has given us a backhanded slap, suddenly sending everything spinning out of our control and shaping it into something we never envisioned or imagined: What is marriage? What comprises a long-term relationship? How do two people weather the storms — or even trickier, how do three? What is fidelity? Responsibility? Commitment? How inescapable are the factors of race and the State, of history? How powerful is the sway of a white woman’s word over a black man’s life, and over the lives of all those he holds dear? It’s all there in this slyly intricate novel that explores not just the intersectionality of race, class, gender, and culture, but also the tenuous, unavoidable intricacies of family — intimate, extended, and newly discovered.
The story of this hateful barrier’s fall and the ensuing 28 years, two months and 27 days of German history is one of expanded individual horizons: it has meant previously unimaginable travel, enterprise, friendships and relationships (the proportion of German couples with one “Ossi” and one “Wessi” partner passed the 10% mark in around 2008). Among the touching reflections on the anniversary today have been social media posts to that effect by Germans speculating on how much poorer their lives would have been #ohneMauerfall (without the fall of the wall).
One experience is better documented than most. The wall’s construction in 1961 was Angela Merkel’s first political memory: “My father was preaching on that Sunday. The atmosphere was horrible in the church. I will never forget it. People cried. My mother cried too. We couldn’t fathom what had happened.” Just over 28 years later, working as a physicist in East Berlin, she was taking her regular Tuesday-evening sauna when travel restrictions were lifted. She later joined the crowds pouring across the border at the Bornholmer Straße bridge and, on the other side, wanted to call her aunt in Hamburg from a pay phone, but had no West German money. A woman who had dreamed of travelling to the West, perhaps to America with special permission on her retirement, would soon after plunge into the reunified republic’s politics and end up leading it. If she secures a fourth term as chancellor in the still-ongoing coalition talks she could end up having done so for over half of its post-wall history.
One possible explanation for this is how old people are when they learn different kinds of languages. Small, in-group languages tend to be learned in infancy, when the brain is astonishingly adept at learning complicated linguistic systems and large lists of exceptions to the rules. But languages like English and Chinese are learned by huge numbers of people later in life, including adulthood, when the brain has become much less cooperative. So, one hypothesis goes, the features that are harder to learn in later life get lost from languages that are learned by plenty of second-language speakers.
But Reali, Chater, and Christiansen think we can make that explanation even simpler and not worry about when in life a language is learned. They point to evidence that vocabulary is easier to learn than fiddly grammar, and the team suggests that, when you combine this ease of learning with different population sizes, the results we see in languages around us fall out naturally.
Libraries are the last place in every town and city that people can simply exist. Every building one enters today comes with some expectation of spending money. Restaurants require paying for service. Shops require the intention of purchasing something. Houses require rent. Anyone who has lived near the poverty line, whether or not they have actually been homeless, has felt the threatening pressure toward expenditure that permeates the public spaces of modern Western culture. Even a free restroom is becoming difficult to find, especially as growing cities experience ever-increasing space constraints.
In a library, no one is asked to pay anything simply to sit. For those with few resources besides time, this is a godsend. Libraries are unofficial playgrounds for low-income families on rainy days, homeless shelters in cold months, reprieves from broken homes for grade-school-age children. They are the last bastions of quiet and calm where nothing is asked of one but to exist. Many arguments have been made about how the library is an outdated institution offering outdated services—that in the twenty-first-century how-to books on building sheds and daily newspaper copies are obsolete and the funding used for libraries ought to be reallocated to other programs. I can only assume that those who make such arguments are people who have always been comfortable with the expenditures it takes to move through the world, whose presence has never been questioned. For those people, libraries can be about books. But not everyone has the luxury of seeing past the space.
The Renovo Public Library, in North-Central Pennsylvania, isn’t a handsome wood or brick building on the town square. It certainly isn’t anything like the New York Public Library’s main branch, on Fifth Avenue, with its marble lion guards outside and palatial rooms and hallways. Instead, it’s a small, squat former auto garage built with concrete blocks painted white. The building was remodeled and opened in 1968, in a campaign led by a group of schoolteachers and local residents to obtain, for our remote end of the county, a branch library. It sits on a rise overlooking the Susquehanna river, at the end of a dead-end street, all but hidden from the currents of town life.
Frequently, I was the only person in the building, other than our librarian, Viv. I lingered there on drowsy after-school afternoons because I loved the sweet damp smell of paper and glue slowly decaying, because I loved pulling some forgotten old hardcover from the shelves, and because, simply, I loved being in a room filled with books. Two rooms, actually. There was a small-town stillness, an atmosphere of benign neglect inside our little library that suggested the great works of Western lit were mine alone to discover. A translation of the Greek epic poem the Odyssey had, according to its date-stamped card, an equally epic lending history: checked out twice in 1968, once in 1980, and again in 1992, before I came along, in March of 1994, when I was 17, and removed Homer’s masterpiece from its place for the fifth time in more than a quarter century.
I arrived at the artists’ residency late in the afternoon with a small suitcase, my computer, and my notebook. Located in the outer reaches of New Delhi, the residency offered space to eight writers and visual artists to work for up to two months. If you’d asked me, I’d have said I had big plans for my three weeks there, but the truth was my timing was off; I’d just completed a novel before coming to India to visit family and begin a new project. I needed to do research, but I wasn’t sure what that research would look like. Books? Interviews? Archival material? Uncertainty reigned. Furthermore, I hadn’t factored into my plans how helpful family members, with their insistence on driving me places or offering up their car and driver, often added time to the process, as I scheduled my outings around their busy schedules.
It was almost dusk when my cousin dropped me off at the residency. Dusk and dawn are the worst times to drive, with their shifting light and shadows, the contrast between the light sky and dark road hard on the eyes. Arriving at a new destination in this smoky light with its blurred edges and deep shadows induces a specific kind of terror. I’m not afraid of the dark, but I do have a terrible sense of direction and can’t read maps. For this reason, my first rule of travel is to arrive in the clear light of day. I orient myself by landmarks; in the dark, I grope around like a blind person with no guide, off course and confused.
Kass has long believed we face a worrisome “lack of cultural and moral confidence about what makes a life worth living.” His new book, Leading a Worthy Life: Finding Meaning in Modern Times, is a collection of essays on that theme, many of which first appeared in magazines like First Things, Commentary and the New Atlantis. Different sections of the book deal with love and friendship, human dignity (primarily in the realm of bioethics), learning and teaching, and human aspirations for freedom, justice and righteousness. If you’re anything like me, reading the essays can initially feel like subjecting yourself to an elaborate scolding. Kass takes gratuitous swipes at universities (I’m guilty by association) and liberal intellectuals (guilty by aspiration). He is against wives who don’t take their husband’s last name (guilty), conceptions of marriage that are divorced from procreation (guilty even though I now have kids), gender-neutral individualism (probably guilty), the decline of womanly modesty (probably complicit), and sports fans who focus too much on wins and stats (go Tar Heels).
But it’s worth working past the feeling that Kass is condemning half your life, and trying to understand why he thinks these issues matter. In each case, his judgment is rooted in his sense that humans have a nature that points us toward specific forms of flourishing. This view of human nature comes partly from biology, but Kass derives the majority of it from his reading of philosophy and literature—Hawthorne as well as Aristotle, Shakespeare and Austen. What makes great books great, he believes, is that they contain wisdom about our nature and therefore about the ways we might fulfill it. To make this argument with conviction, as Kass does, is so unusual in contemporary intellectual discourse that it can be quite bracing—even if sometimes unpleasant—to read, like a brisk wind coming down from the mountains. In wrestling with his ideas you may find that you end up wrestling with your own.
“If there were a union for fictionalized characters, they would insist on having sex in a book at least every Friday,” said Allan Gurganus, famous among writing students for his enthusiastic “let’s get it on” philosophy about sex in fiction. “We love our main characters and they stick with us for 350 pages and the least we can do is give them one moment of sexual pleasure.”
It’s been 90 years since Lady Chatterley adulterously wove flowers into her lover’s pubic hair in D.H. Lawrence’s book, to the scandalized delight of readers wily enough to score early samizdat copies. But now that anything goes, now that we’ve seen it all, now that we have PornHub to amuse us on demand, is there anything left to get excited about? Should novelists try to counteract the numbing aspects of porn, as Gurganus advised in an interview, by giving the characters the gift of more active sex lives?
“Frankenstein” is four stories in one: an allegory, a fable, an epistolary novel, and an autobiography, a chaos of literary fertility that left its very young author at pains to explain her “hideous progeny.” In the introduction she wrote for a revised edition in 1831, she took up the humiliating question “How I, then a young girl, came to think of, and to dilate upon, so very hideous an idea” and made up a story in which she virtually erased herself as an author, insisting that the story had come to her in a dream (“I saw—with shut eyes, but acute mental vision,—I saw the pale student of unhallowed arts kneeling beside the thing he had put together”) and that writing it consisted of “making only a transcript” of that dream. A century later, when a lurching, grunting Boris Karloff played the creature in Universal Pictures’s brilliant 1931 production of “Frankenstein,” directed by James Whale, the monster—prodigiously eloquent, learned, and persuasive in the novel—was no longer merely nameless but all but speechless, too, as if what Mary Wollstonecraft Godwin Shelley had to say was too radical to be heard, an agony unutterable.
The finest diarists are able to view themselves with the detachment they apply to others. They become, in this sense, their own sharpest biographers, dividing themselves into both observer and observed, audience and performer, hovering eagle-eyed above themselves, ever curious to record, however unfavorably, their own imperfect ways. As Claire Tomalin puts it in her biography of Samuel Pepys: “In writing it down, he detached himself from the self who acted out the scene.”
In her diaries of her years at Vanity Fair, Tina Brown is certainly adept at noting, with her unforgiving eye, the flaws in others. Revulsion brings out the best in her.
The book, through excavating ruins both literal and metaphoric, considers how the past becomes an instrument of the present, how we narrate history to give meaning to our individual lives and construct national mythologies. Accordingly, Radtke forgoes any pretense of objectivity, openly engaging with how she curates the events portrayed in her memoir for her own thematic purpose. Rather than striving for impartiality or uncontested truth, Imagine Wanting Only This presents a different path for memoirs, suggesting they can instead adopt subjectivity as their principal intent and, in turn, provide insight into how we construct personal meaning.
When asked what he thought of his son’s books, Kingsley Amis said: “Martin needs to write more sentences like ‘He put down his drink, got up and left the room.’” Amis père would approve of many of the sentences in Julian Barnes’s latest novel, The Only Story, which steps through familiar Barnesian territory, giving us the English suburbs, an aged protagonist looking back over an unfulfilled life, all told in deceptively affectless prose.
When he was thirty-five, Kieran Setiya had a midlife crisis. Objectively, he was a successful philosophy professor at the University of Pittsburgh, who had written the books “Practical Knowledge” and “Knowing Right from Wrong.” But suddenly his existence seemed unsatisfying. Looking inward, he felt “a disconcerting mixture of nostalgia, regret, claustrophobia, emptiness, and fear”; looking forward, he saw only “a projected sequence of accomplishments stretching through the future to retirement, decline, and death.” What was the point of life? How would it all end? The answers appeared newly obvious. Life was pointless, and would end badly.
Unlike some people—an acquaintance of mine, for example, left his wife and children to move to Jamaica and marry his pot dealer—Setiya responded to his midlife crisis productively. In “Midlife: A Philosophical Guide” (Princeton), he examines his own freakout. “Midlife” has a self-soothing quality: it is, Setiya writes, “a self-help book in that it is an attempt to help myself.” By methodically analyzing his own unease, he hopes to lessen its hold on him.
Eileen Chang died alone in her Los Angeles apartment in 1995. A small, quiet death for a literary celebrity who had grown ever more reclusive as she aged. Similarly, Chang’s novel Little Reunions, newly translated by Jane Weizhen Pan and Martin Merz, nearly met a quiet, small death. Chang’s most autobiographical novel experienced a drawn-out journey to the hand of a publisher. Though written in 1976, it wasn’t published until 2009 in China and sold over a million copies. When she first finished it in the mid-’70s, Chang wrote to her literary executors, Stephen Soong and his wife Mae Fong Soong, asking them to read the 600-page handwritten manuscript. Later, when she wrote to discuss her will, she mentioned she’d contemplated destroying what may be the most personal of all her work to date, but didn’t go through with it. The Soongs would later be responsible for the novel’s appearance and wild success in China.
Little Reunions traces protagonist Julie’s life during prewar and wartime China. As the novel opens, Julie, who is from Shanghai, is studying abroad in Hong Kong. The reader soon finds out that much like Chang, Julie is the daughter of an opium-addicted, traditional father and a glamorous, globetrotting mother whom she never sees. The book’s title comes from those rare times spent with her complicated mother, Rachel, as a child and adult, as well as Julie’s long separations during and after the war from lover-then-husband Chih-yung, a suspected Japanese sympathizer in exile.
When the average consumer logs in to the Caviar app to order a Mulberry & Vine salad for the office or a grain bowl on the way home from work, she might reasonably assume that her order is benefitting the restaurant’s bottom line. But Gauthier, like many other restaurant owners I’ve spoken to in recent months, paints a more complicated picture. “We know for a fact that as delivery increases, our profitability decreases,” she said. For each order that Mulberry & Vine sends out, between twenty and forty per cent of the revenue goes to third-party platforms and couriers. (Gauthier initially had her own couriers on staff, but, as delivery volumes grew, coördinating them became unmanageable.) Calculating an order’s exact profitability is tricky, Gauthier said, but she estimated that in the past three years Mulberry & Vine’s over-all profit margin has shrunk by a third, and that the only obvious contributing factor is the shift toward delivery. “I think it’s a far bigger problem than a lot of operators realize,” she told me. “I think we are losing money on delivery orders, or, best-case scenario, breaking even.”
The problem is that entanglement violates how the world ought to work. Information can’t travel faster than the speed of light, for one. But in a 1935 paper, Einstein and his co-authors showed how entanglement leads to what’s now called quantum nonlocality, the eerie link that appears to exist between entangled particles. If two quantum systems meet and then separate, even across a distance of thousands of lightyears, it becomes impossible to measure the features of one system (such as its position, momentum and polarity) without instantly steering the other into a corresponding state.
Up to today, most experiments have tested entanglement over spatial gaps. The assumption is that the ‘nonlocal’ part of quantum nonlocality refers to the entanglement of properties across space. But what if entanglement also occurs across time? Is there such a thing as temporal nonlocality?
To get my once-in-a-generation story down onto the page, in my laptop, though, I’m going to need your café’s Wi-Fi password. Sorry, I don’t know if you heard me: I’m going to be writing a novel here in your café today! How exciting for you!
That’s the thing about writing a novel. You think you can get by without the World Wide Web. I mean, isn’t my imagination the widest web of them all? However, I will need to connect to your Wi-Fi to do some online research in order to fill my story with historically accurate details. Sure, I’ll spend most of my time staring at my ex’s vacation photos on Facebook, but it’s all just a part of the writing process.
The Word for Woman Is Wilderness is unlike any published work I have read, in ways that are beguiling, audacious and occasionally irritating. It’s a British debut in which 19-year-old Erin leaves her Midlands home and heads for Alaska by land and sea in order to write a feminist narrative about the wilderness: a revision of the works of Jack London and John Muir for the millennial generation. Along the way, she muses on space travel, mutually assured destruction, climate change and physics.
The Adulterants, from its punning title onwards, is brilliantly knowing about its knowingness. It knows the only way we’ll tolerate a narrator as annoying as Ray is to punish him for the very virtues that make him a good narrator – nosiness and eloquence.
We live enmeshed in networks. The internet, a society, a body, an ant colony, a tumour: they are all networks of interactions, among people, ants or cells – aggregates of nodes or locations linked by some relation. The power of networks is in their local connections. All networks grow, shrink, merge or split, link by link. How they function and change depends on what forms, or disrupts, the connections between nodes. The internet dominates our lives, not because it is huge, but because each of us can make so many local links. Its size is the result, not the cause, of its impact on our communication.
Nowhere is the decisive influence of local interactions easier to see than in ants, which I study. The local is all an ant knows. A colony operates without central control, based on a network of simple interactions among ants. These are local by necessity, because an ant cannot detect anything very far away. Most ant species can’t see, and all of them rely on smell, which they do with their antennae. The important interactions are when ants touch antennae, smelling each other, or the ground, smelling chemicals deposited by other ants.
Meteorologists warned of a coming “bomb cyclone.” Satellite images showed a giant, hurricane-like weather system barreling towards land. Words like “exploding” and “slamming” and “tearing” peppered the news reports. It was enough to conjure images of The Day After Tomorrow, the 2004 environmental thriller about a superstorm that brings on a new Ice Age. Americans are highly dependent on weather forecasts. Today, most of us rely on modern technology for predictions about the weather—forecasts based on readings of countless measuring tools, fed into computer models, then analyzed and broadcast or sent straight to our smartphones. But I had other tools of weather prediction, small enough to fit in my backpack: two farmers almanacs. They’ve been around hundreds of years, since before the Civil War, and have survived the advent of modern technology.
The unpacking of books, perhaps because it is essentially chaotic, is a creative act, and as in every creative act the materials employed lose in the process their individual nature: they become part of something different, something that encompasses and at the same time transforms them. In the act of setting up a library, the books lifted out of their boxes and about to be placed on a shelf shed their original identities and acquire new ones through random associations, preconceived allotments, or authoritarian labels.
Reckoning with the ongoing demands of reciprocity in human affairs requires facing up to art’s and storytelling’s insufficiency with respect to that effort. For the modern humanities, then, the “rise and fall of Adam and Eve” matters as a searing reminder of unmet needs for mutuality. A memento of paradise lost, after stories and visions of paradise step to the side.
Historically illiberal democracy has been a stage on the route to liberal democracy rather than the end point of a country’s political trajectory. Indeed, in the past, the experience of, or lessons learned from, flawed and even failed democratic experiments have played a crucial role in helping societies appreciate liberal values and institutions. And many of the problems that have emerged in Western democracies today are not the result of “hyperdemocratization”—but the exact opposite. Over the past decades, democratic institutions and elites have become increasingly out of touch with and insulated from the people, contributing greatly to the anger, frustration, and resentment that is eating away at liberal democracy today. Let’s examine each of these points in turn.
I am lucky not for surviving the infection, but for being a member of a shrinking class of Americans whose lives can absorb a trauma of this magnitude, and for whom being thrown, insensible, into the system is actually a good thing. When people refer to me as a “survivor,” which they do often, they’re correct, but it’s not what they think it means: It has already been decided, especially now that it’s again fashionable to claim that healthcare is not a right, who is a designated survivor in this country. It has also been decided who is not.
I know every time I’ve let go of Zelda, in fact, what’s actually happened is that she let go of me, and I simply allowed it, overcoming my natural inclinations to cling, to hold tight. I felt her pull away from me as she stood up on her fat wobbly legs to walk for the first time, and I worried that she would fall. She did, of course, fall down, and though she cried real tears of failure and frustration, and though she looked over at me, she didn’t reach for me. She didn’t need me, not right that second. She told me then what I didn’t want, couldn’t stand to hear, not yet, not yet: “Sometimes, I need you; sometimes I do not.”
And while this seasonal quartet has its angry and agonized passages—Winter includes many small but insistent notations of the way institutions of Britain’s public culture, from bus service to libraries, have been gradually privatized and downsized—its creator wants to remind us that the pendulum can swing back and that one day the sun will return.
In Thomas Pierce’s warm and inventive debut novel, “The Afterlives,” reality is slippery, time is out of joint and profound disorientation is a feature of daily existence. In other words, pretty much how the world feels to a lot of us right now.
What follows is an exceptional work, a humane book about an incendiary subject. Blending history and investigative reporting, Bergman never loses sight of the ethical questions that arise when a state, founded as a refuge for a stateless people who were targets of a genocide, decides it needs to kill in order to survive.