The taped conversation began. Harari began to describe future tech intrusions, and Pinker, pushing back, referred to the ubiquitous “telescreens” that monitor citizens in Orwell’s “1984.” Today, Pinker said, it would be a “trivial” task to install such devices: “There could be, in every room, a government-operated camera. They could have done that decades ago. But they haven’t, certainly not in the West. And so the question is: why didn’t they? Partly because the government didn’t have that much of an interest in doing it. Partly because there would be enough resistance that, in a democracy, they couldn’t succeed.”
Harari said that, in the past, data generated by such devices could not have been processed; the K.G.B. could not have hired enough agents. A.I. removes this barrier. “This is not science fiction,” he said. “This is happening in various parts of the world. It’s happening now in China. It’s happening now in my home country, in Israel.”
“What you’ve identified is some of the problems of totalitarian societies or occupying powers,” Pinker said. “The key is how to prevent your society from being China.” In response, Harari suggested that it might have been only an inability to process such data that had protected societies from authoritarianism. He went on, “Suddenly, totalitarian regimes could have a technological advantage over the democracies.”
Pinker said, “The trade-off between efficiency and ethics is just in the very nature of reality. It has always faced us—even with much simpler algorithms, of the kind you could do with paper and pencil.” He noted that, for seventy years, psychologists have known that, in a medical setting, statistical decision-making outperforms human intuition. Simple statistical models could have been widely used to offer diagnoses of disease, forecast job performance, and predict recidivism. But humans had shown a willingness to ignore such models.
“My view, as a historian, is that seventy years isn’t a long time,” Harari said.
When I later spoke to Pinker, he said that he admired Harari’s avoidance of conventional wisdom, but added, “When it comes down to it, he is a liberal secular humanist.” Harari rejects the label, Pinker said, but there’s no doubt that Harari is an atheist, and that he “believes in freedom of expression and the application of reason, and in human well-being as the ultimate criterion.” Pinker said that, in the end, Harari seems to want “to be able to reject all categories.”
The next day, Harari and Yahav made a trip to Chernobyl and the abandoned city of Pripyat. They invited a few other people, and hired a guide. Yahav embraced a role of half-ironic worrier about health risks; the guide tried to reassure him by giving him his dosimeter, which measures radiation levels. When the device beeped, Yahav complained of a headache. In the ruined Lenin Square in Pripyat, he told Harari, “You’re not going to die on me. We’ve discussed this—I’m going to die first. I was smoking for years.”
Harari, whose work sometimes sounds regretful about most of what has happened since the Paleolithic era—in “Sapiens,” he writes that “the forager economy provided most people with more interesting lives than agriculture or industry do”—began the day by anticipating, happily, a glimpse of the world as it would be if “humans destroyed themselves.” Walking across Pripyat’s soccer field, where mature trees now grow, he remarked on how quickly things had gone “back to normal.”
The guide asked if anyone had heard of Call of Duty: Modern Warfare—the video game, which includes a sequence set in Pripyat.
“No,” Harari said.
“Just the most popular game in the world,” the guide said.
At dusk, Harari and Yahav headed back to Kyiv, in a black Mercedes. When Yahav sneezed, Harari said, “It’s the radiation starting.” As we drove through flat, forested countryside, Harari talked about his upbringing: his hatred of chess; his nationalist and religious periods. He said, “One thing I think about how humans work—the only thing that can replace one story is another story.”
We discussed the tall tales that occasionally appear in his writing. In “Homo Deus,” Harari writes that, in 2014, a Hong Kong venture-capital firm “broke new ground by appointing an algorithm named VITAL to its board.” A footnote provides a link to an online article, which makes clear that, in fact, there had been no such board appointment, and that the press release announcing it was a lure for “gullible” outlets. When I asked Harari if he’d accidentally led readers into believing a fiction, he appeared untroubled, arguing that the book’s larger point about A.I. encroachment still held.
In “Sapiens,” Harari writes in detail about a meeting in the desert between Apollo 11 astronauts and a Native American who dictated a message for them to take to the moon. The message, when later translated, was “They have come to steal your lands.” Harari’s text acknowledges that the story might be a “legend.”
“I don’t know if it’s a true story,” Harari told me. “It doesn’t matter—it’s a good story.” He rethought this. “It matters how you present it to the readers. I think I took care to make sure that at least intelligent readers will understand that it maybe didn’t happen.” (The story has been traced to a Johnny Carson monologue.)
Harari went on to say how much he’d liked writing an extended fictional passage, in “Homo Deus,” in which he imagines the belief system of a twelfth-century crusader. It begins, “Imagine a young English nobleman named John . . .” Harari had been encouraged in this experiment, he said, by the example of classical historians, who were comfortable fabricating dialogue, and by “The Hitchhiker’s Guide to the Galaxy,” by Douglas Adams, a book “packed with so much good philosophy.” No twentieth-century philosophical book besides “Sources of the Self,” by Charles Taylor, had influenced him more.
We were now on a cobbled street in Kyiv. Harari said, “Maybe the next book will be a novel.”
At a press conference in the city, Harari was asked a question by Hannah Hrabarska, a Ukrainian news photographer. “I can’t stop smiling,” she began. “I’ve watched all your lectures, watched everything about you.” I spoke to her later. She said that reading “Sapiens” had “completely changed” her life. Hrabarska was born the week of the Chernobyl disaster, in 1986. “When I was a child, I dreamed of being an artist,” she said. “But then politics captured me.” When the Orange Revolution began, in 2004, she was eighteen, and “so idealistic.” She studied law and went into journalism. In the winter of 2013-14, she photographed the Euromaidan protests, in Kyiv, where more than a hundred people were killed. “You always expect everything will change, will get better,” she said. “And it doesn’t.”
Hrabarska read “Sapiens” three or four years ago. She told me that she had previously read widely in history and philosophy, but none of that material had ever “interested me on my core level.” She found “Sapiens” overwhelming, particularly in its passages on prehistory, and in its larger revelation that she was “one of the billions and billions that lived, and didn’t make any impact and didn’t leave any trace.” Upon finishing the book, Hrabarska said, “you kind of relax, don’t feel this pressure anymore—it’s O.K. to be insignificant.” For her, the discovery of “Sapiens” is that “life is big, but only for me.” This knowledge “lets me own my life.”
Reading “Sapiens” had helped her become “more compassionate” toward people around her, although less invested in their opinions. Hrabarska had also spent more time on creative photography projects. She said, “This came from a feeling of ‘O.K., it doesn’t matter that much, I’m just a little human, no one cares.’ ”
Hrabarska has disengaged from politics. “I can choose to be involved, not to be involved,” she said. “No one cares, and I don’t care, too.” ♦