The Precipice argues that we need to pay attention to existential risks (Xrisks) – events that, in the author's words, would prevent humanity from achieving its long term potential. What makes these difficult to reason about is that these events have very low probability (evidenced by the fact that we are still here) but immense consequences. Humans are rather bad at reasoning about day-to-day probabilistic events, let alone events with this character.
The book is exceedingly well-written: methodical without being robotic. Ord takes us through several key Xrisks, including natural ones like the K-T meteor that wiped out the dinosaurs, as well as anthropogenic risks – an important point in the book is that the latter is the dominant contributor to the total risk. It's appalling how close we came to nuclear war during the Cuban Missle Crisis, and how little progress has been made since. We now also have to worry about the risks of rogue AI, bioengineered pandemics, and perhaps the scariest: risks that we haven't yet thought of. The Fifth Risk by Michael Lewis gives some nice perspective on the mishandling of Xrisk under the Trump administration.
With the detonation of the first atomic bomb, a new age of humanity began. At that moment, our rapidly accelerating technological power finally reached the threshold where we might be able to destroy ourselves.
Ord believes that the main reason Xrisk is important is that humanity has a huge amount of potential in the long run – we will eventually reach out to the stars, and accomplish wonderful things. Ord bleeds with hope and optimism; I get strong Asimov vibes: "the end of Eternity – and the beginning of Infinity" (this quote will only make sense if you've read The End of Eternity).
Maybe I am just a pessimist, but this is the part of the book that I am personally most ambivalent about. It's not self-evident to me that the long term future of humanity is so important. Some Buddhist scholars liken Nirvana to non-existence; under that philosophy, a transition from existence to non-existence (like humanity getting instantly vaporised by a gamma-ray burst) wouldn't be a bad thing – though forgive me if I am misunderstanding and misconstruing Buddhist philosophy. Ord mentions that he also went through a similar phase but grew out of it; perhaps that will happen to me one day. For now, my belief is nicely summarised by Jan Narveson (mentioned in the appendix): I am "in favour of making people happy, but neutral about making happy people".
In any case, even if I were strongly committed to this somewhat nihilist way of thinking, Xrisk would still be worth paying attention to. I do care about the suffering of people currently in existence, and many Xrisks would lead to gruesome levels of suffering on a global scale: not a clean mathematical transition from existence to non-existence.
This has been a long review, but my takeaway is that The Precipice is a necessary read. It is quite short and very well-written (the page count is misleading because literally half the book is made of footnotes). I'll leave you to form your own judgments about the long-term value of humanity.
The Precipice gives our time immense meaning. In the grand course of history—if we make it that far—this is what our time will be remembered for"