Jacques Derrida. “Of an Apocalyptic Tone in Recent Philosophy.” Semeia 22 (Studies in Ancient Letter Writing). John L. White, ed. Society of Biblical Literature, 1982. 63-97.
That’s an interesting take, but how are we to explain all previous apocalyptic moments?
According to what has been written on the subject, we seem only too eager to surround such a catastrophe with avenging fury, with destructive angels and the sound of trumpets, and other no less horrifying accompaniments.
Alas, we do not need such histrionics to be destroyed; we are not worth such a funereal display, and if God wishes it he can change the whole surface of the globe without such exertion on his part.—
Brilliat Savarin on the end of the world, Physiology of Taste
The beginning of the 2nd sentence really rings in my head. It’s true we don’t need fanfare. Perhaps we are already subtlety in motion toward the end of all things.
Expert: We’re underestimating the risk of human extinction
This is from The Atlantic’s website—an interview with Nick Bostrom, a philosophy prof at Oxford University. Well worth the read, but this in particular grabbed my attention:
I think the biggest existential risks relate to certain future technological capabilities that we might develop, perhaps later this century. For example, machine intelligence or advanced molecular nanotechnology could lead to the development of certain kinds of weapons systems. You could also have risks associated with certain advancements in synthetic biology.
Of course there are also existential risks that are not extinction risks. The concept of an existential risk certainly includes extinction, but it also includes risks that could permanently destroy our potential for desirable human development. One could imagine certain scenarios where there might be a permanent global totalitarian dystopia. Once again that’s related to the possibility of the development of technologies that could make it a lot easier for oppressive regimes to weed out dissidents or to perform surveillance on their populations, so that you could have a permanently stable tyranny, rather than the ones we have seen throughout history, which have eventually been overthrown.
Get that? He thinks we’ll do it to ourselves. No gods, no monsters—just our own technological hubris.