Future of Life Institute and the International Politics of AI Apocalypse
Just as the core function and main source of legitimacy of the state is its ability to secure its population against existential threats, fear of such threats to humankind has similarly been used to justify calls for a radical transformation of the international political order. The specter of existential threats—of human extinction or civilizational collapse—has also motivated various nonstate actors since the twentieth century, including those working with multiple governments and across societies, to address what they see as an ultimate challenge for politics.
This chapter explores this phenomenon by investigating the ideology and activities of the Future of Life Institute (FLI). Founded in 2014, FLI has focused on extreme technological risks, especially existential risks connected to the rapid development of Artificial Intelligence. The organization provides an entry point for a broader exploration of how networks of various nonstate actors specializing in existential risk interact with the state-system while engaging with a challenge that transcends the limitations of the logic of national interests and state survival.

AI Apocalypse and Global Governance
Many, perhaps most, of those who signed the letter did not in fact agree with the existential risk framing, but the large majority agreed that the development of AI systems should not be entrusted to private market actors (Struckman & Kupiec, 2023). These are illustrated by their dedicated campaigns to ban deepfakes (https://bandeepfakes.org/) and to prohibit and regulate autonomous weapons systems (https://autonomousweapons.org/).
Artificial general intelligence is a potential form of AI that is capable of every cognitive task a human is capable of, perhaps at an even higher level. Today’s AI, in contrast, is narrow AI, capable of performing only a single task or a limited set of tasks (Mitchell, 2020: 45). Effective Altruism is the name of a well-funded movement that focuses on a utilitarian vision for improving the world and has recently gravitated toward a somewhat cult-like focus on “longtermism,” i.e., the maximization of human utility on an astronomical timescale, and on the existential risks that threaten such utility-maximization (Torres, 2024, pp. 384–389).
Links to Tech Sector and Funding
One of the largest donors to EA was the FTX Fund of now convicted former crypto-billionaire Sam Bankman-Fried, another major link with the tech sector and existential risk (Tiku, 2022). Eliezer Yudkowsky, cofounder of the Machine Intelligence Research Institute, another organization in the AI existential risk network, would go even further: “Make immediate multinational agreements to prevent the prohibited activities from moving elsewhere. Track all GPUs sold. If intelligence says that a country outside the agreement is building a GPU cluster, be less scared of a shooting conflict between nations than of the moratorium being violated; be willing to destroy a rogue datacenter by airstrike.” [GPU stands here for Graphics Processing Units, which are used for nongraphical purposes, including in the training of neural networks.]
FAR AI is also part of the AI existential risk-funding ecosystem that includes FLI (Weiss-Blatt, 2023a). Aguirre, A. (2023). Close the gates to an inhuman future: How and why we should choose to not develop superhuman general-purpose artificial intelligence (SSRN scholarly paper 4608505). https://doi.org/10.2139/ssrn.4608505.
![]()
References
Barrett, S. (2010). Why cooperate?: The incentive to supply global public goods. Oxford University Press. Google Scholar. Barrett, G. (2020). Minding the gap: Zhou Peiyuan, Dorothy Hodgkin, and the durability of Sino-Pugwash networks. In A. Kraft & C. Sachse (Eds.), Science, (anti-)communism and diplomacy: The Pugwash conferences on science and world affairs in the early cold war (pp. 190–217). Brill. Google Scholar. Bordelon, B. (2024, March 26). The little-known AI group that got $660 million. POLITICO. https://www.politico.com/news/2024/03/25/a-665m-crypto-war-chest-roils-ai-safety-fight-00148621.










