Recent Publications & Features
my objective
I love exploring the complex problems that face humanity. I am passionate about understanding ways in which we can better ourselves and the world we live in. The world is becoming ever more complicated and I endeavour to shine a light on cross-disciplinary issues and threats where data is sparse or difficult to collect.
My current research focuses on both AI governance and understanding how artificial intelligence affects nuclear risk with a particular focus on human decision making in a crisis. AI has the potential to shape much of the world in the coming decades and nuclear security is no exception. By understanding both the benefits AI could have for stability while simultaneously taking into account its limitations and inherent dangers I seek to help guide the integration of AI and nuclear technologies. My work seeks to combine the fields of nuclear strategy, artificial intelligence, and psychology, in order to provide a completely new, cross-disciplinary analysis of human decision making around nuclear weapons.
I come at these problems through a long term lens and the belief that the lives of future generations matter. Working to protect the world and the people who inhabit it today will safeguard it for those to come.
I am always happy to explore additional projects and look forward to hearing from you. Please reach out to me by email for any enquiries.
Publications
Publications
Rautenbach, Peter James M. (2023). Keeping humans in the loop is not enough to make AI safe for nuclear weapons. Bulletin of The Atomic Scientists. https://thebulletin.org/2023/02/keeping-humans-in-the-loop-is-not-enough-to-make-ai-safe-for-nuclear-weapons/#post-heading
Rautenbach, Peter James M. (2022). Artificial Intelligence and Nuclear Command, Control, & Communications: The Risks of Integration. EA Forum. https://forum.effectivealtruism.org/posts/BGFk3fZF36i7kpwWM/artificial-intelligence-and-nuclear-command-control-and-1
Rautenbach, Peter James M. (2022). “On Integrating Artificial Intelligence With Nuclear Control.” Arms Control Today, 52(9), 23-26. https://www.armscontrol.org/aca/2008.
Rautenbach, Peter M., Reem Rashed Alnuaimi, Shawon Saha. (2022) “Impact of SMRs in Global Nuclear Development: To What Extent will the Global Spread of SMRs Increase the Proliferation of Nuclear Weapons?.” In 2021 NEREC Annul Report, edited by Man-Sung Yim, 221-234. NEREC: Daejeon. (DUE TO FILE SIZE CONSTRAINTS PLEASE CONTACT FOR FULL REPORT AND REFERENCES IF DESIRED)
Rautenbach, Peter. (2021). Conventional Arms Control and Nuclear Security: The Challenge of Conventional Prompt Global Strike Weapons. British American Security Information Council. https://basicint.org/conventional-arms-control-and-nuclear-security-the-challenge-of-conventional-prompt-global-strike-weapons/
Rautenbach, Peter James M. (2020). “The Threat of Conventional Weapons to Nuclear Security: A New Reality for Deterrence.” The Journal of International Analytics, 11(4), 56–71, https://doi.org/10.46272/2587-8476-2020-11-4-56-71.
Rautenbach, Peter. (2019). “The Subtle Knife: A discussion on hybrid warfare and the deterioration of nuclear deterrence.” The Journal of Intelligence, Conflict, and Warfare, 2(1).
Contributing Author
Rautenbach, Peter et al. (2021). Living with pandemics in higher education: people, place, and policy. In J. Bryson, L. Andres, A. Ersoy, & L. Reardon (Eds.), Living with Pandemics: People, Place and Policy, (pp. 47-59). Edward Elgar Publishing
speaking engagments
Doomed From the Start? Lessons From Nuclear Arms Control and What They Mean for Governing Military AI
Rethinking Nuclear Deterrence Research Network Arms Control and Emerging Tech Working Group DC Workshop (2023)
Spoke on the history of governing nuclear weapons and what lessons could be learned for the governance of AI with a focus on the military applications of AI technology.
Machine Learning & NC3: Risks of Integration
2022 NEREC Regional Workshop Europe (2022)
Spoke on the risks of integrating AI with nuclear weapon command systems. Explored the combination of technical aspects and the changing human-machine relationship in terms of decision making.
Machine Learning and Nuclear Command: How the technical flaws of automated systems and a changing human-machine relationship could impact the risk of inadvertent nuclear use
ISYP Third Nuclear Age Conference (2022)
Spoke on the risks of integrating AI with nuclear weapon command systems. Explored the combination of technical aspects and the changing human-machine relationship in terms of decision making.
US China Mutual Vulnerability and how it could reinvigorate non-proliferation
2021 NEREC Regional Workshop D.C. (2021)
Spoke on mutual vulnerability and its importance for nuclear stability. Provided ways in which it could be used to improve U.S. – China Relations.
Recent Activity
(December 2023) Listed in the Bulletin of the Atomic Scientists “Best of 2023: Fresh takes from ‘Voices of Tomorrow’”
(February 2023) Opinion piece on why keeping humans in-the-loop is not enough to ensure the safe use of AI in nuclear command systems published with the Bulletin of the Atomic Scientists
(November 2022) Spoke at ISYP Third Nuclear Age Conferecne and NEREC Regional Workshop on risks of AI integration with NC3 systems.
(Setpember 2022) Article on key risks associated with integrating Machine Learning with Nuclear Weapon Command published with Arms Control Today.
(Summer 2022) Took part in the Cambridge Existential Risks Initiative (CERI) Summer Research Fellowship where I worked to understand both the benefits and strategic risks of integrating machine learning with nuclear command.
(April 2022) Co-authored a article exploring the impact of SMRs on Proliferation
(December 2021) Presented at the 2021 NEREC Regional Workshop in DC
(November 2021) Graduated from the London School of Economics with a MSc in International Relations
(August 2021) Completed the 2021 NEREC Graduate Fellows Program on nuclear non-proliferation and the safe uses of nuclear energy
(July 2021) Published a article on conventional prompt global strike weapons & nuclear deterrence with BASIC
(November 2020) Published a paper with the Journal of International Analytics on how emerging technology could degrade nuclear security
Experience
Center for strategic and international studies | commissioned researcher
february 2023 - december 2023
Commissioned to write research paper as part of a larger CSIS report. Research work is part of the Rising Voices Arms Control and Emerging Technologies Working Group, one of the Belfer Center’s Rethinking Nuclear Deterrence Research Network’s Working Groups. Topic of research is on the history of nuclear arms control and the lessons this history could have for the governance of military artificial intelligence.
resiliencer project | commissioned researcher
february 2023 - december 2023
Commissioned to co-write research paper as part of the RESILIENCER Project. Research focuses on the conflict risks associated with the deployment of solar radiation modification (SRM) technology as a means of combating the detrimental effects of climate change.
Cambridge Existential Risks Initiative (CERI), Summer Research Fellowship
July 2022 - September 2022
A research fellowship that explored existential risk of various forms. My area of study was how the intersection of nuclear command and machine learning would impact nuclear risk. I conducted full time research with the end result being the creation of a deliverable that is both a piece of good academic work and impactful in terms of reducing risk.
Nuclear Nonproliferation Education & Research Center at KAIST, Graduate Fellow
June 2021 - august 2021
Took part in a research fellowship where fellows worked to understand the peaceful uses of nuclear energy, the technical aspects of nuclear power, and nuclear security on the Korean Peninsula. I co-authored a paper for their conference on the proliferation potential associated with small-modular nuclear reactors.
Millennium journal of international studies, editorial board member
February 2020 - april 2021
Analyzed manuscripts for publication inclusive of determining their relevance to the journal and academia in general. Actively engaged in discussions focused on argument construction and the intellectual integrity of the research.
canadian association for security and intelligence studies, research fellow
May 2019 - September 2019
Conducted research on fifth-generation warfare and right-wing extremism through verified sources for raising audience engagement on social aspects. Prepared briefing notes and other timely documents for academics, practitioners, and policymakers to facilitate elicitation of required insights. Performed careful analysis of classical and contemporary thought/research.
canadian association for security and intelligence studies, Deputy executive officer
October 2018 - May 2019
Interacted with strategic partners and public policymakers on discussing and resolving identified issues. Prioritized, managed, and delivered several large-scale projects as per required specifications. Devised and executed training courses on strategic thinking, enhancing the knowledge of involved members.
Simon fraser university, research assistant
may 2017 - September 2017
Facilitated the completion of a research project exploring Canadian and NATO responses to Russia post-Crimea. Identified gaps in academic and policy literature and proposed key suggestions to relevant departments. Strategically focused on the NATO aspect of research and delivered an annotated bibliography and draft sections that spoke to the area of specialization.
Education
The London School of Economics and political sciecne, uk
MSc INTERNATIONAL RELATIONS
Graduated November 2021
Simon fraser university, canada
Bachelor of Arts, Political Science (Honors with Distinction)
Graduated September 2017
certifications
Nuclear Nonproliferation Education & Research Center Summer (NEREC) Graduate Fellows Program, 2021
Certificate of Completion
Associated Skills: nuclear safeguards, insight into the peaceful uses of nuclear power, international non-proliferation regimes, technical knowledge behind nuclear power.