The AI Alignment Forum is a forum for discussing technical research on AI alignment that superseded the Agent Foundations Forum, established around 2015.[1]
A beta version of the site, at the time named the Alignment Forum, was announced on 10 July 2018.[2] The site under its current name was officially launched on 29 October 2018.[3] The authors describe its purpose as follows:
Our first priority is obviously to avert catastrophic outcomes from unaligned Artificial Intelligence. We think the best way to achieve this at the margin is to build an online-hub for AI Alignment research, which both allows the existing top researchers in the field to talk about cutting-edge ideas and approaches, as well as the onboarding of new researchers and contributors.
We think that to solve the AI Alignment problem, the field of AI Alignment research needs to be able to effectively coordinate a large number of researchers from a large number of organisations, with significantly different approaches. Two decades ago we might have invested heavily in the development of a conference or a journal, but with the onset of the internet, an online forum with its ability to do much faster and more comprehensive forms of peer-review seemed to us like a more promising way to help the field form a good set of standards and methodologies.
The AI Alignment Forum is built by Lightcone Infrastructure.
Further reading
Habryka, Oliver et al. (2018) Introducing the AI Alignment Forum (FAQ), AI Alignment Forum, October 29.
External links
AI Alignment Forum. Official website.
Related entries
Alignment Newsletter | LessWrong | Lightcone Infrastructure
- ^
LaVictoire, Patrick (2015) Welcome, new contributors, Agent Foundations Forum, March 23.
- ^
Arnold, Raymond (2018) Announcing AlignmentForum.org beta, AI Alignment Forum, July 10.
- ^
Habryka, Oliver et al. (2018) Introducing the AI Alignment Forum (FAQ), AI Alignment Forum, October 29.