Do you have short AI timelines and/or p(doom|AGI) that is far too high for comfort? Do you want to pivot to working on directly addressing the problem and lowering all of our p(doom)s by slowing down or pausing AGI development? Is lack of funding/runway holding you back?
This is an invitation for people to apply to CEEALAR for a grant (of free accommodation, food and stipend) to work towards getting a global moratorium on AGI implemented. Such work may take the form of organising public campaigns, such as letter writing, petitions, protests, social media posts, ads etc, drafting of relevant policies or regulatory frameworks (e.g. how to implement caps on training runs), or meta work organising and fundraising for such activities.
We’ve already had one grantee stay who’s working in the space, and I (Founder and Executive Director) am very interested in the area, having recently (post-GPT-4) elevated it to a top priority of mine.
Active discussion spaces for those working in the area include the AGI Moratorium HQ Slack and the PauseAI Discord. Various projects are being organised within them.
Given we (CEEALAR) are a (UK) charity, we have to be mindful of not being too overtly political, or ensuring that any political activity is furthering our charitable objects[1]. This means not being partisan by singling out individual political parties or politicians for criticism, or being needlessly provocative[2]. Think public awareness raising, public education and encouragement of civic responsibility over the issue, similar to how many charities focused on climate campaigning operate (e.g. the Climate Coalition)[3].
I look forward to your applications and hope that we can hereby accelerate meaningful action toward a global moratorium on AGI[4].
Following charitable object 3, “To advance such other purposes which are exclusively charitable according to the law in England and Wales”, your work would need to fit in with the Charitable Purposes under UK law listed here. For practical purposes, preventing human extinction from AGI would come under “saving of lives”.
Apply to CEEALAR to do AGI moratorium work
Do you have short AI timelines and/or p(doom|AGI) that is far too high for comfort? Do you want to pivot to working on directly addressing the problem and lowering all of our p(doom)s by slowing down or pausing AGI development? Is lack of funding/runway holding you back?
This is an invitation for people to apply to CEEALAR for a grant (of free accommodation, food and stipend) to work towards getting a global moratorium on AGI implemented. Such work may take the form of organising public campaigns, such as letter writing, petitions, protests, social media posts, ads etc, drafting of relevant policies or regulatory frameworks (e.g. how to implement caps on training runs), or meta work organising and fundraising for such activities.
We’ve already had one grantee stay who’s working in the space, and I (Founder and Executive Director) am very interested in the area, having recently (post-GPT-4) elevated it to a top priority of mine.
Active discussion spaces for those working in the area include the AGI Moratorium HQ Slack and the PauseAI Discord. Various projects are being organised within them.
Orgs already in the space that may have projects you can get involved with, include: PauseAI, Campaign for AI Safety, Stop AGI, Centre for AI Policy, Stakeout AI, Centre for AI Safety, Future of Life Institute.
Given we (CEEALAR) are a (UK) charity, we have to be mindful of not being too overtly political, or ensuring that any political activity is furthering our charitable objects[1]. This means not being partisan by singling out individual political parties or politicians for criticism, or being needlessly provocative[2]. Think public awareness raising, public education and encouragement of civic responsibility over the issue, similar to how many charities focused on climate campaigning operate (e.g. the Climate Coalition)[3].
I look forward to your applications and hope that we can hereby accelerate meaningful action toward a global moratorium on AGI[4].
Following charitable object 3, “To advance such other purposes which are exclusively charitable according to the law in England and Wales”, your work would need to fit in with the Charitable Purposes under UK law listed here. For practical purposes, preventing human extinction from AGI would come under “saving of lives”.
For example, any protests being organised should require people to abide by this code of conduct.
We reserve the right to withdraw funding to anyone who doesn’t work within our charitable objects or who’s work may risk damage to our reputation.
We are also still very much accepting general applications for any EA-related work/career development.