Important as a wiki topic to give short description of this policy proposal and relevant links/papers/discussion- as it seems like an important output of AI Governance literature/studies.
Tag as potential future posts may discuss/critique the idea(e.g. second post below)
Thanks for the suggestion. I don’t personally have views either way (probably because I’m not very familiar with the proposal), but since you think it’s a good idea, I went ahead and created it. I’ll try to add a brief description later today.
Windfall Clause (under Global Catastrophic Risk (AI))
Justification:
Important as a wiki topic to give short description of this policy proposal and relevant links/papers/discussion- as it seems like an important output of AI Governance literature/studies.
Tag as potential future posts may discuss/critique the idea(e.g. second post below)
Posts that it could apply to:
https://forum.effectivealtruism.org/posts/iYCAoP3JgXxGAvMrr/fhi-report-the-windfall-clause-distributing-the-benefits-of
https://forum.effectivealtruism.org/posts/wBzfLyfJFfocmdrwL/the-windfall-clause-has-a-remedies-problem
https://forum.effectivealtruism.org/posts/eCihFiTmg748Mnoac/cullen-o-keefe-the-windfall-clause-sharing-the-benefits-of
Thanks for the suggestion. I don’t personally have views either way (probably because I’m not very familiar with the proposal), but since you think it’s a good idea, I went ahead and created it. I’ll try to add a brief description later today.
Thanks Pablo! Looks great! I really appreciate your work on the wiki.