This is a timely post. It feels like funding is a critical obstacle for many organisations.
One idea: Given the recent calls by many tech industry leaders for rapid work on AI governance, is there an opportunity to request direct funding from them for independent work in this area.
To be very specific: Has someone contacted OpenAI and said: “Hey, we read with great interest your recent article about the need for governance of superintelligence. We have some very specific work (list specific items) in that area which we believe can contribute to making this happen. But we’re massively understaffed and underfunded. With $1m from you, we could put 10 researchers working on these questions for 1 year. Would you be willing to fund this work?”
What’s in it for them? Two things:
If they are sincere (as I believe they are), then they will want this work to happen, and some groups in the EA sphere are probably better placed to make it happen than they themselves are.
We can offer independence (any results will be from the EA group, not from OpenAI and not influenced or edited by OpenAI) but at the same time we can openly credit them with funding this work, which would be good PR and a show of good faith on their part.
Forgive me if this is something that everyone is already doing all the time! I’m still quite new to EA!
Given the (accusations of) conflicts of interest in OpenAI’s calls for regulation of AI, I would be quite averse to relying on OpenAI for funding for AI governance
This is a timely post. It feels like funding is a critical obstacle for many organisations.
One idea: Given the recent calls by many tech industry leaders for rapid work on AI governance, is there an opportunity to request direct funding from them for independent work in this area.
To be very specific: Has someone contacted OpenAI and said: “Hey, we read with great interest your recent article about the need for governance of superintelligence. We have some very specific work (list specific items) in that area which we believe can contribute to making this happen. But we’re massively understaffed and underfunded. With $1m from you, we could put 10 researchers working on these questions for 1 year. Would you be willing to fund this work?”
What’s in it for them? Two things:
If they are sincere (as I believe they are), then they will want this work to happen, and some groups in the EA sphere are probably better placed to make it happen than they themselves are.
We can offer independence (any results will be from the EA group, not from OpenAI and not influenced or edited by OpenAI) but at the same time we can openly credit them with funding this work, which would be good PR and a show of good faith on their part.
Forgive me if this is something that everyone is already doing all the time! I’m still quite new to EA!
Given the (accusations of) conflicts of interest in OpenAI’s calls for regulation of AI, I would be quite averse to relying on OpenAI for funding for AI governance