Thanks for doing that and I look forward to hopefully publishing your findings. It would be valuable at least to me for the doc to show clearly, if you have time for that, if there might be biases in funding—it might be as important what is not funded as what is funded. For example, if some collection of smaller donors put 40% of funding towards considering slowing down AI, while a larger donor spends less than 2%, that might might be interesting at least as a pointer towards investigating such disparities in more detail (I noticed that Pause AI was a bit higher up in the donation election results, for example).
Thanks for doing that and I look forward to hopefully publishing your findings. It would be valuable at least to me for the doc to show clearly, if you have time for that, if there might be biases in funding—it might be as important what is not funded as what is funded. For example, if some collection of smaller donors put 40% of funding towards considering slowing down AI, while a larger donor spends less than 2%, that might might be interesting at least as a pointer towards investigating such disparities in more detail (I noticed that Pause AI was a bit higher up in the donation election results, for example).
Firstly, it’s not really me you should be thanking, it’s not my project, I am just helping with it a bit.
Secondly, it’s just another version of this, don’t expect any info about funding beyond an update to the funding info in this: https://www.alignmentforum.org/posts/zaaGsFBeDTpCsYHef/shallow-review-of-live-agendas-in-alignment-and-safety