I don’t know about you but I just learned about one of the biggest updates to OPs grantmaking in a year on the Forum.
That said, the data does show some agreement with your and commenters vibe of lowering quantity.
I agree that the Forum could be a good place for a lot of these discussions. Some of them aren’t happening at all to my knowledge.[1] Some of those should be, and should be discussed on the Forum. Others are happening in private and that’s rational, although you may be able to guess that my biased view is that a lot more should be public, and if they were, should be posted on the Forum.
Broadly: I’m quite bullish on the EA community as a vehicle for working on the world’s most pressing problems, and of open online discussion as a piece of our collective progress. And I don’t know of a better open place on the internet for EAs to gather.
Yep—I liked the discussion in that post a lot, but the actual post seemed fairly minimal, and written primarily outside of the EA Forum (it was a link post, and the actual post was 320 words total.)
For those working on the forum, I’d suggest work on bringing in more of these threads to the forum. Maybe reach out to some of the leaders in each group and see how to change things.
I think that AI policy in particular is most ripe for better infrastructure (there’s a lot of work happening, but no common public forums, from what I know), though it probably makes sense to be separate from the EA Forum (maybe like the Alignment Forum), because a lot of them don’t want to be associated too much with EA, for policy reasons.
I know less about Bio governance, but would strongly assume that a whole lot of it isn’t infohazardous. That’s definitely a field that’s active and growing.
For foundational EA work /​ grant discussions /​ community strategy, I think we might just need more content in the first place, or something.
I assume that AI alignment is well-handled by LessWrong /​ Alignment Forum, difficult and less important to push to happen here.
I don’t know about you but I just learned about one of the biggest updates to OPs grantmaking in a year on the Forum.
That said, the data does show some agreement with your and commenters vibe of lowering quantity.
I agree that the Forum could be a good place for a lot of these discussions. Some of them aren’t happening at all to my knowledge.[1] Some of those should be, and should be discussed on the Forum. Others are happening in private and that’s rational, although you may be able to guess that my biased view is that a lot more should be public, and if they were, should be posted on the Forum.
Broadly: I’m quite bullish on the EA community as a vehicle for working on the world’s most pressing problems, and of open online discussion as a piece of our collective progress. And I don’t know of a better open place on the internet for EAs to gather.
Part of that might be because as EA gets older the temperature (in the annealing sense) rationally lowers.
Yep—I liked the discussion in that post a lot, but the actual post seemed fairly minimal, and written primarily outside of the EA Forum (it was a link post, and the actual post was 320 words total.)
For those working on the forum, I’d suggest work on bringing in more of these threads to the forum. Maybe reach out to some of the leaders in each group and see how to change things.
I think that AI policy in particular is most ripe for better infrastructure (there’s a lot of work happening, but no common public forums, from what I know), though it probably makes sense to be separate from the EA Forum (maybe like the Alignment Forum), because a lot of them don’t want to be associated too much with EA, for policy reasons.
I know less about Bio governance, but would strongly assume that a whole lot of it isn’t infohazardous. That’s definitely a field that’s active and growing.
For foundational EA work /​ grant discussions /​ community strategy, I think we might just need more content in the first place, or something.
I assume that AI alignment is well-handled by LessWrong /​ Alignment Forum, difficult and less important to push to happen here.