Would you consider reviewing the Center for Reducing Suffering? They are an organization similar to the Center on Long-Term Risk in the sense that their main focus is reducing S-risks, i.e. risks of astronomical suffering, but are less focused on AI. CRS is currently Brian Tomasik’s top charity recommendation.
In what capacity are you asking? I’d be more likely to do so if you were asking as a team member, because the organization right now looks fairly small and I would almost be evaluating individuals.
Would you consider reviewing the Center for Reducing Suffering? They are an organization similar to the Center on Long-Term Risk in the sense that their main focus is reducing S-risks, i.e. risks of astronomical suffering, but are less focused on AI. CRS is currently Brian Tomasik’s top charity recommendation.
In what capacity are you asking? I’d be more likely to do so if you were asking as a team member, because the organization right now looks fairly small and I would almost be evaluating individuals.