I’ve always been surprised that there is no fund you can donate to that is only for AI Alignment. You can either donate directly to an org or project or you can donate to a longtermist fund which is broader than just alignment.
I’ve tried to argue before that plenty of people are just not that cause neutral and would want to donate to a fund just for alignment. And now that alignment has gone much more mainstream it is even more important that we actually have a legible place for people to donate.
AI Safety has gone mainstream but most people in the world wouldn’t have a clue what “longtermism” is.
I’ve always been surprised that there is no fund you can donate to that is only for AI Alignment. You can either donate directly to an org or project or you can donate to a longtermist fund which is broader than just alignment.
I’ve tried to argue before that plenty of people are just not that cause neutral and would want to donate to a fund just for alignment. And now that alignment has gone much more mainstream it is even more important that we actually have a legible place for people to donate.
AI Safety has gone mainstream but most people in the world wouldn’t have a clue what “longtermism” is.
FWIW, the LTFF is considering spinning off a fund focused on AI alignment (though no promises at this point in time).
Agreed. Lots of people aren’t longtermist.