The Nonlinear Fund is working on addressing this problem for people in AI Safety (My guess would be they will start with people at orgs, then possibly expand it to people on certain grants, I interned there a while ago, so I don’t know the current plan).
Gavin Li is working on EA Offroad for people “not constituted for college” or who would find attending college challenging due to their financial position.
I would really like to see the establishment of more EA Hubs in cities that are more affordable. I think that the financial challenges a lot of people are facing are the result of trying to support themselves in some of the most expensive cities that there are. That said, there seem to be a few projects starting in this space, so I would probably encourage people to support existing projects, rather than starting more.
I’m not exactly sure of the scope of Magnify mentoring (previously WAMBAM), but it might be able to provide some support helping people figure out their lives. If not, then perhaps someone should create a mentoring service more focused on helping people improve their lives.
Further ideas:
Bountied Rationality—I’m sure that there are a lot of small, useful, and accessible tasks to do. Perhaps someone should apply for funding in order to post more bounties here. (Argument against: bounties are generally winner-takes-all so they can easily result in people burning up a lot of time without receiving any money in return)
On a similar, but slightly different note, the AI Safety Fundamental course is now paying facilitators $1000. Having more of these kinds of opportunities available seems positive.
Programming bootcamps—a lot of EAs are capable of becoming programmers and this could provide a path to financial stability.
Some kind of peer support project with group facilitators receiving training from professionals.
Something like Y Couchinator to help EAs share their free rooms.
Exit grants. In some circumstances, it might make sense to award exit grants to people who were funded/employed productively for a reasonable period but have now become unproductive. These grants should probably be awarded privately with only the total number of grants and dollar value reported.
Final thoughts:
Given all the excellent points you make about the challenges of such a fund, I believe that it’s important to have a wide variety of other means of support. Nonetheless, I suspect that a more traditional assistance organisation would be valuable, so long as there was proper communication about its role, specifically, the limits on how much support it can provide and that the organisation wouldn’t be able to help everyone.
That all sounds pretty good to me. I like the idea of a wide variety of means of support; both to try out more things (it’s hard to tell what would work in advance), and because it’s probably a better solution long-term.
The Nonlinear Fund is working on addressing this problem for people in AI Safety (My guess would be they will start with people at orgs, then possibly expand it to people on certain grants, I interned there a while ago, so I don’t know the current plan).
Gavin Li is working on EA Offroad for people “not constituted for college” or who would find attending college challenging due to their financial position.
I would really like to see the establishment of more EA Hubs in cities that are more affordable. I think that the financial challenges a lot of people are facing are the result of trying to support themselves in some of the most expensive cities that there are. That said, there seem to be a few projects starting in this space, so I would probably encourage people to support existing projects, rather than starting more.
I’m not exactly sure of the scope of Magnify mentoring (previously WAMBAM), but it might be able to provide some support helping people figure out their lives. If not, then perhaps someone should create a mentoring service more focused on helping people improve their lives.
Further ideas:
Bountied Rationality—I’m sure that there are a lot of small, useful, and accessible tasks to do. Perhaps someone should apply for funding in order to post more bounties here. (Argument against: bounties are generally winner-takes-all so they can easily result in people burning up a lot of time without receiving any money in return)
On a similar, but slightly different note, the AI Safety Fundamental course is now paying facilitators $1000. Having more of these kinds of opportunities available seems positive.
Programming bootcamps—a lot of EAs are capable of becoming programmers and this could provide a path to financial stability.
Some kind of peer support project with group facilitators receiving training from professionals.
Something like Y Couchinator to help EAs share their free rooms.
Exit grants. In some circumstances, it might make sense to award exit grants to people who were funded/employed productively for a reasonable period but have now become unproductive. These grants should probably be awarded privately with only the total number of grants and dollar value reported.
Final thoughts:
Given all the excellent points you make about the challenges of such a fund, I believe that it’s important to have a wide variety of other means of support. Nonetheless, I suspect that a more traditional assistance organisation would be valuable, so long as there was proper communication about its role, specifically, the limits on how much support it can provide and that the organisation wouldn’t be able to help everyone.
That all sounds pretty good to me. I like the idea of a wide variety of means of support; both to try out more things (it’s hard to tell what would work in advance), and because it’s probably a better solution long-term.