I’m not sure what the solution is—more experimentation seems generally like a good idea, but EA fundmakers seem quite conservative in the way they operate, at least once they’ve locked in a modus operandi.
For what it’s worth, my instinct is to try a model with more ‘grantmakers’ who take a more active, product-managery/ownery role, where they make fewer grants, but the grants are more like contracts of employment, such that the grantmakers take some responsibility for the ultimate output (and can terminate a contract like a normal employer if the ‘grant recipient’ underperforms). This would need a lot more work-hours, but I can imagine it more than paying itself back through the greater security of the grant recipients and the increased accountability for both recipients and grantmakers.
What talents do you think aren’t applicable outside the EAsphere?
Community building doesn’t seem to have that much carryover—that’s not to say it’s useless, just that it’s not going to look anywhere as good to most employers as something vaguely for-profit equivalent, like being a consultant at some moderately prestigious firm. Research seems comparable. It’s unlikely to be taken seriously for academic jobs, and likely to be far too abstract for for-profits. In general, grantees and even employees at small EA orgs get little if any peer support or training budgets, which will stymie their professional development even when they’re working in roles that have direct for-profit equivalents (I’ve written a little about this phenomenon for the specific case of EA tech work here).
I’m not sure what the solution is—more experimentation seems generally like a good idea, but EA fundmakers seem quite conservative in the way they operate, at least once they’ve locked in a modus operandi.
For what it’s worth, my instinct is to try a model with more ‘grantmakers’ who take a more active, product-managery/ownery role, where they make fewer grants, but the grants are more like contracts of employment, such that the grantmakers take some responsibility for the ultimate output (and can terminate a contract like a normal employer if the ‘grant recipient’ underperforms). This would need a lot more work-hours, but I can imagine it more than paying itself back through the greater security of the grant recipients and the increased accountability for both recipients and grantmakers.
Community building doesn’t seem to have that much carryover—that’s not to say it’s useless, just that it’s not going to look anywhere as good to most employers as something vaguely for-profit equivalent, like being a consultant at some moderately prestigious firm. Research seems comparable. It’s unlikely to be taken seriously for academic jobs, and likely to be far too abstract for for-profits. In general, grantees and even employees at small EA orgs get little if any peer support or training budgets, which will stymie their professional development even when they’re working in roles that have direct for-profit equivalents (I’ve written a little about this phenomenon for the specific case of EA tech work here).