Following on vollmer’s point, it might be reasonable to have a blanket rule against policy/PR/political/etc work—anything that is irreversible and difficult to evaluate. “Not being able to get funding from other sources” is definitely a negative signal, so it seems worthwhile to restrict guests to projects whose worst possible outcome is unproductively diverting resources.
On the other hand, I really can’t imagine what harm research projects could do; I guess the worst case scenario is someone so persuasive they can convince lots of EAs of their ideas but so bad at research their ideas are all wrong, which doesn’t seem very likely. (why not
‘malicious & persuasive people’? the community can probably identify those more easily by the subjects they write about)
Furthermore, guests’ ability to engage in negative-EV projects will be constrained by the low stipend and terrible location (if I wanted to engage in Irish republican activism, living at the EA hotel wouldn’t help very much). I think the largest danger to be alert for is reputation risk, especially from bad popularizations of EA, since this is easier to do remotely (one example is Intentional Insights, the only negative-EV EA project I know of)
This basically applies to everything as a matter of degree, so it looks impossible to put in a blanket rule. Suppose I raise £10 and send it to AMF. That’s irreversible. Is it difficult to evaluate? Depends what you mean by ‘difficult’ and what the comparison class is.
Following on vollmer’s point, it might be reasonable to have a blanket rule against policy/PR/political/etc work—anything that is irreversible and difficult to evaluate. “Not being able to get funding from other sources” is definitely a negative signal, so it seems worthwhile to restrict guests to projects whose worst possible outcome is unproductively diverting resources.
On the other hand, I really can’t imagine what harm research projects could do; I guess the worst case scenario is someone so persuasive they can convince lots of EAs of their ideas but so bad at research their ideas are all wrong, which doesn’t seem very likely. (why not ‘malicious & persuasive people’? the community can probably identify those more easily by the subjects they write about)
Furthermore, guests’ ability to engage in negative-EV projects will be constrained by the low stipend and terrible location (if I wanted to engage in Irish republican activism, living at the EA hotel wouldn’t help very much). I think the largest danger to be alert for is reputation risk, especially from bad popularizations of EA, since this is easier to do remotely (one example is Intentional Insights, the only negative-EV EA project I know of)
This basically applies to everything as a matter of degree, so it looks impossible to put in a blanket rule. Suppose I raise £10 and send it to AMF. That’s irreversible. Is it difficult to evaluate? Depends what you mean by ‘difficult’ and what the comparison class is.
I agree research projects are more robustly positive. Information hazards are one main way in which they could do a significant amount of harm.