Be more cooperative. (There are arguments about increasing cooperation, especially from people working on reducing S-risks, but I couldn’t find any suitable resource in a brief search)
Take a strong stance against narrow moral circles.
Have a good pitch prepared about longtermism and EA broadly. Balance confidence with adequate uncertainty.
Have a well-structured methodology for getting interested acquaintances more involved with EA.
Help friends in EA/longtermism more.
Strengthen relationships with friends who have a high potential to be highly influential in the future.
I basically like all of these. I think there might be versions which could be bad, but they seem like a good direction to be thinking in.
I’d love to see further exploration of these—e.g. I think any of your six suggestions could deserve a top-level post going into the weeds (& ideally reporting on experiences from trying to implement it). I feel most interested in #3, but not confidently so.
I basically like all of these. I think there might be versions which could be bad, but they seem like a good direction to be thinking in.
I’d love to see further exploration of these—e.g. I think any of your six suggestions could deserve a top-level post going into the weeds (& ideally reporting on experiences from trying to implement it). I feel most interested in #3, but not confidently so.
Gidon Kadosh, from EA Israel, is drafting a post with a suggested pitch for EA :)