I guess the author missed some of the details / had a slightly vague or adversarial form, but the core point just seems really important to bring up.
Tertiarily, and somewhat related to the first point, I think the post can be improved by showcasing some of the preferred traits or actions that you believe others in EA should emulate.
Maybe the author doesn’t say it explicitly, but the post seems to strongly be pointing at “(a) EAs making career decisions should make quantitative estimates of all promising career paths. (b) EA organizations should have someone making (maybe public) quantitative estimates of all possible directions for the organization to explore / fund, including that the marginal dollar spent doing whatever it’s doing is better than the marginal dollar spent on philanthropic advising, or creating Stanislav Petrovs, or persuading police officers to do suicide reduction.”
I guess the author missed some of the details / had a slightly vague or adversarial form, but the core point just seems really important to bring up.
Maybe the author doesn’t say it explicitly, but the post seems to strongly be pointing at “(a) EAs making career decisions should make quantitative estimates of all promising career paths. (b) EA organizations should have someone making (maybe public) quantitative estimates of all possible directions for the organization to explore / fund, including that the marginal dollar spent doing whatever it’s doing is better than the marginal dollar spent on philanthropic advising, or creating Stanislav Petrovs, or persuading police officers to do suicide reduction.”