You asked whether you should spend time on this book at the expense of going part time on your job, i.e. you raised the question of the opportunity cost.
In order to assess that, we need to work out a Theory of Change for your book. Is it to support people interested in doing good, and helping them to be more effective? In that case it would be useful to see your model for this:
What’s your forecast for the number of people buying your book?
What’s the shape of your distribution on that? E.g. is there a fat tail on the possibility that it will sell very well?
What proportion of readers do you expect would change behaviour as a result of reading your book?
How should you adjust that for counterfactuals? (i.e. what proportion of those people would have ended up reading TLYCS or DGB or something else instead?)
How valuable is a counterfactual-adjusted reader who changes their behaviour?
How much of your time needs to be given up in order to achieve these outcomes?
I suspect that the cruxiest of the above questions will probably be the one about counterfactuals. Will you have a marketing strategy that enables you to reach people who would not have ended up reading another EA book anyway?
If not, my not-carefully-thought-through intuition is that it would be better for you to focus your time on your day job (assuming it’s high impact, which, from memory, I think it is). Which is a shame, because I would have liked to see your book!
While people reading the book changing their altruistic behavior in ways that counterfactually improve the world is one way I see this book being valuable, I think a larger component of its value would be via people better understanding what EA is about, and what EAs are doing and why. As above:
Existing EA writing is also generally aimed at an elite audience. I see why some people have decided to take that approach, but I also think it’s really important to have a presentation of these ideas grounded in common sense. If we ignore the general public we leave EA’s popular conception to be defined by people who don’t understand our ideas very well.
How people who don’t decide to get into EA view EA approaches to the world matters, and I think we’ve been neglecting this. I’m concerned about a growing dynamic where we’re increasingly misunderstood, people who don’t actually substantively disagree with us reflexively counter EA efforts, and people who would find EA ideas helpful don’t engage with them because their osmosis-driven impression of EA is mistaken.
I would state that last paragraph even more strongly: my hypothesis is that the views about EA held by people who will never decide to get into EA will ultimately have a larger effect on how EA impacts the world (both in magnitude and in direction) than the views of people who are already a part of EA communities today.
General-public perception of a group pre-filters the types of people who engage with curiosity toward that group’s ideas, and I think that could be a strong enough force to make my prediction true on its own. I also think that general public reputation for a group can affect the types of opportunities that the group can access for acting upon their ideas, which might particularly shape the actual activities of a community that largely focuses on each person’s highest impact opportunities.
I wonder if you could reduce the opportunity cost by farming out some of the background labor to (for lack of a better term) a research assistant? Seems like that might be a useful investment (depending on funding) to maximize your productivity and minimize time away from your object-level job.
You asked whether you should spend time on this book at the expense of going part time on your job, i.e. you raised the question of the opportunity cost.
In order to assess that, we need to work out a Theory of Change for your book. Is it to support people interested in doing good, and helping them to be more effective? In that case it would be useful to see your model for this:
What’s your forecast for the number of people buying your book?
What’s the shape of your distribution on that? E.g. is there a fat tail on the possibility that it will sell very well?
What proportion of readers do you expect would change behaviour as a result of reading your book?
How should you adjust that for counterfactuals? (i.e. what proportion of those people would have ended up reading TLYCS or DGB or something else instead?)
How valuable is a counterfactual-adjusted reader who changes their behaviour?
How much of your time needs to be given up in order to achieve these outcomes?
I suspect that the cruxiest of the above questions will probably be the one about counterfactuals. Will you have a marketing strategy that enables you to reach people who would not have ended up reading another EA book anyway?
If not, my not-carefully-thought-through intuition is that it would be better for you to focus your time on your day job (assuming it’s high impact, which, from memory, I think it is). Which is a shame, because I would have liked to see your book!
While people reading the book changing their altruistic behavior in ways that counterfactually improve the world is one way I see this book being valuable, I think a larger component of its value would be via people better understanding what EA is about, and what EAs are doing and why. As above:
How people who don’t decide to get into EA view EA approaches to the world matters, and I think we’ve been neglecting this. I’m concerned about a growing dynamic where we’re increasingly misunderstood, people who don’t actually substantively disagree with us reflexively counter EA efforts, and people who would find EA ideas helpful don’t engage with them because their osmosis-driven impression of EA is mistaken.
I would state that last paragraph even more strongly: my hypothesis is that the views about EA held by people who will never decide to get into EA will ultimately have a larger effect on how EA impacts the world (both in magnitude and in direction) than the views of people who are already a part of EA communities today.
General-public perception of a group pre-filters the types of people who engage with curiosity toward that group’s ideas, and I think that could be a strong enough force to make my prediction true on its own. I also think that general public reputation for a group can affect the types of opportunities that the group can access for acting upon their ideas, which might particularly shape the actual activities of a community that largely focuses on each person’s highest impact opportunities.
Yes. The Life You Can Save and Doing Good Better are pretty old. I think it’s natural to write new content to clarify what EA is about.
I wonder if you could reduce the opportunity cost by farming out some of the background labor to (for lack of a better term) a research assistant? Seems like that might be a useful investment (depending on funding) to maximize your productivity and minimize time away from your object-level job.