What does ‘one dollar of value’ mean?
Alternate title: How Much A Dollar Cost
Intro
Recently, I’ve heard other EAs use some variation of the phrase “I think activity X is worth $10,000 in expected value,” or even just “X is worth $10,000.”
Frankly, I have no idea what this means.
A few possibilities:
A hypothetical altruistic puppetmaster of the global economy should be willing to spend up to $10k for X to happen.
EA as a whole should be willing to spend up to $10k for X to happen.
I.e., X is a good use of $10k given the current funding situation
Some particular well-funded organization (say, OpenPhil) should be willing to spend up to $10k for X to happen.
If an EA was given $10k that they had to allocate in some way, they should spend it on X (and not on something else).
In other words, X is the best possible known use of $10k at the current moment
The speaker personally would be willing to pay up to $10k for X to happen if no there was no other funding source.
The speaker is literally willing to pay up to $10k for X, and has his/her Venmo app open.
X just generally seems like a good use of up to $10k
Doing X seems morally equivalent to giving $10k to some (particular?) EA organization.
Doing X is morally equivalent to preventing 10,000/n malaria deaths, where n is GiveWell’s current point estimate of the cost of preventing a death in the developing world.
Doing X is morally equivalent to whatever the speaker thinks $10k can accomplish.
There is no “actual” meaning, and “$1 of value” is just another way of saying ‘one util.’
There used to be an “actual” meaning, and maybe implicitly there still kinda is, but functionally it’s just #10 at this point.
Consideration 1: clarity in communication
Most importantly, I think, everyone involved should understand what is being expressed. What seems bad, and what might be happening now, is if some people are using the phrase to mean one thing, some are using it to mean another, and some are just kinda going with the semi-coherent flow. For now, I’m primarily hoping and recommending that we simply state more specifically what we’re using “dollars of value” to refer to in a particular case.
For example:
“I would be willing to pay up to $60 to prevent single chicken from being born on a factory farm”
If relevant, the speaker might even add “but I wouldn’t be willing to spend $600,000 to prevent 10,000 chickens from being born because I literally don’t have that much money.”
“Currently, OpenPhil* is willing to pay up to $1m to fund a project that we think reduces x-risk by 0.01%. We think we’d be willing to pay up to $10m for such a project, but we’d have to have a more in-depth discussion first”
*Remember, I’m making this up!
“If the US government was an altruistic, impartial, and rational agent, it would spend $500B per year on pandemic preparedness”
“Preventing one malaria death in the developing world empirically seems to cost around $5,000”
“Based on the fund’s previous grants, I think they should value this new grant proposal at $400k. However, I’d be willing to grant up to $1m if I was the only fund manager.”
“I would personally be indifferent between that Forum post getting written and having $300 magically appear in my bank account”
Consideration 2: measure (or imagine measuring) what matters
Even once expressive clarity is satisfied, it seems wise to use metrics that best approximate what we care about, which will of course vary by context. Since there is no “puppetmaster of the global economy” (to the best of my knowledge), for example, I’d guess definition #1 from the first list will rarely if ever be very useful.
Although it’s fallen out of fashion, the QALY seems clearer, more meaningful, and more useful than the “dollar of value”—in at least some circumstances. If we could find a QALY-like metric that represented a better-defined hedonic or moral quantity, and clearly allowed for suffering to count as disvalue, I’d be all for it.
A few days ago, I offered “Aaron-apple-minutes,” defined as ‘the moral value associated with my happiness arising from one minute of me eating an apple,’ as a comical example of how this might look. For my fellow hard core hedonic utilitarians, maybe “one minute activation of a single opioid receptor in a biological human” would be even better.
That said, this type of specificity isn’t always a virtue. I do think the “$X of value” unit has a lot going for it and should continue being used some of the time. Sometimes, it’s helpful to be able to cast aside specific predictions about how a particular action cashes out (“ Aaron-apple-minutes”) and clearly state an individual’s or organization’s subjective assessment of the importance of some action or initiative. Further, money sometimes really is the action-relevant unit; we probably don’t want the Future Fund managers converting everything into opioid receptor activations when discussing whether or not to grant $10 million to some project.
Conclusion
At least in the short term, I think clarity of communication is the dominant and more pressing concern. “A dollar of value” can mean a lot of different things, so trading a bit of time, effort, and verbosity for more precise communication seems like a worthwhile exchange.
- We need more discussion and clarity on how university groups create value by 28 Jun 2022 15:56 UTC; 35 points) (
- 10 Jun 2022 1:27 UTC; 3 points) 's comment on Proposal: Impact List—like the Forbes List except for impact via donations by (
I don’t use this framing very often because I think it confuses more than enlightens, but I roughly mean a something similar to #3:
13. I value this action roughly equivalently to “EA coffers” increasing by ~$10k.
Agreed, this appears to be the most neutral interpretation. Since the marginal value of increasing “EA coffers” depends on what EA as a whole spends its money on, it could function as a pretty useful metric for intuitively communicating value across cause areas imo.
A disadvantage might be that it’s not a very concrete metric, unlike something like the QALY. Additionally, someone needs to have a somewhat accurate understanding of what the funding distribution in EA looks like (and what the funding at the margin is being used on!) for this metric to make any sense.
I’ve wondered this a lot myself and find this lack of clarity to always be an issue. I personally think something in the realm of 9 makes the most sense, and I personally define “$X of value” as “as good as shifting $X from a morally neutral use to being donated to GiveDirectly”. It helps that I roughly expect GiveDirectly to have linear returns, even in the range of billions spent. But I do try to make this explicit in a footnote or something when I discuss value.
Another good idea in the realm of 9 is how GiveDirectly defines their ROI:
The Kendrick Lamar joke at the top makes me cringe (like I’d feel uncomfortable sharing this post with my friends outside of EA). Otherwise I really like this post; I’m also confused about the precise meaning of “doing X is worth $$$.”