Agreed, I’ll edit the post.
frib
[Question] Is asking for a lower pay a good way to donate money?
Hedging Grants And Donations
This roughly lines up with what I had in mind!
In a world in which people used the ITN as a way to do Fermi estimates of impact, I would have written “ITN isn’t the only way to do Fermi estimates of impact”, but my experience is that people don’t use it this way. I have almost never seen an ITN analysis with a conclusion which looks like “therefore, is roughly X lives per dollars” (which is what I care about). But I agree that “Fermi estimates vs ITN” isn’t a good title either: what I argue for is closer to “Fermi estimates (including ITN_as_a_way_to_Fermi_estimate, which sometimes is pretty useful) vs ITN_people_do_in_practice”.
That’s an ordering!
It’s mostly analyses like the ones of 80k Hours, which do not multiply the three together, which might let you think there is no ordering.
Is there a way I can make that more precise?
How would you compare these two interventions:
1: I=10 T=1 N=1
2: I=1 T=2 N = 2
I feel like the best way to do that is to multiply things together.
And if you have error bars around I, T & N, then you can probably do something more precise, but still close in spirit to “multiply the three things together”
I don’t understand how the robustness argument works, I couldn’t steelman it.
If you want to assess the priority of an intervention by breaking down it’s priority Q into I, T & N:
if you multiply them together, you didn’t make your estimation more robust than using any other breakdown.
if you don’t, then you can’t say anything about the overall priority of the intervention.
What’s your strategy to have high robustness estimation of numerical quantities? How do you ground it? (And how is it that it works only when using the ITN breakdown of Q, and not any other breakdown?)
Should you still use the ITN framework? [Red Teaming Contest]
I talked to people who think defaults should be higher. I really don’t know where they should be.
I put “fraction of the work your org. is doing” at 5% because I was thinking about a medium-sized AGI safety organization (there are around 10 of them, so 10% each seems sensible), and because I expect that there will be many more in the future, I put 5%.
I put “how much are you speeding up your org.” at 1%, because there are around 10 people doing core research in each org., but you are only slightly better than the second-best candidate who would have taken the job, so 1% seemed reasonable. I don’t expect this percentage to go down, because as the organization scale up, senior members become more important. Having “better” senior researchers, even if there are hundreds of junior researchers, would probably speed up progress quite a lot.
Where do you think the defaults should be, and why?
I added this feature!
I made the text a bit more clear. As for the bug, it didn’t affect the end result of the Fermi estimation but how I computed the intermediate “probability of doom” was wrong: I forgot to take into account situations where AGI safety ended up being impossible… It is fixed now.
Thank you for the feedback!
Thank you for the feedback. It’s fixed now!
At first, I thought this would be distracting, as there are many orders of magnitudes between the lowest “lives saved if you avoid extinction” estimations and the higher ones. But given that you’re not the first to ask for that, I think it would be a good idea to add this feature! I will probably add that soon.
How would you model these effects? I have two ideas :
add a section with how much you speed up AGI (but I’m not sure how I could break this down further)
add a section with how likely it would be for you to take on resources away from other actions that could be used to save the world (either through better AI safety, or something else)
Is one of them what you had in mind? Do you have other ideas?
What drives this huge drop? Naive utility would be very close to 100%. (Do you mean “aligned ais built in 100y if humanity still exists by that point”, which includes extinction risk before 2123?)