Let me illustrate my argument. Suppose there are two opportunities, X and Y. Each of them contributes some value at each time step after they’ve been taken.
In the base timeline, A is never taken, and B is taken at time 2.
Now, it is time 1 and you have the option of taking A or B. Which should you pick?
In one sense, both are equally neglected, but in fact taking A is much better, because B will be taken very soon, whereas A will not.
The argument is that new technology is more likely to be like B, and any remaining opportunities in old technology is more likely to be like A (simply because if it were easy to do, we would have expected someone to do it already).
So even if most breakthroughs occur at the cutting edge, so long as we expect other people to do them soon, and they are not so big that we really want even a small speedup, then it can be better to find things that are more “persistently” neglected. (I used to use “persistent neglectedness” and “temporary neglectedness” for these concepts, but I thought it was confusing)
OK, I agree that makes sense as well—it now seems unclear which way it goes.
However, if you’re thinking from a career capital or more long-term future perspective (where transformative technologies are often the key lever), my guess is that EAs should still focus on learning about cutting-edge technologies.
Let me illustrate my argument. Suppose there are two opportunities, X and Y. Each of them contributes some value at each time step after they’ve been taken.
In the base timeline, A is never taken, and B is taken at time 2.
Now, it is time 1 and you have the option of taking A or B. Which should you pick?
In one sense, both are equally neglected, but in fact taking A is much better, because B will be taken very soon, whereas A will not.
The argument is that new technology is more likely to be like B, and any remaining opportunities in old technology is more likely to be like A (simply because if it were easy to do, we would have expected someone to do it already).
So even if most breakthroughs occur at the cutting edge, so long as we expect other people to do them soon, and they are not so big that we really want even a small speedup, then it can be better to find things that are more “persistently” neglected. (I used to use “persistent neglectedness” and “temporary neglectedness” for these concepts, but I thought it was confusing)
OK, I agree that makes sense as well—it now seems unclear which way it goes.
However, if you’re thinking from a career capital or more long-term future perspective (where transformative technologies are often the key lever), my guess is that EAs should still focus on learning about cutting-edge technologies.