2nd year electronic engineering student at Southampton university. I’m currently not that involved with EA, but I agree with its philosophy and I’m open to opportunities.
In my free time I like working on ambitious electronics projects, a little programming here and there, and playing the piano.
I’ve been thinking similar things. A few comments:
We might be able to justify a specifically exponential decay by assuming that the impact of our intervention will eventually be nullified by some (unknown) future event. If that event follows a Poisson distribution (i.e. it has an equal probability of occurring in any given year), the probability that the event hasn’t occurred at a given point in the future decays exponentially.
The rate of decay is probably not universal, but dependent on the intervention. For example, in evaluating the impact of preventing X tonnes of carbon emissions, we expect the carbon will be absorbed into the ocean over 100s or 1000s of years. On the other hand, in trying to influence politics, our impact would become very uncertain beyond a time-scale of 5-10 years. We can therefore have a more certain impact further into the future by influencing climate change than politics. (There are obviously some big caveats here, but it illustrates the point).
Chaos theory is relevant here: it’s impossible to predict the outcome of actions taken now very far into the future due to the complexity of political/social systems, meaning that the expected value decays over time. In theory, by clapping your hands you create atmospheric turbulence which will eventually drastically change weather systems on the other side of the world (e.g. causing/preventing tornadoes!). Of course, the expected impact is zero—there is an equal probability of causing a tornado as there is of preventing one.
Influencing extinction would have an impact for a much longer (indefinite?) period of time.
I might write a separate post going into a bit more detail at some point.