I think this is actually quite a complex question. I think it’s clear that there’s always a chance of value drift, so you can never put the chance of “giving up” at 0. If the chance is high enough, it may in fact be prudent to front-load your donations, so that you can get as much out of yourself with your current values as possible.
If we take the data from here with 0 grains of salt, you’re actually less likely to have value drift at 50% of income (~43.75% chance of value drift) than 10% (~63.64% of value drift). There are many reasons this might be, such as consistency and justification effects, but the point is the object level question is complicated :).
I’m quite excited to see an impassioned case for more of a focus on systemic change in EA.
I used to be quite excited about interventions targeting growth or innovation, but I’ve recently been more worried about accelerating technological risks. Specific things that I expect accelerated growth to effect negatively include:
Climate Change
AGI Risk
Nuclear and Biological Weapons Research
Cheaper weapons in general
Curious about your thoughts on the potential harm that could come if the growth interventions are indeed successful.