On “don’t have much time left”, this is a very specific and precise question. If you think that AGI will happen in 5 years, I’d agree that advancing wisdom and intelligence probably isn’t particularly useful. However, if AGI happens to be 30-100+ years away, then it really gets to be. Even if there’s a <30% chance that AGI is 30+ years away, that’s considerable.
In the very short-time-frames, “education about AI safety” seems urgent, though is more tenuously “wisdom and intelligence”.
+1 for Stefan’s point.
On “don’t have much time left”, this is a very specific and precise question. If you think that AGI will happen in 5 years, I’d agree that advancing wisdom and intelligence probably isn’t particularly useful. However, if AGI happens to be 30-100+ years away, then it really gets to be. Even if there’s a <30% chance that AGI is 30+ years away, that’s considerable.
In the very short-time-frames, “education about AI safety” seems urgent, though is more tenuously “wisdom and intelligence”.