[Question] Should I force myself to work on AGI alignment?

Hi, I’m an 18 year old going into college in a week. I am studying Computer engineering and mathematics. Since I have a technical interest and AGI has a much higher probability ending humanity this century(1/​10, I think) than other causes (that I would rather work on, like Biorisks is 110,000), would the utility positive thing to do be to force myself to get an ML alignment focused PhD and become a researcher?

I am at a mid-tier university. I think I could force myself to do AI alignment since I have a little interest, but not as much as the average EA. I wouldn’t find as much engagement in it, but I also have an interest in starting a for-profit company, which couldn’t happen with AGI alignment (most likely). I would rather work on a hardware/​software combo for virus detection (Biorisks), climate change, products for 3rd world, other current problems, or other problems that will be found in the future.

Is it certain enough that AI alignment is so much more important that I should forgo what I think I will be good at/​like to pursue it?

Edit: made some people confused that I had a false dichotomy between “pursuing my passion” and doing EA alignment. Removed that comment.