An undergrad at University of Maryland, College Park. Majoring in math.
After finishing The Sequences at the end of 9th grade, I started following the EA community, changing my career plans to AI alignment. If anyone would like to work with me on this, PM me!
I’m currently starting the EA group for the university of maryland, college park.
When you start talking about silicon valley in particular, you start getting confounders like AI, which has a high chance of killing everyone. But if we condition on that going well or assume the relevant people won’t be working on that, then yes that does seem like a useful activity, though note that silicon valley activities are not very neglected, and you can certainly do better than them by pushing EA money (not necessarily people[1]) into the research areas which are more prone to market failures or are otherwise too “weird” for others to believe in.
On the former, vaccine development & distribution or gene drives are obvious ones which comes to mind. Both of which have a commons problem. For the latter, intelligence enhancement.
Why not people? I think EA has a very bad track record of extreme group think, caused by a severe lack of intellectual diversity & humility. This is obviously not very good when you’re trying to increase the productivity of a field or research endeavor.