Executive summary: A grantmaker on Open Philanthropy’s AI governance team gives a candid personal overview of what it’s like to work on Open Phil’s AI teams—arguing that the roles offer unusually high impact, autonomy, and talented colleagues, but also involve ambiguity, indirect impact, and challenges with feedback loops, work-life boundaries, and career progression.
Key points:
High-impact opportunity: Open Philanthropy (OP) is the largest philanthropic funder in AI safety, offering staff exceptional leverage over how hundreds of millions in funding are allocated across the field.
Strong culture and autonomy: The AI teams foster a culture of warmth, intellectual independence, and personal responsibility—staff are encouraged to form and defend their own views and can quickly take ownership of significant grants or strategic areas.
Professional growth and collaboration: OP actively supports professional development through coaching, conferences, and responsibility scaling, and the author highlights unusually competent and kind colleagues.
Tradeoffs of grantmaking: Compared to direct work, grantmaking shapes the field more broadly but sacrifices hands-on control and clear feedback; the author urges applicants to assess whether they prefer breadth and coordination over deep, individual contribution.
Challenges and risks: The author notes long and uncertain feedback loops, social complications from funding relationships, risk of “take atrophy,” and potential imposter syndrome when surrounded by highly impressive peers.
Fit considerations: Applicants well-suited to OP are comfortable with ambiguity, responsibility, and slow-moving institutional processes; those who prefer fast, concrete, research-driven environments may find the roles frustrating.
This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, andcontact us if you have feedback.
Executive summary: A grantmaker on Open Philanthropy’s AI governance team gives a candid personal overview of what it’s like to work on Open Phil’s AI teams—arguing that the roles offer unusually high impact, autonomy, and talented colleagues, but also involve ambiguity, indirect impact, and challenges with feedback loops, work-life boundaries, and career progression.
Key points:
High-impact opportunity: Open Philanthropy (OP) is the largest philanthropic funder in AI safety, offering staff exceptional leverage over how hundreds of millions in funding are allocated across the field.
Strong culture and autonomy: The AI teams foster a culture of warmth, intellectual independence, and personal responsibility—staff are encouraged to form and defend their own views and can quickly take ownership of significant grants or strategic areas.
Professional growth and collaboration: OP actively supports professional development through coaching, conferences, and responsibility scaling, and the author highlights unusually competent and kind colleagues.
Tradeoffs of grantmaking: Compared to direct work, grantmaking shapes the field more broadly but sacrifices hands-on control and clear feedback; the author urges applicants to assess whether they prefer breadth and coordination over deep, individual contribution.
Challenges and risks: The author notes long and uncertain feedback loops, social complications from funding relationships, risk of “take atrophy,” and potential imposter syndrome when surrounded by highly impressive peers.
Fit considerations: Applicants well-suited to OP are comfortable with ambiguity, responsibility, and slow-moving institutional processes; those who prefer fast, concrete, research-driven environments may find the roles frustrating.
This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, and contact us if you have feedback.