Thanks for the question, JP, it’s always good to define those probabilities. I personally estimate his chance of winning to at least 30% (and going to 50%) due to Carrick’s fit, the unusually good fundraising, and the excellent campaign team. This is probably not one for people who want a definite win—it’s so much at play. However, I am very enthusiastic about Carrick’s potential for a very large impact and think it’s worth the shot.
Caro
I think Carrick has a decent shot, since he is running for a new seat (no incumbent), grew up in the district, has a compelling personal narrative (escaping poverty and then choosing a life of service), and doesn’t seem to be facing any extremely strong competitors. But, because he’s new to Oregon politics, he does need to raise a lot of money to attract the attention and support of local stakeholders and supporters.
This is such a great summary, thanks, Joy!
Building on what Sara said, I would add that there is also an “invisible part” which I would name “building a thriving Culture”, notably by organizing social events, or listening to people when they have issues in their work (and sometimes personal lives) and find ways to solve those. The idea is to continue building a high level of trust and engagement… and maybe make work fun too? :)
I appreciate the comments of @abrahamrowe on this as well as the discussion.
Just mentioning here that the Personal Assistant / Chief of Staff really is a continuum, where you could have Junior PM to COO-level people, as highlighted in this article.
We could make a collection of pictures that capture the concept of “caring for future generations”. I have been moved and inspired by this picture of John Kerry signing the Paris Agreement with his granddaughter on his lap. He said that she represented future generations.
Congrats on starting this work! Those are great.
A particular ask: may we have one or several on “Future Generations”? I’ve been willing to see and maybe have inspirational art around this for a while and I haven’t encountered this in this format.
I would be interested in evidence-based ways to fight procrastination and akrasia. Ideally, these techniques would have long-lasting positive effects so that once you learn them and practice them, they become natural.
It seems like this service could work very well with the “Open threads” comments where a lot of people introduce themselves and talk about their uncertainties.
Welcome! It seems like your skills in NGO management are very needed in EA projects! You can consider reading more about how to apply your expertise to high-impact causes and see if you come across exciting opportunities to directly work in an NGO or be a consultant for different organizations.
Thanks so much for writing this! I’m commenting so that more recent readers can see it. Wellbutrin/Bupropion can really be life-enhancing with very limited (to none) side effects.
As a follow-up to my comment, I would strongly suggest having higher salaries for Personal Assistant/Chief of Staff people. They bring immense value and their salaries need to reflect that. I think this would increase the number of qualified and sustainably-motivated candidates.
A $50k hiring bonus, which I think is really really high—maybe too high. $10-20k would probably make more sense. I’ve edited my comment to say $20k instead of 50 and for clarity. (Curious to know if you think this is about right or not).
I would be very excited about people working in cybersecurity in relation to protecting important information related to EA cause areas, such as in AI and in bio. This could definitely involve working for the US government but that could also involve working for AI companies such as DeepMind, OpenAI, or Anthropic. There may be some risks of automation- in that case, learning how to use AI and ML within cybersecurity sounds very promising.
My impression of this, as I worked in Operations at CHAI for two years and talked to several other operations people, is that we still have a very very high need for Operations people.
My intuition is that we have the following level of need from 0 to 10, where 0 is ” we have actually too many Operations people” to 10 where “we have a dire need for Operations people such that an EA-affiliated or EA-adjacent org would pay $20K just to hire that person as a hiring bonus (which I think is really high)”.COO-type: 7. It’s hard to find those people but the current people in operations can level up really fast and become those. (We have both capable and ambitious people!).
Manager-type: 8. These people are able to manage very well others, motivate them, keep them on track, create and nurture a great excellence-oriented culture. In small EA organizations, it’s quite likely that you one can go from project manager to COO, but as those organizations grow, there will be the additional “manager” step.(EDIT from 01/18/2022: I added this category )
Project manager-type: 7. A proactive person, very agent-y, team-oriented and deeply aligned with the purpose of the org. A junior person out of uni can become this after 2 years of junior work in an EA org or a consulting company-like. We lack this kind of people at the moment but it’s not dire yet.
A very specific type of Project Manager: the Personal Assistant/Chief of Staff: 9. This is an incredibly rare skill set that requires to be in the “brain” of the person they help and multiply their impact. It requires all the properties of the PM, plus a deep alignment with the person they’re helping and the “humility” to be behind the scene and do both high-level work and admin work.
Junior PM: 5 this is the “helpful intern” that an EA org can recruit to help with a project or with an event. We have plenty of brilliant students EAger ( ;) ) to help and their help is really welcome. They’ll become 3 (and hopefully 4!) soon!
(May those be good projects for some of the groups of the AGI safety program and the AI long-term governance program?)
Brilliant idea!
For AI existential risk, there are a few improvements that would make a big difference:
Merge relevant pages, because we now have
AI take-over (though this one also tackles the risks of job automation)
Improve the quality of the content. Some ideas: Include the most recent developments in AI safety research (eg cite ARCHES, Rohin Shah’s work, and many many others), include updates on timelines with the recent reports from OpenPhil. It would also be great to make a few of the concepts clearer, for example the “sources of risk”.
Some ideas:
The publication of “Superintelligence” by Nick Bostrom in July 2014 and its successful communication have been hugely impactful in establishing the field of AI safety, notably by getting recommendations from Bill Gates, Stephen Hawkin, and Elon Musk.
The Future of Life Institute’s organization of the “Beneficial AI conferences”, including facilitating the signing of the Open Letter on Artificial Intelligence and the Asilomar Conference, which established foundational AI principles
Probably the launching of several organizations with a focus on AI Safety. See more here (but need prioritization and attribution to the EA movement).
I felt the same thing when I discovered (and met) EAs :-). Welcome!
I found the EA forum really lively and thriving these last few months. It’s really a pleasure hanging out here! I also feel more at ease to comment/post thanks to the aliveness and welcoming community. Congrats to the CEA team for making an awesome job at developing a great space for EA discussions!
Strongly agree with this! Having only the first author get all the karma seems unfair for the co-author(s) and doesn’t provide the appropriate incentives. Maybe the first author gets 50% of the karma and the following ones share the rest.