Furthermore, I think that the EA community needs to do more to ensure that EAs can easily become acquainted with [important between-cause considerations (IBCs)], by producing a greater quantity of educational content that could appeal to a wider range of people. This could include short(ish) videos, online courses, or simplified write-ups. An EA movement where most EAs have at least a high-level understanding of all known IBCs should be a movement where people are more aligned to the highest value cause areas (whatever these might be), and ultimately a movement that does more good.
[...] In light of this these are my proposed next steps:
[...] Do a stock take of all the resources that are currently available to learn about the IBCs Identify where further content might be useful to inform a wider range of people of the IBCs, and determine what type of content this should be Potentially collaborate with others to produce this content and disseminate to the EA community (I am very aware of the danger of doing harm at this stage and would mitigate this risk or may not engage in this stage myself if necessary)
If you are significantly worried that your top options might not work out, or just significantly interested in exploring options that make more use of your teaching-ish skillset, it may well be worth reaching out the post’s author, Jack Malde.
---
Another person it could be worth talking to is alexrjl. Alex did the Teach First program (source) but now does a range of things related to forecasting and EA-aligned research, including making a series of videos to help people understand and get better at forecasting. (I don’t know Alex personally and haven’t watched those videos myself.)
---
Disclaimer: I expect more could be said on this topic, perhaps especially by people who know more about your full set of interests, skills, and plans. These comments are just a relatively quick response. I also haven’t tailored my comment to the fact you have skills and interest in AI and policy.
Another relevant recent post is Important Between-Cause Considerations: things every EA should know about. For example, the author writes:
If you are significantly worried that your top options might not work out, or just significantly interested in exploring options that make more use of your teaching-ish skillset, it may well be worth reaching out the post’s author, Jack Malde.
---
Another person it could be worth talking to is alexrjl. Alex did the Teach First program (source) but now does a range of things related to forecasting and EA-aligned research, including making a series of videos to help people understand and get better at forecasting. (I don’t know Alex personally and haven’t watched those videos myself.)
---
Disclaimer: I expect more could be said on this topic, perhaps especially by people who know more about your full set of interests, skills, and plans. These comments are just a relatively quick response. I also haven’t tailored my comment to the fact you have skills and interest in AI and policy.