Further possible projects on EA reform
As part of this project on reforms, we collected a rough list of potential projects for EA organizational reform. Each idea was pretty interesting to at least one of us (Julia Wise, Ozzie Gooen, Sam Donald), but we don’t necessarily agree about them.
This list represents a lightly-edited snapshot of projects we were considering around July 2023, which we listed in order to get feedback from others on how to prioritize. Some of these were completed as part of the reform project, but most are still available if someone wants to make them happen.
There’s an appendix with a rough grid of projects by our guess at importance and tractability.
Background
Key Factors
Factors that might influence which projects people might like, if any:
Centralization vs. Decentralization
How much should EA aim to be more centralized / closely integrated vs. more like a loose network?
Big vs. Small Changes
Are existing orgs basically doing good stuff and just need some adjustments, or are some things in seriously bad shape?
Short-term vs. Long-term
How much do you want to focus on things that could get done in the next few months vs changes that would take more time and resources?
Risk appetite
Is EA already solid enough that we should mostly aim to preserve its value and be risk-averse, or is most of the impact in the future in a way that makes it more important to stay nimble and be wary of losing the ability to jump at opportunities?
Issue Tractability
How much are dynamics like sexual misconduct within the realm of organizations to influence, vs. mostly a broad social problem that orgs aren’t going to be able to change much?
Long-term Overhead
How much operations/infrastructure/management overhead do we want to aim for? Do we think that EA gets this balance about right, or could use significantly more or less?
If you really don’t want much extra spending per project, then most proposals to “spend resources improving things” couldn’t work.
There are sacrifices between moving quickly and cheaply, vs. long-term investments and risk minimization.
Boards
Advice on board composition
Scope: small
Action:
Make recommendations to EA organizations about their board composition.
Have compiled advice from people with knowledge of boards generally
Many to the effect of “small narrow boards should be larger and have a broader range of skills”
Can pass on this summary to orgs with such boards
What can we offer here that orgs aren’t already going to do on their own?
Collect advice that many orgs could benefit from
Area where we don’t have much to offer:
Custom advice for orgs in unusual situations
Julia’s take: someone considering board changes at orgs in unusual situations should read through the advice we compile, but not expect it to be that different from what they’ve probably already heard
Steps with some organizations
[some excerpted parts about communications with specific organizations]
If you could pick a couple of people who’d give the best advice on possible changes these orgs should make to the board, who would it be?
Small organizations with no board or very basic board: work out if we have useful advice to give here
Existing work / resources:
Investigations
FTX investigation
Project scope: medium
Action:
Find someone to run an investigation into how EA individuals and organizations could have better handled the FTX situation.
Barriers:
People who did things that were bad or will make them look bad will not want to tell you about it. Everyone’s lawyers will have told them not to talk about anything.
Existing work / resources:
EV’s investigation has a defined scope that won’t be relevant to all the things EAs want to know, and it won’t necessarily publish any of its results.
Oliver Habryka and friends have done a discussion series informally looking at what happened and how trust within EA related to the problem.
New general-purpose investigation team
Project scope: large
Action:
Set up a new team or organization for doing investigations into major problem behavior in EA.
Potential Responsibilities:
Donor vetting
Donor is making their money in some obviously shady way
Donor is making their money in a way that’s not obviously shady but could be
Organization vetting
Fraud / major misalignment at an EA org
More standard HR problems at an EA org
Suboptimal performance at an EA org
Sources of inspiration:
Auditing firms / internal audits
Ombuds in governments where the ombuds supposedly has some teeth, e.g. Finland
US courts system investigating its own staff / police anticorruption units policing other police
Project Preparation:
Sketch out more concretely what types of people might work there and how much that would cost
Which people?
Someone with law experience? Not that any given lawyer can advise about everything, but someone who knows what kind of specific advice to seek
Someone with investigation experience in other contexts (law enforcement, private investigation?)
Forensic accounting? What Kroll does?
Someone who knows the EA landscape well
How full-time?
Do they mostly work somewhere else and do some contracting as needed?
How many investigations a year would there be demand for?
Ask different orgs/ teams in EA how often they’d want to hire such a service if it existed
Try to work out how likely they would have been to detect known past problems
Try to work out what the carrots/sticks would be for orgs being investigated to share info: other orgs won’t work with you if you don’t agree to allow such investigations?
Medium-term goal:
Work out if there’s something worth developing further
Alternative to the CEA community health team
(note that one of the authors, Julia, works on this team)
Project scope: medium
Action:
Set up a new team or set of consultants that complements the CEA Community Health Team.
Value:
Especially for cases involving people in EV or OP leadership, because people shouldn’t expect a team to correctly police its own bosses / main funders. Or for concerns about members of the community health team itself.
This would either be an alternative system people could use from the start in such cases, or a second opinion.
Possible forms:
Hire a specific person, or contract them part-time as needed
Develop a network of professionals (HR people, legal investigators, ombudspeople)
Develop a network of community people from other communities (religious communities, housing groups, other social groups)
Develop a network of EAs with demonstrated community judgment who aren’t beholden to EV or OP and unlikely to become so
The community health team has been looking for such people, and it’s very hard to find this combination.
Barriers
Several existing projects along these lines have gone poorly, either by their people burning out or by making decisions that stakeholders viewed as badly done. If you’re considering something in this area, Julia suggests contacting her (julia.wise@centreforeffectivealtruism.org) or any of the other people on the community health team for lessons learned about our own work and other efforts we’ve seen.
Alternatives:
The community health team could eventually spin out of CEA and/or EV and try to get independent funding.
Existing work / resources:
HR staff at relevant organizations
Local groups sometimes have designated contact people
Three community members serving as volunteer contact people after the Owen situation, listed here
Proposal on the EA Forum for an EA Ombudsperson, which overlaps with this idea.
Whistleblowing
(See also investigation team above which would likely receive whistleblowing reports)
Improve awareness of whistleblowing resources
[November 2023: here is the post we produced about whistleblowing.]
Project scope: small
Actions:
Improve access to and knowledge of existing institutions
Methods:
Produce a public explainer about different bodies you can go to and what they do (e.g. a government regulatory agency, HR at a company, funders, community health team).
Could post to the EA Forum at regular intervals (maybe once a year), or have as a special post suggested for new users (so that everyone sees it once, at least).
Material for org staff, parallel to workers’ rights notices: “Here’s a list of internal and external places you can go to if you think there’s a problem in the org”
Post information about this explainer for EA Events. For example, on materials for new college events or EAGs, there could be a link to this.
List of different people who community members can consider going to, since different community members will trust different ones.
Identify barriers to people reporting things
Lack of knowledge?
Worry about repercussions from employer
A lawyer thought it was viable to have a specific whistleblowing platform carved out in an org’s NDA. (Not guaranteed the org would want to). This would make clearer to org staff they’re not breaking their NDA by reporting in that way.
Could be good to have multiple organizations (especially funders) make clear that they’ll strongly disapprove of orgs that retaliate against staff for whistleblowing
But it’s hard to draw the distinction between “valid whistleblowing that shouldn’t be discouraged” and “breaking confidentiality without sufficient public benefit to consider it whistleblowing.”
Risks:
Orgs or people that sign up to receive reports may get a lot of problems they don’t know how to handle or want to handle.
If they accept cases that aren’t a good fit for them, can lead to bad outcomes.
If they decline cases that aren’t a good fit for them, people feel disappointed.
Hard to prevent some forms of retaliation (e.g. giving bad references) by employers if they know staff have reported them
Existing alternatives:
HR at organizations
Reporting problems to funders. OP has an EthicsPoint contact option on their contact page now
Community health team invites reports about interpersonal problems
Coordination
Improve coordination between orgs
Project scope: medium (ongoing)
Action:
Various small improvements to coordination between EA orgs about program work (separate from operations).
Goals:
Create common knowledge of gaps where no one owns something. Maybe it’s ok to have a gap because it’s not that important, or maybe it’s a serious problem. Still better to spell it out than have people assume someone else owns it.
Better syncup between teams/staff working on similar areas
Example interventions:
More frequent calls between staff working on similar areas.
Check if less-closely-tied funders want more coordination
Existing work / resources:
Meta Coordination Forum (formerly Coordination Forum or Leaders Forum)
Internal communications infrastructure
Project scope: medium
Action:
Set up a new project to own/improve internal communications in EA.
Along the line of Google’s internal comms system.
Potential Activities:
What sorts of things would be useful to track / share?
Databases of internal activities
Meeting videos / notes
Slack-type discussions
Blog-type discussions, including pre-posts.
Questions/Considerations:
Are there clear clusters of EA orgs that should be in communication?
Maybe “EA meta + community organizations”
AI safety organizations might be a separate cluster, though there is overlap.
Is there anyone who would be a good fit to do something like this?
There’s a subtle line between “Internal communications infrastructure” and “Better coordination”. “Internal communications infrastructure” could include internal podcasts/recordings and internal events.
Information about what funders/leaders want seems particularly important. That might be able to deliver 80% of the benefit.
Existing work / resources:
The ops version of this already exists and seems well run.
There is an EA Managers Slack, but it’s not very active or populated.
Operations & org policies
Ops support for newer/smaller (<~10 person) orgs
Project scope: medium
Action: Spend time promoting operational support for EA organizations.
Goals:
Reduce avoidable errors in legal and HR practices that are costly later.
Reduce how bad it is when conflicts of interest, bullying, or harassment come up in workplaces.
Methods:
Find out if there’s a disconnect between how much funders would be happy to see spent on ops and what grant applicants perceive. If there’s a disconnect, try to correct the perception or encourage funders to correct the perception.
One ops staff member who’s advised other ops people writes: “It’d be nice if there were some mechanism to provide an incentive here, e.g. funders evaluating orgs/applicants on this axis more, or providing checklists/recommended resources etc. Wouldn’t necessarily want it to go in a “you can’t have a grant, your ops is too bad” direction, but it could be useful to have something like “sure, your idea is promising, have some money. However we think you should pay a lot of attention to management & ops, we’ll ask how this is going if you apply for a renewal, here are some helpful resources/advisors”.”
See if funders would offer easier-to-get funding packages for specific risk reduction work like getting an HR professional or lawyer to look over org policies if they’ve never done that.
Learn what is blocking small / new organizations from putting staff handbook policies (like COI and anti-harassment policies) in place
Learn more about what’s blocking small orgs from getting advising — several of them said it’s helpful to have a more experienced person talk you through
The org doesn’t prioritize funding ops advising, so they don’t have budget to hire outside advisors?
They don’t know how to find advisors they’d be happy to pay for?
AntiEntropy aims to provide this
Advocate for funding for completing AntiEntropy’s resource library, if small organizations indicate this would be useful to them. (Are there other existing libraries they could use instead?)
Address lack of EA-aligned experienced ops people (most potential ops people have alignment or ops experience but not both.) One ops person writes:
“E.g. a respected org with a large ops team & good training processes in place doing some kind of ‘tour of duty’ hiring where they hire people for a year and expect them to move on after that.
“Maybe some ‘things you need to succeed working in the EA community’ resources/training to help external experienced people know what they’ll need to succeed”
More sharing of existing knowledge within EA ops / legal space:
Some paid hours on running the EA ops slack, to better curate recommended resources / contractors, or compile written resources
Inhouse lawyers at EA orgs could refer smaller orgs to other legal advisors (e.g. how do you find a good lawyer about X area? Hard to know where to start if you’re not already connected to a network.)
Existing work / resources:
AntiEntropy—advising and written resources
EA services directory (EASE)
Julia’s uncertainty: why would you go with an EA provider for many of these?
Support/advice from Charity Entrepreneurship to its incubees
Could their written resources be shared more widely?
COI policy sharing
[November 2023: Here is the resource we produced.]
Project scope: small
Action:
EA organizations share COI policies as examples
Staff hiring / promotion / firing decisions — templates for this are already widely available
Grantmaking — this is a less-standard thing to have a policy for
Admissions (for offices, and for events like conferences and retreats) — this is a less-standard thing to have a policy for
Reasoning:
It’s common to have a COI policy for hiring and management. But it’s less common to have one for other areas of decision-making in EA, and it’s genuinely hard to write a good policy.
This is a subset of “Ops support” that we want to highlight.
Implementation options:
On the low-effort side, we could have a simple one-time project to encourage adopting such policies. If more time is spent, there could be more encouragement, vetting, and continued promotion.
Challenges:
Employees might not take these seriously, and organizations may not remember to ask people about COIs.
Weak policies may encourage disclosure of COIs but not create real change in decision-making.
Organizations may not want to implement these policies.
Existing work / resources:
Guide on COI policies from AntiEntropy
Advice for new grantmakers
Project scope: small
Action: Help new EA grantmaking programs avoid community and operational problems
Recent examples:
Methods:
Content about power dynamics — from OP, EA Funds, and community health?
Advice from more established grantmakers on “surprising things that go wrong” / “stuff we wish we had known earlier in our grantmaking” — from OP and EA Funds grantmakers?
Limitations:
These projects are comparatively small in funding provided.
These organizations might not want the advice.
Designate an “EA COO”
Project scope: large
Action: Assign a semi-formal entity to pursue efforts that improve the state of operations and management in a large set of EA organizations.
Alternative:
Instead of there being a “COO”, there could be something like an “organizational health” team, which has similar duties.
Responsibilities:
Encourage/oversee potential high-value mergers & acquisitions
Identify key correlated risks and opportunities (see also Risk management)
Help identify good talent for CEO or board member roles at other organizations.
Help ensure that orgs have good board members and leadership.
Limitations:
If there’s not much interest by funders in changing the state of EA, there might not be much for this person/group to do.
It will be difficult to balance official authority but also giving them sufficient oversight. It’s easy for such a position to be either powerless, or tyrannical.
Probably would require a very strong candidate to make this worthwhile.
Practical and legal barriers to getting enough information about what’s going on at different organizations.
Considerations:
A “COO” would have a broader mandate than many of the other proposals here. That could be good (allowing for flexibility and optimization) or bad (they spend time avoiding the important things).
Inspiration / parallels:
Discussion of COOs in the public sector, specifically around the US government. Arguably, some public institutions are made of many small turfs, so it would be useful to have some sort of over-arching leaders and responsibilities.
Some hiring CEA has done has aimed to provide a resource across the entire EA ecosystem.
EA communications staff aim to work on comms/branding for the space generally, rather than for CEA specifically.
Community health and special projects team aims to prevent community problems across the EA ecosystem.
CEA listed a position for EA strategy coordinator in spring 2021, but did not fill it.
Code of conduct for EA leaders
Project scope: small to medium
Reasoning:
Higher standards are worthwhile for leaders than for most community members
Misbehavior by powerful people often has worse direct effects on others
Misbehavior by powerful people has a worse reputational effect on the EA space
HR rules by any given institution don’t cover the full area where it’s possible for misbehavior by a powerful person to do harm
Target audience
Leadership staff and board members of organizations
Key funders or grantmakers (or all grantmakers?)
Others would be free to opt in to hold themselves to these standards
Possible versions
Norm-setting: maybe some leaders personally endorse the code and encourage their staff to do likewise, but there’s no roster of who endorses it beyond a few examples
More formal: Individuals actually sign onto it, publicly or privately
Possible components:
Honesty and integrity
Around professional work
Around research
etc
Care around power dynamics
Not having sexual / romantic interactions with people who report to you or are mentored by you
Extra care around consent
Avoiding situations where others feel a transactional relationship is going on related to the power you hold in EA, like feeling pressured to do favors for you
Challenges:
Hard to write specifics that people will agree are a good idea
No clear enforcement mechanism
Unclear where the boundary of EA-relevant behavior is
Maybe makes staff feel stifled / overly policed
It’s probably not legal for an employer to punish an employee for not following these rules outside a work context, at least in California
Inspiration / parallels:
Codes of conduct for clergy (e.g. Unitarian Universalist, Reconstructionist Rabbinical Association)
Codes of conduct for judges
CEA’s Guiding Principles
EA organization landscape assessment
Project scope: large
Action:
Have some committee/organization do an analysis of the EA organization landscape, with respect to operational/bureaucratic/strategic issues.
There have been some analyses of specific EA orgs, but not many of the broader organization ecosystem.
Details:
This would seek to judge it on competence and suggest ways to improve.
Identify key bottlenecks and gaps between organizations.
Management consultants offer a lot of services to do things like this.
Particular attention would likely be spent on the management/leadership, especially the funders. (Similar to how in organizational consulting, organizational leadership is often critical to understand)
Alternative:
Instead of having experienced consultants / managers do said assessment, perhaps there could be an internal journalist/consultant or similar who spends a long time in EA and tries writing about gaps and opportunities.
Uncertainties:
It’s not clear if we should evaluate the “EA landscape” as an organization, or as a movement. Perhaps we would want analyses of both.
Challenges:
Many consultants are mediocre, even ones at large institutions. It might be very hard to find great ones.
Consultants can be very expensive.
There would have to be a lot of buy-in from EA leaders to ensure that any results might be acted upon.
It might be difficult to carve out a clear outline of what the “EA Ecosystem” is.
If it’s the case that most useful analysis is organization-specific, then this project might not make sense.
Existing work / resources:
A lot of management consulting involves applying a set of known strategy frameworks to institutions. For example, the Balanced Scorecard method, or the BCG Growth-Share matrix.
The Bill and Melinda Gates Foundation has been known to work with McKinsey and BCG.
Many management consultants work with governments, in situations where there also are many diverse organizations with different interests. This might not be that much different to the EA ecosystem.
Risk management
EA risk management
Project scope: medium to large
Action:
Set up a new formal or informal project to monitor and list multi-organization risks.
Considerations:
This might be a part-time project for staff at existing orgs/projects.
Challenges:
If all this project does is risk assessment, it’s not obvious what the results of said risk assessment would be. This would likely need buy-in from other leadership to take corresponding actions.
Many risks at organizations involve stuff lawyers would advise the org not to talk about.
Existing resources:
Some organizations have done their own risk management plans. It might be possible to have some wins from collecting/filtering/evaluating/promoting existing thinking here.
Fundraising assistance
EA-internal fundraising support
Project scope: medium to large
Action:
New project to organize and maintain funding advocates/liaisons within EA, who help EA organizations work with EA funders to manage expectations and shared goals.
Purpose:
Many organizations seem to find fundraising very stressful. In my (Ozzie’s) experience, some orgs make (what seem to me like) poor decisions around this.
I think ideally, the funders would do more relationship management themselves. Having other groups do this is more awkward, but could be better than nothing if that’s the option.
Act, in part, as multi-project fundraisers
Parallel:
Consultancies that help academics manage the grant system.
Fundraising training for GH&D charities
Project scope: small to medium
Action:
Provide training for global health and development charity fundraising.
If there’s sufficient interest, try to provide a single training as a test. Possibly continue / scale up later.
Reasoning:
If there are global health and development orgs that currently depend on EA funding but could get more mainstream funding (e.g. from Gates or Hewlett foundations), training could help them access that funding and make them less dependent on the EA funding monoculture.
This would also leave more EA funding available for projects that appeal less to mainstream donors.
Idea initially suggested by someone from Center for Global Development who thought there were EA-funded orgs that could benefit from this type of training.
Current status:
Julia has asked around GiveWell and OP, and it doesn’t seem that they fund orgs that are in this position.
Could ask Charity Entrepreneurship’s incubated charities.
EA fundraising expansion
Project scope: large
Action:
Expansion of existing fundraising organizations, or starting new ones. Try to find donors who haven’t yet given much to EA organizations, who could in the future.
Goal:
Expand general EA donorbase. The EA funding scene could be much healthier with more donors.
Reasoning:
Many “EA donors” give to many “EA organizations”. Thus, finding a new EA donor is a public good for EA organizations. EA organizations themselves will do exploration here, but it won’t be as much in their interest, as the main benefit would help many organizations including theirs.
Potential targets:
Government grant organizations (try to figure out which would be a good fit for which EA organizations)
Tech billionaires
Bay Area tech scene
Other philanthropic foundations
Possible steps:
Share information about EA organizations
Get more staff / former staff of EA orgs to use Glassdoor
Project scope: small
Action: Promote Glassdoor use for staff and former staff of EA organizations / projects.
Methods:
Find an org that’s willing to try encouraging its current and former staff to write reviews.
Then decide whether to encourage other organizations to do the same, and encourage it more broadly cross-EA.
Pros:
Transparency. Getting more data could help prospective staff make better decisions about where to work.
In particular for orgs that are burning people at an unusual rate: make it easier for warnings to surface where job applicants are likely to see them.
Cons:
Many EA orgs are tiny, so employees won’t feel like they have much privacy when posting reviews, and their employers might retaliate or withhold favors like good references.
Glassdoor has a reputation for allowing the removal of negative reviews, and some companies dilute negative reviews by encouraging current staff to post positive ones.
In one case in New Zealand they revealed employees’ identities in a defamation case. Seems like UK and Australian laws are also particularly strict on defamation; residents of some countries might be more at risk than others.
For orgs that have changed a lot over time, maybe a bunch of the info reflects things that are no longer relevant.
In some cases you care about the last 3 projects an org founder has worked at and not just the current one. Maybe it doesn’t help if the problems were mostly at another org.
Existing work / resources:
Post encouraging this
EA organization information center
Project scope: medium to large
Action: Set up a new project to organize and track reviews and/or data about EA organizations.
Potential activities:
Collect reviews/reports of:
What it’s like to work in EA organizations
Like “Glassdoor, for EA Orgs”
What it’s like to work with EA organizations
Like “Yelp, for EA Orgs”
How good is the work of EA organizations?
Like “Guidestar, for EA donors”
Share data as needed.
Ideally, a lot of this can be public, but much can be private.
Where sensitive, on a need-to-know basis.
Grantmakers
Individuals considering working with these orgs/groups.
Collect and organize “objective” data
Org deliverables, like papers published
Lists of employees / leadership
990 reports (or equivalent reports) and corresponding data
Employee retention rates
Salary levels
Average EA Forum karma
Lists of yearly reports
See Faunalytics, Crunchbase, and IMDB Pro as somewhat similar.
Further work:
Could cover grantmaking processes/orgs
Could cover key individuals, even if they’ve moved across orgs
Challenges:
The software engineering portion of this could be costly, as software is expensive.
The evaluative parts could get pushback or capture. For example, if this group published some damning info on Org X, then Org X really pushed back, that could become a pain to deal with.
Sensitive information might be difficult to manage/handle.
Public information could make EA look bad.
Even if we think it’s worth it, EAs might be hesitant to share.
Meta
Further organizational improvement research
Project scope: large
Action
Invest further resources in doing the kinds of thinking as in this document, but better. This could mean research in the next few months, or long-term efforts.
Background
Our team has had limited time to work on this, less than one FTE over a few months. This covers large topics that could be investigated and vetted much better.
Potential activities:
Research:
Many of the specific projects listed above.
Institutions comparable to EA. Find best practices from other groups we could learn from.
“Methods of effectively improving large institutions”, and see if we can apply those principles to EA.
Interviewing:
Understand EAs with power, who might be able to make changes.
Work with them to craft proposals.
This includes funders, seniors EAs, potential candidates to start projects.
More concrete proposals:
Pay EA research contracting agencies to do specific investigations.
Encourage EA researchers to write about these issues on the EA Forum, maybe with a prize.
Allocate more time from OP/CEA to consider options.
Appendix 1: Importance and tractability grid
Sorting ideas by importance and tractability (this estimate is by Julia, others would do it differently!)
Easier to implement | Harder to implement | |
Bigger benefit (if it goes well) | Ops support for newer/smaller orgs | Investigation team EA COO Risk management person/team More fundraising e.g. in Bay More capacity on reform work |
Medium benefit | Awareness of whistleblowing resources Alternative to community health team Coordination between orgs Advice for new grantmakers Fundraising training for GH&D charities Get staff to use Glassdoor | FTX investigation Ops / management evaluation EA org info center |
Smaller benefit | Advice on board composition Mapping of board members Code of conduct for leaders Internal communications infrastructure EA-internal fundraising support |
Appendix 2: Areas of possible improvements
Instead of imagining a list of specific projects, we could attempt to break down the areas of potential improvement into a set of domains. It could later be interesting to work to identify which domains are the most important, neglected, and tractable.
As with much of this document, this is a fairly rough draft. We could improve it if there were interest.
Community Health
Do community members feel like they belong to a community or subcommunity that is healthy?
Do community members feel like they can trust leadership?
Do community members feel empowered and safe speaking out about important crises? Do they know who to speak to?
Are there unhealthy fractions/disagreements?
Are community-funded resources adequate?
Are community members having health/mental/social problems that can be effectively addressed?
Sexual Harassment/Abuse
Do we have institutions with strong justified trust that people can report incidents to?
Are the above institutions capable of taking reasonable actions and fighting back against adversarial actors?
Do potentially vulnerable people feel safe around EA community members?
Are potential abusers properly incentivized to not commit abuse?
Is there widespread knowledge of good practices and reporting procedures?
Are community members in key positions properly trained to ensure that problems don’t happen?
Is there proper legal counsel around these situations?
“EA Program”
How much should we focus on improving each of the following? Where are we weak/strong?
Epistemics
Philosophical/Academic Foundations
Cause Prioritization
Execution
Adversarial Actors
Large actors (FTX, major donors)
Do we have measures in place that might take adequate action in cases like FTX? This might require significant time and legal support. (In the case of FTX, this might have involved a mixture of investigation, and resisting pressures from FTX to be silenced.)
Small actors (Small organizations, individuals)
There are now many small EA organizations and funded individuals. Do we have adequate measures to make sure that we can detect severe problems by these groups?
Organizational Health
Do we have a strong management class?
How much are management’s intentions clear?
Organization culture
Can employees speak up to management, or adequately raise issues to higher power figures?
Do organizations have cultures of competence and meritocracy?
Are poor performers kept too long?
Do organizations have good levels of candidness?
For example, if leadership were to rank all organization projects in terms of value, would employees be comfortable with that? What if employees are ranked?
Different firms have very different expectations.
Do we have adequate/strong departments/skills in:
HR
PR
IT
Marketing
Legal
Office Management
Operations
Executive / Top Leadership
Internal Communications
Professional Development / Training
Program Evaluation
- Project on organizational reforms in EA: summary by 9 Nov 2023 18:58 UTC; 81 points) (
- 21 Nov 2023 1:24 UTC; 35 points) 's comment on CEA is fundraising, and funding constrained by (
- 14 Nov 2023 18:05 UTC; 8 points) 's comment on Project on organizational reforms in EA: summary by (
Hey Julia and Ozzie, thanks a lot for writing this! I enjoyed reading it and agree with your points, nice to see the gaps highlighted, hopefully it will lead to more people considering and starting these projects! I actually used the gap approach to start Impactful Animal Advocacy, and we now have a very active Slack space which the readers of this forum (interested in animal advocacy) are welcome to join. The main purpose is coordination and knowledge sharing in animal advocacy, and we also have a regular newsletter among other programmes like resource databases and wikis (building these up at the moment).
Thanks for linking! I’ve heard of IAA, happy to see community work here.
Executive summary: The post outlines potential projects to reform and improve EA organizations and infrastructure. It considers changes to boards, investigations, whistleblowing, coordination, operations, risk management, fundraising support, and information sharing.
Key points:
Advising EA organizations on board composition could help ensure necessary skills and oversight.
An investigation team could review major problems like fraud or HR issues at EA organizations.
Improving whistleblowing procedures could facilitate reporting problems.
Better coordination between organizations on gaps and staff working in similar areas could be valuable.
Operations support through funding, advising, and sharing policies could help smaller EA organizations.
An “EA COO” could pursue efforts improving operations and management across organizations.
A risk management project could identify and monitor multi-organization risks.
Fundraising training and assistance could help diversify funding sources.
Sharing more information on organizations could aid decision-making by staff, donors, and partners.
Further research into organizational improvements could build on this preliminary overview.
This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, and contact us if you have feedback.