This post summarises my work last year to evaluate the merits of establishing a legal service to support the EA community. Some important notes up front:
I am not the only person who has been considering this idea. Most of the thoughts below aren’t original, and some are explicitly from other people considering this idea. That said, this post doesn’t set out a consensus view of lawyers involved in EA. Other lawyers and other orgs likely have a different (and potentially more concrete) view on the merits of the proposal. My comments below do not speak for anyone else who has been considering a similar proposal.
Most of this post (other than the background and current state section) was written prior to the collapse of FTX and subsequent fraud allegations. Ideally I would have revisited the analysis in light of those events, but I have not had the capacity to do so. I’m posting this work in its current form as it may be a useful resource for other lawyers or EA orgs thinking about unmet legal needs in the community.
The final section (Summary of Community Feedback) may not be particularly useful to most readers; you can get the gist from the earlier sections).
I would welcome feedback/suggestions (either in the comments, by DM, or by email).
My high level views/credence
At the time I wrote the initial draft of this post:
I was pretty confident there is an unmet need, but I had not done much work to estimate the lower and upper bounds for the benefit I’d expect different interventions to yield—so I wasn’t sure how cost-competitive this would be vis-a-vis the counterfactual marginal funding opportunity.
I generally heard positive feedback from the community, but it was not universal and I’d be wary of setting this infrastructure up unilaterally.
I did feel reasonably confident that a triage and referral service would probably be worth trialling, especially if referrals could be made to skilled specialist lawyers offering pro-bono services. Lawyers are expensive, and it didn’t seem unreasonable to think that funding a single lawyer (probably not too expensive) to work as a clearinghouse could have a decent multiplier in terms of the dollar value of legal services provided to EA orgs (note I had not considered the impact of those legal services, merely their market cost)
How do I feel now?
I’m really unsure how to update my views following the collapse of FTX and the other recent controversies in the community. Anyone taking this work forward should consider this carefully.
I remain reasonably confident that EA orgs (or at least small and newly established ones) allocate insufficient resource to legal risk and that there is an unmet need.
I’m more concerned about our vulnerability as a community to correlated risk. I guess I’d be a little less enthusiastic about ‘EA lawyers’ or an ‘EA legal consultancy’ (but still think the idea might be positive) but probably still just as positive about a triage and referral service to make it easier to access pro-bono offerings from law firms.
While I remain very enthusiastic about EA as an ideology or EA as a question, I’m not totally sure I feel as enthusiastic about EA as community whose ideas I generally reflexively support (influenced a little bit by this post and by the ‘EA as a Method and not a Result’ section of this post), and so I feel a little bit more uncertain about the value of community infrastructure generally.
History and current state of my work on this
In March 2022 I posted “Legal support for EA orgs—useful?”. The key claim was that large EA orgs have pretty clear legal needs (which they fill with in-house counsel and external legal resource), that smaller/newly established orgs probably have those needs but under-resource them, and that a virtual in-house counsel service would be valuable.
Several other lawyers (and former lawyers) within EA had already been looking into the merits of an EA legal service, although with different priors about the sorts of legal needs that exist (my impression—not necessarily a 100% correct restatement of their view—was something like less belief that there was a need for ‘virtual in-house counsel’ services but more belief that some of the legal services available on the market wouldn’t meet the needs of EA orgs aiming to do fast-moving ambitious work).
I was subsequently funded by a regrant from the FTX Future Fund to scope out this need. (I was also supported with desk space by UCL EA in London and as a Prague Fall Season visitor)
Together with some of the other interested lawyers, we sought feedback from EA orgs and with EA-adjacent lawyers and legal professionals.
Orgs:
Pretty wide range of functions (well-established ‘meta’ and ‘doing’ orgs, incubators, pretty wide range of cause areas, some newly established orgs, some prospective founders and grant recipients).
Did not speak to many commercially-focussed EA-adjacent organisations, so probably have a weaker sense of those legal needs.
Focussed mostly on orgs in the UK and USA. I think I was mistaken not to seek information about unmet legal needs in a wider set of jurisdictions even though most EA activity currently happens in these two jurisdictions.
EA-adjacent lawyers:
Largely on a referral basis, so out of the full set of EA-adjacent lawyers and legal professionals (incl immigration advisors, tax advisors etc), we mostly spoke with a highly-engaged, well-connected subset.
Of those lawyers, a mix of lawyers already doing operational legal work for EA orgs, academic research or policy work, and lawyers looking to transition into direct EA work.
Also spoke with several highly-motivated law students
Spoke with a pretty wide set of attendees at EAGx Australia 2022, EAG San Francisco 2022 and EAG Washington DC 2022.
I ran an open survey promoted on the EA Forum and in some EA Slack channels. The survey was largely qualitative, suffers from voluntary response bias, and had a small number of responses (n=29).
I met with lawyers with pro-bono expertise outside of effective altruism, including partners responsible for pro bono at several large law firms in the US, UK, and New Zealand. Also met with staff working at several pro-bono clearinghouses. Some of these lawyers expressed interest in working with small/newly established EA clients pro-bono with an EA lawyer or org coordinating these requests, although nothing has been formalised so far.
Another lawyer was giving serious consideration to starting up a California-based EA legal practice. At this point my focus turned to establishing a service that would employ one or more English lawyers to provide advice to orgs with UK legal needs.
My work on this project largely stopped in December 2022. This was primarily due to burnout and some minor personal matters, although I also wanted to reflect on whether this work should continue and whether it should be taken forward by another org or a lawyer with a different set of skills. (I am continuing to provide legal support to a few EA orgs directly.)
For the reasons set out in this post I remain of the view that there are unmet legal needs in the EA ecosystem, and that it would be useful for them to be met. I do not have a view as to whether this is cost-competitive vis-a-vis the counterfactual marginal funding opportunity.
As noted above, almost all of this post was written shortly prior to the FTX crisis (and other recent controversial events). Before any concrete action is taken to start up an EA legal services org, it would be prudent to revisit this assessment in light of discussions within the community since then.
If another EA lawyer or EA org suspects they are well-placed to take this work forward, I would encourage them to reach out [1]- I would be happy to share the work done to date/make introductions to folks I’ve spoken with so far, etc. If that does not happen, then I would potentially consider pitching some version of this service (probably just the triage and referral service) to funders.
There is unmet legal need in Effective Altruism
It sometimes costs too much to engage a lawyer:
Established, well-funded EA organisations use lawyers. They hire external and in-house counsel. At scale, EA orgs do not rely solely on generalist/operations staff to consider legal risk.
New, small, or poorly-funded EA organisations use of lawyers is mixed. Some significant portion of these orgs are reluctant to engage lawyers. Some reasons for this appear to include:
Financial costs
Genuinely constrained funding (this may be a signal that the work the org is doing is not valuable, but may also reflect capacity constraints within large funders).
Pressure from donors (whether actual or perceived) to keep overheads low.
Not planning for legal expenses when budgeting and seeking funding.
Psychological barriers to returning to funders with a request for more funding.
Overestimating the complexity of a matter / the cost of getting legal advice.
Non-financial costs
Anticipated difficulty in finding a suitable lawyer / ‘ugh field’ around the task.
Time cost to explain the legal issue and, in some cases, the unique ways this org/EA orgs generally trade off risk and reward.
Conflict with desire to appear talented/knowledgeable.
Movement-wide legal needs might not be met by individual orgs
We received feedback that it would be useful to have a movement-wide view of legal risks and opportunities. For example:
I expect EA-orgs are going to have atypical legal needs, ranging from non-traditional governance structures, to wanting to do beneficial collaboration without falling foul of anti-trust. I think having a single org that can serve as a ‘one stop shop’ for these kinds of questions would be a net asset to the community—expertise and intuition will centralize in this org over time, making its value greater than the sum of its parts. If an org like this doesn’t exist, then status quo means a lot of legal queries will get distributed around a load of different lawyers and firms, without the opportunity for having a central view of the evolving landscape of legal issues that EA orgs face.
Some EA orgs want advice that quantifies legal risk, and possibly which frames it in an EA/rationalist way:
More ‘precise’ risk estimation. (This is a claim about precision, not accuracy—a lawyer may have an accurate mental model of the distribution of outcomes, but default to vague, standard legal terms like “real risk” or “more than minor”.)
Differentiation between legal risk and other kinds of risk (e.g. reputational risk).
Risk neutrality.
Accounting for legal risks/harms that accrue to other parts of the EA movement.
Bayesian reasoning.
Stating epistemic statuses.
Explicitly consider whether they have been sufficiently curious.
Use of the ‘outside view’.
Lawyers and clients have diverging incentives
We had some feedback that external counsel:
were professionally incentivised (by regulators, internal risk departments, or insurers) to give advice that was imprecise but accurate;
would steer clear of work (especially pro-bono work) that created a risk of bad optics with major corporate clients;
respond slowly to smaller/pro bono clients, or assign their work to junior staff;
felt unapproachable.
Evaluation of potential interventions
Funding orgs to get legal advice
Paying for or subsidising legal advice would probably increase the use of lawyers by EA orgs.
Where orgs receiving legal funding would not have counterfactually engaged a lawyer, this funding is distortionary. Ideally, this funding would only be provided to orgs that are expected to under purchase legal services. This might look like identifying orgs that are comparatively risk-loving, have inexperienced leadership or operations staff, or have donors that are very sensitive to overhead. But this rewards poor decision-making, would likely be awkward, would be unreliable, and would consume a lot of grant-making resources.
In some proportion of cases, the recipient organisation would have engaged a lawyer under the counterfactual. Here, the funding would either have the impact of an unrestricted marginal donation to that org—or where an org accurately reports funding gaps, would potentially ‘funge’ with other donations that would counterfactually have gone to that org but are no longer required.
Funding alone would not directly address non-financial costs to engage lawyers, divergent incentives, or the differences in reasoning style between EAs and most lawyers.
Starting one or more law firms
Insofar as there are lawyers who are familiar with EA, value-aligned, and willing to adopt rationalist/EA frameworks in their assessments of legal risk, then starting a firm could plausibly improve the quality of legal services available to EA orgs.
However, the law diverges significantly across practice areas and jurisdictions. Lawyers in private practice tend to specialise in a few (often related) practice areas, and cross-jurisdictional practice is fairly rare. I expect that a specialist in a practice area (compared to an equivalently smart and motivated generalist) has:
a stronger recall of specific legal rules in that area;
a better ability to predict the behaviour of courts, regulators, and counterparties in that area;
best-practice approaches to solve lots of standard problems;
higher productivity, by not needing to research as much and—I presume—by being able to make confident decisions faster.
There is not a small set of practice areas that neatly maps onto EA legal needs such that a lawyer could naturally be an EA legal specialist. If an EA firm wanted to provide the EA movement with the benefits of specialisation, it would need to be reasonably large.
A specialised boutique firm
One response to this tradeoff might be for a lawyer and funder to focus on a single jurisdiction and on practice areas that more clearly map onto the principal legal needs of a smaller, more homogenous set of high-impact clients. That could potentially deliver a significant benefit to those clients, and would be worthwhile insofar as there is significant legal need in a more narrow subset of EA.
[Pre FTX collapse] Another lawyer was giving serious consideration to starting up a California-based EA legal practice. He described that as follows:
A legal services organization to represent EA startups and organizations and publish related content. The organization will help with formation and provide “in-house type” legal services to EA startups and organizations too new or too small to have in-house attorneys. In addition, the legal services organization will seek to address and publish guidance on legal issues common across many EA orgs, particularly those unique to EA orgs. In addition to providing direct representation, the legal services organization will develop referral networks to specialists (e.g., immigration lawyers) and work with outside counsel as needed.
A EA-focussed general practice firm
The foregone benefits of specialisation might arguably be offset in most lawyer-client engagements by the benefits of EA familiarity, value alignment, and willingness to use EA/rationalist frameworks. If this was almost always true, then the EA org should primarily select lawyers for those EA features, not for expertise in a given area of law—and an EA law firm would do the most good by applying those EA features (which other firms might not optimise for) to almost any area of the law that EA orgs need help with.
However, in most cases the foregone benefits of specialisation won’t be offset by EA familiarity, value alignment, and willingness to use EA/rationalist frameworks. This was supported by feedback from in-house lawyers and leaders in EA orgs who were clear that, all things being equal, their orgs want domain expertise over these other EA features.
A hybrid model
The tradeoff between specialisation and EA familiarity, value alignment, and willingness to use EA/rationalist frameworks will not be uniform across all types of legal questions and practice areas. An EA firm could add value by providing EA-flavoured advice on the kinds of legal questions that do not need the expertise of a specialist lawyer, and by helping the EA organisations to instruct external counsel where specialist advice is better.
Larger organisations tend to adopt a variation of this model by employing in-house counsel. This model can be thought of as offering a ‘virtual’ in-house counsel service to EA organisations.
Starting a triage and referral service
The effort involved in working out whether to get legal advice, and where to get it from, probably discourages EA organisations from getting legal advice. Similarly, EA organisations who need legal advice sometimes use a lot of time finding and briefing a lawyer, and explaining to the lawyer how what the organisation wants from the legal advice differs from what other nonprofit or start-up clients typically want.
Many lawyers (including some within or adjacent to the EA community) are willing to provide pro-bono advice. There is currently no way to easily identify and connect with those lawyers.
A minimal version of this could involve:
compiling a database of lawyers able to provide pro-bono support to EA organisations—and periodically checking in with those lawyers to check they are still able to assist;
compiling recommendations of lawyers who have specialised expertise and are recommended by other EAs, and seeking feedback from EA orgs to keep this updated over time.
A higher-effort option could involve:
intake conversations with an organisation to help them identify the legal issues, and to weigh in on the need for/value of external legal advice;
connecting an organisation with one or more recommended lawyers, including helping the organisation to provide clear instructions to the lawyer;
developing relationships with regularly-used external lawyers—potentially negotiating a discounted rate, helping familiarise these lawyers with the style of reasoning that EA organisations expect, and providing aggregated feedback from clients;
where appropriate, reviewing advice to offer a second opinion, provide an EA-framed view on the advice, and form a community-wide view of legal risk.
Producing/commissioning community resources
Several EA organisations have expressed interest in resources that provide general legal information from a EA perspective on topics that founders and newly-established or small EA orgs would typically be interested in. I had feedback that this would be useful, but not especially useful as a standalone intervention.
The AntiEntropy Resource Portal may be a good place to host these resources. The portal is open-access, so care would need to be taken with content that could increase legal risk to EA orgs if publicly published.
External lawyers could potentially produce some of these resources as part of their pro-bono work, but would need an EA organisation to act as a client for this purpose.
Intervention risks
Several respondents were concerned that an EA legal service could harm EA organisations by:
providing bad advice, especially in a practice area where a lawyer doesn’t have significant expertise;
causing legal risk to be correlated across EA organisations;
crowding out other sources of legal advice for the EA community, including other potential EA-focussed legal services or pro-bono relationships with law firms.
Bad advice
This outcome happens where the advice that an org receives leads them to make a worse decision than they counterfactually would have made, either because they would have:
made a better decision without the advice; or
sought better advice elsewhere.
The first case can probably be reduced acceptably by checking that whoever is giving advice via the EA legal service is a competent lawyer and follows reasonable epistemic practices, including noting uncertainties in advice or gaps in experience. A bad lawyer might routinely give worse advice than a smart layperson with good epistemics, but there isn’t any reason to expect that a smart and epistemically honest lawyer would underperform that standard.
Reducing the risk of an org foregoing better advice elsewhere is trickier because it is not enough to check that the lawyer giving advice is competent and has good epistemics. Evaluating the risk in each case would involve estimating the likelihood the org would seek advice elsewhere and the likelihood that this advice would be better than the advice provided by an EA legal service. Instead, an acceptable way of managing the risk could involve lawyers explicitly considering how well-placed they are to provide advice, how critical the issue is for the org or the community, and how realistic it would be for the org to seek specialised advice.
Where an org does not have the budget or inclination to seek specialised advice, a competent generalist lawyer can still add value by providing advice with a clearly-identified credence and encouraging the organisation to consider whether seeking a firmer opinion from a specialist would be appropriate.
Correlated risk
This risk was best described by a respondent to my survey:
One possibility is that an EA-branded legal service might quickly become the default for many organizations in the community, which might lead people to make correlated and self-reinforcing mistakes compared to the counterfactual setting in which everyone finds separate counsel (e.g., as a result of founders hearing from many other EA founders that entity design choice A is superior to choice B, every org in the EA community becomes confident in choosing A over B even if B is optimal).
There seem to be three kinds of correlated risk:
bad advice is widely adopted by EA orgs;
advice specific to one org’s circumstances being adopted more widely, and particularly in circumstances that make the advice inappropriate; and
approaches that are good on expectation for any individual org being bad when adopted across many EA orgs, simply by virtue of increasing the movement’s susceptibility to those approaches failing or ceasing to be available.
All three cases likely exist to some extent already—both by virtue of organisations leaning towards ‘default’ options, and also by organisations sharing informal advice and know-how with each other.
The first case is simply a multiplier of the ‘bad advice’ risk discussed above, which adds weight to that concern.
An EA legal services org could plausibly reduce the risk of the second and third cases by:
collecting and publishing best practices, and where appropriate making them available for community comment with explanations provided;
reducing the barriers to checking whether common-knowledge advice that has worked well in other orgs is suitable for a new org’s circumstances;
maintaining a community-wide view of legal risks.
Crowding out the EA legal market
The benefit of EA organisations seeking legal advice is not wholly contingent on an EA legal service, as some organisations would engage lawyers (some for a fee, some on a pro bono basis) under the counterfactual. This may reduce the benefit of an EA legal service, but does not necessarily make it harmful.
However, if other lawyers see that there is an EA legal service and think that there is little benefit (either commercially or altruistically) in targeting the EA market, then there will be fewer lawyers for EA organisations to select from than under the counterfactual. This might lead to less specialised advice being available to orgs, to bottlenecks arising from limited access to lawyers, or to higher prices for EA orgs.
For this reason, an EA legal services org should be focussed on building a network of lawyers who are available to provide services to EA orgs, rather than on trying to directly meet every conceivable legal need. An aspirational goal of an EA legal services organization would be to influence legal advice generally and to develop and model best practices in legal advice for other lawyers and new lawyers, providing more lawyers for EA organizations to select from.
Summary of community feedback
Note: all feedback was provided prior to the collapse of FTX, and does not factor in lessons learned by the community since then. Some limited post-FTX comments are included in brackets.
Degree of legal need
Feedback on this point was not uniform. While all respondents I spoke to believed that effective altruism had some legal needs, respondents were divided on how significant and neglected those legal needs were. Some respondents reported difficulty meeting their legal needs, while others felt that their needs were sufficiently served by pro-bono and paid-for counsel.
I asked some survey questions to very roughly consider the degree of unmet legal need:[2]
Is your sense that there are currently—or will soon be—unmet legal needs within EA?
“Significant unmet legal needs” – 17% of survey respondents, 19% of respondents to this question
“Some unmet legal needs” –72% of survey respondents, 81% of respondents to this question
“No real unmet legal needs” – 0%
“I have no idea at all” – 0%
No response – 10% of survey respondents
Have you observed any promising projects not go ahead or suffer significant delays because of legal issues?
Respondent identified a project that failed or was delayed – 17% of survey respondents, 50% of respondents to this question
Respondent suspected this was likely, but had not observed this – 3.5% of survey respondents, 10% of respondents to this question
Respondent had not observed this – 13.7% of survey respondents, 40% of respondents to this question
No response – 65% of survey respondents
Have you observed any EA orgs make poor decisions about legal risk? If so, were those decisions solely bad in expectation—or did they result in actual adverse consequences?
Respondent identified an adverse legal consequence − 3.5% of survey respondents, 14% of respondents to this question
Respondent identified a poor decision on expectation – 20.5% of survey respondents, 86% of respondents to this question
No response – 0.76% of survey respondents
If you needed an external lawyer, how easy do you think it would be to find and engage one?
Respondent thought it would be easy – 14% of survey respondents, 28.5% of respondents to this question
Respondent did not think it would be easy – 27.5% of survey respondents, 57% of respondents to this question
Respondent thought it would be easy to find a lawyer, but cost-prohibitive to engage – 7% of survey respondents, 0.14% of respondents to this question
No response – 51% of survey respondents
I also asked orgs to report their approximate legal spend:
· USD 5,000
· USD 5,000
· USD 20,000
· USD 30,000 to 50,000
· GBP 40,000
· USD 50,000
· USD 3,000,000.
Areas of law mentioned by respondents
Below, I have listed areas of law in approximate [3] order of how many EA orgs I spoke with and how many respondents to my survey mentioned them. I think that this data provides some limited indication as to the relative frequency of legal need in different areas. However, the data is of limited use because the frequency with which different needs were mentioned does not clearly say anything about the degree of legal risk to EA orgs, ability of lawyers to mitigate that risk, and need for an EA-specific org to assist.
· Tax
· Employment
· Immigration
· Charity law (including charity registration) - n.b. could be considered to overlap with ‘tax’ above
· Choosing a structure for a new org or project
· Governance / internal policies
· Data protection / privacy
· Antitrust / competition law
· Fiscal sponsorship—n.b., again could be seen as part of ‘charity’ or ‘tax’ above
· Real estate (purchasing and leasing)
· General compliance issues
· Regulatory advice
· Corporate law
· Integrity and conduct / probity
· Intellectual Property
· Commercial contracts / procurement
· Securities law and crypto
· Defamation/disparagement
· Wills and estates for donors
· Impact litigation
· Help with legislative drafting
· User terms / consumer protection
· Modern slavery
· Child protection
· Financial services / investment advisory
· Stockholder activism
Jurisdictions
I expected that almost all of the legal need would be for United States and United Kingdom advice. While most of the current legal need is in these jurisdictions, there was a greater interest in legal support elsewhere than I expected.
Respondents reported having questions about the law in:
· India
· China
· Czech Republic
· Singapore
· Germany
· Nigeria
· Australia
· New Zealand
· Switzerland
· Mexico
· Cayman Islands
· Indonesia
· Vietnam
EA-specific legal issues
Respondents also identified some legal issues that are common to many EA organisations, some of which are relatively unique to EA organisations. In these domains, it may be useful to have specialised legal advice or an organisation to review and publish best practices in these areas.
Boundaries of charitable / tax exempt status
Charity laws are not cause-neutral, and it can be ambiguous whether some EA interventions fall within existing legal concepts of charity.
This can be especially relevant where an org wants to fund a for-profit business, engage in political activity, or provide private benefit (beyond a reasonable salary) to people involved in a project.
This concern included both regulatory and reputational risk – as one respondent put it:
Non-traditional uses of charitable money could bring negative attention from the media or charities regulators, possibly cascading from one organisation to another and harming the whole movement.
[While not directly describing the last few months in the EA movement, this comment feels reasonably prescient.]
Non-standard governance arrangements in EA organisations
While non-standard structures that do not require leaders in a commercial organisation to profit maximise are increasingly available:
some respondents anticipate a need for governance structures that require leaders to act responsibly when making decisions with the potential for significant impact outside of the company (e.g in companies developing AI systems); and
it may not always be straightforward to structure an organisation in a way that allows it to act cause neutrally (i.e. being able to stop doing work that is ineffective, and pivot to a conceptually different use of the organisation’s resources).
Antitrust / competition law in safety-critical areas
I won’t describe in depth, but see:
In the US context Cullen O’Keefe’s Antitrust-Compliant AI Industry Self-Regulation. In summary, anticompetitive mutual restraints on trade are prima facie unlawful, making coordination not to develop unsafe AI risky. Similar legal issues would apply in a biorisk context.
Where grants are being made to individuals or to orgs that do not have tax-exempt status, the tax status of the grant in the hands of the recipient can be unclear (and can hinge on the nature of the specific grant).
Regranting and fiscal sponsorship
It is important for orgs regranting funds or engaging in a fiscal sponsorship arrangement to know what is clearly permitted and also to be able to make risk-proportionate decisions about regrants and activities where the position the regulator (or a court) would take is not clear.
Engaging contractors abroad
Multiple respondents were concerned about how often EA orgs defaulted to engaging offshore staff as contractors. This had the potential to reduce the effectiveness of the staff member (who might deal with adverse financial or administrative consequences as a result) or not comply with local law, accruing reputational risk.
Preventing, and managing fallout from, bad conduct within or adjacent to EA
Several respondents considered managing risks in this area well was important to the continued growth and impact of the community.
One respondent gave the example of Ben Delo, an EA-adjacent public figure who faced criminal liability for failing to comply with anti-money laundering law, suggesting that EA could be vulnerable to reputational risk if someone more closely linked to EA were to commit a more serious offence.
[The original draft of this post was prepared before the FTX fraud/insolvency became known. I’ve since talked to folks who wondered whether additional legal support might have helped orgs think better about risks arising from FTX funding/association. Maybe so, but it feels pretty hubristic to claim this in hindsight; I sure didn’t consider the possibility that SBF mighn’t be a trustworthy actor, although perhaps a typical lawyer would have been more objective.]
Managing conflicts of interest (actual or apparent) within EA
EA is a small community, and some respondents felt that EA orgs needed help to build their capacity to manage actual or apparent conflicts of interest.
[Plenty of discussion of this on the forum post-FTX crisis—although not detailed consideration of whether legal resource would specifically be helpful.]
Drafting note: the following five issues were identified by another lawyer interested in this project, not by me.
Impact Certificates
Addressing securities law and tax law issues to enable a strong market for impact certificates.
Securities on Large Prizes
Addressing securities law and tax exemption issues to enable securities trading on shares of large prizes.
Prediction Markets
Addressing gambling and CFTC issues to enable prediction markets at scale.
Binding AI Developers
Finding ways to bind AI developers to take or not take certain actions in response to other developers’ actions—how to structure these to be legally binding and how to avoid antitrust issues.
Compensation-related Governance
Some AI orgs in particular have difficulty determining how to set attractive compensation for talented technical people without running into nonprofit problems. A solution to this may involve appropriate structuring.
Desired and undesired attributes of EA lawyers/legal services
EA modes of thinking
Respondents commonly mentioned that lawyers (and other professional service providers, like accountants) are reluctant to give precise estimates of risk. Some respondents also suggested that lawyers might lack the skills needed to give precise estimates.
very few lawyers are willing to put probabilities on risks, so they’ll just say “I advise against X,” but what we need is “If you do X then the risk of A is probably 1%-10% and the risk of B is <1% and the risk of C is maybe 1%-5%.” So would be nice you could do some calibration training etc. if you haven’t already.
I also heard that it would be valuable for lawyers to:
apply Bayesian reasoning
clearly identify epistemic statuses or credences
examine whether they are being sufficiently curious
explicitly consider the outside view
An operations person at an established EA org identified the following features that it looked for when engaging lawyers:
While accurate legal advice is not too hard to find, it can be hard to find experts who engage well with what a usual EA approach to risk assessment would look like. What [organisation] tends to look for is:
- lawyers who differentiate between legal risk and other types of risk (e.g. reputational risk)
- lawyers who are willing to walk through the mechanics of bad-case-scenarios (e.g. the way you could get fined for X is if you fired someone then they filed for unemployment and then that triggered an investigation by Z agency etc.)
- willingness to quantify risk (even if it’s very large bands like 1-40%)
Principal-agent problems and value alignment
A lawyer with a background advising EA organisations as an external counsel described an internal culture within law firms that promotes giving risk averse advice. I heard that while there is a competitive incentive to deliver good advice to avoid losing business, private practice lawyers may still have an incentive to give as little firm advice as possible while still being able to charge for the work.
I heard from some respondents that they expected an EA lawyer to have higher intrinsic motivation to give correct, useful advice, although this was contingent on finding genuinely value aligned lawyers.
On the other hand, I also heard a concern that an EA lawyer might lack professional distance critical to giving good legal advice:
I do think it’s useful for lawyers, especially outside lawyers, to have some emotional and philosophical distance from their clients. I can see issues if an EA legal service provider is too embedded in the community, such that they have trouble providing support and assessment dispassionately.
I heard from some respondents that they would potentially be more fulsome and honest with a value-aligned lawyer. One respondent described needing to decide whether to take an opportunity for impact that had some low-but-not-precisely-known probability of being unlawful and some unknown distribution of potential consequences, and noted:
EA orgs are unusually risk neutral and therefore willing to incur risk if the altruistic impact is sufficiently high, but law firms often fail to account for this… [getting this kind of advice] requires unusually high trust and a shared understanding of the org’s risk neutrality mindset.
[Note that post-FTX, there is probably a legitimate debate around the risk-neutrality mindset—and I’m not confident I endorse taking a risk-neutral/expected valu mindset to all legal issues.]
Some respondents felt that value alignment was largely unnecessary:
EA alignment is rarely relevant to legal issues in an organization, with the possible exception of corporate strategy for companies like [identifying information removed], and funding decisions for nonprofits (though I think those are not usually the province of lawyers, except to structure and document grants appropriately, which doesn’t strike me as needing EA alignment).
Degree of specialisation, and familiarity with EA/longtermism
Some respondents thought it would be useful for their lawyers to have EA or longtermist context. A particular benefit was the convenience of having an easily identifiable person to ask legal questions, and to feel comfortable doing so. Other respondents suggested this was not necessary.
I also heard that an EA legal services organisation would be of limited use, or might be harmful, because it would lack deep expertise in all of the practice areas in which EA orgs would need support.
I think that it’s not easy to provide excellent legal advice, and I’m also worried about legal services being substandard and making EA orgs worse than the counterfactual.
The whole point of paying a lawyer is that you are paying an expert. “Generalist” help can be useful for issue spotting and risk mitigation in house, as well as simple contract review, etc.
Targeted legal advice is critical. A lawyer expert in a niche field can answer a narrow question quickly and reliably. A generalist, even if EA aligned, may struggle with narrow issues in tax law etc. I hate to say it, but [org name] would almost always want the better product.[4]
It strikes me as unlikely that such a service would be helpful at this point, mostly because it would be hard to source expertise in a sufficiently broad set of legal disciplines. I think this would end up being lawyers practicing as generalists, which wouldn’t be terribly useful in my view and could result in quite bad advice going around.
In contrast, some respondents thought that a ‘right-sized’ legal services organisation would fulfil a different need to specialised external counsel.
I see a pretty big unmet and growing need for right-sized consult support for incubating or solo-operating grantees and contractors. [identifying information removed] Some fraction of these folks seem like they could use legal support and in many cases we do refer them to our external counsel, but that’s probably more high-powered/expensive than they really need.
The org should choose the non-EA specialist for vital subject areas that are complex or for more ‘active’ legal issues such as when something has already gone wrong or is likely to, whereas the org should choose the EA generalist for run-of-the-mill issues or for the prevention of routine issues.
I also heard that an EA legal service could help EA orgs navigate the process of engaging and instructing external counsel:
I would like to engage the EA lawyer first as a source to determine what specialised lawyer I might need. I wouldn’t expect the EA lawyer to do everything.
Ideally you should have both, with the generalist being a sort of translator or medium between the non-EA specialist and the EA organization/client. Again ideally, the client learns from going through the process a few times how they should speak with and interpret feedback from the non-EA specialist; then, with those enhanced skills, the client should be able to engage with non-EA specialists directly more often. In any case, a good generalist lawyer should be able to tell you when they don’t know something and need to refer it out to a specialist, so if you had to choose one or the other (and especially if the client is unsophisticated in terms of consuming legal services), I would start with the EA generalist most of the time. I would go directly to a specialist when something is generic, meaning it does not require EA knowledge to do a good job on the matter. (For example: intellectual property such as patent filings; basic corporate set-up/LLC filings; tax-exempt filings, employment litigation, immigration.)
Shared view of legal risk / correlated legal risks
I received feedback that it would be useful to have a movement-wide view of legal risks and opportunities, although I expect that in many cases respondents were primed to give this feedback by my description of what an EA legal services organisation might look like.
The following feedback was provided to another lawyer considering a similar initiative:
It’d be really helpful to have an organization that can provide legal support on questions with an EA-flavor. I expect EA-orgs are going to have atypical legal needs, ranging from non-traditional governance structures, to wanting to do beneficial collaboration without falling foul of anti-trust. I think having a single org that can serve as a ‘one stop shop’ for these kinds of questions would be a net asset to the community—expertise and intuition will centralize in this org over time, making its value greater than the sum of its parts. If an org like this doesn’t exist, then status quo means a lot of legal queries will get distributed around a load of different lawyers and firms, without the opportunity for having a central view of the evolving landscape of legal issues that EA orgs face.
Respondents also identified the potential for this shared view to harm EA organisations or to cause harm to others:
I’m worried about there being correlated legal risk across EA organizations. I think that it’s not easy to provide excellent legal advice, and I’m also worried about legal services being substandard and making EA orgs worse than the counterfactual.
In trying to be a value-first rather than product-first organization, you risk providing worse advice. If you provide bad advice across many EA orgs, you could do real damage.
If everyone is using the same service and that service makes the same mistake then we all fail in the same way.
if the EA legal service gives bad advice, and it’s giving it to all EA orgs, this makes the movement less resilient.
One possibility is that an EA-branded legal service might quickly become the default for many organizations in the community, which might lead people to make correlated and self-reinforcing mistakes compared to the counterfactual setting in which everyone finds separate counsel (e.g., as a result of founders hearing from many other EA founders that entity design choice A is superior to choice B, every org in the EA community becomes confident in choosing A over B even if B is optimal).
Responsiveness
Some respondents who had engaged external counsel told us that those firms were often slow to respond to their questions.
[Personal note: it feels hypocritical to report this feedback without also noting that I have also been slow in responding to questions by EA orgs I’ve assisted over the last few months.]
Availability and cost
Many organisations we spoke to reported that cost was a barrier to getting legal advice.
Can’t justify the cost. We’ve slowed down on some issues because of it.
One view we heard is that EA orgs can be expected to make good decisions when deciding where to spend their funding, and that funding the provision of legal advice would never be better than simply making an unrestricted grant to orgs that would have used the service. However, our impression was that willingness to pay for advice did not necessarily reflect the value that an organisation – or EA more generally – would expect to get out of legal advice. An org we spoke to which had received a grant seemed to have a psychological barrier to asking for additional funding to pay for advice. A leader of another EA org which receives donations from the public explained a reluctance to pay for legal advice directly, noting that they felt practical pressure from donors to keep overheads low.
We also suspect that some EA orgs overestimate the amount of time required for a lawyer to provide advice, and therefore over-estimate the cost in deciding whether to engage a lawyer. We spoke to generalists who seemed to arrive at correct answers to legal questions, but more slowly than a lawyer would have.
Having non-lawyers/people without expertise trying to figure out the answers to legal questions is often a time consuming process that slows down work. Having an expert available would speed things up considerably, but cost is often a barrier
It [researching legal questions as a generalist] feels very inefficient and often frustrating but it goes well surprisingly often. Sometimes I do that and then ask a lawyer to verify my research. (I was more inclined to do that when we couldn’t afford to pay much for legal advice.)
Cost barriers were not universal – we spoke to many organisations that regularly engaged external counsel, and spoke to external counsel who had provided legal support to EA orgs. Well-funded and established orgs seemed to be more willing to pay for advice, and small or new orgs seemed to be less willing.
A general non-understanding of the law and a reluctance to engage lawyers (because they have a reputation for being expensive, and only for ‘big firms/projects’ means that orgs make easily avoidable mistakes. This is especially true, I’ve noticed, when orgs pick up their first staff, and make easily avoided employment law errors. I haven’t seen any of this hit a worst case scenario yet but it may happen one day.
Having read Philip Tetlock’s Superforecasting since launching the survey, I acknowledge that “significant” and “some” were almost certainly interpreted differently by each participant. Mea maxima culpa.
Based on me going through feedback and identifying areas of law (and grouping some together), so some mentions might have been missed at the time (especially with meeting notes) or been misunderstood by me.
There is some risk that I’ve over-updated on the critical aspects of the feedback I received from the community… When checking with this respondent that they were happy for their comments to be included, they noted “Those quotes sound super negative, I hope you are also sourcing positive quotes as required :) I don’t think my overall response was negative, just caveated”.
Some initial work to evaluate the merits of an EA legal service
This post summarises my work last year to evaluate the merits of establishing a legal service to support the EA community. Some important notes up front:
I am not the only person who has been considering this idea. Most of the thoughts below aren’t original, and some are explicitly from other people considering this idea. That said, this post doesn’t set out a consensus view of lawyers involved in EA. Other lawyers and other orgs likely have a different (and potentially more concrete) view on the merits of the proposal. My comments below do not speak for anyone else who has been considering a similar proposal.
Most of this post (other than the background and current state section) was written prior to the collapse of FTX and subsequent fraud allegations. Ideally I would have revisited the analysis in light of those events, but I have not had the capacity to do so. I’m posting this work in its current form as it may be a useful resource for other lawyers or EA orgs thinking about unmet legal needs in the community.
The final section (Summary of Community Feedback) may not be particularly useful to most readers; you can get the gist from the earlier sections).
I would welcome feedback/suggestions (either in the comments, by DM, or by email).
My high level views/credence
At the time I wrote the initial draft of this post:
I was pretty confident there is an unmet need, but I had not done much work to estimate the lower and upper bounds for the benefit I’d expect different interventions to yield—so I wasn’t sure how cost-competitive this would be vis-a-vis the counterfactual marginal funding opportunity.
I generally heard positive feedback from the community, but it was not universal and I’d be wary of setting this infrastructure up unilaterally.
I did feel reasonably confident that a triage and referral service would probably be worth trialling, especially if referrals could be made to skilled specialist lawyers offering pro-bono services. Lawyers are expensive, and it didn’t seem unreasonable to think that funding a single lawyer (probably not too expensive) to work as a clearinghouse could have a decent multiplier in terms of the dollar value of legal services provided to EA orgs (note I had not considered the impact of those legal services, merely their market cost)
How do I feel now?
I’m really unsure how to update my views following the collapse of FTX and the other recent controversies in the community. Anyone taking this work forward should consider this carefully.
I remain reasonably confident that EA orgs (or at least small and newly established ones) allocate insufficient resource to legal risk and that there is an unmet need.
I’m more concerned about our vulnerability as a community to correlated risk. I guess I’d be a little less enthusiastic about ‘EA lawyers’ or an ‘EA legal consultancy’ (but still think the idea might be positive) but probably still just as positive about a triage and referral service to make it easier to access pro-bono offerings from law firms.
While I remain very enthusiastic about EA as an ideology or EA as a question, I’m not totally sure I feel as enthusiastic about EA as community whose ideas I generally reflexively support (influenced a little bit by this post and by the ‘EA as a Method and not a Result’ section of this post), and so I feel a little bit more uncertain about the value of community infrastructure generally.
History and current state of my work on this
In March 2022 I posted “Legal support for EA orgs—useful?”. The key claim was that large EA orgs have pretty clear legal needs (which they fill with in-house counsel and external legal resource), that smaller/newly established orgs probably have those needs but under-resource them, and that a virtual in-house counsel service would be valuable.
Several other lawyers (and former lawyers) within EA had already been looking into the merits of an EA legal service, although with different priors about the sorts of legal needs that exist (my impression—not necessarily a 100% correct restatement of their view—was something like less belief that there was a need for ‘virtual in-house counsel’ services but more belief that some of the legal services available on the market wouldn’t meet the needs of EA orgs aiming to do fast-moving ambitious work).
I was subsequently funded by a regrant from the FTX Future Fund to scope out this need. (I was also supported with desk space by UCL EA in London and as a Prague Fall Season visitor)
Together with some of the other interested lawyers, we sought feedback from EA orgs and with EA-adjacent lawyers and legal professionals.
Orgs:
Pretty wide range of functions (well-established ‘meta’ and ‘doing’ orgs, incubators, pretty wide range of cause areas, some newly established orgs, some prospective founders and grant recipients).
Did not speak to many commercially-focussed EA-adjacent organisations, so probably have a weaker sense of those legal needs.
Focussed mostly on orgs in the UK and USA. I think I was mistaken not to seek information about unmet legal needs in a wider set of jurisdictions even though most EA activity currently happens in these two jurisdictions.
EA-adjacent lawyers:
Largely on a referral basis, so out of the full set of EA-adjacent lawyers and legal professionals (incl immigration advisors, tax advisors etc), we mostly spoke with a highly-engaged, well-connected subset.
Of those lawyers, a mix of lawyers already doing operational legal work for EA orgs, academic research or policy work, and lawyers looking to transition into direct EA work.
Also spoke with several highly-motivated law students
Spoke with a pretty wide set of attendees at EAGx Australia 2022, EAG San Francisco 2022 and EAG Washington DC 2022.
I ran an open survey promoted on the EA Forum and in some EA Slack channels. The survey was largely qualitative, suffers from voluntary response bias, and had a small number of responses (n=29).
I met with lawyers with pro-bono expertise outside of effective altruism, including partners responsible for pro bono at several large law firms in the US, UK, and New Zealand. Also met with staff working at several pro-bono clearinghouses. Some of these lawyers expressed interest in working with small/newly established EA clients pro-bono with an EA lawyer or org coordinating these requests, although nothing has been formalised so far.
Another lawyer was giving serious consideration to starting up a California-based EA legal practice. At this point my focus turned to establishing a service that would employ one or more English lawyers to provide advice to orgs with UK legal needs.
My work on this project largely stopped in December 2022. This was primarily due to burnout and some minor personal matters, although I also wanted to reflect on whether this work should continue and whether it should be taken forward by another org or a lawyer with a different set of skills. (I am continuing to provide legal support to a few EA orgs directly.)
For the reasons set out in this post I remain of the view that there are unmet legal needs in the EA ecosystem, and that it would be useful for them to be met. I do not have a view as to whether this is cost-competitive vis-a-vis the counterfactual marginal funding opportunity.
As noted above, almost all of this post was written shortly prior to the FTX crisis (and other recent controversial events). Before any concrete action is taken to start up an EA legal services org, it would be prudent to revisit this assessment in light of discussions within the community since then.
If another EA lawyer or EA org suspects they are well-placed to take this work forward, I would encourage them to reach out [1]- I would be happy to share the work done to date/make introductions to folks I’ve spoken with so far, etc. If that does not happen, then I would potentially consider pitching some version of this service (probably just the triage and referral service) to funders.
There is unmet legal need in Effective Altruism
It sometimes costs too much to engage a lawyer:
Established, well-funded EA organisations use lawyers. They hire external and in-house counsel. At scale, EA orgs do not rely solely on generalist/operations staff to consider legal risk.
New, small, or poorly-funded EA organisations use of lawyers is mixed. Some significant portion of these orgs are reluctant to engage lawyers. Some reasons for this appear to include:
Financial costs
Genuinely constrained funding (this may be a signal that the work the org is doing is not valuable, but may also reflect capacity constraints within large funders).
Pressure from donors (whether actual or perceived) to keep overheads low.
Not planning for legal expenses when budgeting and seeking funding.
Psychological barriers to returning to funders with a request for more funding.
Overestimating the complexity of a matter / the cost of getting legal advice.
Non-financial costs
Anticipated difficulty in finding a suitable lawyer / ‘ugh field’ around the task.
Time cost to explain the legal issue and, in some cases, the unique ways this org/EA orgs generally trade off risk and reward.
Conflict with desire to appear talented/knowledgeable.
Movement-wide legal needs might not be met by individual orgs
We received feedback that it would be useful to have a movement-wide view of legal risks and opportunities. For example:
Some EA orgs want advice that quantifies legal risk, and possibly which frames it in an EA/rationalist way:
More ‘precise’ risk estimation. (This is a claim about precision, not accuracy—a lawyer may have an accurate mental model of the distribution of outcomes, but default to vague, standard legal terms like “real risk” or “more than minor”.)
Differentiation between legal risk and other kinds of risk (e.g. reputational risk).
Risk neutrality.
Accounting for legal risks/harms that accrue to other parts of the EA movement.
Bayesian reasoning.
Stating epistemic statuses.
Explicitly consider whether they have been sufficiently curious.
Use of the ‘outside view’.
Lawyers and clients have diverging incentives
We had some feedback that external counsel:
were professionally incentivised (by regulators, internal risk departments, or insurers) to give advice that was imprecise but accurate;
would steer clear of work (especially pro-bono work) that created a risk of bad optics with major corporate clients;
respond slowly to smaller/pro bono clients, or assign their work to junior staff;
felt unapproachable.
Evaluation of potential interventions
Funding orgs to get legal advice
Paying for or subsidising legal advice would probably increase the use of lawyers by EA orgs.
Where orgs receiving legal funding would not have counterfactually engaged a lawyer, this funding is distortionary. Ideally, this funding would only be provided to orgs that are expected to under purchase legal services. This might look like identifying orgs that are comparatively risk-loving, have inexperienced leadership or operations staff, or have donors that are very sensitive to overhead. But this rewards poor decision-making, would likely be awkward, would be unreliable, and would consume a lot of grant-making resources.
In some proportion of cases, the recipient organisation would have engaged a lawyer under the counterfactual. Here, the funding would either have the impact of an unrestricted marginal donation to that org—or where an org accurately reports funding gaps, would potentially ‘funge’ with other donations that would counterfactually have gone to that org but are no longer required.
Funding alone would not directly address non-financial costs to engage lawyers, divergent incentives, or the differences in reasoning style between EAs and most lawyers.
Starting one or more law firms
Insofar as there are lawyers who are familiar with EA, value-aligned, and willing to adopt rationalist/EA frameworks in their assessments of legal risk, then starting a firm could plausibly improve the quality of legal services available to EA orgs.
However, the law diverges significantly across practice areas and jurisdictions. Lawyers in private practice tend to specialise in a few (often related) practice areas, and cross-jurisdictional practice is fairly rare. I expect that a specialist in a practice area (compared to an equivalently smart and motivated generalist) has:
a stronger recall of specific legal rules in that area;
a better ability to predict the behaviour of courts, regulators, and counterparties in that area;
best-practice approaches to solve lots of standard problems;
higher productivity, by not needing to research as much and—I presume—by being able to make confident decisions faster.
There is not a small set of practice areas that neatly maps onto EA legal needs such that a lawyer could naturally be an EA legal specialist. If an EA firm wanted to provide the EA movement with the benefits of specialisation, it would need to be reasonably large.
A specialised boutique firm
One response to this tradeoff might be for a lawyer and funder to focus on a single jurisdiction and on practice areas that more clearly map onto the principal legal needs of a smaller, more homogenous set of high-impact clients. That could potentially deliver a significant benefit to those clients, and would be worthwhile insofar as there is significant legal need in a more narrow subset of EA.
[Pre FTX collapse] Another lawyer was giving serious consideration to starting up a California-based EA legal practice. He described that as follows:
A EA-focussed general practice firm
The foregone benefits of specialisation might arguably be offset in most lawyer-client engagements by the benefits of EA familiarity, value alignment, and willingness to use EA/rationalist frameworks. If this was almost always true, then the EA org should primarily select lawyers for those EA features, not for expertise in a given area of law—and an EA law firm would do the most good by applying those EA features (which other firms might not optimise for) to almost any area of the law that EA orgs need help with.
However, in most cases the foregone benefits of specialisation won’t be offset by EA familiarity, value alignment, and willingness to use EA/rationalist frameworks. This was supported by feedback from in-house lawyers and leaders in EA orgs who were clear that, all things being equal, their orgs want domain expertise over these other EA features.
A hybrid model
The tradeoff between specialisation and EA familiarity, value alignment, and willingness to use EA/rationalist frameworks will not be uniform across all types of legal questions and practice areas. An EA firm could add value by providing EA-flavoured advice on the kinds of legal questions that do not need the expertise of a specialist lawyer, and by helping the EA organisations to instruct external counsel where specialist advice is better.
Larger organisations tend to adopt a variation of this model by employing in-house counsel. This model can be thought of as offering a ‘virtual’ in-house counsel service to EA organisations.
Starting a triage and referral service
The effort involved in working out whether to get legal advice, and where to get it from, probably discourages EA organisations from getting legal advice. Similarly, EA organisations who need legal advice sometimes use a lot of time finding and briefing a lawyer, and explaining to the lawyer how what the organisation wants from the legal advice differs from what other nonprofit or start-up clients typically want.
Many lawyers (including some within or adjacent to the EA community) are willing to provide pro-bono advice. There is currently no way to easily identify and connect with those lawyers.
A minimal version of this could involve:
compiling a database of lawyers able to provide pro-bono support to EA organisations—and periodically checking in with those lawyers to check they are still able to assist;
compiling recommendations of lawyers who have specialised expertise and are recommended by other EAs, and seeking feedback from EA orgs to keep this updated over time.
A higher-effort option could involve:
intake conversations with an organisation to help them identify the legal issues, and to weigh in on the need for/value of external legal advice;
connecting an organisation with one or more recommended lawyers, including helping the organisation to provide clear instructions to the lawyer;
developing relationships with regularly-used external lawyers—potentially negotiating a discounted rate, helping familiarise these lawyers with the style of reasoning that EA organisations expect, and providing aggregated feedback from clients;
where appropriate, reviewing advice to offer a second opinion, provide an EA-framed view on the advice, and form a community-wide view of legal risk.
Producing/commissioning community resources
Several EA organisations have expressed interest in resources that provide general legal information from a EA perspective on topics that founders and newly-established or small EA orgs would typically be interested in. I had feedback that this would be useful, but not especially useful as a standalone intervention.
The AntiEntropy Resource Portal may be a good place to host these resources. The portal is open-access, so care would need to be taken with content that could increase legal risk to EA orgs if publicly published.
External lawyers could potentially produce some of these resources as part of their pro-bono work, but would need an EA organisation to act as a client for this purpose.
Intervention risks
Several respondents were concerned that an EA legal service could harm EA organisations by:
providing bad advice, especially in a practice area where a lawyer doesn’t have significant expertise;
causing legal risk to be correlated across EA organisations;
crowding out other sources of legal advice for the EA community, including other potential EA-focussed legal services or pro-bono relationships with law firms.
Bad advice
This outcome happens where the advice that an org receives leads them to make a worse decision than they counterfactually would have made, either because they would have:
made a better decision without the advice; or
sought better advice elsewhere.
The first case can probably be reduced acceptably by checking that whoever is giving advice via the EA legal service is a competent lawyer and follows reasonable epistemic practices, including noting uncertainties in advice or gaps in experience. A bad lawyer might routinely give worse advice than a smart layperson with good epistemics, but there isn’t any reason to expect that a smart and epistemically honest lawyer would underperform that standard.
Reducing the risk of an org foregoing better advice elsewhere is trickier because it is not enough to check that the lawyer giving advice is competent and has good epistemics. Evaluating the risk in each case would involve estimating the likelihood the org would seek advice elsewhere and the likelihood that this advice would be better than the advice provided by an EA legal service. Instead, an acceptable way of managing the risk could involve lawyers explicitly considering how well-placed they are to provide advice, how critical the issue is for the org or the community, and how realistic it would be for the org to seek specialised advice.
Where an org does not have the budget or inclination to seek specialised advice, a competent generalist lawyer can still add value by providing advice with a clearly-identified credence and encouraging the organisation to consider whether seeking a firmer opinion from a specialist would be appropriate.
Correlated risk
This risk was best described by a respondent to my survey:
There seem to be three kinds of correlated risk:
bad advice is widely adopted by EA orgs;
advice specific to one org’s circumstances being adopted more widely, and particularly in circumstances that make the advice inappropriate; and
approaches that are good on expectation for any individual org being bad when adopted across many EA orgs, simply by virtue of increasing the movement’s susceptibility to those approaches failing or ceasing to be available.
All three cases likely exist to some extent already—both by virtue of organisations leaning towards ‘default’ options, and also by organisations sharing informal advice and know-how with each other.
The first case is simply a multiplier of the ‘bad advice’ risk discussed above, which adds weight to that concern.
An EA legal services org could plausibly reduce the risk of the second and third cases by:
collecting and publishing best practices, and where appropriate making them available for community comment with explanations provided;
reducing the barriers to checking whether common-knowledge advice that has worked well in other orgs is suitable for a new org’s circumstances;
maintaining a community-wide view of legal risks.
Crowding out the EA legal market
The benefit of EA organisations seeking legal advice is not wholly contingent on an EA legal service, as some organisations would engage lawyers (some for a fee, some on a pro bono basis) under the counterfactual. This may reduce the benefit of an EA legal service, but does not necessarily make it harmful.
However, if other lawyers see that there is an EA legal service and think that there is little benefit (either commercially or altruistically) in targeting the EA market, then there will be fewer lawyers for EA organisations to select from than under the counterfactual. This might lead to less specialised advice being available to orgs, to bottlenecks arising from limited access to lawyers, or to higher prices for EA orgs.
For this reason, an EA legal services org should be focussed on building a network of lawyers who are available to provide services to EA orgs, rather than on trying to directly meet every conceivable legal need. An aspirational goal of an EA legal services organization would be to influence legal advice generally and to develop and model best practices in legal advice for other lawyers and new lawyers, providing more lawyers for EA organizations to select from.
Summary of community feedback
Note: all feedback was provided prior to the collapse of FTX, and does not factor in lessons learned by the community since then. Some limited post-FTX comments are included in brackets.
Degree of legal need
Feedback on this point was not uniform. While all respondents I spoke to believed that effective altruism had some legal needs, respondents were divided on how significant and neglected those legal needs were. Some respondents reported difficulty meeting their legal needs, while others felt that their needs were sufficiently served by pro-bono and paid-for counsel.
I asked some survey questions to very roughly consider the degree of unmet legal need:[2]
I also asked orgs to report their approximate legal spend:
· USD 5,000
· USD 5,000
· USD 20,000
· USD 30,000 to 50,000
· GBP 40,000
· USD 50,000
· USD 3,000,000.
Areas of law mentioned by respondents
Below, I have listed areas of law in approximate [3] order of how many EA orgs I spoke with and how many respondents to my survey mentioned them. I think that this data provides some limited indication as to the relative frequency of legal need in different areas. However, the data is of limited use because the frequency with which different needs were mentioned does not clearly say anything about the degree of legal risk to EA orgs, ability of lawyers to mitigate that risk, and need for an EA-specific org to assist.
· Tax
· Employment
· Immigration
· Charity law (including charity registration) - n.b. could be considered to overlap with ‘tax’ above
· Choosing a structure for a new org or project
· Governance / internal policies
· Data protection / privacy
· Antitrust / competition law
· Fiscal sponsorship—n.b., again could be seen as part of ‘charity’ or ‘tax’ above
· Real estate (purchasing and leasing)
· General compliance issues
· Regulatory advice
· Corporate law
· Integrity and conduct / probity
· Intellectual Property
· Commercial contracts / procurement
· Securities law and crypto
· Defamation/disparagement
· Wills and estates for donors
· Impact litigation
· Help with legislative drafting
· User terms / consumer protection
· Modern slavery
· Child protection
· Financial services / investment advisory
· Stockholder activism
Jurisdictions
I expected that almost all of the legal need would be for United States and United Kingdom advice. While most of the current legal need is in these jurisdictions, there was a greater interest in legal support elsewhere than I expected.
Respondents reported having questions about the law in:
· India
· China
· Czech Republic
· Singapore
· Germany
· Nigeria
· Australia
· New Zealand
· Switzerland
· Mexico
· Cayman Islands
· Indonesia
· Vietnam
EA-specific legal issues
Respondents also identified some legal issues that are common to many EA organisations, some of which are relatively unique to EA organisations. In these domains, it may be useful to have specialised legal advice or an organisation to review and publish best practices in these areas.
Boundaries of charitable / tax exempt status
Charity laws are not cause-neutral, and it can be ambiguous whether some EA interventions fall within existing legal concepts of charity.
This can be especially relevant where an org wants to fund a for-profit business, engage in political activity, or provide private benefit (beyond a reasonable salary) to people involved in a project.
This concern included both regulatory and reputational risk – as one respondent put it:
[While not directly describing the last few months in the EA movement, this comment feels reasonably prescient.]
Non-standard governance arrangements in EA organisations
While non-standard structures that do not require leaders in a commercial organisation to profit maximise are increasingly available:
some respondents anticipate a need for governance structures that require leaders to act responsibly when making decisions with the potential for significant impact outside of the company (e.g in companies developing AI systems); and
it may not always be straightforward to structure an organisation in a way that allows it to act cause neutrally (i.e. being able to stop doing work that is ineffective, and pivot to a conceptually different use of the organisation’s resources).
Antitrust / competition law in safety-critical areas
I won’t describe in depth, but see:
In the US context Cullen O’Keefe’s Antitrust-Compliant AI Industry Self-Regulation. In summary, anticompetitive mutual restraints on trade are prima facie unlawful, making coordination not to develop unsafe AI risky. Similar legal issues would apply in a biorisk context.
For an EU perspective, Shin-Shin Hua and Haydn Belfield on AI & Antitrust: Reconciling Tensions Between Competition Law and Cooperative AI Development
Ashurt LLP’s Competition Law Newsletter on The car emission cleaning cartel, or how legitimate technical cooperation can go wrong? The newsletter discusses a semi-recent settlement between the EU Commission and automakers where it was alleged automakers limited technological development under the guise of sustainability cooperation.
Tax treatment of grants
Where grants are being made to individuals or to orgs that do not have tax-exempt status, the tax status of the grant in the hands of the recipient can be unclear (and can hinge on the nature of the specific grant).
Regranting and fiscal sponsorship
It is important for orgs regranting funds or engaging in a fiscal sponsorship arrangement to know what is clearly permitted and also to be able to make risk-proportionate decisions about regrants and activities where the position the regulator (or a court) would take is not clear.
Engaging contractors abroad
Multiple respondents were concerned about how often EA orgs defaulted to engaging offshore staff as contractors. This had the potential to reduce the effectiveness of the staff member (who might deal with adverse financial or administrative consequences as a result) or not comply with local law, accruing reputational risk.
Preventing, and managing fallout from, bad conduct within or adjacent to EA
Several respondents considered managing risks in this area well was important to the continued growth and impact of the community.
One respondent gave the example of Ben Delo, an EA-adjacent public figure who faced criminal liability for failing to comply with anti-money laundering law, suggesting that EA could be vulnerable to reputational risk if someone more closely linked to EA were to commit a more serious offence.
For further colour about other kinds of risks in this area, it may also be useful to read The community health team’s work on interpersonal harm in the community and Democratising Risk—or how EA deals with critics.
[The original draft of this post was prepared before the FTX fraud/insolvency became known. I’ve since talked to folks who wondered whether additional legal support might have helped orgs think better about risks arising from FTX funding/association. Maybe so, but it feels pretty hubristic to claim this in hindsight; I sure didn’t consider the possibility that SBF mighn’t be a trustworthy actor, although perhaps a typical lawyer would have been more objective.]
Managing conflicts of interest (actual or apparent) within EA
EA is a small community, and some respondents felt that EA orgs needed help to build their capacity to manage actual or apparent conflicts of interest.
[Plenty of discussion of this on the forum post-FTX crisis—although not detailed consideration of whether legal resource would specifically be helpful.]
Drafting note: the following five issues were identified by another lawyer interested in this project, not by me.
Impact Certificates
Addressing securities law and tax law issues to enable a strong market for impact certificates.
Securities on Large Prizes
Addressing securities law and tax exemption issues to enable securities trading on shares of large prizes.
Prediction Markets
Addressing gambling and CFTC issues to enable prediction markets at scale.
Binding AI Developers
Finding ways to bind AI developers to take or not take certain actions in response to other developers’ actions—how to structure these to be legally binding and how to avoid antitrust issues.
Compensation-related Governance
Some AI orgs in particular have difficulty determining how to set attractive compensation for talented technical people without running into nonprofit problems. A solution to this may involve appropriate structuring.
Desired and undesired attributes of EA lawyers/legal services
EA modes of thinking
Respondents commonly mentioned that lawyers (and other professional service providers, like accountants) are reluctant to give precise estimates of risk. Some respondents also suggested that lawyers might lack the skills needed to give precise estimates.
I also heard that it would be valuable for lawyers to:
apply Bayesian reasoning
clearly identify epistemic statuses or credences
examine whether they are being sufficiently curious
explicitly consider the outside view
An operations person at an established EA org identified the following features that it looked for when engaging lawyers:
Principal-agent problems and value alignment
A lawyer with a background advising EA organisations as an external counsel described an internal culture within law firms that promotes giving risk averse advice. I heard that while there is a competitive incentive to deliver good advice to avoid losing business, private practice lawyers may still have an incentive to give as little firm advice as possible while still being able to charge for the work.
I heard from some respondents that they expected an EA lawyer to have higher intrinsic motivation to give correct, useful advice, although this was contingent on finding genuinely value aligned lawyers.
On the other hand, I also heard a concern that an EA lawyer might lack professional distance critical to giving good legal advice:
I heard from some respondents that they would potentially be more fulsome and honest with a value-aligned lawyer. One respondent described needing to decide whether to take an opportunity for impact that had some low-but-not-precisely-known probability of being unlawful and some unknown distribution of potential consequences, and noted:
[Note that post-FTX, there is probably a legitimate debate around the risk-neutrality mindset—and I’m not confident I endorse taking a risk-neutral/expected valu mindset to all legal issues.]
Some respondents felt that value alignment was largely unnecessary:
Degree of specialisation, and familiarity with EA/longtermism
Some respondents thought it would be useful for their lawyers to have EA or longtermist context. A particular benefit was the convenience of having an easily identifiable person to ask legal questions, and to feel comfortable doing so. Other respondents suggested this was not necessary.
I also heard that an EA legal services organisation would be of limited use, or might be harmful, because it would lack deep expertise in all of the practice areas in which EA orgs would need support.
In contrast, some respondents thought that a ‘right-sized’ legal services organisation would fulfil a different need to specialised external counsel.
I also heard that an EA legal service could help EA orgs navigate the process of engaging and instructing external counsel:
Shared view of legal risk / correlated legal risks
I received feedback that it would be useful to have a movement-wide view of legal risks and opportunities, although I expect that in many cases respondents were primed to give this feedback by my description of what an EA legal services organisation might look like.
The following feedback was provided to another lawyer considering a similar initiative:
Respondents also identified the potential for this shared view to harm EA organisations or to cause harm to others:
Responsiveness
Some respondents who had engaged external counsel told us that those firms were often slow to respond to their questions.
[Personal note: it feels hypocritical to report this feedback without also noting that I have also been slow in responding to questions by EA orgs I’ve assisted over the last few months.]
Availability and cost
Many organisations we spoke to reported that cost was a barrier to getting legal advice.
One view we heard is that EA orgs can be expected to make good decisions when deciding where to spend their funding, and that funding the provision of legal advice would never be better than simply making an unrestricted grant to orgs that would have used the service. However, our impression was that willingness to pay for advice did not necessarily reflect the value that an organisation – or EA more generally – would expect to get out of legal advice. An org we spoke to which had received a grant seemed to have a psychological barrier to asking for additional funding to pay for advice. A leader of another EA org which receives donations from the public explained a reluctance to pay for legal advice directly, noting that they felt practical pressure from donors to keep overheads low.
We also suspect that some EA orgs overestimate the amount of time required for a lawyer to provide advice, and therefore over-estimate the cost in deciding whether to engage a lawyer. We spoke to generalists who seemed to arrive at correct answers to legal questions, but more slowly than a lawyer would have.
Cost barriers were not universal – we spoke to many organisations that regularly engaged external counsel, and spoke to external counsel who had provided legal support to EA orgs. Well-funded and established orgs seemed to be more willing to pay for advice, and small or new orgs seemed to be less willing.
By DM on the forum would be best—otherwise by email to tyrone [at] spiltmilk [dot] nz
Having read Philip Tetlock’s Superforecasting since launching the survey, I acknowledge that “significant” and “some” were almost certainly interpreted differently by each participant. Mea maxima culpa.
Based on me going through feedback and identifying areas of law (and grouping some together), so some mentions might have been missed at the time (especially with meeting notes) or been misunderstood by me.
There is some risk that I’ve over-updated on the critical aspects of the feedback I received from the community… When checking with this respondent that they were happy for their comments to be included, they noted “Those quotes sound super negative, I hope you are also sourcing positive quotes as required :) I don’t think my overall response was negative, just caveated”.