Two potential cases of Effective Ventures breaking the law

Disclaimer: we have no legal background, so while we suspect both of these activities to be illegal, we would welcome clarification from an EVF employee explaining why one or both are not.

1. Ignoring conflict of interest/​giving a trustee direct and indirect benefits

UK charity law states that ‘trustees cannot receive a benefit from their charity, whether directly or indirectly, unless they have an adequate legal authority to do so’*. At virtually every EAG and EAGx, Effective Ventures distributes large numbers of copies of What We Owe the Future and/​or Doing Good Better, presumably bought with Effective Ventures money. Both books are authored by William MacAskill, a trustee of Effective Ventures, who would have benefited directly from royalties received on each sale**, and whose academic career almost certainly benefited indirectly as a result (less concretely, the tendency of EAG events to regularly invite him as the keynote speaker has probably had a substantial indirect benefit).

* We are unclear what ‘legal authority’ implies (it’s not defined elsewhere on that page), but from context seems likely to relate to technical benefit, where eg they needed to take custody of money to pass it on to someone else. We doubt it applies here.

** As we understand it, MacAskill donates all the proceeds from sales of his book to effective charities. The claim we make is not that MacAskill is an immoral human, but that EVF/​he has continually acted illegally in this respect.

2. Retaining data against the requirements of GDPR

In her recent Vox article, Carla Cremer claims that

in 2019, I was leaked a document circulating at the Centre for Effective Altruism, the central coordinating body of the EA movement. Some people in leadership positions were testing a new measure of value to apply to people: a metric called PELTIV, which stood for “Potential Expected Long-Term Instrumental Value.” It was to be used by CEA staff to score attendees of EA conferences, to generate a “database for tracking leads” and identify individuals who were likely to develop high “dedication” to EA — a list that was to be shared across CEA and the career consultancy 80,000 Hours. There were two separate tables, one to assess people who might donate money and one for people who might directly work for EA.

Individuals were to be assessed along dimensions such as “integrity” or “strategic judgment” and “acting on own direction,” but also on “being value-aligned,” “IQ,” and “conscientiousness.” Real names, people I knew, were listed as test cases, and attached to them was a dollar sign (with an exchange rate of 13 PELTIV points = 1,000 “pledge equivalents” = 3 million “aligned dollars”)

… When I confronted the instigator of PELTIV, I was told the measure was ultimately discarded. Upon my request for transparency and a public apology, he agreed the EA community should be informed about the experiment. They never were.

From what we understand of GDPR, if true, this probably would have violated all its ‘data subject rights’, such as the requirement for ‘the right to be informed’ and for giving subjects the right to access their data. It’s unclear to us whether consent would have been required in this situation, but if so it was evidently not obtained. It would probably also have violated the requirement that it be ‘for a specific purpose’. It might also have qualified as ‘high-risk data’ on the grounds that it involved ‘Systematic and extensive profiling’. We do not know the implications of this, but presumably it would have required a higher level of justification and/​or imply a more serious offence when the other requirements are breached.

Given that the measure was discarded and we don’t know how long it was actually used for, perhaps the law was never in practice broken—nonetheless, if Cremer’s account is accurate, it seems that there was clear intent to do so.