Haydn has been a Research Associate and Academic Project Manager at the University of Cambridge’s Centre for the Study of Existential Risk since Jan 2017.
HaydnBelfield
How many people work full-time and part-time on GPP? What are sustainable growth predictions?
Do you model yourself as a think-tank?
What think-tanks have you looked at, spoken to, or modelled yourself upon?
Have you reached out to e.g. RUSI, BASIC, etc? Do you plan to?
What are your plans for the next a) 6 months b) year c) 5 years?
In what ways are you experimenting and iterating?
How many people have read your most popular content?
What are your next few marginal hires?
If a reader wants to work for GPP, what should they do/study/write/etc?
If a reader wants to help GPP, what should they do?
What would you do with a) £2,000 b) £10,000 c) £20,000?
What do you think your room-for-more-funding is?
You’re based in the UK—there’s about to be an election, then five years of a new government. How does that affect your plans?
When do you aim to influence debate, and policy—i.e. over what timescale? Are you trying to influence policy in 10 years, 20?
Who are the key decision-makers/stakeholders in your area? Have you mapped them out—how they relate, what their responsibilities are?
What Government Departments are you mainly interested in? Which are you monitoring? Are there any consultations open at the moment that you are submitting to? Same question for Parliamentary Committees.
A link to the (very good!) 2015 Strategy might be helpful: http://globalprioritiesproject.org/wp-content/uploads/2015/03/GPP-Strategy-Overview-February-2015.pdf
President Trump as a Global Catastrophic Risk
Thanks for commenting! I’ll try to answer your points in turn.
Nuclear weapons I was using the Cuban Missile Crisis as an example of a nuclear stand-off. I’m not saying a very similar crisis will occur, but that other stand-offs are possible in the future. Other examples of stand-offs include Nixon and the Yom Kippur War, or Reagan and Able Archer. There have many ‘close calls’ and stand-offs over the years, and there could be one in the future e.g. over the Baltics. Trump’s character seems particularly ill-suited to nuclear stand-offs, so increases risk.
Pandemics Many countries have had biological weapons programs: for example the US, UK, USSR, Japan, Germany, Iraq and South Africa. I agree that they’re difficult to control and would likely hurt the country that used them as well as the target—but that hasn’t stopped those countries. The development and use of biological weapons has been constrained by the Convention and surrounding norms. I think Trump threatens those norms, and so increases risk.
Liberal global order Very interesting fact about trade and war there, although she is looking at the period 1870-1938 and I’m talking about post-1945. And yes I agree with you about democratic peace theory. My point is more general, that the liberal global order has kept us safe—to take one example we haven’t had a serious great power war. Trump threatens that order, and so increases risk.
Were I working for an EA org this would be the decisive factor that would swing me, so it would be really good if we could work this out. Giving to another org adds Gift Aid to your donation. +20% Forgoing salary saves you and your employer National Insurance. +29%
So if you’re basic rate, is giving to your employer better value?
Very interesting idea, and potentially really useful for the community (and me personally!).
What’s the timeline for this?
I’m presuming that the Funds would be transparent about how much money is in them, how much has been given and why—is that the case? Also as a starter, has Nick written about how much is/was in his Fund and how its been spent?
- 28 Feb 2017 18:12 UTC; 4 points) 's comment on EA Funds Beta Launch by (
New Vacancy: Policy & AI at Cambridge University
Thanks for this! Its mentioned in the post and James and Fluttershy have made the point, but I just wanted to emphasise the benefits to others of Open Philanthropy continuing to engage in public discourse. Especially as this article seems to focus mostly on the cost/benefits to Open Philanthropy itself (rather than to others) of Open Philanthropy engaging in public discourse.
The analogy of academia was used. One of the reasons academics publish is to get feedback, improve their reputation and to clarify their thinking. But another, perhaps more important, reason academics publish academic papers and popular articles is to spread knowledge.
As an organisation/individual becomes more expert and established, I agree that the benefits to itself decrease and the costs increase. But the benefit to others of their work increases. It might be argued that when one is starting out the benefits of public discourse go mostly to oneself, and when one is established the benefits go mostly to others.
So in Open Philanthropy’s case it seems clear that the benefits to itself (feedback, reputation, clarifying ideas) have decreased and the costs (time and risk) have increased. But the benefits to others of sharing knowledge have increased, as it has become more expert and better at communicating.
For example, speaking personally, I have found Open Philanthropy’s shallow investigations on Global Catastrophic Risks a very valuable resource in getting people up to speed – posts like Potential Risks from Advanced Artificial Intelligence: The Philanthropic Opportunity have also been very informative and useful. I’m sure people working on global poverty would agree.
Again, just wanted to emphasise that others get a lot of benefit from Open Philanthropy continuing to engage in public discourse (in the quantity and quality at which it does so now).
Excellent!
Peter’s question was one I asked in the previous post as well. I’m pleased with this answer, thanks Tara.
This is a great idea and you’ve presented it fairly, clearly and persuasively. I’ve donated.
Whatever happened to EA Ventures?
Thanks for this!
I personally agree that Democratic control of Congress, or even Congress and the Presidency, would be great. But I’m not sure how likely that is, or how certain that I should be about that likelihood.
Even if there was a high certainty and high likelihood, I probably still wouldn’t take that option—the increased risk for four years is just too high. As Michael_S says you get higher nuclear risk and higher pandemic risk. As I said in my post, I think Trump also raises the risks of increased global instability, increased international authoritarianism, climate change, and emerging technologies. Take climate change—we really don’t have long to fix it! We need to make significant progress by 2030 - we can’t afford to go backwards for four years.
[Writing in a personal capacity, my views are not my employer’s]
This is incredibly valuable (and even groundbreaking) work. Well done for doing it, and for writing it up so clearly and informatively!
Really glad to see you taking conflicts of interest so seriously!
The recent quality of posts has been absolutely stellar*. Keep it up everyone!
*interesting, varied, informative, written to be helpful/useful, rigorous, etc
“Cause areas shouldn’t be tribes” “We shouldn’t entrench existing cause areas” “Some methods of increasing representativeness have the effect of entrenching current cause areas and making intellectual shifts harder.”
Does this mean you wouldn’t be keen on e.g. “cause-specific community liasons” who mainly talk to people with specific cause-prioritisations, maybe have some money to back projects in ‘their’ cause, etc? (I’m thinking of something analogous to an Open Philanthropy Project Program Officer )
CSER Special Issue: ‘Futures of Research in Catastrophic and Existential Risk’
If this research seems interesting to you, CSER is currently hiring! https://www.cser.ac.uk/news/hiring-APM/
Hi everyone, I’m Haydn. I used to work at the Centre for Effective Altruism, now I work for a Member of the UK Parliament. Message me if you’re interested in politics and EA.