Field-specific LE (Longterm Entrepreneurship)
Background
I’ve been interested in EA Entrepreneurship for a while as I find myself being a bit bonkers and seriously considering going for a pretty risk-neutral approach to my life. I do, however, feel that if I’m going to gamble a stable life for a much more volatile approach, then I would want to do it where I believe my EV is highest (as I’m an expected utility machine). I would consider myself a longtermist, and I would want to work mainly within AI safety or other emerging technology fields. The primary actor in the area, Charity Entrepreneurship is, unfortunately, for me, focused on more short-term risks, which is understandable as the tractability of creating short term projects is a lot higher than longer-term projects. Yet the EV is a lot higher for longtermist projects, so I want to see a longtermist entrepreneurship incubator.
Something similar has previously existed with a long-term entrepreneurship incubator that ran in 2020, and if one looks at the post, it is a rigorous idea. The founders of LE quit this project because of more impactful opportunities, the tractability difficulty, and the inherent knowledge level required to run an incubator. Yet an underlying current of it all is that they tried to create a longtermist incubator, a vast umbrella term for many different projects. Longtermist entrepreneurship incubation is very difficult as many resources are required to run successfully. Yet if one were to use the structure and ideas of Charity Entrepreneurship on specific subfields such as AI safety with “AI Safety Entrepreneurship”, it could make the project a lot more tractable as the previous LE incubator creators also thought. Below are some arguments for why I think so and some creation pathways.
Specificity
Currently, CE is an organisation with a lot of weight on its shoulders, a large portion of potential EA entrepreneurs have thought about applying there. Yet, there is a considerable variation of EA cause areas, and especially longtermist areas are pretty divergent in aptitude needs. If one were to create specific subareas such as “AI Safety Entrepreneurship” or “Biorisk Entrepreneurship”, this would enable faster partner compatibility checking and more relevant idea planning and aptitude screening. It would, in short, increase the probability of projects succeeding.
Enabling more projects
An LE incubator would most likely enable the founding of more longtermist projects. There is a massive need for this within the AI safety field as an example. Only a few actors currently have the ability to ensign AI safety researchers, and there are many more potential AI safety researchers than there are places for them. If one were to make an incubator for AI safety, one could create 5-100 jobs more a year by creating new projects (don’t quote me). Incubation would almost certainly increase the productivity of the AI safety field. There is, of course, the problem of making sure that each project has the experience needed and that things work out, but if one could replicate CE, this would increase the output in the AI safety field.
There is also an underlying thought in AI safety that shallow knowledge does not generate value and that only deep knowledge (the underlying causal reason behind processes) does. However, shallow knowledge creation can be counteracted at the idea searching stage as it is the ideas that generate a shallow or deep focus and not the individual’s knowledge level. Deep idea searching might decrease the probability of finding someone with a good knowledge level. Yet, a project like an AI safety incubator would enable more people to go in divergent paths and increase the probability of generating more deep knowledge.
Steps to creation
If one were to try and create this then it might be difficult to see any pathways forward. It is very difficult to create it as an island without any support so the following are some pathways that someone interested in starting a LE incubator could check out.
One pathway is to try to contact CE and branch out from the current CE and have CE create more teams and apply the same organisation they have so far but in a more specialised manner. The expansion would be to an extent on CE to organise, and intuitively it feels like they have a lot on their plate and that it wouldn’t happen in a couple of years but I haven’t talked to them about it so go ahead!
If one were to try to start an AI safety variant as a test run I believe that asking for support from Non-Linear is a good idea. Non-Linear could be an excellent starting point as Kat Woods, president and co-founder of Non-Linear, also co-founded CE and they’re actively looking to improve the efficiency of the AI safety metaspace.
As—Alex—pointed out in the comments, there are a lot of student-focused incubators, and if EAs were to create one at a university, then this could be a good experiment to see if it works. It could at least be interesting to find out if it’s a cost-effective cause or not and it can potentially lead to more longtermism focused incubators.
Summary
It is a good idea to replicate Charity Entrepreneurship’s model but for specific areas of longtermism such as AI safety or Biorisks. It would build on the ideas of a previously tried project but increase the specificity of current incubators and increase the amount of EA projects, decreasing the oversaturation of CE applicants. Depending on the implementation, this could generate a lot of value.
Final remarks
I don’t think I’m a good fit for starting this as it requires experience as it is with many jobs, but if you want help with starting it, I’ll do what I can. Other than that, I will update the steps to creation part of the post if you people have any other ideas of how this could start. I’d love to be wrong about this variation of LE being a better approach, but I would love to be right even more. However, that shouldn’t stop you from providing feedback, I feel a bit like a lost duckling among emus, and it is common knowledge that emus are scary, so I need guidance.
I’ve also kept this article a bit shorter than I could have made it. As they say, there is beauty in the absence of embellishment.
We are actually doing this! For all the reasons you mention and more.
We’ll be announcing the specifics relatively soon, but in the meantime, we’re incubating an EA-hiring agency for longtermists with an initial focus on PAs and are working on finding a founder for this. We are also incubating a promising woman for an as-yet-unspecified charity.
Our model will be similar to CE but adjusted based on the different needs of longtermism and the lessons I learned from CE’s limiting factors.
We will soon be fundraising to increase our capacity to take on more incubatees. Details to be announced soon.
Cool idea! I think there are some others that are also thinking about this, and they would probably love a helping hand:) More info in DM
Thank you Jonas for linking to this article on the EA Entrepreneurs Slack Group and thank you Paal, for tagging me in Slack to draw my attention to it.
Also, thank you for linking to What we learned from a year incubating longtermist entrepreneurship. I wasn’t previously aware of this article. I would have been if I’d searched the EA Forum using The Entrepreneurshiptag which:
Simon Haberfellner and I talked yesterday about a post-Covid plan to foster more meaningful connections between EA Entrepreneurs, EA communities and non-EA Customers, Angels and VCs. One super-nerdy metaphor we’ve adopted is a Fourier Transform of EAG/EAGx/EA Retreats into an extended EA Vacation a small group of EA participants spends in an EA Host City. We are still working the creativity alchemy on the idea!
Another idea which might lead to Longtermist Entrepreneurship would be to scout the landscape for EAs interested in investing in the startup scene. For example, for students:
Front Row Ventures (FRV) is a Canadian student-run, university-focused venture capital fund
Black Gen Capital (BGC) is a 100% minority-owned student investment fund in the US
Student Investment Funds exist at: University of Waterloo, University of New Brunswick, Mount Royal University and many more Canadian Universities (honestly, I didn’t even know these last two places had universities—aside: maybe this is why anti-Toronto sentiment runs high in small town Canada!)
Someone wrote How to Start and Run a Student-Managed SRI Fund (I only skimmed it, but it has 37 footnotes, so they probably did way more research than I’m capable of)
Questions you could explore:
Do such student-run funds also exist in Europe, Africa, Asia-Pacific and Latin America?
How would an EA who is a student join such a fund? Are they only for finance students?
Are there any EAs already involved in these funds? If not, would outreach in this direction be cost-effective?
What would be required to start a Longtermist Student-Managed Fund at University X? Which universities would be good candidates? A ranking table might reveal UNB and MRU to be contenders after all…
I’d be keen to hear about this as well.