I’ve been interested in EA Entrepreneurship for a while as I find myself being a bit bonkers and seriously considering going for a pretty risk-neutral approach to my life. I do, however, feel that if I’m going to gamble a stable life for a much more volatile approach, then I would want to do it where I believe my EV is highest (as I’m an expected utility machine). I would consider myself a longtermist, and I would want to work mainly within AI safety or other emerging technology fields. The primary actor in the area, Charity Entrepreneurship is, unfortunately, for me, focused on more short-term risks, which is understandable as the tractability of creating short term projects is a lot higher than longer-term projects. Yet the EV is a lot higher for longtermist projects, so I want to see a longtermist entrepreneurship incubator.
Something similar has previously existed with a long-term entrepreneurship incubator that ran in 2020, and if one looks at the post, it is a rigorous idea. The founders of LE quit this project because of more impactful opportunities, the tractability difficulty, and the inherent knowledge level required to run an incubator. Yet an underlying current of it all is that they tried to create a longtermist incubator, a vast umbrella term for many different projects. Longtermist entrepreneurship incubation is very difficult as many resources are required to run successfully. Yet if one were to use the structure and ideas of Charity Entrepreneurship on specific subfields such as AI safety with “AI Safety Entrepreneurship”, it could make the project a lot more tractable as the previous LE incubator creators also thought. Below are some arguments for why I think so and some creation pathways.
Specificity
Currently, CE is an organisation with a lot of weight on its shoulders, a large portion of potential EA entrepreneurs have thought about applying there. Yet, there is a considerable variation of EA cause areas, and especially longtermist areas are pretty divergent in aptitude needs. If one were to create specific subareas such as “AI Safety Entrepreneurship” or “Biorisk Entrepreneurship”, this would enable faster partner compatibility checking and more relevant idea planning and aptitude screening. It would, in short, increase the probability of projects succeeding.
Enabling more projects
An LE incubator would most likely enable the founding of more longtermist projects. There is a massive need for this within the AI safety field as an example. Only a few actors currently have the ability to ensign AI safety researchers, and there are many more potential AI safety researchers than there are places for them. If one were to make an incubator for AI safety, one could create 5-100 jobs more a year by creating new projects (don’t quote me). Incubation would almost certainly increase the productivity of the AI safety field. There is, of course, the problem of making sure that each project has the experience needed and that things work out, but if one could replicate CE, this would increase the output in the AI safety field.
There is also an underlying thought in AI safety that shallow knowledge does not generate value and that only deep knowledge (the underlying causal reason behind processes) does. However, shallow knowledge creation can be counteracted at the idea searching stage as it is the ideas that generate a shallow or deep focus and not the individual’s knowledge level. Deep idea searching might decrease the probability of finding someone with a good knowledge level. Yet, a project like an AI safety incubator would enable more people to go in divergent paths and increase the probability of generating more deep knowledge.
Steps to creation
If one were to try and create this then it might be difficult to see any pathways forward. It is very difficult to create it as an island without any support so the following are some pathways that someone interested in starting a LE incubator could check out.
One pathway is to try to contact CE and branch out from the current CE and have CE create more teams and apply the same organisation they have so far but in a more specialised manner. The expansion would be to an extent on CE to organise, and intuitively it feels like they have a lot on their plate and that it wouldn’t happen in a couple of years but I haven’t talked to them about it so go ahead!
If one were to try to start an AI safety variant as a test run I believe that asking for support from Non-Linear is a good idea. Non-Linear could be an excellent starting point as Kat Woods, president and co-founder of Non-Linear, also co-founded CE and they’re actively looking to improve the efficiency of the AI safety metaspace.
As—Alex—pointed out in the comments, there are a lot of student-focused incubators, and if EAs were to create one at a university, then this could be a good experiment to see if it works. It could at least be interesting to find out if it’s a cost-effective cause or not and it can potentially lead to more longtermism focused incubators.
Summary
It is a good idea to replicate Charity Entrepreneurship’s model but for specific areas of longtermism such as AI safety or Biorisks. It would build on the ideas of a previously tried project but increase the specificity of current incubators and increase the amount of EA projects, decreasing the oversaturation of CE applicants. Depending on the implementation, this could generate a lot of value.
Final remarks
I don’t think I’m a good fit for starting this as it requires experience as it is with many jobs, but if you want help with starting it, I’ll do what I can. Other than that, I will update the steps to creation part of the post if you people have any other ideas of how this could start. I’d love to be wrong about this variation of LE being a better approach, but I would love to be right even more. However, that shouldn’t stop you from providing feedback, I feel a bit like a lost duckling among emus, and it is common knowledge that emus are scary, so I need guidance.
I’ve also kept this article a bit shorter than I could have made it. As they say, there is beauty in the absence of embellishment.
Field-specific LE (Longterm Entrepreneurship)
Background
I’ve been interested in EA Entrepreneurship for a while as I find myself being a bit bonkers and seriously considering going for a pretty risk-neutral approach to my life. I do, however, feel that if I’m going to gamble a stable life for a much more volatile approach, then I would want to do it where I believe my EV is highest (as I’m an expected utility machine). I would consider myself a longtermist, and I would want to work mainly within AI safety or other emerging technology fields. The primary actor in the area, Charity Entrepreneurship is, unfortunately, for me, focused on more short-term risks, which is understandable as the tractability of creating short term projects is a lot higher than longer-term projects. Yet the EV is a lot higher for longtermist projects, so I want to see a longtermist entrepreneurship incubator.
Something similar has previously existed with a long-term entrepreneurship incubator that ran in 2020, and if one looks at the post, it is a rigorous idea. The founders of LE quit this project because of more impactful opportunities, the tractability difficulty, and the inherent knowledge level required to run an incubator. Yet an underlying current of it all is that they tried to create a longtermist incubator, a vast umbrella term for many different projects. Longtermist entrepreneurship incubation is very difficult as many resources are required to run successfully. Yet if one were to use the structure and ideas of Charity Entrepreneurship on specific subfields such as AI safety with “AI Safety Entrepreneurship”, it could make the project a lot more tractable as the previous LE incubator creators also thought. Below are some arguments for why I think so and some creation pathways.
Specificity
Currently, CE is an organisation with a lot of weight on its shoulders, a large portion of potential EA entrepreneurs have thought about applying there. Yet, there is a considerable variation of EA cause areas, and especially longtermist areas are pretty divergent in aptitude needs. If one were to create specific subareas such as “AI Safety Entrepreneurship” or “Biorisk Entrepreneurship”, this would enable faster partner compatibility checking and more relevant idea planning and aptitude screening. It would, in short, increase the probability of projects succeeding.
Enabling more projects
An LE incubator would most likely enable the founding of more longtermist projects. There is a massive need for this within the AI safety field as an example. Only a few actors currently have the ability to ensign AI safety researchers, and there are many more potential AI safety researchers than there are places for them. If one were to make an incubator for AI safety, one could create 5-100 jobs more a year by creating new projects (don’t quote me). Incubation would almost certainly increase the productivity of the AI safety field. There is, of course, the problem of making sure that each project has the experience needed and that things work out, but if one could replicate CE, this would increase the output in the AI safety field.
There is also an underlying thought in AI safety that shallow knowledge does not generate value and that only deep knowledge (the underlying causal reason behind processes) does. However, shallow knowledge creation can be counteracted at the idea searching stage as it is the ideas that generate a shallow or deep focus and not the individual’s knowledge level. Deep idea searching might decrease the probability of finding someone with a good knowledge level. Yet, a project like an AI safety incubator would enable more people to go in divergent paths and increase the probability of generating more deep knowledge.
Steps to creation
If one were to try and create this then it might be difficult to see any pathways forward. It is very difficult to create it as an island without any support so the following are some pathways that someone interested in starting a LE incubator could check out.
One pathway is to try to contact CE and branch out from the current CE and have CE create more teams and apply the same organisation they have so far but in a more specialised manner. The expansion would be to an extent on CE to organise, and intuitively it feels like they have a lot on their plate and that it wouldn’t happen in a couple of years but I haven’t talked to them about it so go ahead!
If one were to try to start an AI safety variant as a test run I believe that asking for support from Non-Linear is a good idea. Non-Linear could be an excellent starting point as Kat Woods, president and co-founder of Non-Linear, also co-founded CE and they’re actively looking to improve the efficiency of the AI safety metaspace.
As—Alex—pointed out in the comments, there are a lot of student-focused incubators, and if EAs were to create one at a university, then this could be a good experiment to see if it works. It could at least be interesting to find out if it’s a cost-effective cause or not and it can potentially lead to more longtermism focused incubators.
Summary
It is a good idea to replicate Charity Entrepreneurship’s model but for specific areas of longtermism such as AI safety or Biorisks. It would build on the ideas of a previously tried project but increase the specificity of current incubators and increase the amount of EA projects, decreasing the oversaturation of CE applicants. Depending on the implementation, this could generate a lot of value.
Final remarks
I don’t think I’m a good fit for starting this as it requires experience as it is with many jobs, but if you want help with starting it, I’ll do what I can. Other than that, I will update the steps to creation part of the post if you people have any other ideas of how this could start. I’d love to be wrong about this variation of LE being a better approach, but I would love to be right even more. However, that shouldn’t stop you from providing feedback, I feel a bit like a lost duckling among emus, and it is common knowledge that emus are scary, so I need guidance.
I’ve also kept this article a bit shorter than I could have made it. As they say, there is beauty in the absence of embellishment.