Yes you are correct I could have clarified in my reply to you that there is no RCT yet.
A CEA doesn’t have to be based on RCT data. As long as the assumption being made a clear then it is up to the reader to accept or reject those and evaluate the merit of the CEA on that basis. I think you may be confusing a (theory-based) estimation with the actual evaluation, which is our fault as I also see that this distinction is not entirely clear.
In other words, even if we did not have it in the roadmap to conduct an RCT in order to validate the assumptions used in the CEA, the CEA itself can use the terminology “control” in order to make it analytically clear that the “control” refers to a group which differs only from the “treatment” group in that it did not attend the programme.
Regarding existing studies. There are not really any that I have found. I am in talks with some of my professors here in Oxford who were looking into doing something somewhat similar, but still quite different (simple IT gig work, of the Amazon Turk type)…
More generally, the devil are in the details for programmes like ours. Only because someone, somewhere, has evaluated a X week-long training programme in IT in country Y doesn’t mean that this generalises in any meaningful way to what we are doing. Analogously, say you have a social media website in year 2005 where users can create profiles and connect with each other. Will this company be valued at $1B, $10B or be bankrupt in 10 years? It all boils down to the people, execution and many small details.
We target students right after high school in Kenya, have partner high schools (high performing schools), make use of 2-week milestone assessments/conditional cash transfers, have direct contact with western tech companies with an end-to-end pipeline from 0 to internship to job and focus on state-of-the-art MERN stack Javascript development. All of these matter for the end-impact.
For exampl, Educate! do vocational training programmes and say that their programme have measurably impacted 250,000 Educate! | Preparing youth in Africa with the skills to succeed in today’s economy. (experienceeducate.org). “Educate! tackles youth unemployment by partnering with youth, schools, and governments to design and deliver education solutions that equip young people in Africa with the skills to attain further education, overcome gender inequities, start businesses, get jobs, and drive development in their communities.”
Now their focus is very different from ours. They target a population that is unemployed. We target a population that is not unemployed (they are between high school and university).
Regarding why I focus on this. I can write a lot about the personal journey and the entire process/long form argument for it, but in short, I think it has the potential of being the most cost-effective (development) intervention there is. Why? Because the greatest alleviation of large scale suffering has historically always been grounded in a strong economy and specifically a flourishing export industry. Remote work is an export industry and we are now in a position to help upskill this industry through knowledge transfer at a very low cost (since all the info/content is out there already, we just need to structure it and match the talent with opportunity).
I agree that an RCT is not needed to create a cost-effectiveness estimate. To me when you wrote “This is an RCT” that implied that it was indeed based on RCT. That is why I suggested language such as “proposed RCT” to make it clear that the RCT is hypothetical at this point.
I agree that when it comes to programs such as educational interventions context can be very important. But isn’t it worth asking whether there is good evidence that these sort of programs are generally effective across contexts? Or to put it differently, without evidence about a specific context, what is the base rate of effectiveness?
I agree that growth of export industries has frequently been a strong source of economic development; but that doesn’t mean that we should assume that it would be easy to speed up through philanthropic activity. Effective altruism means being skeptical and humble about what is or is not realistic to achieve.
Thank you for pointing it out. I was sloppy in my wording!:)
As I mentioned earlier, I was not able to find any relevant studies with transferable insights, unfortunately. There is ample literature on primary school or secondary school interventions, or general vocational training programmes. But there’s non that
Target digital remote employment in low income counties (simply because that wasn’t feasible from an infrastructure point of view)
Do NOT target those who are already unemployed.
To be more specific. The studies you cite here are simply not relevant. It’s as if I come and suggest medical intervention X to combat diseases, and you find study A, B and C from 10 years back that used intervention Y to also combat diseases. On a very superficial level they may seem similar in the same way studying at Harvard university is similar to studying at Södertörn university. But they are hardly particularly useful in producing useful proxies for cost effectiveness of our particular intervention.
First of all, they are NOT digital skills programs. And as an aside, this type of intervention wasn’t even possible to do just 4-5 years ago because the internet infrastructure or otherwise just didn’t exist in Kenya or Ethiopia back then.
Second they are targeting currently unemployed youth or less educated youth. We do not target this group. Our intervention targets the upper segment of highly talented individuals—and most of which won’t have the resources or access to the top quality training needed to succeed.
Third, the amount of resources invested is comparatively low relative to ours. Our intervention is definitely not as scalable as some of these programs (that are intended to scale) and we instead focus on intervention targeting efficiency. That means we invest a lot more per individual and we invest much more in selection of whom we support. This is inspired by on existing research on the heterogeneity of the effectiveness of microcredit by Banerjee et al (2018), Gung ho entrepreneurs paper.
Instead, I propose that in our case, it is much more informative to look at current market data on (a) how much remote employed software engineers earn, (b) what they need to learn in order to get these jobs.
To give you some numbers, this comprehensive stackoverflow survey with 100.000 respondents from 2018 reveals that, amongst the 55% who enrolled in a coding bootcamp without already having a jobs as developers, 71.5% found jobs as professional developers within 6 months (n=6652).
With that said, we make several assumptions in the CEA and I’d love to get informed critiques of those assumptions so we can adjust and change them to be more realistic. We’ve tried our best to find good data but that itself takes time and a lot of effort. We are in the process of rolling out survey of former students and of working professionals to figure out both counter factual earnings of comparable students from earlier years from the same schools and from people who work as remote engineers. Our current estimates are in the range $300-$600/month and are based on informal surveying in both countries. Anecdotally, the ones who do get jobs have often learn the frameworks and languages themselves using pirated versions of Udemy or similar sites (even if they have CS degrees).
However, given that US software engineering entry salaries are at $12,500/month, there’s clearly ample room for potential.
I don’t think it’s a It’s not a matter of whether some of the brightest talents in Africa can compete with these 6-figure position jobs. It’s a matter of asking “what does it take” for them to get there.
Thank you for keeping the conversation going! It’s very helpful as I’m forced to flesh out my arguments. This will help prepare a long form post at some point later on!:)
Hi again Ian!
Yes you are correct I could have clarified in my reply to you that there is no RCT yet.
A CEA doesn’t have to be based on RCT data. As long as the assumption being made a clear then it is up to the reader to accept or reject those and evaluate the merit of the CEA on that basis. I think you may be confusing a (theory-based) estimation with the actual evaluation, which is our fault as I also see that this distinction is not entirely clear.
In other words, even if we did not have it in the roadmap to conduct an RCT in order to validate the assumptions used in the CEA, the CEA itself can use the terminology “control” in order to make it analytically clear that the “control” refers to a group which differs only from the “treatment” group in that it did not attend the programme.
Regarding existing studies. There are not really any that I have found. I am in talks with some of my professors here in Oxford who were looking into doing something somewhat similar, but still quite different (simple IT gig work, of the Amazon Turk type)…
More generally, the devil are in the details for programmes like ours. Only because someone, somewhere, has evaluated a X week-long training programme in IT in country Y doesn’t mean that this generalises in any meaningful way to what we are doing. Analogously, say you have a social media website in year 2005 where users can create profiles and connect with each other. Will this company be valued at $1B, $10B or be bankrupt in 10 years? It all boils down to the people, execution and many small details.
We target students right after high school in Kenya, have partner high schools (high performing schools), make use of 2-week milestone assessments/conditional cash transfers, have direct contact with western tech companies with an end-to-end pipeline from 0 to internship to job and focus on state-of-the-art MERN stack Javascript development. All of these matter for the end-impact.
For exampl, Educate! do vocational training programmes and say that their programme have measurably impacted 250,000 Educate! | Preparing youth in Africa with the skills to succeed in today’s economy. (experienceeducate.org).
“Educate! tackles youth unemployment by partnering with youth, schools, and governments to design and deliver education solutions that equip young people in Africa with the skills to attain further education, overcome gender inequities, start businesses, get jobs, and drive development in their communities.”
Now their focus is very different from ours. They target a population that is unemployed. We target a population that is not unemployed (they are between high school and university).
Here is an example of an intervention that was implemented by the same NGO, in the same country yet had 0 measured impact, despite strong RCT evidence suggesting otherwise prior to this. The Comparative Impact of Cash Transfers and a Psychotherapy Program on Psychological and Economic Well-being (nber.org). Podcast episode where I interviewed the author https://open.spotify.com/episode/6PNL8nJ5acgAWhIuVThym0?si=d780a64f12644f9a
Regarding why I focus on this. I can write a lot about the personal journey and the entire process/long form argument for it, but in short, I think it has the potential of being the most cost-effective (development) intervention there is. Why? Because the greatest alleviation of large scale suffering has historically always been grounded in a strong economy and specifically a flourishing export industry. Remote work is an export industry and we are now in a position to help upskill this industry through knowledge transfer at a very low cost (since all the info/content is out there already, we just need to structure it and match the talent with opportunity).
Happy to elaborate if you are at EAG London.
Hi Simon,
I agree that an RCT is not needed to create a cost-effectiveness estimate. To me when you wrote “This is an RCT” that implied that it was indeed based on RCT. That is why I suggested language such as “proposed RCT” to make it clear that the RCT is hypothetical at this point.
I agree that when it comes to programs such as educational interventions context can be very important. But isn’t it worth asking whether there is good evidence that these sort of programs are generally effective across contexts? Or to put it differently, without evidence about a specific context, what is the base rate of effectiveness?
A quick Google reveals Subsidizing Vocational Training for Disadvantaged Youth in Colombia: Evidence from a Randomized Trial by Attanasio et al., and The Labor Market Impact of Youth Training in the Dominican Republic: Evidence from a Randomized Evaluation by Card et al. The former study found a modest effect comparable to GiveDirectly and the latter found a small to null effect. Have you looked at these studies?
I agree that growth of export industries has frequently been a strong source of economic development; but that doesn’t mean that we should assume that it would be easy to speed up through philanthropic activity. Effective altruism means being skeptical and humble about what is or is not realistic to achieve.
Thank you for pointing it out. I was sloppy in my wording!:)
As I mentioned earlier, I was not able to find any relevant studies with transferable insights, unfortunately. There is ample literature on primary school or secondary school interventions, or general vocational training programmes. But there’s non that
Target digital remote employment in low income counties (simply because that wasn’t feasible from an infrastructure point of view)
Do NOT target those who are already unemployed.
To be more specific. The studies you cite here are simply not relevant. It’s as if I come and suggest medical intervention X to combat diseases, and you find study A, B and C from 10 years back that used intervention Y to also combat diseases. On a very superficial level they may seem similar in the same way studying at Harvard university is similar to studying at Södertörn university. But they are hardly particularly useful in producing useful proxies for cost effectiveness of our particular intervention.
First of all, they are NOT digital skills programs. And as an aside, this type of intervention wasn’t even possible to do just 4-5 years ago because the internet infrastructure or otherwise just didn’t exist in Kenya or Ethiopia back then.
Second they are targeting currently unemployed youth or less educated youth. We do not target this group. Our intervention targets the upper segment of highly talented individuals—and most of which won’t have the resources or access to the top quality training needed to succeed.
Third, the amount of resources invested is comparatively low relative to ours. Our intervention is definitely not as scalable as some of these programs (that are intended to scale) and we instead focus on intervention targeting efficiency. That means we invest a lot more per individual and we invest much more in selection of whom we support. This is inspired by on existing research on the heterogeneity of the effectiveness of microcredit by Banerjee et al (2018), Gung ho entrepreneurs paper.
Instead, I propose that in our case, it is much more informative to look at current market data on (a) how much remote employed software engineers earn, (b) what they need to learn in order to get these jobs.
To give you some numbers, this comprehensive stackoverflow survey with 100.000 respondents from 2018 reveals that, amongst the 55% who enrolled in a coding bootcamp without already having a jobs as developers, 71.5% found jobs as professional developers within 6 months (n=6652).
With that said, we make several assumptions in the CEA and I’d love to get informed critiques of those assumptions so we can adjust and change them to be more realistic. We’ve tried our best to find good data but that itself takes time and a lot of effort. We are in the process of rolling out survey of former students and of working professionals to figure out both counter factual earnings of comparable students from earlier years from the same schools and from people who work as remote engineers. Our current estimates are in the range $300-$600/month and are based on informal surveying in both countries. Anecdotally, the ones who do get jobs have often learn the frameworks and languages themselves using pirated versions of Udemy or similar sites (even if they have CS degrees).
However, given that US software engineering entry salaries are at $12,500/month, there’s clearly ample room for potential.
I don’t think it’s a It’s not a matter of whether some of the brightest talents in Africa can compete with these 6-figure position jobs. It’s a matter of asking “what does it take” for them to get there.
Thank you for keeping the conversation going! It’s very helpful as I’m forced to flesh out my arguments. This will help prepare a long form post at some point later on!:)
Best Simon