Founder of Overcome, an EA-aligned mental health charity
John Salter
This seems like something western militaries might be interested in buying to defend against bioweapons. You might be able to use Anduril as a middle-man and maybe bag yourself a couple million dollars to fund your work? I think I know someone who could introduce you to their CEO. DM me if you’re interested!
Consider reaching out to AIM.
https://www.charityentrepreneurship.com/effective-giving-incubation
They’re doing something similar, but seem to have broken it down by country rather than by faith. They might even have some people who narrowly missed out on their incubation program who’d make a good co-founder.https://www.eaforchristians.org/
EA for Christians might also have learnt some useful stuff they could share
This seems like a really promising idea—good luck with it!!!
The vast majority of psychotherapy drop-out happens between session 1 and 2. You’d expect people to give it at least two sessions before concluding their symptoms aren’t reducing fast enough. I think you’re attributing far too larger proportion of drop-out to ineffectiveness.
I run another EA mental health charity. Here are my hastily scribbled thoughts:
Firstly, why did you opt to not have a control group?
When psychotherapy interventions fail, it’s usually not because they don’t reduce symptoms. They fail by failing to generate supply / demand cost-effectively enough, finding pilot and middle stage funding, finding a scalable marketing channel, or some other logistical issue.
Given that failing to reduce symptoms is not that bigger risk, we and every other EA mental health startup I can name did not use a control group for our pilots. Doing so would increase the cost of recruitment by ~10x and the cost of the pilot by ~30% or so.
The #1 reason is that so long as you’re using an evidence-based intervention, cost explains most of the variance in cost-effectiveness.
Secondly, isn’t it a massive problem that you only look at the 27% that completed the program when presenting results? You write that you got some feedback on why people were not completing the program unrelated to depression, but I think it’s more than plausible that many of the dropouts dropped out because they were depressed and saw no improvement
It’s also possible that they started feeling better and they didn’t need it any more. IMO, this is a little tangential because most dropout isn’t much to do with symptom reduction, it’s more to do with:
1 - (A lack of) Trust in and rapport with the therapist2 - Not enjoying the process
3 - Not having faith it will work for them
4 - Missing a few sessions out of inconvenience and losing the desire to continue
It’s somewhat analogous to an online educational course. You probably aren’t dropping out because you aren’t learning fast enough; it’s probably that you don’t enjoy it or life got in the way so you put it on the back-burner
...[likely] many of the dropouts dropped out because they were depressed and saw no improvement. This choice makes stating things like “96% of program completers said they were likely or very likely to recommend the program” at best uninformative.
This is good point. These statistics are indeed uninformative, but it’s also not clear what better one would be. We use “mean session rating” and get >9/10, which I perceive as unrealistically high. Presumably, this would have gotten around the completer problem (as we’re sampling after every session and we include dropouts in our analysis), but it doesn’t seem it to have. I think it might be because both our services are free, and people don’t like to disparage free services unless they REALLY suck.
If there was a marketplace where you could make bets like this in a low friction way, without much risk besides simply being wrong, would you use it? Please agree-vote for yes, disagree-vote for no
I’d personally strongly consider betting $1000-$10,000 USD so long as it’s secured against the value of some illiquid asset (e.g. a building). Please DM me if you’re interested in betting that the world ends.
What an amazing accomplishment!
Tiny core team costs
Being cost-effective AF year one
Credible plans for becoming financially independent of EA when operating at scale.
Reading this has made my morning.
Strongly upvoted. Replications are really underrated, and so is sharing your code so people can check your work!
If you’re using it enough, you’re probably costing them money. Using the everliving shit out of it would be better for you and for screwing openAI, if you’re that way inclined
Can you tell us a little more about your most promising and median fellows? How old are they? What are your selection criteria and how did they fare against them?
EA forum posts used to be written much worse than they are now in terms of conciseness, clarity, and ease of understanding. I believe a series of posts came out mocking it / arguing against it, and shortly afterwards writing habits changed.
I don’t know much about being hired by organisations, but I know a lot about starting them.
This answer assumes that you’re very ambitious, and want to do something big and world-changing. Your odds are much better if your aims are more modest. It’s also just my opinion as a guy whose started a few organisations. I’ve been wrong before, and I will be wrong again.
----
I’m going to tell you some negative stuff, and then suggest a path to getting what you want anyway.On paper, you aren’t suitable to start a charity. Most charities fail (+80%). On the balance of the evidence presented, it’s more likely you’d be among the 80%:
Being a charity founder usually takes a heroic amount of work over a decade, maybe 60Hpw on average for ~48 weeks per year for ~5-10 years. This will be hard to do with ME.
Social anxiety is debilitating to a startup founder because it involves being rejected 19 times for every one time someone says yes. You’ll have to manage teams, fire under-perfomers, explain to funders why your plan fails and why they should fund you again anyway. A great founder will nonetheless make mistakes that will cost people their jobs, waste months of their time, hurt their collaborators and beneficiaries
Given the lack of relevant experience, it’s going to take you longer to get started. You lack the resumé you’d need to get hired or get funding to do the work you’re interested in in EA right now.
The good news is that most successful founders seemed to have some crippling flaws that’d make them a bad fit. They find a way around it. It’s possible that you could to:
You could find people to supplement your lack of energy if you can successfully outsource or delegate the more draining parts of the work
Social anxiety is treatable, even curable, given sufficient effort (2Hs weekly for ~12 weeks).
Free treatments exist for EAs. Rethink Wellbeing offers free counselling for Effective Altruists, in a group setting. I would be surprised if they didn’t offer bespoke support for social anxiety. Overcome, my charity, also offers it but one-to-one (caveat: it’s not as bespoke to EAs).
If you can get a project off the ground and make good progress people will stop giving a shit about your lack of qualifications. I’d suggest working on the project linked on your medium article, create a Minimum Viable Product, and writing up the results on the forum or wherever your most likely collaborators / funders hang out.
The above is likely quite helpful for getting a relevant job too. Field-relevant accomplishments will help you get an interview. Being less socially anxious will help you come across better. Having strong systems of outsourcing / delegation would make it more credible to employers that your ME isn’t going to undermine your performance.
I hope this is helpful.
More information needed:
What is your knowledge level?
Are there specific areas within the non-profit sector you are particularly interested in?
How many hours per week are you able to commit to volunteering or part-time work?
Do you have any previous work or volunteer experience, even if not directly related to non-profits?
What are your strengths?
What are your salary expectations?
NSFW: How elon responded: https://twitter.com/elonmusk/status/1783989456414085339/photo/1
Potentially self-funding organisations strike me as neglected within EA
I think we have the exact opposite problem. When I see the budget of many big EA orgs I throw up in my mouth, thinking about how many smaller charities the money could have funded.
Maybe I’m wrong. Got any evidence that larger EA orgs are most cost effective then smaller ones?
I view our hiring process as a constant work in progress, and we look back at the application process of everyone after their time with us, the best and worst performers alike, and try figure out how we could have told ahead of time. Part of that is writing up notes. We use chatgpt to make the notes more sensitive and send them to the applicant.
Caveat: We only do this for people who show some promise of future admission.
Have also tried this, although most our applicants aren’t EAs. People who reapply given detailed feedback usually don’t hit the bar.
We still do it, in part because we think it’s good for the applicants, and in part because people who make a huge improvement attempt 2 usually make strong long-term hires
Are you open to blunt constructive feedback?