How the CBT Lab Amplifies Effective Altruists’ Impact

Our 2024 CBT Lab Peer Support program helped participants feel and perform better in just 12 weeks. Our effectiveness evaluation suggests that participants complete the program with notably improved productivity and mental health. As the program targets individuals dedicated to making the world better, we have good reason to believe that the personal benefits they receive amplify their altruistic work.[1] For example, by enabling those in high-impact roles to boost their impact and helping those who earn to give to increase their income potential, the CBT Lab Peer Support program has positive ripple effects on the cause areas our participants serve.

First, a quick note on the value of marginal donations to us

Rethink Wellbeing has previously received funding from the Mental Health Funding Circle and Open Philanthropy, among others, to help us develop and deliver our services. In 2025, the running costs for the CBT Lab Peer Support program can be covered with 150 paying participants per year. However, we believe that cost should not be a barrier to accessing mental health support. While some EA organizations provide personal wellbeing budgets to support their employees, many smaller promising organizations can’t afford to do so. Given the impact-amplifying effect of our program, we want it to be accessible to as many ambitious altruists as possible.

In 2024, we supported over 75 changemakers to join our programs at low or no cost. With $30,000 in donor funding, we can achieve this again. We are a multiplier: by supporting us, you empower ambitious altruists to increase their wellbeing and productivity, increasing their impact on the world’s most pressing problems.

  • $30 funds one hour of guided support for an ambitious altruist.

  • $350 provides a free full program place for an ambitious altruist without the means to otherwise participate.

Please consider donating to help us keep our no-cost offer. If you would like to fund program places for a specific cause area, please contact us to discuss your idea.

Read on to find out more about how our programs amplify participants’ impact.

Introduction

The CBT Lab Peer Support program is an online mental health support program by Rethink Wellbeing designed to help ambitious altruists tackle psychological challenges often faced when combating the world’s most pressing issues, including perfectionism, procrastination, and existential dread. In this year’s iteration, participants engaged in 8 weeks of interactive sessions, with follow-ups in weeks 12 and 16. Through weekly group sessions and personalized exercises, participants built practical skills to navigate personal and professional challenges with more confidence and resilience.

Last year, 42 ambitious altruists joined our first program.[2] This year, we’ve iterated our flagship program, replicated its positive outcomes, and added two more programs to broaden our reach. We’ve welcomed ~150 ambitious altruists to our programs in 2024, including employees at Giving What We Can, the Charity Entrepreneurship Incubator, and the AI Safety Institute. 59 altruists participated in our renewed CBT Lab Peer Support program, while the rest joined our CBT Lab Simple Support format or our brand new Internal Family Systems (IFS) Lab.

Participants recommend the CBT Lab Peer Support program

85% of CBT Lab Peer Support participants are mostly or very satisfied with the program (N=104) and 84% would recommend the program to a colleague or close friend (N=56).[3] Notably, 1 in 5 participants felt even more committed to the ideas of Effective Altruism due to the program, and none reported a drop in commitment. These results mirror the success of the 2023 CBT Lab, as demonstrated by the charts below.

The graph titled, "Program Satisfaction" shows participants' responses to the question "How satisfied are you with the program overall at this point?". N=104. 28.8% responded "Very satisfied"; 55.8% "Mostly satisfied"; 12.5% "Indifferent or mildly dissatisfied"; 2.9% "Quite dissatisfied"; and 0.0% "Very dissatisfied".
The graph titled, "Program Recommendation" shows participants' responses to the question "Would you recommend taking part in our program?". N=56. 32.1% responded "Yes, definitely"; 51.8% "Yes, I think so"; 16.1% "No, I don't think so" and 0.0% "No, definitely not".

Here’s how a participant summed up the program:

“The content was excellent: evidence-based, digestible, and very action-oriented. The support from my peers, especially my facilitator, helped me feel seen and heard. Having scheduled time to connect and share progress was very motivating and brought a sense of belonging.”

You can find more testimonials on our website.

How effective is the CBT Lab Peer Support program?

To measure the program’s effectiveness, we used a standardized self-assessment that employs common psychometric scales.[4] Participants filled out a 30-minute Wellbeing Evaluation Survey once before the program and thrice more over 16 weeks. We also used a shorter Weekly Program Evaluation to enable real-time improvement.

Our findings show that participants who completed the CBT Lab feel and perform significantly better.[5] Key outcomes include:

  • Productivity gains equivalent to 8 additional weekly working hours, effectively unlocking an extra workday.[6]

  • Greater wellbeing increase (plus ~1 point on a 0-10 WELLBY scale) than from becoming partnered (+0.59) or finding employment (+0.70).[7]

  • Better mental health through a decrease in symptoms of depression, anxiety, and executive function by 12-23% and an increase in positive functioning by 29%, which includes feelings of engagement, perseverance, optimism, connectedness, and happiness.

This figure shows the average change that 48 participants reported in all of the variables we measured in the form of standardized scores. Between week 0 and week 12, Positive Mental Health increased by 0.5 standard deviations from the mean; Mental Health Burden reduced by 0.36 standard deviations from the mean; and Productivity increased by 0.39 standard deviations from the mean.

This figure shows the average change that 48 participants reported in all of the variables we measured in the form of standardized scores. Higher values represent higher improvements in the pre-post comparison (week 0 versus week 8 versus week 12). The z-score is measured in terms of standard deviations from the mean. If a z-score is 0, it indicates that the data point’s score is identical to the mean score. A z-score of 1.0 would indicate a value that is one standard deviation from the mean. z-scores may be positive or negative, with a positive value indicating the score above the mean and a negative score below the mean.

The sample size for these results is N=48.[8] Participants who dropped out of the program did not complete further Wellbeing Evaluations so we cannot include them in our effectiveness evaluation. This may produce a bias towards higher effectiveness. Another limitation of our study is that participants were allocated based on their preferences rather than randomly. Despite these limitations, we believe the results remain both suggestive and realistic[9]. We hope that independent organizations will attempt to replicate our findings by employing our programs themselves. Stay tuned for more details on this.

When discussing concerns, quantification and improvements, it is important to mention that we do not emphasize productivity as a goal throughout the program, because obsessing about impact is often a contributing factor to poor mental health.

Could our $550 program be worth thousands?

The case for helping ambitious altruists unlock additional resources is compelling, as it likely translates into greater good achieved in the world.

As promised in our 2023 Impact Report, we used this year’s data to re-test our cost-effectiveness. An 8-hour weekly productivity gain sustained for a full year represents $8,900-$12,100 for the average US college graduate.[10] This is around 20x the full cost of the program. We do not control how participants apply their productivity gains, but our participant base of altruists who are dedicated to improving the world gives us good reason to believe that we are an impact multiplier. For example, if even one in ten participants donates their extra earnings to a GiveWell Top Charity, they would save 2-3 lives for a comparative program fee spend of $5,500.[11]

A potential positive effect that is harder to quantify is EA member retention. As estimated in 2023, one in three people in the EA community experience poor mental health but we don’t know how many of them will become less engaged or leave as a result.[12] 20% of participants report feeling more committed to EA after completing the program. In 2025, we’d like to understand this impact further and consider how to prioritize this effect relative to other program effects such as wellbeing and productivity benefits for highly engaged EAs.

We would like to invite individuals with a background in data analysis to independently examine the data, as well as those interested in using it for a thesis or scientific publication.

Room for improvement

In addition to evaluating each program, we leverage our research to identify opportunities to improve our effectiveness through a combination of quantitative and qualitative feedback. Our Weekly Program Evaluation, conducted during program weeks 1-8, measures effectiveness factors chosen for their potential to predict positive outcomes of mental health programs.

In weeks 2-6, our participants rated all effectiveness factors >7, with 10 representing 100% effectiveness, except Home Practice. There is also a slight upward trend for the effectiveness factors overall between weeks 1 and 6. This data suggests that the sessions and the facilitation improve over time.

This image is of a graph which shows how participants' average 0-10 scores of 8 effectiveness factors changed over time (weeks 1-8 of the program).

Home Practice is the only effectiveness factor trending downward, which may reflect the waning initial excitement of joining the program. Participants’ qualitative feedback indicates that we can support future participants to establish stronger home practice habits by improving the CBT Lab Playbook provided at the program’s start. Specific improvements include:

  • Offering practical examples of how and when different techniques can be applied.

  • Providing recaps to simplify content review.

  • Suggesting priority tiers for home practice and reading so that participants who are short on time can focus on the most valuable tasks.

As we prepare for 2025, we’ll use feedback like this to iterate our program content and materials, making participants’ journeys even better and more impactful next year.

You can multiply your impact too

Thank You

.… to our participants , facilitators, advisors and funders for making our mission possible. And to those who provided valuable suggestions to improve this post including Báo, Charlie O’Donohue and Justis Millis.

Last but not least, take good care of yourself and the people around you!

  1. ^

    Rethink Wellbeing’s programs are marketed in EA-associated Slack groups, Facebook groups, and on the EA Forum. In addition to this, involvement in altruistic work is a program acceptance criterion.

  2. ^

    Read our 2023 Impact Report on the EA Forum

  3. ^

    N is higher than the number of participants because participants are invited to provide feedback at multiple points.

  4. ^

    You can read more about our Program Effectiveness Evaluation Measures

  5. ^

    The changes in our primary outcome measures were all statistically significant (p<0.05) and we are reasonably confident they are not due to random chance. We used repeated measures ANOVA to test significance.

  6. ^

    This calculation is a sum of 2.7 more hours worked on average per participant plus ~19% self-assessed subjective productivity increase (+5.2h) during working hours (WPAI:GH).

  7. ^

    According to previous cross-sectional studies, which also note that “[m]ental health is the most important single factor explaining the variation in the happiness of the population” (ibid.).

  8. ^

    N is controlled for only those who completed Wellbeing Evaluations in weeks 0 and 12. Of the 59 participants who started the CBT Lab Peer Support program, 8 dropped out and 3 did not complete the Wellbeing Evaluation in week 12. This drop out rate is similar to that of 1:1 psychotherapy (Fernandez et al., 2015; 1:1 therapy 16%, e.g., see Ong et al., 2018).

  9. ^

    The effect sizes are comparable to professional 1:1 therapy in multiple meta-analyses, e.g., Baumeister et al., 2014.

  10. ^

    We work with the underlying assumption that productivity is related to outcomes such as salary and quality of work. Based on working 48 weeks, an 8-hour-per-week increase would result in 384 additional hours worked. According to the U.S. Bureau of Labor Statistics, the median and mean hourly wage for U.S. workers in May 2023 was $23.11 - $31.48 respectively. We have applied a range because a large majority of participants are degree-educated or higher which correlates with earning above the median wage however they also skew to early/​mid-career. This equation is most applicable for freelancers, students, and employees in EA organizations.

  11. ^

    GiveWell’s top charities estimate an average cost-effectiveness of $3,000 to $5,500 per life saved.

  12. ^

    EA mental health & productivity survey; see also recent elaborations like this or this post.