Applications are open for CFAR workshops in Prague this fall
tldr: CFAR’s running updated versions of our mainline rationality workshop this fall in the Prague area. Apply here.
===
CFAR workshops are back! After a nearly three-year hiatus during COVID, we’re going to be running a series of four workshops this fall in the Czech Republic.
So what is a CFAR workshop again?
These workshops are four-and-a-half-day intensive retreats that are designed around the rationality techniques CFAR developed in its 10+ year history. These techniques vary: some we pulled straight from academic literature (looking at you TAPs), others we adapted from outside practices (Gendlin’s Focusing), while some we developed wholesale (hello Double Crux, Goal Factoring, and all that which is still unnamed). Our goal with this workshop is to create an environment in which participants are able to gain new insight into their minds and their decision-making processes.
Why would you come to a CFAR workshop?
Because you want to! That’s the most important part.
But maybe also you’ve got a sense of stuckness, and a new frame or tool might help you shift. Maybe it’s exactly the opposite of stuck; you’re moving and changing, and you want to be deliberate about how you move forward. Maybe you’re just intrigued and excited about this thing you’ve heard about from your friends. No guarantees from us on the result, but folks often walk away with insight, new fascinating friends, and some outstanding ideas. Most participants answer that they were glad they came (the only trick is figuring out how to select yourself into the correct reference class.)
We think that the workshop would be particularly valuable if you:
Are enthusiastic about rationality and you want to nerd out
Have an exploratory/playful/experimental mindset—you like to try things!
Are looking for an environment where you can have open and honest conversations about what you care about
A caveat, about this workshop...
It’s experimental. Which, if you’re familiar with CFAR’s history, is true of much of our workshops. If the material isn’t alive and real and practiced by our instructors, we don’t believe it will be for participants. And since it’s been a significant nearly-3-years since our last workshop, we and our beliefs have shifted a bit. It’ll still be fairly recognizable for those familiar with our applied rationality, but we want to set your expectations. The most notable shift is we’re saying goodbye to our problem-oriented approach (colloquially known as “bugs”) and focusing on the perspective of asking: what do you actually want and how can we move toward that? We’ll also be incorporating a holistic approach to your goals and needs, an understanding of the limits of improved productivity, an orientation to uncertainty, and more.
Where and when?
There will be four workshops this fall in Prague, Czech Republic (precisely: in a village about 90 min from Prague). All workshops start at 6pm on arrival day and finish with breakfast at 9am on departure day.
September 29 - October 4
October 20 - October 25
November 3 - November 8
November 16 - November 21 (to be confirmed)
There’s also going to be the Prague Fall Season (more info soon!), full of fun and interesting events that will be happening concurrently. Consider coming early or staying late if you want to spend some time taking part in that or just exploring.
If going to Prague is too inconvenient or expensive for you, the odds of spring 2023 workshops in California are higher than ever (but no promises at this point).
How do I get in on this?
Apply here. We’ll be processing applications on a rolling basis. We’re aiming to get back to everyone within 2 weeks of their application. Depending on the demand, we might ask you to provide more information later in the process. You’ll want to apply through the google form even if you’ve previously shown interest via our rationality.org website.
How much does it cost?
It’s free! Partly due to the new and experimental nature of the workshops, but moreover, we want to shift from a stance of, “Hey we’re offering you a product” to more of a collaborative spirit where everyone at the workshop is in it together, building something beautiful.
The team is full of optimism, excitement, and new ideas. Come join us.
- Announcing: the Prague Fall Season by 29 Jul 2022 18:31 UTC; 163 points) (
- The Conversations We Make Space For by 28 Jul 2022 21:36 UTC; 22 points) (
- The Conversations We Make Space For by 28 Jul 2022 21:37 UTC; 21 points) (LessWrong;
- 23 Feb 2023 16:39 UTC; 14 points) 's comment on EA, Sexual Harassment, and Abuse by (
CFAR’s mistakes regarding Brent
Although CFAR noted it needed to greatly improve re. “Lack of focus on safety” and “Insufficient Institutional safeguards”, evidence these have improved or whether they are now adequate remains scant. Noting “we have reformed various things” in an old update is not good enough.
Whether anything would be ‘good enough’ is a fair question. If I, with (mostly) admirable candour, describe a series of grossly incompetent mistakes during my work as a doctor, the appropriate response may still be to disqualify me from future medical practice (there are sidelines re. incentives, but they don’t help). The enormity of fucking up as badly as (e.g.[!!!]):
Should be sufficient to disqualify CFAR from running ‘intensive’ residential retreats, especially given the ‘inner work’ and ‘mutual vulnerability’ they (at least used to) have.
I would also hope a healthy EA community would warn its members away from things like this. Regardless, I can do my part: for heaven’s sake, just don’t go.
I strongly disagree with Greg. I think CFAR messed up very badly, but I think the way they messed up is totally consistent with also being able to add value in some situations.
We have data I find convincing suggesting a substantial fraction of top EAs got value from CFAR. ~ 5 years have passed since I went to a CFAR workshop, and I still value what I learned and think it’s been useful for my work. I would encourage other people who are curious to go (again, with the caveat that I don’t know much about the new program), if they feel like they’re in a place of relative strength and can take a discerning eye to what they’re taught.
I think doctor is a really disanalogous example to use; doctors are in one of the relatively few professions where screwups regularly lead to death; we want to some somewhat risk-averse, with respect to doctors (and e.g. pilots or school bus drivers), at least if the screwups are the very dangerous kind (as opposed to like, being terrible at filing one’s paperwork), and aren’t based on a reasonable CBA (e.g. enrolling patients in a clinical trial with a drug that looked promising but turned out to be dangerous). For lots of other professions, this example looks way less compelling; e.g. I doubt people would think that a startup founder or movie director or author who had a bunch of failures but also some big wins should be banned from their profession or ostracized in their community. I think in-person overnight events about psychology are in a pretty in-between risk category.
I don’t find said data convincing re. CFAR, for reasons I fear you’ve heard me rehearse ad nauseum. But this is less relevant: if it were just ‘CFAR, as an intervention, sucks’ I’d figure (and have figured over the last decade) that folks don’t need me to make up their own mind. The worst case, if that was true, is wasting some money and a few days of their time.
The doctor case was meant to illustrate that sufficiently consequential screw-ups in an activity can warrant disqualification from doing it again—even if one is candid and contrite about them. I agree activities vary in the prevalence of their “failure intolerable” tasks (medicine and aviation have a lot, creating a movie or a company very few). But most jobs which involve working with others have some things for which failure tolerance is ~zero, and these typically involve safety and safeguarding. For example, a teacher who messes up their lesson plans obviously shouldn’t be banned from their profession as a first resort; yet disqualification looks facially appropriate for one who allows their TA to try and abscond with one of their students on a field trip.
CFAR’s track record includes a litany of awful mistakes re. welfare and safeguarding where each taken alone would typically warrant suspension or disqualification, and in concert should guarantee the latter as it demonstrates—rather than (e.g.) “grave mistake which is an aberration from their usually excellent standards”—a pattern of gross negligence and utter corporate incompetence. Whatever degree of intermediate risk attending these workshops constitute is unwise to accept (or to encourage others accepting), given CFAR realising these risks is already well-established.
To build on Greg’s example, I think in normal circumstances, if eg a school was linked with a summer camp for high schoolers, and the summer camp made the errors outlined in the post linked to, then the school would correctly sever ties with the summer camp.
The mistakes made seem to me to be outrageously bad—they put teenagers in the custody of someone they had lots of evidence was an unethical sociopath, and they even let him ask a minor to go to Burning Man with him, and after that still didn’t ban him from their events (!). Although apparently little harm was done, this seems to me to have been very lucky, and if the minor had agreed (which CFAR apparently would not have prevented) this most likely would have ended extremely badly. If the minor had agreed and it had ended extremely badly, would you think that should disqualify them from running future events? If yes, why should the good fortune of the minor turning down the invitation make any difference to how we treat CFAR?
From accounts I heard later (I was not at the camp, but did hear a lot about it from folks who were), I’m basically certain CFAR would have interfered with the minor going even if the minor had agreed. Multiple CFAR staff members stepped in to attempt to prevent the minor from going (as mentioned in e.g. https://www.rationality.org/resources/updates/2019/cfars-mistakes-regarding-brent, and as I also remember from closer to time) much fuss was correctly made at the time, etc. I agree that many bad mistakes were made, then and previously and afterwards, however.
Also, after we eventually understood what the deal had been with Brent, we gave up running programs for minors. We continue to run programs for adults. My feeling is that adults should indeed not expect that we are vetting a particularly careful or safe environment particularly reliably, but that this is often not the crux for whether an adult wishes to attend a CFAR workshop.
Hi Gregory, I will be running these workshops together with John, so I’d like to respond to your comments.
I think that it is fair for you to post your warning/recommendation but as far as I can tell, today’s CFAR is quite different from the organization that you say demonstrated “gross negligence and utter corporate incompetence” in the past. You say that the evidence is sparse that anything has changed and I’m not sure about that but I’m also not the person to make that case because I’m not CFAR—I’m a CFAR developer running a project with other CFAR developers and a couple of CFAR core staff.
I can only speak for myself as one of the co-leads of this project and what I can say is that we see the skulls. They’re a bit hard to ignore since they’re everywhere! But that is exactly why we think we have enough of an understanding of what happened and how to learn from it. We are very much aware of the previous mistakes and believe that we can do better. And we want to try because we think these workshops are good and important and that we can do them well.
I generally think that people and organizations deserve second chances but ultimately it is for the people to decide. We will be fulfilling our role as event organizers by mitigating risks to our participants, to the extent that we reasonably can (based on CFAR’s past mistakes and also based on our own experience and judgment). And our participants will decide if they trust us enough to come to an immersive workshop with a bunch of other humans who they will interact with for 4.5 days.
“We will be fulfilling our role as event organizers by mitigating risks to our participants, to the extent that we reasonably can (based on CFAR’s past mistakes and also based on our own experience and judgment). And our participants will decide if they trust us enough to come to an immersive workshop with a bunch of other humans who they will interact with for 4.5 days.”
I was recently asked for recommendations of people who might like to attend. I would find it more useful to know what safeguards, if any, are now in place to avoid similar situations in the future. There feels like a big difference between consenting to attend an event with no safeguards versus one with them (EAG-style Code of Conduct, appointed persons to deal with concerns, etc.) If you have time, can you elaborate? Thanks very much.
It’s great that CFAR is running workshops again. I’ve heard great things about them and would like to join one in the future (though I probably will not join these ones in Prague).
I just want to ask why the info above isn’t reflected on the CFAR website yet? When will it be reflected there if ever? I think it would help increase the legitimacy and appeal of these workshops, rather than having people go to a Forum post with no photos.
Hi Brian, I hope that you’ll eventually be able to make it to the workshops, we certainly hope that there will be more next year.
Eventually, we would like to update the website but we have limited capacity and we were focused on getting the dates and applications out there as early as we knew they were happening so people could pencil in the dates. Now we are also running admissions, coordinating staff, working on content and logistics etc. Given all of this, it will likely take us a couple of more weeks to update.
What does this mean, please?: (the only trick is figuring out how to select yourself into the correct reference class.)
Also, I love this: The most notable shift is we’re saying goodbye to our problem-oriented approach (colloquially known as “bugs”) and focusing on the perspective of asking: what do you actually want and how can we move toward that? We’ll also be incorporating a holistic approach to your goals and needs, an understanding of the limits of improved productivity, an orientation to uncertainty, and more.