Maybe part of the problem here is that social media demands fast apologies. There could be a speed/quality tradeoff for apologies.
titotal writes:
The FLI has managed to fail at point 1 of an apology: understanding that they did something wrong.
Understanding that you did something wrong is not always an easy thing. Suppose, for the sake of discussion, that Max Tegmark is, in fact, partway down some sort of far-right pipeline. [EDIT: With the recent FAQ release, such speculation now seems unwarranted. I’m leaving the rest of this comment as it was, but please don’t assume I endorse it.] It seems unlikely that a few angry EA Forum comments are going to pull him out of whatever is going on. If you’ve ever argued politics with people online, you know they don’t budge easily.
Demanding fast apologies could cause people to optimize for the appearance of remorse, instead of taking the time to look for genuine remorse within themselves, then express what they find.
If I was an FLI board member, I might do something like:
Make an announcement that Tegmark and others involved in the Nya Dagbladet grant are on leave pending an investigation. This sends a signal on social media that the org is responsive to the concerns people have, and hopefully reduces the pressure to produce another statement quickly. It might be best to have the investigation conducted by a 3rd party, so outsiders get a credible view on what happened.
Get Tegmark & others to do an “apology steelman” exercise, spending time talking to people who are predicted to have the highest chance of persuading him that the existing statement is inadequate—including a mix of relevant domain experts and EAs. Pay those people for their time. Make conditions as favorable as possible for a genuine change of heart: communicate face-to-face (or at least via video call), spend the first 15 minutes of every conversation finding common ground through casual chitchat, maybe pull in a 3rd party mediator, probably get people involved to sign NDAs so he doesn’t feel the need to be in “PR mode”, etc. (There’s likely literature on political persuasion to consult here.)
If a change of heart occurs, Tegmark can release a second statement. He could get feedback on the draft from people involved in the steelman exercise (and people dissatisfied with the existing statement more generally). It could be that he’s able to satisfy most everyone without compromising his epistemic integrity.
In a previous comment, I mentioned that far-right ideas have created enormous suffering over the past few centuries. As an exercise, let’s consider “preventing harms from far-right ideology” as an EA cause. Some facts:
Betting markets say Donald Trump has a 14.2% chance of being US President in 2024, despite public knowledge of him hanging out with far-right thought leaders.
Newsweek says 40% of Americans still believe the 2020 election was stolen, despite heavy censorship of this claim on social media.
There’s been a lot of noise about Donald Trump over the past few years, and a lot of people have “Trump fatigue”. “Trump fatigue” might cause people to neglect that the US is actually in a very bad situation. It seems very reasonable to estimate a >10% chance the US is a far-right dictatorship within 5-10 years. If this happens, it will be a far-right dictatorship with an incredibly powerful military and intelligence apparatus, likely including all the tech company data that privacy advocates have been nagging us about.
Given the numbers above, “preventing people from adopting far-right ideology” (as I discussed in this comment) seems like a weak intervention for the “preventing harms from far-right ideology” cause area. The facts suggest there already are millions of Americans who’ve adopted far-right ideology, and existing measures haven’t exactly caused them to go away. Deconversion seems like a better intervention than prevention at this point.
Point being, if Tegmark really is partway down a far-right pipeline, it may be worth investing effort in deconverting him as a case study. [EDIT: This speculation now seems unwarranted given recent statements from FLI]
Maybe part of the problem here is that social media demands fast apologies. There could be a speed/quality tradeoff for apologies.
titotal writes:
Understanding that you did something wrong is not always an easy thing.
Suppose, for the sake of discussion, that Max Tegmark is, in fact, partway down some sort of far-right pipeline.[EDIT: With the recent FAQ release, such speculation now seems unwarranted. I’m leaving the rest of this comment as it was, but please don’t assume I endorse it.] It seems unlikely that a few angry EA Forum comments are going to pull him out of whatever is going on. If you’ve ever argued politics with people online, you know they don’t budge easily.Demanding fast apologies could cause people to optimize for the appearance of remorse, instead of taking the time to look for genuine remorse within themselves, then express what they find.
If I was an FLI board member, I might do something like:
Make an announcement that Tegmark and others involved in the Nya Dagbladet grant are on leave pending an investigation. This sends a signal on social media that the org is responsive to the concerns people have, and hopefully reduces the pressure to produce another statement quickly. It might be best to have the investigation conducted by a 3rd party, so outsiders get a credible view on what happened.
Get Tegmark & others to do an “apology steelman” exercise, spending time talking to people who are predicted to have the highest chance of persuading him that the existing statement is inadequate—including a mix of relevant domain experts and EAs. Pay those people for their time. Make conditions as favorable as possible for a genuine change of heart: communicate face-to-face (or at least via video call), spend the first 15 minutes of every conversation finding common ground through casual chitchat, maybe pull in a 3rd party mediator, probably get people involved to sign NDAs so he doesn’t feel the need to be in “PR mode”, etc. (There’s likely literature on political persuasion to consult here.)
If a change of heart occurs, Tegmark can release a second statement. He could get feedback on the draft from people involved in the steelman exercise (and people dissatisfied with the existing statement more generally). It could be that he’s able to satisfy most everyone without compromising his epistemic integrity.
In a previous comment, I mentioned that far-right ideas have created enormous suffering over the past few centuries. As an exercise, let’s consider “preventing harms from far-right ideology” as an EA cause. Some facts:
Betting markets say Donald Trump has a 14.2% chance of being US President in 2024, despite public knowledge of him hanging out with far-right thought leaders.
Newsweek says 40% of Americans still believe the 2020 election was stolen, despite heavy censorship of this claim on social media.
There’s been a lot of noise about Donald Trump over the past few years, and a lot of people have “Trump fatigue”. “Trump fatigue” might cause people to neglect that the US is actually in a very bad situation. It seems very reasonable to estimate a >10% chance the US is a far-right dictatorship within 5-10 years. If this happens, it will be a far-right dictatorship with an incredibly powerful military and intelligence apparatus, likely including all the tech company data that privacy advocates have been nagging us about.
Given the numbers above, “preventing people from adopting far-right ideology” (as I discussed in this comment) seems like a weak intervention for the “preventing harms from far-right ideology” cause area. The facts suggest there already are millions of Americans who’ve adopted far-right ideology, and existing measures haven’t exactly caused them to go away. Deconversion seems like a better intervention than prevention at this point.
Point being, if Tegmark really is partway down a far-right pipeline, it may be worth investing effort in deconverting him as a case study.[EDIT: This speculation now seems unwarranted given recent statements from FLI]