Why I will not be participating in the draft amnesty.
When you make a post on the Effective Altruism (EA) Forum, you can edit your post, you can move your post back into draft, but you cannot delete your post.
When you make a comment, after a period of time, you cannot delete your comment at all. You can however edit it, or retract it.
When you send a message, you can either edit or delete it.
There is room to strengthen the pseudonymity offered to potential contributors. This would encourage greater participation. In particular for this community, there will be a number of people who’s advocacy for human rights, or criticism of regimes, would make them a target in places they work, live or visit.
For example, if I am an expert on malaria and I want to travel to certain parts of Africa for work but I have promoted LGBT inclusion in the EA community, and my identity were to be leaked, I could be targeted by criminal groups for my identity.
If I have written critically about authoritarianism and visit certain Gulf countries to raise funds for my startup, I could be arrested at the border.
We live in a world with increasing harvesting of data, stylometric correlation that unmasks users, and powerful artificial intelligence systems used by regimes around the world. The Overton window about what is acceptable is rapidly changing and it is unclear what direction it might go.
It is irresponsible for the EA Forum not to provide the easiest and best opportunities for users to preserve their anonymity while participating in the community. This is not available through big Tech social media websites, where the options are unsend after an indefinite timeframe or delete data is rarely offered, and if taken up, does not stop those companies forming a back end snapshot about people that don’t even have profile. The EA forum does not have that limitation because it is not aiming to maximise profits by on selling data to advertisers or government agencies.
There is a tail risk that data about AI safety advocates may be particularly valuable.
Draft amnesty can achieve its objectives better if it bundled with ephemerality.
Hi 98373, I run the EA Forum team so I thought it would be helpful to respond.
First off, I’m always happy to hear feedback about the Forum, especially critical feedback, so thank you!
I’ll also say that I think not participating in any specific Forum event or even not using the Forum at all is very reasonable, so I’m not here to try to convince you otherwise. Things are not always a good fit for all people.
But I would like to correct the places where I think your post is misleading:
“you cannot delete your post”
You can archive any draft post, which effectively deletes it from the site. That doesn’t delete the text from our database, but you can always just contact our team and ask us to do so.
“When you make a comment, after a period of time, you cannot delete your comment at all.”
I think what you’re referring to is that users can’t delete their comment when it has a reply. There is no time component.
This was a decision made before my time, but my guess is that this is just a complicated case due to needing to weigh the interests of many users (including the respondents) — as above, you’re welcome to contact our team if you want to delete such a comment and we can discuss on a case-by-case basis.
It’s also very possible that you’ve run into a bug. We have a small team and there are plenty of issues to fix. Our codebase is open source so you’re welcome to take a look at how this is implemented.
TL;DR if you’d like to do something with your Forum data that you can’t do via the site, you can just contact us and ask us to help.
I think we do pretty well at this given our very limited resources. To make a Forum account, you only need to give us an email address and come up with a username, neither of which need to be closely associated with your real identity. So I’m not quite clear on what threats you are thinking of that can’t be addressed by current systems — in all your examples there are pretty straightforward ways to disassociate your Forum account with your real identity. (As above, if you want help changing your account username or email address you can just contact us.)
We have orders of magnitude less resources than most comparable apps (for example, last year we spent less than 2 FTE on the Forum). We are a nonprofit, and have a responsibility to use our charitable dollars thoughtfully. As a very small team, we certainly won’t have state-of-the-art secure software. I hope anyone who is concerned their software usage putting them in physical danger will default to not providing identifiable personal information to websites, especially in a case like the Forum when it’s clear that your username will be publicly visible and discoverable by search engines. For example, don’t use an identifiable username or email address when creating a Forum account.
For anyone who has uncertainties, I think it’s very reasonable to not even create a Forum account. There are also many ways to engage with the EA community outside of the EA Forum, for example via in-person groups, conferences, or other programs.
Thanks for this post, I had no idea posts or comments couldn’t be deleted after a certain time. This seems like a pretty major privacy flaw, and as someone who cares strongly about digital privacy it certainly makes me more hesitant to post or comment in the future.
As far as I know though, it’s still possible to change the username and email attached to your activity? This could be a way of anonymizing yourself, although of course this isn’t perfect and doesn’t let you take action on specific posts or comments.
It also seems a little ironic that a community that is so concerned with future AI capabilities could seemingly ignore one of its dangers and make it impossible to remove personally identifiable data, given that in a world with powerful AI systems personal data could potentially be harvested significantly more efficiently than they are now.
I love the EA Forum but I’m personally quite glad I chose not to use my real name on here...