Since Alice met Emerson at an EAG, I’d like to hear what the CEA’s response to this is? I am curious how this sort of thing could be prevented in the future. Perhaps if someone who works for or owns a company meets someone they want to recruit from an EAG, there should be some protections for the young person attending the EAG (for ex- the company supplies the CEA with data about who they recruited, how much they plan to pay them, etc). I think young people attending an EAG would assume that the more senior people attending, who may be potential employers, would have been vetted and are acting in good faith. But if that isn’t the case (which clearly wasn’t here), then there is a serious problem. This is really concerning to me being someone who is currently in university, who knows young people who are eager to or who have attended EAGs, and could fall prey to people like this.
I don’t believe CEA actually has that many resources to deeply vet organizations.
If someone were interested in donating to them enough money for them to do more vetting, I wouldn’t be surprised if they would do more.
I’d expect that the funders would have done more vetting. That said, some of the EA funders now are pretty time-constrained and don’t do very deep vetting.
My guess is that this sort of thing could be prevented with increased vetting and mentorship/oversight. In some worlds, strong managers could find ways for Nonlinear to have done good work, while discouraging/preventing the bad parts of it.
But, this is pretty expensive, and I don’t think there’s a lot of enthusiasm/interest for expanding this area soon. In fairness, note that it is difficult to set up this infrastructure, and the results are often fairly amorphous (it’s hard to detect problems that aren’t happening) and long-term.
Part of the reason I think it was worth Ben/Lightcone prioritizing this investigation is as a retro-active version of “evaluations.”
Like, it is pretty expensive to “vet” things.
But, if your org has practices that lead to people getting hurt (whether intentionally or not), and it’s reasonably likely that those will eventually come to light, orgs are more likely to proactively put more effort into avoiding this sort of outcome.
That sounds a lot like what I picture as an “evaluation”?
I agree that spending time on evaluations/investigations like this is valuable.
Generally, I agree that—the more (competent) evaluations/investigations are done, the less orgs will feel incentivized to do things that would look bad if revealed.
(I think we mainly agree, it’s just terminology here)
Thanks for the reply! I guess I thought that since the CEA already does vet people before they can attend EAG, that maybe this wouldn’t be that hard to do in practise. But I see that most people disagree with me and I appreciate your reply!
Yea. I think CEA does much less vetting than something like this would require. Ben put in hundreds of hours in this case. Maybe CEA has 10-60 minutes to decide on each organization that joins? Most of the time simple questions like, “were they funded by ea donors, and can a few EAs vouch for them” satisfy.
I think Nonlinear would seem pretty competitive with a quick review (you notice they were funded by EAs, the team is EA centric, they do projects that are used by EAs)
I have no idea because I have never gone to a conference. I would expect that at most professional conferences the senior attendees who are offering careers (maybe universities or hospitals offering research positions) would have a minimum level of professionalism in the employment opportunity they are offering the junior attendees, but I genuinely have no idea how these things work! My concern really stems from meeting a lot of highly capable, excited, intelligent, young people at my university group, and wanting to make sure that they are protected! I hope that comes across in my question. I appreciate Catherine’s response though, and I do think this is harder to do in practice than I considered.
Another idea I had is that talking to young attendees about what to look for in a an employer might be a good idea, but maybe this is already done/or it has been considered and vetoed but I don’t know!
I’m on CEA’s Community Health and Special Projects team, and I sometimes contribute to EAG and EAGx event admissions and speaker decisions. I can understand your concern Lauren Maria. I’d really like for EA events to be places where attendees can have a high level of confidence in the other attendees (especially the attendees in positions of power). CEA does a small amount of vetting of speakers and organisations attending the career fairs. We also have our regular admissions process, where we sometimes choose to reject people from attending the conference if we have reasons to think their attendance would be bad for others (the most common reason is getting complaints of poor behaviour from members of the EA community). This hopefully reduces the risk, but people will still attend who could cause harm.
My main advice is to encourage community members to not implicitly trust others at EA events. Do your own due diligence, and talk it over with trusted friends, family, or mentors before making large decisions.
Since Alice met Emerson at an EAG, I’d like to hear what the CEA’s response to this is? I am curious how this sort of thing could be prevented in the future. Perhaps if someone who works for or owns a company meets someone they want to recruit from an EAG, there should be some protections for the young person attending the EAG (for ex- the company supplies the CEA with data about who they recruited, how much they plan to pay them, etc). I think young people attending an EAG would assume that the more senior people attending, who may be potential employers, would have been vetted and are acting in good faith. But if that isn’t the case (which clearly wasn’t here), then there is a serious problem. This is really concerning to me being someone who is currently in university, who knows young people who are eager to or who have attended EAGs, and could fall prey to people like this.
I don’t believe CEA actually has that many resources to deeply vet organizations.
If someone were interested in donating to them enough money for them to do more vetting, I wouldn’t be surprised if they would do more.
I’d expect that the funders would have done more vetting. That said, some of the EA funders now are pretty time-constrained and don’t do very deep vetting.
My guess is that this sort of thing could be prevented with increased vetting and mentorship/oversight. In some worlds, strong managers could find ways for Nonlinear to have done good work, while discouraging/preventing the bad parts of it.
But, this is pretty expensive, and I don’t think there’s a lot of enthusiasm/interest for expanding this area soon. In fairness, note that it is difficult to set up this infrastructure, and the results are often fairly amorphous (it’s hard to detect problems that aren’t happening) and long-term.
Part of the reason I think it was worth Ben/Lightcone prioritizing this investigation is as a retro-active version of “evaluations.”
Like, it is pretty expensive to “vet” things.
But, if your org has practices that lead to people getting hurt (whether intentionally or not), and it’s reasonably likely that those will eventually come to light, orgs are more likely to proactively put more effort into avoiding this sort of outcome.
That sounds a lot like what I picture as an “evaluation”?
I agree that spending time on evaluations/investigations like this is valuable.
Generally, I agree that—the more (competent) evaluations/investigations are done, the less orgs will feel incentivized to do things that would look bad if revealed.
(I think we mainly agree, it’s just terminology here)
Thanks for the reply! I guess I thought that since the CEA already does vet people before they can attend EAG, that maybe this wouldn’t be that hard to do in practise. But I see that most people disagree with me and I appreciate your reply!
Yea. I think CEA does much less vetting than something like this would require. Ben put in hundreds of hours in this case. Maybe CEA has 10-60 minutes to decide on each organization that joins? Most of the time simple questions like, “were they funded by ea donors, and can a few EAs vouch for them” satisfy.
I think Nonlinear would seem pretty competitive with a quick review (you notice they were funded by EAs, the team is EA centric, they do projects that are used by EAs)
Yeah, that makes sense. Thanks for explaining.
Are there any conferences (regardless of field) that do this?
I have no idea because I have never gone to a conference. I would expect that at most professional conferences the senior attendees who are offering careers (maybe universities or hospitals offering research positions) would have a minimum level of professionalism in the employment opportunity they are offering the junior attendees, but I genuinely have no idea how these things work! My concern really stems from meeting a lot of highly capable, excited, intelligent, young people at my university group, and wanting to make sure that they are protected! I hope that comes across in my question. I appreciate Catherine’s response though, and I do think this is harder to do in practice than I considered.
Another idea I had is that talking to young attendees about what to look for in a an employer might be a good idea, but maybe this is already done/or it has been considered and vetoed but I don’t know!
I’m on CEA’s Community Health and Special Projects team, and I sometimes contribute to EAG and EAGx event admissions and speaker decisions. I can understand your concern Lauren Maria. I’d really like for EA events to be places where attendees can have a high level of confidence in the other attendees (especially the attendees in positions of power). CEA does a small amount of vetting of speakers and organisations attending the career fairs. We also have our regular admissions process, where we sometimes choose to reject people from attending the conference if we have reasons to think their attendance would be bad for others (the most common reason is getting complaints of poor behaviour from members of the EA community). This hopefully reduces the risk, but people will still attend who could cause harm.
My main advice is to encourage community members to not implicitly trust others at EA events. Do your own due diligence, and talk it over with trusted friends, family, or mentors before making large decisions.
Do you have plans to exclude Nonlinear from the events in the near future?
Hey Morpheus. This comment provides a partial answer to your question.