Arkose is an early-stage, field-building nonprofit with the mission of improving the safety of advanced AI systems to reduce potential large-scale risks. To this end, Arkose focuses on supporting researchers, engineers, and other professionals interested in contributing to AI safety.
Arkose
Karma: 203
Good question! We’re choosing to be cautious around data privacy, which unfortunately makes it hard to share specific wins publicly. However, we can share this graph which people fill out six months after their call:
Arkose is Closing
Thanks for the suggestion! I’ve changed the title.
Thanks for the suggestion! I’ve now crossposted it to LessWrong.
Thanks for the suggestion, Neel! I’ve now posted a project on Manifund.
Unfortunately, we got very little feedback from funders, and frequently none at all. It could be the risks of reaching out to this more senior audience (i.e. the risk of making them less interested in AI safety if the outreach is poor), but this is just a guess. I expect there are a number of factors at play.