Was this funded? These sorts of findings are clearly more worthy of funding than 90% of the other stuff I’ve seen, and most the rest of the 10% is ambiguous.
Forget the “Alignment Textbook from 100 years in the future”; if we had this 6 years ago, things would have gone very differently.
I don’t think these conversations had as much impact as you suggest and I think most of the stuff funded by EA funders has decent EV, i.e. I have more trust in the funding process than you seem to have.
I think one nice side-effect of this is that I’m now widely known as “the AI safety guy” in parts of the European AIS community and some people have just randomly dropped me a message or started a conversation about it because they were curious.
I was working on different grants in the past but this particular work was not funded.
Was this funded? These sorts of findings are clearly more worthy of funding than 90% of the other stuff I’ve seen, and most the rest of the 10% is ambiguous.
Forget the “Alignment Textbook from 100 years in the future”; if we had this 6 years ago, things would have gone very differently.
I don’t think these conversations had as much impact as you suggest and I think most of the stuff funded by EA funders has decent EV, i.e. I have more trust in the funding process than you seem to have.
I think one nice side-effect of this is that I’m now widely known as “the AI safety guy” in parts of the European AIS community and some people have just randomly dropped me a message or started a conversation about it because they were curious.
I was working on different grants in the past but this particular work was not funded.
Agree there’s a bunch of value in just your presence bringing AI safety to the space / conversation where it makes it more salient to people.