Austin, I’m gathering there might be a significant and cruxy divergence in how you conceptualize Manifold’s position in and influence on the EA community and how others in the community conceptualize this. Some of the core disagreements discussed here are relevant regardless, but it might help clarify the conversation if you describe your perspective on this.
Yeah, I think that’s a great observation. I think of Manifold as a small player in the EA space—it’s a team of ~8 people, and not an explicitly EA org (I think I’m the only person on the team who explicitly identified as EA, a few of my coworkers are sympathetic). We’ve received some EA funding, but most of our funding comes from venture capital sources. Meanwhile Manifund is approximately an EA org, but we’re also small players so far (I’d guess like half or more of the attendees of EAG London have not even heard of Manifund).
I also think of Manifest as something like, a scaled up house party, rather than an arbiter of who is good or notable in forecasting/EA. It’s an event we created from scratch, and primarily serves the audience of “fans of Manifold”, which makes me feel a bit defensive about my ability to invite who I please. If I have more influence in the EA community than this, that’s a bit of a surprise—I’d like to use that influence well because I care about EA, but I don’t feel especially beholden to the community at large (a very formless and difficult to parse entity, in the first place)
I also think of Manifest as something like, a scaled up house party, rather than an arbiter of who is good or notable in forecasting/EA
Forecasting is young, and in putting on Manifest you have quite a lot of influence on how the field grows and matures.
To the extent that you have (and I think that you should have) goals around the kind of field you want to build, who you invite to your conference seems much more consequential to me than who you invite to a house party.
(I am not saying anything either way on the choices you’ve made with invites etc, only on how I’d encourage you to think about them)
Yeah—in practice, I know that conference invites are consequential (and we discussed this as a team, eg very abbreviated Apr 22 meeting notes). I use the words “scaled up house party” to try to implant a bit of how bizarre it feels to me that, like, something that was just an idea in my head 1.5 years ago, now has attracted many of my favorite writers in the world, received multiple major media mentions, and is viewed as consequential. I also think that there’s something special about our invite process which leads to the feeling of “how it feels to attend Manifest”, and I and many attendees are overall quite happy with the outcome—TracingWoodgrains talks more about this here. While I want to continue to improve on how we shape who comes to Manifest, I also don’t want to kill the golden goose.
While Manifest is a forecasting festival, I’m not sure I’m really trying to build up the field of forecasting in general, rather than something more specific and tautological like “the Manifest community”. Even more than EA, forecasting is a formless vague entity, without a clear leader or in/out distinction.
Note that SFF explicitly does not identify as an EA Funder. I think there are of course still social ties here, and I don’t want to police people’s internal categories, but it seems like a relevant thing to bring up. I remember Critch clarifying this at some point in a comment somewhere, but I can’t find it, but I am reasonably confident that Critch and Jaan would both say something like “SFF is not an ‘EA funder’, please do not try to hold us to EA standards” if explicitly asked.
I think what to do here is a bit messy, since of course SFF is similar to EA funders in important ways, but I think the specific way it is being invoked here, where funding by SFF tries to somehow police people as therefore being members of a community, is something that SFF has reasonable right to object to, and as far as I know, does indeed object to.
Interpret data semantics as desired. At least I linked data my claims were based on upon, especially considering original claims still seem untrue regardless if SFF is considered EA:
most of our funding comes from venture capital sources
Yep, I wasn’t intending to disagree with all the stuff you said. Overall your complaint seems quite reasonable to me, I just had one local comment (which did seem relevant to the overall conclusion).
To preface, I don’t think this point is load-bearing/cruxy to the question of “is Manifold EA?” or “is Manifold a large player in the EA space?”, which itself is also something of a side point.
I was referring specifically to Manifold Markets as the we in “We’ve received some EA funding, but most of our funding comes from venture capital sources.”—right afterwards I agree that Manifund (aka Manifold for Charity) is an EA org.
Manifold Markets has received ~2.9M in investment and ~1.5M in grants, which were the figures I had in mind when I said “most of our funding comes from VC”. One complicating factor is that of the investment, 1M came from a FTX Future Fund regrant, structured as an equity investment through Alameda Research. Does that count as EA funding or VC? Idk, I think that counts in both categories, but if you characterize that as exclusively EA funding I agree it would be fair to say “Manifold has received more in EA funding than in venture capital”.
To my knowledge, Manifold for Charity grants did not only fund Manifund; Manifold for Charity grants seemingly funded Manifold’s currency donation platform too
Once debating if: * Manifold for Charity should be excluded * Survival and Flourishing Fund (SFF) funding is EA * FTX Future Fund funding is EA
The original comment feels reductive and I’d rather data be linked upfront rather than feeling dragged into revealing data and motte-and-bailey-esque threads
I agree the comment is reductive; many sentences are, due to the fractal nature of information. I generally wrote trying to balance correctness with informativeness with “actually publishing the damn post, rather than being maximally defensive”.
In any case, I appreciate that you linked to our finances, and that you like how we publish our numbers openly to the world!
I think to give some color to how this affects my work in particular (speaking strictly for myself as I haven’t discussed this with others on my team):
One of our organization priorities is ensuring we are creating a welcoming and hopefully safe community for people to do good better, regardless of people’s identities. A large part of our work is familiarizing people with and connecting them to organizations and resources, including ones that aren’t explicitly EA-branded. We are often one of their first touch points within EA and its niches, including forecasting. Many factors shape whether people decide to continue and deepen their involvement, including how positive they find these early touch points. When we’re routing people toward organizations and individuals, we know that their perception of our recommendations in turn affects their perception of us and of EA as a whole.
Good, smart, ambitious people usually have several options for professional communities to spend their time within. EA and its subcommunities are just one option and an off-putting experience can mean losing people for good.
With this in mind, I will feel much more reluctant to direct community members to Manifold in particular and (EA-adjacent) forecasting spaces more broadly, especially if the community member is an underrepresented group in EA. I think Manifold brings a lot of value, but I can’t in good conscience recommend they plug into communities I believe most people I am advising would find notably morally off-putting.
This is of course a subjective judgement call, I understand there are strong counterarguments here, and what repels one person also attracts another. But I hope this gives a greater sense of the considerations/trade-offs I (and probably many others) will have to spend time thinking about and reaching decisions around as a result of Manifest.
Thanks; I also appreciate you sharing your rationale here. I think this makes sense from your perspective, and while I think Manifest and Manifold in fact would be great experiences for people of all kinds, including underrepresented folks, I totally understand if we haven’t proven this to you at this point. Next time I’m in NYC, I’d enjoy speaking with you or other members of EA NYC, if you’d like that!
(I also want to note that my views shouldn’t reflect on “forecasting” any more than they reflect on “EA”; see Ozzie and Peter Wildeford’s comments for what forecasting more broadly is like. I’m in the awkward position of having run a platform for many forecasters, but not participating much as a forecaster myself.)
Just for your information, as a non-binary person who was assigned female at birth (so definitely under-represented in EA), I would find it very upsetting if I knew you were trying to control which ideas and people I was exposed to.
I find speciesist attitudes morally offputting, but if you would keep events from me because some people there were speciesist, I’d consider you to be being a bad friend.
People are different. Some people consider what you’re suggesting to be helpful. I do not. I just want you to be aware about the differences in preferences there, and not think that all “underrepresented groups” would feel uncomfortable going to events with speakers they deeply disagree with.
I do also want to clarify that I have no desire to “control which ideas and people [anyone] is exposed to.” It is more so, “If I am recommending 3 organizations I think someone should connect with, are there benefits or risks tied to those recommendations.”
Oh, it sounds like you might be confused about the context I’m talking about this occurring in, and I’m not sure that explaining it more fully is on-topic enough for this post. I’m going to leave this thread here for now to not detract from the main conversation. But I’ll consider making a separate post about this and welcome feedback there.
I really appreciate you sharing your perspective on this. I think these are extremely hard calls, as evidenced by the polarity of the discussion on this post, and to some extent it feels like a lose-lose situation. I don’t think these decisions should be made in a vacuum and want other people’s input, which is one reason I’m flagging how this affects my work and the larger involvement funnels in EA.
I also think of Manifest as something like, a scaled up house party, rather than an arbiter of who is good or notable in forecasting/EA
I’ve made a similar point in other comments, but this framing makes things worse. Then it’s not that Richard Hanania has relevant expertise in spite of his controversial statements, it’s that you think he’s a fun guy that you’d like to hang out with. Where people are willing to stomach unsavory co-attendees at a conference in their field, they’re more than happy to skip out on a scaled up house party with them.
Austin, I’m gathering there might be a significant and cruxy divergence in how you conceptualize Manifold’s position in and influence on the EA community and how others in the community conceptualize this. Some of the core disagreements discussed here are relevant regardless, but it might help clarify the conversation if you describe your perspective on this.
Yeah, I think that’s a great observation. I think of Manifold as a small player in the EA space—it’s a team of ~8 people, and not an explicitly EA org (I think I’m the only person on the team who explicitly identified as EA, a few of my coworkers are sympathetic). We’ve received some EA funding, but most of our funding comes from venture capital sources. Meanwhile Manifund is approximately an EA org, but we’re also small players so far (I’d guess like half or more of the attendees of EAG London have not even heard of Manifund).
I also think of Manifest as something like, a scaled up house party, rather than an arbiter of who is good or notable in forecasting/EA. It’s an event we created from scratch, and primarily serves the audience of “fans of Manifold”, which makes me feel a bit defensive about my ability to invite who I please. If I have more influence in the EA community than this, that’s a bit of a surprise—I’d like to use that influence well because I care about EA, but I don’t feel especially beholden to the community at large (a very formless and difficult to parse entity, in the first place)
Forecasting is young, and in putting on Manifest you have quite a lot of influence on how the field grows and matures.
To the extent that you have (and I think that you should have) goals around the kind of field you want to build, who you invite to your conference seems much more consequential to me than who you invite to a house party.
(I am not saying anything either way on the choices you’ve made with invites etc, only on how I’d encourage you to think about them)
Yeah—in practice, I know that conference invites are consequential (and we discussed this as a team, eg very abbreviated Apr 22 meeting notes). I use the words “scaled up house party” to try to implant a bit of how bizarre it feels to me that, like, something that was just an idea in my head 1.5 years ago, now has attracted many of my favorite writers in the world, received multiple major media mentions, and is viewed as consequential. I also think that there’s something special about our invite process which leads to the feeling of “how it feels to attend Manifest”, and I and many attendees are overall quite happy with the outcome—TracingWoodgrains talks more about this here. While I want to continue to improve on how we shape who comes to Manifest, I also don’t want to kill the golden goose.
While Manifest is a forecasting festival, I’m not sure I’m really trying to build up the field of forecasting in general, rather than something more specific and tautological like “the Manifest community”. Even more than EA, forecasting is a formless vague entity, without a clear leader or in/out distinction.
Appreciate Manifold’s quantitative transparency because Manifold’s qualitatively misleading. Public data may be slightly outdated but seeing ~$2.4M EA funding to Manifold for Charity and ~$2.4M EA funding and <$2M in non-EA VC funding to Manifold:
https://www.notion.so/manifoldmarkets/Manifold-Finances-0f9a14a16afe4375b67e21471ce456b0#d13406b6b26a43178d09609135aa38c6
Grants are funding, and Manifold seems mostly EA-funded, even excluding Manifold for Charity
Note that SFF explicitly does not identify as an EA Funder. I think there are of course still social ties here, and I don’t want to police people’s internal categories, but it seems like a relevant thing to bring up. I remember Critch clarifying this at some point in a comment somewhere, but I can’t find it, but I am reasonably confident that Critch and Jaan would both say something like “SFF is not an ‘EA funder’, please do not try to hold us to EA standards” if explicitly asked.
I think what to do here is a bit messy, since of course SFF is similar to EA funders in important ways, but I think the specific way it is being invoked here, where funding by SFF tries to somehow police people as therefore being members of a community, is something that SFF has reasonable right to object to, and as far as I know, does indeed object to.
Interpret data semantics as desired. At least I linked data my claims were based on upon, especially considering original claims still seem untrue regardless if SFF is considered EA:
Yep, I wasn’t intending to disagree with all the stuff you said. Overall your complaint seems quite reasonable to me, I just had one local comment (which did seem relevant to the overall conclusion).
To preface, I don’t think this point is load-bearing/cruxy to the question of “is Manifold EA?” or “is Manifold a large player in the EA space?”, which itself is also something of a side point.
I was referring specifically to Manifold Markets as the we in “We’ve received some EA funding, but most of our funding comes from venture capital sources.”—right afterwards I agree that Manifund (aka Manifold for Charity) is an EA org.
Manifold Markets has received ~2.9M in investment and ~1.5M in grants, which were the figures I had in mind when I said “most of our funding comes from VC”. One complicating factor is that of the investment, 1M came from a FTX Future Fund regrant, structured as an equity investment through Alameda Research. Does that count as EA funding or VC? Idk, I think that counts in both categories, but if you characterize that as exclusively EA funding I agree it would be fair to say “Manifold has received more in EA funding than in venture capital”.
To my knowledge, Manifold for Charity grants did not only fund Manifund; Manifold for Charity grants seemingly funded Manifold’s currency donation platform too
Once debating if:
* Manifold for Charity should be excluded
* Survival and Flourishing Fund (SFF) funding is EA
* FTX Future Fund funding is EA
The original comment feels reductive and I’d rather data be linked upfront rather than feeling dragged into revealing data and motte-and-bailey-esque threads
I agree the comment is reductive; many sentences are, due to the fractal nature of information. I generally wrote trying to balance correctness with informativeness with “actually publishing the damn post, rather than being maximally defensive”.
In any case, I appreciate that you linked to our finances, and that you like how we publish our numbers openly to the world!
This feels important to me and I hope we get a reply
Thanks for spelling this out.
I think to give some color to how this affects my work in particular (speaking strictly for myself as I haven’t discussed this with others on my team):
One of our organization priorities is ensuring we are creating a welcoming and hopefully safe community for people to do good better, regardless of people’s identities. A large part of our work is familiarizing people with and connecting them to organizations and resources, including ones that aren’t explicitly EA-branded. We are often one of their first touch points within EA and its niches, including forecasting. Many factors shape whether people decide to continue and deepen their involvement, including how positive they find these early touch points. When we’re routing people toward organizations and individuals, we know that their perception of our recommendations in turn affects their perception of us and of EA as a whole.
Good, smart, ambitious people usually have several options for professional communities to spend their time within. EA and its subcommunities are just one option and an off-putting experience can mean losing people for good.
With this in mind, I will feel much more reluctant to direct community members to Manifold in particular and (EA-adjacent) forecasting spaces more broadly, especially if the community member is an underrepresented group in EA. I think Manifold brings a lot of value, but I can’t in good conscience recommend they plug into communities I believe most people I am advising would find notably morally off-putting.
This is of course a subjective judgement call, I understand there are strong counterarguments here, and what repels one person also attracts another. But I hope this gives a greater sense of the considerations/trade-offs I (and probably many others) will have to spend time thinking about and reaching decisions around as a result of Manifest.
Thanks; I also appreciate you sharing your rationale here. I think this makes sense from your perspective, and while I think Manifest and Manifold in fact would be great experiences for people of all kinds, including underrepresented folks, I totally understand if we haven’t proven this to you at this point. Next time I’m in NYC, I’d enjoy speaking with you or other members of EA NYC, if you’d like that!
(I also want to note that my views shouldn’t reflect on “forecasting” any more than they reflect on “EA”; see Ozzie and Peter Wildeford’s comments for what forecasting more broadly is like. I’m in the awkward position of having run a platform for many forecasters, but not participating much as a forecaster myself.)
Just for your information, as a non-binary person who was assigned female at birth (so definitely under-represented in EA), I would find it very upsetting if I knew you were trying to control which ideas and people I was exposed to.
I find speciesist attitudes morally offputting, but if you would keep events from me because some people there were speciesist, I’d consider you to be being a bad friend.
People are different. Some people consider what you’re suggesting to be helpful. I do not. I just want you to be aware about the differences in preferences there, and not think that all “underrepresented groups” would feel uncomfortable going to events with speakers they deeply disagree with.
I do also want to clarify that I have no desire to “control which ideas and people [anyone] is exposed to.” It is more so, “If I am recommending 3 organizations I think someone should connect with, are there benefits or risks tied to those recommendations.”
Do you not see how that’s controlling what ideas and people they are exposed to?
They can’t make the choice on their own. You’re keeping information from them because you’ve decided what’s good for them.
I think the more robustly good thing to do is find out what your friends’ preferences are and follow their expressed preferences.
Oh, it sounds like you might be confused about the context I’m talking about this occurring in, and I’m not sure that explaining it more fully is on-topic enough for this post. I’m going to leave this thread here for now to not detract from the main conversation. But I’ll consider making a separate post about this and welcome feedback there.
I really appreciate you sharing your perspective on this. I think these are extremely hard calls, as evidenced by the polarity of the discussion on this post, and to some extent it feels like a lose-lose situation. I don’t think these decisions should be made in a vacuum and want other people’s input, which is one reason I’m flagging how this affects my work and the larger involvement funnels in EA.
I’ve made a similar point in other comments, but this framing makes things worse. Then it’s not that Richard Hanania has relevant expertise in spite of his controversial statements, it’s that you think he’s a fun guy that you’d like to hang out with. Where people are willing to stomach unsavory co-attendees at a conference in their field, they’re more than happy to skip out on a scaled up house party with them.