I suspect that funders in general, but especially altruists, and double-especially effective altruists, should be investing in prizes more on the margin. Briefly, some arguments in favor:
It is much easier to evaluate interventions after the fact. In fact, that is what most EA donors are already doing, but in a way that forces an intervention to be very similar to something else that has already happened if it wants to be funded.
The incentives created by prize funding tend to be much better.
Standard mechanisms can overcome many of the costs of prizes (esp. the inability of prospective grantees / prize-recipients to fund their projects if they don’t win a prize), and the development of such institutions would be a huge public good.
The evaluation process for prizes tends to substantially reinforce EA values, as compared to the more typical grant-making process.
There are a number of ways in which an economically sound EA prize would differ from usual philanthropic prizes (which are typically engineered with the PR impact in mind, rather than as a sustainable funding source). I hope to write a post about this eventually (both w.r.t. the economic arguments, and particularly promising opportunities at the moment). But I’m always interested in thoughts.
As Carl points out here, if big donors have better opportunities than small donors, the small donors can just purchase lotteries. In fact I think that you can get more than a 50% chance of doubling your money using typical risky investments, though this gets into a more complicated discussion about risk. $200,000 is small enough that you probably have access to the typical EA opportunities, so I suspect you shouldn’t do anything differently than smaller EA donors.
That said, I suspect that smaller EA donors should behave quite differently than they currently do, and thinking about what you might do with more money can serve to make this more clear.
edited to add: I also think it’s plausible that at the $100k level a donor should still be saving / trying to pool their money / being risk-loving / just giving it to existing EA orgs, because the amounts of time required to make a good decision even at the meta level—unless you’ve already found someone you trust who you can just hand the money to.