I also didn’t vote but would be very surprised if that particular paper—a policy proposal for a biosecurity institute in the context of a pandemic—was an example of the sort of thing Oxford would be concerned about affiliating with (I can imagine some academics being more sceptical of some of the FHI’s other research topics). Social science faculty academics write papers making public policy recommendations on a routine basis, many of them far more controversial.
The postmortem doc says “several times we made serious missteps in our communications with other parts of the university because we misunderstood how the message would be received” which suggests it might be internal messaging that lost them friends and alienated people. It’d be interesting if there are any specific lessons to be learned, but it might well boil down to academics being rude to each other, and the FHI seems to want to emphasize it was more about academic politics than anything else.
Transparency has costs, but potentially so does opacity (in terms of both loss of trust and reduced consideration for decisions that don’t need to be justified externally). Arguably both apply here: the decision obviously wasn’t uncontroversial, and the decision making process sounds quite limited for a significant commitment in a novel area. It’s also possible some of that information that wasn’t shared or subsequent metrics (I don’t think anyone is asking for original research here) would actually cast the original decision in a more favourable light
I also think its notable EA organizations and figures have made a lot of noise criticising non-EA philanthropy for lack of transparency and rigour in the past, and a norm of exempting EA organizations from calls for such scrutiny is probably worse than oversharing. In this particular case the how effective a use of funds might it have been? question isn’t arbitrary, it’s actually core to the EA mission, and might actually be useful to the people who think there is future value in EA events space as well as the doubters.