I will only write a comment and not an answer because I think other people will probably give better answers. The thinking probably includes that 1) the world was unprepared, therefore even if there is a massive effort going on, cheap opportunities to do good might arise. 2) This situation might somewhat change the equilibriums between cause-areas and within EA, also changing how the world responds to risk, which may influence what is neglected and what is not, for example. Here a good post by Peter Hurford.
About the lockdown: I find it difficult to evaluate the short term effects, but thinking about the very long term effects is also probably interesting. On the one hand, under the longtermist view, slowing down technological progress has enormous negative consequences for the far future if the slope of progress continues to be positive. On the other, a lockdown means that the world will take pandemic preparedness more seriously, which in turn diminishes the probability of existential risk, which should lead to a greater positive impact… so, maybe the answer should be “enough lockdown for this situation to improve our chances to face greater threats”? I recognize this is not exactly what you asked though.