That’s interesting- I guess I’m expecting so much diversity in responses that one fixed response question would probably raise more questions than it answered (i.e. “which second-order consideration?”). An alternative would be to send out a short survey afterwards to a randomised group of voters from across the spectrum. Depending on the content of people’s comments maybe we could also categorise them and do some kind of basic analysis (i.e. without sending a survey out).
Makes sense—one use case for me is that I’d be more inclined to defer to community judgment based on certain grounds than on others in allocating my own (much more limited!) funds.
E.g., if perspective X already gets a lot of weight from major funders, or if I think I’m in a fairly good position to weigh X relative to others, then I’d probably defer less. On the other hand, there are some potential cruxes on which various factors point toward more deference.
The specific statement I was reacting to was that people might vote based on their views about what happens after a singularity. For various reasons, I would not be inclined to defer to GH/animal welfare funding splits that were promised on that kind of reasoning. (Not that the reasoning is somehow invalid, it’s just not the kind of data that would materially update how I donate.)
That’s interesting- I guess I’m expecting so much diversity in responses that one fixed response question would probably raise more questions than it answered (i.e. “which second-order consideration?”). An alternative would be to send out a short survey afterwards to a randomised group of voters from across the spectrum. Depending on the content of people’s comments maybe we could also categorise them and do some kind of basic analysis (i.e. without sending a survey out).
Makes sense—one use case for me is that I’d be more inclined to defer to community judgment based on certain grounds than on others in allocating my own (much more limited!) funds.
E.g., if perspective X already gets a lot of weight from major funders, or if I think I’m in a fairly good position to weigh X relative to others, then I’d probably defer less. On the other hand, there are some potential cruxes on which various factors point toward more deference.
The specific statement I was reacting to was that people might vote based on their views about what happens after a singularity. For various reasons, I would not be inclined to defer to GH/animal welfare funding splits that were promised on that kind of reasoning. (Not that the reasoning is somehow invalid, it’s just not the kind of data that would materially update how I donate.)