I can confirm that copying and pasting doesn’t move the needle, at least in consultations I’ve been involved with—they will put weight on people actually engaging with the ideas (Similarly feel free to skip or provide very short answers to questions you don’t care much about and focus on the ones who care most about)
That’s interesting! I was thinking there was a chance it did, because in a write-up about a similar public consultation on live animal transport, Defra used a lot of “X% of people thought Y” framings in their analysis (more details). It depends whether they count duplicated responses when they do this.
As a follow-up, in consultations you’ve been involved with, did they put weight on the thoughts on random members of the public, assuming the thoughts were sensible ofc?
Yeah, that’s what I hoped. I couldn’t honestly say that I would care about these labels (cos I don’t eat animal products anyway), but I said stuff like ‘consumers would like to know this’, which I think is true.
Many of the questions ask you to pick among ‘Strongly disagree’ through ‘Strongly agree’ and most questions are optional. For those likert/select-an-option questions, I guess the survey analysers would do more aggregation among survey-takers, so quantity would matter there.
That would make sense! I think the civil servant in charge might also have a certain level of discretion with regards to how they represent the results—I did in my case.
I’m sorry, I didn’t mean to imply that more responses mean nothing, just that bringing up a sensible consideration is more likely to affect outcomes than copying and pasting a response (which may have between no weight and a little weight with the policymakers)
Grayden comments:
I can confirm that copying and pasting doesn’t move the needle, at least in consultations I’ve been involved with—they will put weight on people actually engaging with the ideas (Similarly feel free to skip or provide very short answers to questions you don’t care much about and focus on the ones who care most about)
For context, Kirsten has long worked for UK government departments.
That’s interesting! I was thinking there was a chance it did, because in a write-up about a similar public consultation on live animal transport, Defra used a lot of “X% of people thought Y” framings in their analysis (more details). It depends whether they count duplicated responses when they do this.
Yes that’s true, if they use statistics like this, similar or duplocated responses might count
That’s interesting!
As a follow-up, in consultations you’ve been involved with, did they put weight on the thoughts on random members of the public, assuming the thoughts were sensible ofc?
There weren’t many, so I don’t know unfortunately. In this consultation you’d have a better chance because it’s about a public-facing issue
Yeah, that’s what I hoped. I couldn’t honestly say that I would care about these labels (cos I don’t eat animal products anyway), but I said stuff like ‘consumers would like to know this’, which I think is true.
Many of the questions ask you to pick among ‘Strongly disagree’ through ‘Strongly agree’ and most questions are optional. For those likert/select-an-option questions, I guess the survey analysers would do more aggregation among survey-takers, so quantity would matter there.
That would make sense! I think the civil servant in charge might also have a certain level of discretion with regards to how they represent the results—I did in my case.
I’m sorry, I didn’t mean to imply that more responses mean nothing, just that bringing up a sensible consideration is more likely to affect outcomes than copying and pasting a response (which may have between no weight and a little weight with the policymakers)