Two significant limitations are high rates of respondent attrition and the likely influence of social desirability bias and/or demand effects, as it was likely clear (post-workshop) which were the desired responses.
It seems to me one indication of social desirability bias and/or selective attrition is that there is a nearly half point shift in the average response to “I currently eat less meat than I used to for ethical reasons.” On the other hand, it’s possible students interpreted it as “I currently plan on eating less meat than I used to for ethical reasons.”
I wonder if it is possible to add a check for this in a future survey. Maybe ask them if they intentionally conserve their water usage to save the environment? There should be no reason for that to change from pre- to post- without a change in social desirability or attrition.
My guess is the main problem occurs when it is very clear to students what the instructors want them to say. Since we don’t talk about water usage, students may not change their answer for the water usage question, but still change their answer to the questions relating to the content (whether or not they have been impacted by the program), so there may not be a shift in the water question, but still a shift in the other questions.
We did test out some different social desirability scales in the surveys, which is a common method. The preliminary analysis suggested that social desirability is a factor, but we haven’t finished that analysis on the full data yet.
Definitely true on both counts. I suspect that many answers are signalling intentions, but social desirability certainly has a role to play, as we mentioned above. This is one of the reasons we are now placing less emphasis on the future collection of quantitative survey data.
In the future SHIC is going to be placing more weight on our impact on the understanding and trajectory changes of the smaller number of students who progress from the primary workshops (of the kind we described in this report) onto our advanced workshops and individual coaching. Because we’ll be working more closely with these students and discussing concrete actions (e.g. education and career decisions, pursuing volunteering opportunities with effective charities, and attending EA meetups and conferences), we hope to have a much more reliable insight into whether we’re actually producing valuable changes in their understanding and plans.
It seems to me one indication of social desirability bias and/or selective attrition is that there is a nearly half point shift in the average response to “I currently eat less meat than I used to for ethical reasons.” On the other hand, it’s possible students interpreted it as “I currently plan on eating less meat than I used to for ethical reasons.”
I wonder if it is possible to add a check for this in a future survey. Maybe ask them if they intentionally conserve their water usage to save the environment? There should be no reason for that to change from pre- to post- without a change in social desirability or attrition.
My guess is the main problem occurs when it is very clear to students what the instructors want them to say. Since we don’t talk about water usage, students may not change their answer for the water usage question, but still change their answer to the questions relating to the content (whether or not they have been impacted by the program), so there may not be a shift in the water question, but still a shift in the other questions.
We did test out some different social desirability scales in the surveys, which is a common method. The preliminary analysis suggested that social desirability is a factor, but we haven’t finished that analysis on the full data yet.
Definitely true on both counts. I suspect that many answers are signalling intentions, but social desirability certainly has a role to play, as we mentioned above. This is one of the reasons we are now placing less emphasis on the future collection of quantitative survey data.
What do you see as a better way of gathering data going forward?
In the future SHIC is going to be placing more weight on our impact on the understanding and trajectory changes of the smaller number of students who progress from the primary workshops (of the kind we described in this report) onto our advanced workshops and individual coaching. Because we’ll be working more closely with these students and discussing concrete actions (e.g. education and career decisions, pursuing volunteering opportunities with effective charities, and attending EA meetups and conferences), we hope to have a much more reliable insight into whether we’re actually producing valuable changes in their understanding and plans.