This post is far too assertive considering the weak evidence given in the post.
The blog points out that it may be hard to measure the success of campaigns towards customers. Unfortunately it makes a large unjustified logical leap to assume this means that these campaigns are unsuccessful. As referred to in the EA’s [Hits-based giving] post, it “arrogantly” places one solution over another shortly after the very experts they consulted say “there’s no real answer” and “we need more research”. In fact, there’s no hard evidence given to support the posts headline at all.
Some of the links seem tenuous too, and here’s an example. The point is made that “60 percent of Americans who say they’re vegetarian on surveys also say that they’ve eaten meat in the past 24 hours”. This links to a BusinessInsider post which links a PsychologyToday post, which quotes a CNN poll which I can’t find anywhere (feel free to link below if you find it). The PsychologyToday post also cites a 20 year old study in which a small number of respondents claim to eat <10g of meat per day in an initial poll, but not in a follow-up 3-10 days later. While the study does not claim that this is due to dishonesty or bias, the linking posts claim both lies and social desirability bias.
This example shows how the author is writing their own narrative onto an tertiarily linked study, and greatly lowers the confidence I can have for the links I didn’t check (Brandolini’s law).
I’d really like to see some sort of justification of having this low-quality post linked among the otherwise well-written blogs on this forum.
Also, I don’t see it justify that targeting corporations works? Is it more effective to convert high-profile individuals in animal slaughter-related corporations, to pass regulations for these corporations, to become experts working to create welfare standards from within, or what? It doesn’t tell me much about where to orient my animal welfare charity towards.
This post is far too assertive considering the weak evidence given in the post.
The blog points out that it may be hard to measure the success of campaigns towards customers. Unfortunately it makes a large unjustified logical leap to assume this means that these campaigns are unsuccessful. As referred to in the EA’s [Hits-based giving] post, it “arrogantly” places one solution over another shortly after the very experts they consulted say “there’s no real answer” and “we need more research”. In fact, there’s no hard evidence given to support the posts headline at all.
Some of the links seem tenuous too, and here’s an example. The point is made that “60 percent of Americans who say they’re vegetarian on surveys also say that they’ve eaten meat in the past 24 hours”. This links to a BusinessInsider post which links a PsychologyToday post, which quotes a CNN poll which I can’t find anywhere (feel free to link below if you find it). The PsychologyToday post also cites a 20 year old study in which a small number of respondents claim to eat <10g of meat per day in an initial poll, but not in a follow-up 3-10 days later. While the study does not claim that this is due to dishonesty or bias, the linking posts claim both lies and social desirability bias.
This example shows how the author is writing their own narrative onto an tertiarily linked study, and greatly lowers the confidence I can have for the links I didn’t check (Brandolini’s law).
I’d really like to see some sort of justification of having this low-quality post linked among the otherwise well-written blogs on this forum.
Also, I don’t see it justify that targeting corporations works?
Is it more effective to convert high-profile individuals in animal slaughter-related corporations, to pass regulations for these corporations, to become experts working to create welfare standards from within, or what? It doesn’t tell me much about where to orient my animal welfare charity towards.