So my first reaction to the Youth Ministry Adherence data was the basically the opposite of yours, in that I looked at it and thought âseems like they are doing a (slightly) better job of retentionâ. Reviewing where we disagree, I think thereâs a tricky thing here about distinguishing between âdropoutâ rates and âdecreased engagementâ rates. Ben Toddâs estimates which you quote are explicitly trying to estimate the former, but when you compare to:
those listed as âengaged disciplesâ who continue to self-report as âhigh involvementâ
...I think you might end up estimating the latter. âHigh involvementâ was the highest of four possible levels, and âEngaged discipleâ was also the highest of four possible levels. By default Iâd look at the number who are neither moderately nor highly involved, i.e. 8% rather than 42%.
More generally, my understanding is that Ben was counting someone as not having dropped out if they were still doing something as engaged as fulfulling as GWWC pledge, based on quote below. So if you start at a much higher level than that (like...attending the Weekend Away), thereâs a lot of room to decrease engagement substantially while still being above the bar and not having âdropped outâ. Which in turn means Iâd generally be aiming to have similar leeway for âregression to the meanâ in the comparisons. Or you can compare everything to Benâs GWWC dropout number of 40%, which has no such leeway, as you do with the similarly-no-leeway case of vegetarianism.
I appreciate this is a highly subjective call though, this is very much just my two cents. I could easily imagine changing my mind if I looked at the Youth Ministry information more closely and decided that âEngaged Discipleâ actually constituted some kind of âsuper-high-involvementâ category.
The list contains 69 names. Several of the team went over the names checking who is still involved. We started with our own knowledge, and then checked our own engagement data for ambiguous cases.
We counted someone as âstill involvedâ if they were doing something as engaged as fulfilling a GWWC pledge.
On this basis, we counted about 10 people we think have âdropped outâ, which is 14% of the total in 6 years.
Interesting, thanks! Something which probably isnât obvious without reading the methods (pages 125-127) is that study participants were recruited through church mailing lists and Facebook groups. So the interpretation of that statistic is âof the people who answer surveys from their church, 92% report at least moderate engagementâ.
âModerate engagementâ is defined as an average of a bunch of questions, but roughly it means someone who attends church at least once per month.
I think that definition of âmoderate engagementâ is a bit higher than âwilling to answer surveys from my churchâ (as evidenced by the people who answered the survey but did not report moderate engagement), but itâs not a ton higher, so Iâm hesitant to read too much into the percentage who report moderate engagement.
I felt like âhigh engagementâ was enough above âwilling to answer a surveyâ that some value could be gotten from the statistic, but even there Iâm hesitant to conclude too much, and wouldnât blame someone who discounted the entire result because of the research method (or interpreted the result in a pretty different way from me).
If we want to compare it to Benâs EA estimates: I guess one analog would be to look at people who attended that weekend away but also answered the EA survey five years later. Iâm not sure if such a data set exists.
So my first reaction to the Youth Ministry Adherence data was the basically the opposite of yours, in that I looked at it and thought âseems like they are doing a (slightly) better job of retentionâ. Reviewing where we disagree, I think thereâs a tricky thing here about distinguishing between âdropoutâ rates and âdecreased engagementâ rates. Ben Toddâs estimates which you quote are explicitly trying to estimate the former, but when you compare to:
...I think you might end up estimating the latter. âHigh involvementâ was the highest of four possible levels, and âEngaged discipleâ was also the highest of four possible levels. By default Iâd look at the number who are neither moderately nor highly involved, i.e. 8% rather than 42%.
More generally, my understanding is that Ben was counting someone as not having dropped out if they were still doing something as engaged as fulfulling as GWWC pledge, based on quote below. So if you start at a much higher level than that (like...attending the Weekend Away), thereâs a lot of room to decrease engagement substantially while still being above the bar and not having âdropped outâ. Which in turn means Iâd generally be aiming to have similar leeway for âregression to the meanâ in the comparisons. Or you can compare everything to Benâs GWWC dropout number of 40%, which has no such leeway, as you do with the similarly-no-leeway case of vegetarianism.
I appreciate this is a highly subjective call though, this is very much just my two cents. I could easily imagine changing my mind if I looked at the Youth Ministry information more closely and decided that âEngaged Discipleâ actually constituted some kind of âsuper-high-involvementâ category.
Interesting, thanks! Something which probably isnât obvious without reading the methods (pages 125-127) is that study participants were recruited through church mailing lists and Facebook groups. So the interpretation of that statistic is âof the people who answer surveys from their church, 92% report at least moderate engagementâ.
âModerate engagementâ is defined as an average of a bunch of questions, but roughly it means someone who attends church at least once per month.
I think that definition of âmoderate engagementâ is a bit higher than âwilling to answer surveys from my churchâ (as evidenced by the people who answered the survey but did not report moderate engagement), but itâs not a ton higher, so Iâm hesitant to read too much into the percentage who report moderate engagement.
I felt like âhigh engagementâ was enough above âwilling to answer a surveyâ that some value could be gotten from the statistic, but even there Iâm hesitant to conclude too much, and wouldnât blame someone who discounted the entire result because of the research method (or interpreted the result in a pretty different way from me).
If we want to compare it to Benâs EA estimates: I guess one analog would be to look at people who attended that weekend away but also answered the EA survey five years later. Iâm not sure if such a data set exists.