Short-term impacts: Mmm, this has made me realise I wasn’t explicit about the assumptions I made there—I should either make that effect size bound a bit wider or model it as an exponential (or possibly a beta). I think this CEA is best interpreted as ‘if you built an evidence-based product, what would its cost-effectiveness be?’ but even that should probably have a wider bound. And there’s the new update in Linardon et al. (2024) that will be worth incorporating.
Adherence: Thank you! That roughly tracks with the decay curves from SimilarWeb, which is good validation. Although you raise a good point—decay probably depends a lot on whether you’re feature-gating after a trial period or not. Do you have a ballpark for the ratio of installs to DAU?
CPI: Those are lower CPIs than the estimates I had—good to know! Are those on Facebook, Tik Tok, elsewhere? I was also assuming organic traffic is negligible after the first hundred thousand or so, but do you still see an effect there?
Dev costs: Lovely! Having worked in industry, I definitely have the sense that there are good incentive reasons why headcounts might be unnecessarily bloated 🙃
Opportunities: I won’t ask what your roadmap looks like, but it’s very promising that you have this hunch. In my own experience as a user, I can definitely concur.
I’ll mull for a bit and update the OP with some adjustments. I might also shoot you a DM with some curiosity questions later. Thank you again! 😍
Ratio of Installs to DAU: Hmm, that’s an interesting metric...the way I think about retention is like a layered cake, kind of like the baumkuchen I just ate for breakfast, but linear instead of round. Anyways, there’s time on the X axis and users on the Y axis. For any given day, there’s a sizeable layer of cream at the top which are the Day 0 users. And then right below that, a smaller layer of Day 1 users, etc. etc. Ultimately there are hundreds of layers of users from older daily cohorts. You can track each daily cohort through time and it’ll start big and then shrink rapidly, following the retention curves, until eventually it flatlines at some point (ideally above 0).
So you could look at overall installs to DAU, but that gives an advantage to new apps because they don’t have a lot of old installs from years-old cohorts that have left. Or you could compare daily installs to DAU, but that gives an advantage to old apps because they’ll have a lot of users from old cohorts.
A better metric could be DAU / MAU ratio, which measures like out of all of your active users, how many of them use the app every day. Here ~25% would be exceptional with an average of probably around 10%. But that’s also biased based on how many new users you’re bringing in each day.
CPI: Yes, those numbers are from Facebook / Instagram / TikTok ads.
In terms of organic traffic, it’s also a measure of time. Say for example you’re bringing in 1000 organic users a day. After a year, that’s 365k users. After 5 years that’s 1.8M users. Of course, the app still has to remain good to continue getting organic downloads. Since the definition of good is always improving, the app would need to be consistently updated.
I’d say estimate around 2⁄3 of our lifetime installs are organic, but it really depends on the app. I speculate that Daylio might be closer to 100% organic while Breeze is probably closer to 0%.
Hmm—good points. Getting Installs/DAU wrong could meaningfully affect the numbers, I guess longer-term retention per install is probably a better way of accounting for it. It was unclear to me whether to model retention as having a zero or nonzero limiting value, which would change some of the calculations.
Improving organic install rate would be promising if you could get it above 50%, I think (your apps sound very effective!). I suspect a lot of that is, as you say, about consistently building a good user experience and continuing to add value. (I see a lot of Daylio users complaining about the lack of updates & the increased ad load.)
Eddie, thank you (I’m a long time fan!)
Short-term impacts: Mmm, this has made me realise I wasn’t explicit about the assumptions I made there—I should either make that effect size bound a bit wider or model it as an exponential (or possibly a beta). I think this CEA is best interpreted as ‘if you built an evidence-based product, what would its cost-effectiveness be?’ but even that should probably have a wider bound. And there’s the new update in Linardon et al. (2024) that will be worth incorporating.
Adherence: Thank you! That roughly tracks with the decay curves from SimilarWeb, which is good validation. Although you raise a good point—decay probably depends a lot on whether you’re feature-gating after a trial period or not. Do you have a ballpark for the ratio of installs to DAU?
CPI: Those are lower CPIs than the estimates I had—good to know! Are those on Facebook, Tik Tok, elsewhere? I was also assuming organic traffic is negligible after the first hundred thousand or so, but do you still see an effect there?
Dev costs: Lovely! Having worked in industry, I definitely have the sense that there are good incentive reasons why headcounts might be unnecessarily bloated 🙃
Opportunities: I won’t ask what your roadmap looks like, but it’s very promising that you have this hunch. In my own experience as a user, I can definitely concur.
I’ll mull for a bit and update the OP with some adjustments. I might also shoot you a DM with some curiosity questions later. Thank you again! 😍
Ratio of Installs to DAU: Hmm, that’s an interesting metric...the way I think about retention is like a layered cake, kind of like the baumkuchen I just ate for breakfast, but linear instead of round. Anyways, there’s time on the X axis and users on the Y axis. For any given day, there’s a sizeable layer of cream at the top which are the Day 0 users. And then right below that, a smaller layer of Day 1 users, etc. etc. Ultimately there are hundreds of layers of users from older daily cohorts. You can track each daily cohort through time and it’ll start big and then shrink rapidly, following the retention curves, until eventually it flatlines at some point (ideally above 0).
So you could look at overall installs to DAU, but that gives an advantage to new apps because they don’t have a lot of old installs from years-old cohorts that have left. Or you could compare daily installs to DAU, but that gives an advantage to old apps because they’ll have a lot of users from old cohorts.
A better metric could be DAU / MAU ratio, which measures like out of all of your active users, how many of them use the app every day. Here ~25% would be exceptional with an average of probably around 10%. But that’s also biased based on how many new users you’re bringing in each day.
By the way, the only peer group benchmarks that apple provides are Conversion Rate, Proceeds per Paying User, Crash Rate, Day 1/7/28 Retention. https://developer.apple.com/app-store/peer-group-benchmarks/ . But they might be announcing more in March thanks to the EU’s DMA. https://developer.apple.com/support/dma-and-apps-in-the-eu/#app-analytics
CPI: Yes, those numbers are from Facebook / Instagram / TikTok ads.
In terms of organic traffic, it’s also a measure of time. Say for example you’re bringing in 1000 organic users a day. After a year, that’s 365k users. After 5 years that’s 1.8M users. Of course, the app still has to remain good to continue getting organic downloads. Since the definition of good is always improving, the app would need to be consistently updated.
I’d say estimate around 2⁄3 of our lifetime installs are organic, but it really depends on the app. I speculate that Daylio might be closer to 100% organic while Breeze is probably closer to 0%.
Hmm—good points. Getting Installs/DAU wrong could meaningfully affect the numbers, I guess longer-term retention per install is probably a better way of accounting for it. It was unclear to me whether to model retention as having a zero or nonzero limiting value, which would change some of the calculations.
Improving organic install rate would be promising if you could get it above 50%, I think (your apps sound very effective!). I suspect a lot of that is, as you say, about consistently building a good user experience and continuing to add value. (I see a lot of Daylio users complaining about the lack of updates & the increased ad load.)