DATA
Meditation app adherence, by the numbers.
This is a synthesis of the published research on who actually keeps using meditation apps — and who drops off. We did not run the studies; we read them, compared the numbers, and wrote down what seems to hold up across the literature.
Updated April 2026·9 min read
The meditation-app category has a visible retention problem. App store ratings are high; long-term use is low. That gap is well-documented in the research, but the specific numbers are scattered across dozens of small trials and a few larger meta-analyses. This page compiles what we think is the most reliable public picture as of 2026 — including the parts that are inconvenient for us as an app-maker.
The short version
- Across peer-reviewed trials, typical eight-week meditation-app interventions show session-completion rates well below 100% — roughly half of assigned content is a common finding, with wide variance by study design and population.
- Dropout is heaviest in the first two weeks and flattens after that. People who reach a regular rhythm in weeks 3–4 tend to keep using the app.
- Studies that paid participants or embedded apps in clinical contexts report higher adherence than naturalistic in-the-wild use.
- Meaningful mental-health effects in the published meta-analyses are small-to-moderate and tend to correlate with amount of practice completed, not with assignment to an app.
A note on our claims
Loam's content is grounded in peer-reviewed research where it exists, and clearly flagged when it doesn't. We're a wellness app, not a clinic — for diagnosed conditions, please work with a qualified clinician. Read our editorial methodology for how we decide what to publish.
Where these numbers come from
The two most-cited pieces of evidence on digital mindfulness interventions are Goldberg et al.'s 2022 meta-analysis of mindfulness-based app interventions, and the broader meta-analytic work on mHealth mental-health interventions by Firth et al. (2017) and successors. These papers collate effect sizes and, where reported, adherence. Individual randomized trials — e.g. Flett et al. on university students and Huberty et al. on Calm — are where the more granular usage data lives.
The picture that emerges from these sources is less flattering than the app-marketing picture, but more useful. The effects are real; they're also conditional on actually using the app.
What the adherence numbers look like
Across published RCTs of meditation apps, the proportion of assigned sessions that participants actually complete varies substantially. A reasonable summary of the pattern we observed reading the literature:
- Clinical samples, supervised delivery (e.g. cancer survivors with staff check-ins): adherence often reaches 70–80% of assigned content.
- Unsupervised adult users in an 8-week RCT: completion of around half of assigned sessions is typical.
- University-student samples (often the most-studied population): highly variable, with some studies reporting median completion in the low single digits of sessions.
- Real-world app usage outside a trial: published analyses of anonymized app telemetry — when companies share it — suggest steeper fall-off than any trial population, with a large share of downloads never translating into a second session.
These are ranges rather than exact percentages because each study's operational definition differs. “Completion” means different things in different trials — starting a session vs. finishing a session vs. hitting a 90% playback threshold. We are skeptical of any single headline retention number that doesn't specify which of those is being measured.
The first-two-weeks pattern
The most consistent adherence signal across trials is the shape of the curve. Dropout is concentrated in the first 14 days. People who reach a regular rhythm in weeks 3–4 are substantially more likely to still be using the app at week 8.
Mechanistically this is what you'd expect from habit-formation research more broadly. Early practice competes with existing routines; once the practice is connected to an existing anchor — getting into bed, commute start, post-coffee pause — it stops competing and starts riding on existing cues. We designed The Moment and the guided programs around that model rather than around a 30-day streak ladder.
Effects scale with practice, not assignment
The published effect sizes on anxiety and stress from meditation-app RCTs cluster in the small-to-moderate range — typically somewhere around Cohen's d = 0.3 for stress, smaller for anxiety. But within-study analyses consistently find that the effect is concentrated among participants who actually practiced. Low-adherence participants often show effect sizes indistinguishable from control.
That's the honest interpretation of the research for users choosing an app: the category works, but only if you use it. Most of the variation in outcome across people comes from variation in practice completion, not from which app they downloaded. This is why our comparisons pages (/compare) push so hard on “pick the one you'll actually open on a Wednesday night.”
Why marketed retention numbers are hard to trust
App companies occasionally publish retention or “active user” numbers in press releases. These numbers are almost always selected from the tail of highly-engaged users and don't correspond to any standard published metric. Where a company reports peer-reviewed results, those are closer to the truth; marketing collateral is not.
We deliberately don't publish headline retention numbers for Loam. Our honest position: the research on our category says long-term use is hard, and there's no reason to believe any specific app has solved it. What we can commit to is that every technique in Loam ties back to a primary research source and is chosen because it has a fair evidence base — not because it has the best-looking chart.
Practical takeaways
- If you're starting a meditation app, give yourself two weeks before judging whether it's working. The dropout cliff is real; plan for it.
- Anchor practice to something you already do — waking up, lying down in bed, the end of lunch. Don't rely on motivation.
- Pick an app whose default session length fits your actual schedule, not your ideal schedule. Short sessions completed beat long sessions skipped.
- If you stop for a week, you haven't failed. Start again. The published effects accrue from cumulative practice, not from unbroken streaks.
Methodology & corrections
This page is a reading of the public literature as of April 2026. We did not run statistical analyses on raw trial data; we read the peer-reviewed papers and the most-cited meta-analyses and wrote down the patterns that held up across them. We'll revise this page when meaningful new meta-analyses or app-company peer-reviewed studies land.
If you spot an error or a study we've missed — particularly newer work we should incorporate — our methodology page describes how to flag it.
Related reading
Research library · How long should you meditate · Best meditation app for anxiety · Full comparison hub.