You’ve shipped your app. You’ve got users. Now someone’s telling you to “scale” or “do UA” or “raise a round.” But you’re staring at your analytics dashboard wondering: is this thing actually working well enough to deserve paid growth?
Pouring money into user acquisition before you have real product–market fit is like filling a leaky bucket. You’ll burn cash, blame the channels, and never know if the problem was your targeting or your product.
This post walks through the 5 signals that matter, the thresholds that separate “keep iterating” from “time to scale,” and a simple checklist to decide your next move.
The Pain: When “Growth” Is Premature
Most apps die from premature scaling. You spend $5K on ads, get 1,000 installs, watch 90% disappear in three days, and conclude “paid UA doesn’t work.” But the real issue? Your onboarding was confusing, your core loop wasn’t sticky, or your value prop didn’t land.
The symptoms look like this:
- High install costs but brutal churn
- Organic users stick around; paid users vanish
- You can’t explain in one sentence why someone would use your app daily
- Retention curves look like a cliff, not a curve
The fix isn’t more budget. It’s knowing whether you’ve earned the right to scale.
The 5 Signals That Actually Matter
Forget vanity metrics. These five numbers tell you if your product has real pull:
1. D1 and D7 Retention
What it is: Percentage of new users who return 1 day and 7 days after install.
Why it matters: If people don’t come back, nothing else matters. Retention is the foundation.
How to measure: Cohort any group of users by install date. Track how many open the app the next day (D1) and a week later (D7).
Benchmark ranges:
| Category | D1 Target | D7 Target |
|---|---|---|
| Utility / productivity | 30–50% | 15–25% |
| Social / entertainment | 40–60% | 20–35% |
| Games (casual) | 35–50% | 12–20% |
| Health / habit-tracking | 25–40% | 15–25% |
Green flag: You’re in the top half of your category range.
Yellow flag: Bottom half but trending up after recent changes.
Red flag: Below 25% D1 or below 10% D7 for more than two cohorts in a row.

2. Paying Conversion Rate
What it is: Percentage of users who pay you anything (subscription trial, IAP, ad click if you’re ad-supported).
Why it matters: Retention without monetization is a hobby. You need proof that people value this enough to pay.
How to measure: (Users who paid) / (Total users) in a 30-day cohort.
Benchmark ranges:
- Subscription apps: 2–8% trial start rate in first 7 days
- IAP apps: 1–5% paying users in first 30 days
- Ad-supported: Not directly applicable, but session depth and ad load balance matters
Green flag: Above 3% for subs, above 2% for IAP.
Yellow flag: 1–2%; paywall is visible but not compelling.
Red flag: Under 1%; users don’t see enough value to consider paying.
3. Session Depth
What it is: Average number of screens or actions per session.
Why it matters: Deep sessions mean users are engaging with your core loop, not bouncing after the splash screen.
How to measure: Events per session or time in-app (exclude background time).
Rough targets:
- Utility: 3–6 actions (e.g., scan receipt, save item, export file)
- Social: 8–15 actions (scroll, like, comment, share)
- Content/entertainment: 5+ minutes per session
Green flag: Sessions are meaty; users complete your intended flow.
Red flag: Most sessions are under 30 seconds or 2 actions.
4. Organic Share of New Users
What it is: Percentage of new users who found you without paid ads.
Why it matters: Organic growth (search, word-of-mouth, social sharing) is proof that your app spreads naturally. If 100% of installs are paid, you have zero escape velocity.
How to measure: Check your attribution dashboard (Adjust, Appsflyer, etc.) or store analytics. Organic = non-attributed installs.
Green flag: 30%+ organic, especially if growing week-over-week.
Yellow flag: 10–30% organic; you’re not invisible, but you’re not viral either.
Red flag: Under 10%; you’re entirely dependent on paid spend.
5. Review Velocity and Sentiment
What it is: Number of reviews per week and average star rating trend.
Why it matters: Reviews are unsolicited user feedback. If people love it, they’ll say so. If they’re confused or frustrated, they’ll say that too.
How to measure: App Store Connect or Google Play Console. Track weekly review count and rating.
Green flag: 4.0+ rating, reviews mention specific features they love, volume is steady or growing.
Yellow flag: 3.5–4.0, mixed feedback, or declining review velocity.
Red flag: Under 3.5, frequent complaints about the same issue (bugs, confusion, value).

What “Almost There” Looks Like (and How to Close the Gap)
You’re close to PMF if:
- D1 is solid (35%+) but D7 drops off hard → Fix: Add a reminder/notification on day 2–3, introduce a streak or progress mechanic
- Retention is great but paying conversion is low → Fix: Surface your paywall earlier, clarify the value of premium features, test pricing
- Everything looks good except reviews mention one recurring bug → Fix: Prioritize that bug over new features
- Organic share is zero but retention/monetization are strong → Fix: Add share hooks, referral incentives, or improve your ASO
The last 20% is usually one or two obvious fixes. Don’t skip them to “scale faster.”
Worked Example: Should You Scale?
Let’s say you launched a habit-tracking app. You’ve got 100 new users this week (all organic). Here’s what you see:
- D1 retention: 40%
- D7 retention: 12%
- Paying conversion (trial starts in first 7 days): 1.8%
- Organic share: 100% (you haven’t run ads yet)
- App rating: 4.1 stars, 8 reviews, mostly positive but two mention “confusing onboarding”
Analysis:
- D1 is solid for the category (target: 25–40%) ✅
- D7 is at the low end (target: 15–25%) ⚠️
- Paying conversion is below target (want 2–5%) ⚠️
- Organic is great (no paid spend yet) ✅
- Reviews are positive but onboarding friction is clear ⚠️
Recommendation: You’re almost there. Fix onboarding (address the confusion), add a day-3 nudge to improve D7, and surface the paywall one screen earlier. Retest with the next 100 organic users. If D7 climbs to 16%+ and paying conversion hits 2.5%+, you’re green-lit for paid UA.
Math: If you spend $2 CPI to acquire users and 2.5% convert to a $5/month trial (assume 50% stick past trial), your 30-day LTV is roughly $0.06 per user (100 users × 0.025 × $2.50). That’s break-even at best. But if D7 retention improves and you unlock higher session frequency, LTV could hit $0.15–0.25, giving you room to scale profitably at $2 CPI.
DIY Checklist: Your PMF Pre-Flight
Before you spend a dollar on UA, run through this:
- Pull retention data for your last 3 cohorts (by install week). Calculate D1 and D7.
- Calculate paying conversion: (users who paid) / (total users) in the same cohorts.
- Check session depth: average actions or time per session for retained users.
- Review organic share: what percentage of installs are non-paid?
- Read your reviews: any recurring complaints? What’s your rating trend?
- Compare to benchmarks: are you in the top half of your category for D1/D7 and paying conversion?
- Identify the weakest link: which metric is dragging you down?
- Make one targeted fix: onboarding, paywall placement, feature clarity, bug fix.
- Retest with the next cohort. Did the fix move the number?
- Repeat until 3 of 5 signals are green. Then consider UA.
Quick Retention Calculator (Mini Spreadsheet)
Use this to quickly compute D1 and D7 from raw data:
| Cohort Date | New Users | Returned D1 | Returned D7 | D1 % | D7 % |
|---|---|---|---|---|---|
| 2025-10-01 | 100 | 42 | 14 | =C2/B2 | =D2/B2 |
| 2025-10-08 | 150 | 60 | 18 | =C3/B3 | =D3/B3 |
| 2025-10-15 | 120 | 50 | 15 | =C4/B4 | =D4/B4 |
Alternatives to DIY: When to Get Help
If your signals are green but you lack the time, budget, or expertise to scale, here are your options:
Option 1: Contractors or Agency
What you get: ASO services, UA management, creative production.
What it costs: $2K–10K/month retainer + ad spend.
Trade-offs: You control the budget and strategy, but you’re paying upfront with no guarantee of ROI. Good if you have capital and want full control.
Option 2: Publisher or Growth Partner
What you get: They fund UA, handle ASO, analytics, and iteration. You keep building.
What it costs: Revenue share (typically 40–60% depending on who owns what).
Trade-offs: No upfront spend, but you’re splitting revenue. Good if you’re capital-constrained and want to focus on product.
Option 3: Raise Funding
What you get: Capital to hire or contract growth ops in-house.
What it costs: Equity (10–25% for early rounds).
Trade-offs: You keep control but dilute ownership. Good if you’re building a venture-scale business and need to move fast.

Decision Tree: What’s Your Next Move?
If D1 < 25% or D7 < 10%: Don’t scale yet. Fix core retention first.
If D1 > 35% and D7 > 15%, but paying conversion < 2%: Improve monetization (paywall placement, pricing, feature clarity). Don’t scale until conversion is above 2%.
If D1 > 35%, D7 > 15%, paying conversion > 2%, and you have budget: Start small UA tests ($500–1K) to validate CPI and ROAS. Scale what works.
If all signals are green but you lack budget or ops capacity: Consider a partner or publisher who can fund and run growth while you keep shipping product.
If you’re pre-launch or under 100 users: Get to 100 organic users first. You can’t measure PMF without data.
Final Thought: PMF Is Not Binary
Product–market fit isn’t a switch you flip. It’s a gradient. You’re not looking for perfection; you’re looking for enough signal to justify investment.
The goal of this checklist isn’t to delay your growth. It’s to make sure that when you do scale, the money you spend actually compounds instead of evaporating.
If your signals are green and you’re ready to move, but you don’t want to manage UA/ASO/analytics yourself, that’s where a publishing partner can help.



