TestFlight week one data
I keep a spreadsheet of the first week of every product I launch on TestFlight. Here's BoardSnap's — the numbers, what they meant, and the three things we changed before public launch.
Week one of TestFlight is when you find out if the product you built is the product users need. Theory meets practice. Most of the surprises are uncomfortable. This one was no exception.
### The raw numbers
- Installs: 47 (via TestFlight invite links sent to personal network and two online communities)
- Opened the app at least once: 44 (94% install-to-open rate — this is normal; TestFlight skews toward curious early adopters)
- Snapped at least one board: 29 (66% activation rate — lower than I wanted)
- Completed a full summary (snap → viewed full output): 21 (72% of activated users, 48% of all installs)
- Returned on day 3 or later: 14 (30% D3 retention)
- Used the chat feature: 8 (28% of users who saw at least one summary)
### What the activation gap told me
66% activation on a product where the core loop is "snap a board" means 34% of users installed and never actually used the primary feature.
I interviewed five of the non-activating users. Three patterns:
- No whiteboard available. Installed the app at home or at a desk, didn't have a whiteboard to snap, forgot about the app by the time they were next in a meeting. This is a context problem: the app is useful in a meeting room, not at your desk. Getting installed before the meeting is the challenge.
- Didn't understand what to do first. Opened the app, saw an empty state (first version had no demo content), couldn't figure out the right first action. This was the onboarding failure I described in a separate post.
- Snapped the wrong thing. Tried to snap a printed document, a photo, or a laptop screen. VisionKit doesn't detect these as whiteboards, so the scanner view didn't react the way the user expected. I added better empty-state messaging for this case.
### The chat feature underuse
28% chat engagement was lower than I expected and higher than I feared. The low version: the feature is buried. The high version: it's there for users who want depth, and most first-week users were still establishing the core snap habit.
My current read: chat is a power-user feature that reveals its value after 3–5 board snaps, when the Project has enough history to make contextual questions interesting. Measuring chat at week one is too early. D30 chat engagement is the right metric.
### The three immediate fixes
Fix 1: Empty state with demo content. Added a demo project with one pre-populated board on first launch. New users see what a complete board summary looks like before they take their first snap. This was the most impactful change.
Fix 2: Onboarding prompt timing. Moved the "snap your first board" prompt to after the user has interacted with the demo board, not before. Interaction with demo → specific CTA → snap. This sequence improved activation meaningfully.
Fix 3: Wrong-thing scanner feedback. If VisionKit doesn't detect a whiteboard (because the user is pointing at something that isn't one), the UI now shows "Looking for a whiteboard — try pointing at a flat board with marker writing." Previously it just showed the camera view with no feedback, which confused users trying to understand why nothing was happening.
### D30 outlook
With these three fixes shipped by day 7, D30 retention is the number I'm watching. Based on the behavior of the 14 users who returned at D3+, and the correlation between board count and return rate, I'm targeting 35–40% D30 for the cohort that went through the fixed onboarding.
The honest answer is I don't know yet. Ask me again in three weeks.
Snap your first board today.
See the workflow this post talks about — free on the App Store.