Haptics as a product feature
We use three distinct haptic patterns in BoardSnap, each with a specific functional meaning. Here's why haptics aren't decoration in a camera app — they're the interface.
Most apps treat haptics the way most apps treat sound: as a nicety. A subtle confirmation tap when you complete an action. Nice to have, easy to ignore.
In BoardSnap, haptics are functional. Here's why, and how we designed the haptic language.
### The camera interface problem
When you're photographing a whiteboard, your primary attention is on the board, not on the phone screen. You need to know what the app is doing — whether the scanner has locked on the board, whether you need to adjust your angle — without looking at the screen.
Text and visual cues require looking at the screen. Haptics don't.
This is the case for functional haptics: in contexts where the user's eyes aren't on the interface, haptic feedback is the only reliable communication channel.
### The three haptic patterns
Pattern 1: Detection lock — double tap
When VisionKit achieves high-confidence detection of a whiteboard (the yellow quad is stable and the confidence score exceeds our threshold), the phone delivers a double haptic tap — similar to a notification. This tells the user: the app has a clean lock on the board, you can capture now.
Users learn this pattern within their first two snaps, usually without being told what it means. The connection is intuitive: double tap = ready.
Pattern 2: Capture confirmation — medium impact
When the shutter is triggered, a single medium-weight impact. This confirms the capture happened — different from the soft taps the OS uses for navigation and distinct enough to register as "something happened."
This pattern exists because in bright environments, the visual shutter flash can be invisible. Users sometimes weren't sure if the snap happened. The haptic is the reliable confirmation.
Pattern 3: Analysis complete — success notification
When the BoardSnap analysis completes and the summary card appears, we deliver Apple's standard "success" haptic pattern (a rising short-long sequence using UINotificationFeedbackGenerator). This is the same haptic that appears when you successfully authenticate with Face ID or complete an Apple Pay transaction — users already associate it with completion.
Analysis happens in the background, potentially while the user has navigated elsewhere. The success haptic brings their attention back to the app at exactly the right moment.
### Implementation details
We use UIImpactFeedbackGenerator for the capture confirmation, UINotificationFeedbackGenerator.notificationOccurred(.success) for analysis completion, and a custom sequence of UIImpactFeedbackGenerator calls at .light weight for the detection lock pattern.
One thing that bit us: haptic feedback generators need to be prepared before use. Calling prepare() before the expected haptic reduces latency. If you call impactOccurred() without preparing, there's a perceptible delay on the first trigger. For a detection-lock haptic that should feel immediate, that delay matters.
### The respect-for-context rule
Haptics are off in Silent Mode by default. We don't override this. Users who have their phone silenced have a reason — they're in a context where they don't want feedback. We respect that.
The visual fallbacks (the quad animation, the shutter flash, the animated processing card) work without haptics. Haptics enhance the experience; they're not required for it.
### The user feedback
In beta user interviews, haptics came up unprompted in positive feedback: "I like how you can feel when it's locked on," "the vibration when it's done is satisfying." No user mentioned haptics negatively.
When I asked specifically about haptics, every user who had them on reported that the detection-lock double tap was the most useful — it freed them from watching the screen and let them focus on holding the phone steady. Exactly the behavior we designed for.
Snap your first board today.
See the workflow this post talks about — free on the App Store.