Field Notes · 2026-04-14 · 7 min read

How we make BoardSnap feel instant

The real pipeline from shutter tap to action plan takes about 8–12 seconds. Here's every trick we use to make it feel like 3.

Speed is a feature. But perceived speed is a different feature — one you can control independently of actual latency.

BoardSnap targets a sub-10-second round trip from shutter tap to action plan. On a fast connection with a clean board, we hit around 8 seconds. On a congested network, it can stretch to 14. But here's what I care about more than the raw number: users consistently describe the app as "fast" even when the actual latency is toward the high end. That gap between actual and perceived speed is engineered, not accidental.

Here's how it works.

### Step 1: VisionKit runs on-device immediately

The moment you tap the shutter, Apple's VisionKit starts working — and it runs entirely on-device. No network request. No round trip.

VisionKit detects the whiteboard's bounding quad in real time while you're still pointing the camera. By the time you tap, it already knows where the board is. The perspective correction happens in milliseconds after the shot is taken. The image is straightened, cropped, and normalized before we make any network call.

This matters for two reasons. First, it reduces what we have to send — a properly cropped board image is smaller than a raw wide-angle photo. Second, the user sees an immediate response: the straightened image appears in under a second, which signals that something is happening. You don't stare at a spinner. You see your board, clean and cropped.

### Step 2: Optimistic UI fills the gap

While the image uploads and BoardSnap AI processes it, we show the user a card with the board thumbnail, a pulsing scan line, and the word "Analyzing…"

The card is real. It exists in the project list immediately. You can navigate away, do something else, come back. The result will be there when the processing finishes.

This is the most important piece of perceived speed: we don't make you wait at a spinner screen. The processing happens in the background while you live your life. When it's done, you get a notification — locally delivered, no push required — and the card updates in place.

The alternative — making users stare at a loading bar — would feel slower even if the actual latency were identical. The optimistic card approach is probably responsible for half the "it's so fast" comments we get.

### Step 3: Streaming the summary

Once BoardSnap AI starts generating, we stream the response rather than waiting for the full output. Characters appear as they're generated. This is the same technique every chat interface uses, and it works for the same reason: a progressively appearing result feels active rather than waiting.

For a typical whiteboard summary — two or three paragraphs plus a list of action items — the full generation takes 4–6 seconds. Streaming means users see the first words in under a second after processing begins. By the time the last action item appears, the beginning of the summary has been readable for several seconds.

### Step 4: Aggressive caching of project context

BoardSnap's brand-aware AI reads your website URL once and caches the brand context per project. That context doesn't get re-fetched on every snap — it's stored and included in the prompt from cache.

This shaves roughly 1–2 seconds off every subsequent snap in the same project. First snap in a new project pays the full cost. Everything after that gets the cached brand context essentially free.

### Step 5: The offline queue as a speed feature

If you snap a board while offline, BoardSnap doesn't fail — it queues the capture on-device and processes it the moment you're back online. From the user's perspective, this means the app never says no. You can snap a board in a basement, on a plane, in a tunnel.

But here's the non-obvious thing: the queue is also a speed feature for online captures. Because we've built the queueing infrastructure, every snap goes through the same queue — even on a fast connection. This means we can batch, retry, and prioritize intelligently. The queue lets us smooth over network blips without the user ever seeing a failed request.

### What we haven't optimized yet

Transparency: the slowest part of our pipeline right now is image upload on slow connections. A well-lit whiteboard photo after VisionKit processing is typically 300–600KB, which uploads in 1–2 seconds on LTE. On 3G or a congested hotel WiFi, that can stretch to 4–5 seconds.

We're working on a progressive upload approach — send a downsampled preview first, start processing, then send the full-resolution image to replace it. The summary quality on the first pass might be slightly lower, but the perceived start of processing would move up by 2–3 seconds. Shipping that before v1.1.

### The lesson

Actual speed matters up to a threshold. Past that threshold, perceived speed is what users are actually rating. The gap between the two is entirely in your control — through optimistic UI, streaming, caching, and a queue that never says no.

BoardSnap feels fast because we've engineered the perception of speed as deliberately as the actual performance.

Frequently asked

Does BoardSnap work offline?

Yes. BoardSnap queues captures on-device when you have no connection. The moment you're back online, the queue flushes and your summaries appear. You never lose a snap.

How long does a typical BoardSnap analysis take?

On LTE or WiFi, about 8–12 seconds from shutter tap to full action plan. The first visible result (the optimistic card) appears in under a second. The summary streams in progressively from there.

Snap your first board today.

See the workflow this post talks about — free on the App Store.

Free · 1 project, 30 boards Pro $9.99/mo · everything unlimited Pro $69.99/yr · save 42%
BoardSnap Free on the App Store Get