Field Notes · 2026-04-11 · 5 min read

Whiteboard photography tips from 100 snaps

I went through the first 100 boards that beta users snapped and categorized every failure. Most bad scans come from the same three mistakes. Here's exactly what to fix.

When beta launched, I asked early users to send me their failed scans — boards where BoardSnap didn't read the content correctly. I got about 30 failures in the first two weeks, which I analyzed alongside the successful scans.

Here's what I found.

### Failure #1: Glare from overhead lighting (42% of failures)

This is the most common failure mode by a wide margin. Overhead fluorescent lighting — standard in conference rooms — creates a hotspot on the whiteboard surface where light reflects directly into the camera. VisionKit can't read through a glare hotspot. Nobody can.

The fix: Move. Don't try to take the photo from straight on if there's a hotspot. Step slightly to the left or right — even 20–30 degrees is usually enough to move the glare off the writing. VisionKit handles the perspective correction, so you don't need to be perfectly perpendicular. Being off-axis is fine as long as the glare is gone.

If you can't move (you're in the back corner, the board is against a wall), point the phone slightly downward. The glare from overhead lights disappears when you change the vertical angle even slightly.

Quick test: before you snap, check the phone's live view. If you can see the marker writing clearly on the screen, the shot will work. If there's a bright white blob washing out part of the board, move until it's gone.

### Failure #2: Low contrast markers (31% of failures)

This is the one that surprises people. Whiteboard markers come in a huge range of quality and age. A fresh, high-quality blue or black marker on a clean white board is easy to read. A faded red marker on a slightly yellowed board in a room with dim lighting is genuinely hard — not just for AI, but for human eyes too.

Specific culprits from the beta:

  • Yellow markers on a white board. Yellow on white is low contrast in any lighting.
  • Light gray dry-erase markers (usually the "low-odor" variety). These have lower pigment density and look faded even when new.
  • Old markers on well-used boards. The board surface gets a thin film of marker residue over time that reduces contrast.

The fix: If you're in control of the markers, use black or dark blue. These are the highest-contrast options and work reliably across all lighting conditions. In a room where you're not controlling the markers, move closer to reduce the physical distance the camera has to read across, or use the phone's exposure lock to expose for the marker color rather than the overall scene.

In Photos.app, you can also increase contrast and brightness after the fact and then import the edited photo into BoardSnap via the share extension — if the original snap failed, this often recovers it.

### Failure #3: Partial board captures (19% of failures)

VisionKit is great at edge detection when the full board is in frame. It struggles when parts of the board are cut off by the photo edge, obscured by someone standing in front of it, or when the board extends beyond what the camera can capture.

This happens most often with large floor-to-ceiling boards in small rooms — the room simply doesn't give you enough distance to get the whole board in frame at the angle you want.

The fix for small rooms: Take two snaps — left half and right half — and BoardSnap will create two separate boards in your project. The summaries can then be read together. Not perfect, but better than a failed single snap.

The fix for foreground occlusion: Wait until the room clears. I know this sounds obvious, but the reflex after a meeting is to immediately snap before people scatter — which is also when the most people are standing in front of the board. Give it 30 seconds.

### What the 8% of other failures were

The remaining failures were a mix: boards in very low light (a meeting room with the lights dimmed for a presentation), boards with heavy use of mind-map-style arrows where the AI couldn't follow the connection logic, and one case where a user tried to snap a projected image of a board (which doesn't trigger VisionKit's document detection because it doesn't have the physical edge profile of a real board).

### The one-sentence rule

If I had to summarize 100 snaps of lessons in one rule: the camera needs a clear path to the marker. Glare, distance, low contrast, and occlusion are all ways that path gets blocked. Remove the obstacle and the snap will work.

BoardSnap's VisionKit integration handles the rest — perspective, rotation, cropping. Your job is just to get a clear view of the marker.

Frequently asked

Does BoardSnap work with photos taken by a regular camera, not the in-app scanner?

Yes — you can import photos from your camera roll via the share extension. The VisionKit perspective correction won't apply (that runs in real-time as you scan), but BoardSnap AI will still read and summarize the content. For best results on imported photos, try to shoot the board as straight-on as possible.

What's the best lighting for whiteboard photography?

Indirect natural light is the best. Overhead fluorescent lighting is the most common problem. If you're in a fluorescent-lit room, move the angle of your shot — left, right, or slightly downward — until the glare hotspot moves off the marker area.

Snap your first board today.

See the workflow this post talks about — free on the App Store.

Free · 1 project, 30 boards Pro $9.99/mo · everything unlimited Pro $69.99/yr · save 42%
BoardSnap Free on the App Store Get