Answer

How to scan a whiteboard with your iPhone.

Short answer

To scan a whiteboard with an iPhone, open BoardSnap, point the camera at the board, and tap the shutter — Apple VisionKit detects the board edges and auto-corrects perspective. BoardSnap AI then reads the content and delivers a summary plus action items in about ten seconds.

## The fastest way: BoardSnap

BoardSnap is built specifically for this workflow. Here's the exact flow:

  1. Open BoardSnap on your iPhone. Select or create a Project (e.g., "Q3 Planning" or "Client: Acme").
  2. Tap the camera icon. The viewfinder launches with VisionKit's quad overlay — a four-corner bounding box that tracks the whiteboard in real time as you move the phone.
  3. Center the board in frame so all four corners are visible. The quad turns yellow/gold when it locks on.
  4. Tap the shutter. VisionKit captures and perspective-corrects the image on-device. No cropping, no manual corner adjustment.
  5. Wait ~10 seconds. BoardSnap AI reads the corrected image and returns a plain-English summary plus a tri-state action item list (open / in-progress / done).
  6. Review and edit. Tap any action item to change its state, edit the text inline, or ask BoardSnap AI a follow-up question via chat.

## Without a dedicated app

Your iPhone's built-in Files.app document scanner and Notes.app scanner both use the same VisionKit framework and will produce a similarly clean, perspective-corrected image. The difference: they stop there. You get a flat image or a PDF. No OCR output to your clipboard, no summary, no action items.

The workaround — exporting the image to ChatGPT or Claude — adds steps and loses context. You'd have to reprompt every session, and nothing carries forward to your next meeting.

## Tips for a clean scan

  • Lighting matters more than distance. Overhead fluorescent lights create glare on shiny boards. Angle the phone slightly to the side to avoid direct reflection.
  • Dark marker on white board scans cleanest. Light gray or yellow marker on white boards can lose contrast.
  • Clean the board first. Ghost marks from old sessions confuse the AI about what's current content.
  • Capture before erasing — obvious, but worth saying. BoardSnap's offline queue means you can snap even without signal; it syncs later.

## What VisionKit actually does

Apple's VisionKit framework runs entirely on-device using the Neural Engine. It detects the document/board boundary via a rectangle detection pass, then applies a perspective transform (homography) to produce a front-facing flat image. This is the same pipeline used in Notes, Files, and the Continuity Camera document scanner. BoardSnap wraps this in a purpose-built UI and feeds the corrected image directly to the AI layer.

The result is better than a manual crop: VisionKit's transform accounts for lens distortion as well as angle, so text near the edges of a wide board stays readable.

Frequently asked

Can I scan a whiteboard with the iPhone's built-in camera and still get a summary?

Yes, but with extra steps. Take the photo, then upload it to ChatGPT or Claude and prompt for a summary. BoardSnap shortens that to one tap — and keeps the output organized by project.

Does the scan work on glass whiteboards or dark boards?

VisionKit works on any high-contrast surface. Glass whiteboards with dark markers scan well. Low-contrast combos (light marker on cream board) may need better lighting to get a clean result.

See it work in ten seconds.

BoardSnap is free on the App Store. Snap a board — get a summary and action plan.

Free · 1 project, 30 boards Pro $9.99/mo · everything unlimited Pro $69.99/yr · save 42%
BoardSnap Free on the App Store Get