Whiteboard OCR on iPhone: four options compared.
Short answer
iPhone has four built-in or app-based paths for whiteboard OCR: Apple Live Text (built into Photos, instant), Apple Notes scanner (perspective-corrected, saves PDF), Microsoft Lens (whiteboard mode with color enhancement), and BoardSnap (VisionKit OCR feeding AI summarization and action items). For raw text extraction, Live Text is fastest. For structured output, BoardSnap.
## What whiteboard OCR means on iPhone
OCR — optical character recognition — on iPhone is handled by Apple's Vision framework (lowercase) and the on-device Neural Engine. Since iOS 15, OCR capability is embedded in the OS and available to all apps via VisionKit APIs. The quality has improved dramatically through iOS 17 and 18, with particular gains in handwriting recognition and mixed-case cursive.
## Option 1: Apple Live Text
How it works: Open a whiteboard photo in Photos.app. The Live Text icon (scan lines) appears automatically if text is detected. Tap it to select all visible text, then copy.
Accuracy: High for printed handwriting in dark marker. Degrades for cursive, light colors, or angled shots.
Output: Raw text, no structure. Includes any text in the image in reading order (left-to-right, top-to-bottom).
Cost: Free, built into iOS.
## Option 2: Apple Notes Document Scanner
How it works: In Notes.app, tap the camera icon → Scan Documents. Point at the board; the VisionKit quad locks on. Tap to capture. The scan is saved as a PDF on the note.
Accuracy: Good — VisionKit corrects perspective before OCR. Text in the PDF is selectable/searchable via Spotlight.
Output: PDF with selectable text. No AI analysis.
Cost: Free.
## Option 3: Microsoft Lens
How it works: Lens applies perspective correction and whiteboard-specific color enhancement (increases marker contrast, whitens background) before OCR. Saves to OneNote, OneDrive, camera roll, or Word.
Accuracy: The color enhancement step makes Lens measurably better than plain camera OCR for light-colored markers. OneNote OCR on the saved image makes text searchable.
Output: Image + selectable text in OneNote. No AI summary.
Cost: Free (Microsoft account required).
## Option 4: BoardSnap
How it works: VisionKit perspective correction → BoardSnap AI reads the corrected image → structured summary + tri-state action list output.
Accuracy: Same VisionKit foundation as Apple Notes, plus AI post-processing that can infer meaning from partial words and context.
Output: Structured markdown: summary paragraph, then action items with status and subtasks.
Cost: Free tier (30 boards). Pro at $9.99/mo or $69.99/yr.
## Which to use
For pure OCR output where you'll do your own formatting: Live Text or Apple Notes. For filing in Microsoft 365: Lens. For a structured, actionable summary of the board's content: BoardSnap.
Frequently asked
Does iPhone OCR work offline?
Yes — Apple's Vision framework and VisionKit run entirely on-device using the Neural Engine. Live Text, Notes scanner, and BoardSnap's capture all work without a network connection. BoardSnap's AI summarization requires internet, but the OCR capture queues on-device and syncs when you're back online.
See it work in ten seconds.
BoardSnap is free on the App Store. Snap a board — get a summary and action plan.