Last Roll is built for speed. Here's how GPU compositing, hardware-accelerated computer vision, and zero-dependency binary parsing come together to deliver instant response — even on 45-megapixel RAW files.
The photo viewer isn't built on SwiftUI or AppKit in the traditional sense. The viewer is built on CALayer — Apple's compositing engine that sits directly on top of Metal, the same low-level GPU API used by games and professional 3D software on Mac.
Each image is uploaded to the GPU as a texture, once. Zoom and pan are not software operations — they are 3D matrix transforms (CATransform3D) applied directly to the hardware layer. Zooming from 100% to 200% on a 45-megapixel file is instantaneous because the CPU never touches a single pixel. The GPU composes the layers autonomously at 60 or 120 fps.
Zoom and pan are expressed as a CATransform3D — a 4×4 homogeneous matrix applied entirely by the GPU compositor:
Focus peaking is probably the most technically sophisticated feature in the app. Press D and the following sequence runs in milliseconds — start to finish, on a full-resolution file.
The color image is rendered into a byte buffer via CGContext. Each pixel becomes a single float value normalized between 0.0 and 1.0, converted using vDSP_vfltu8 — a vectorized function that transforms an entire array in a single SIMD instruction rather than pixel-by-pixel.
The core of the algorithm. A 3×3 kernel slides over every pixel and computes the second derivative of the luminance signal. In plain terms: it detects where brightness changes sharply — and sharp transitions correspond to in-focus edges.
This operation on a 24-megapixel image would take tens of seconds in interpreted Python. Here it runs in single-digit milliseconds.
Instead of a fixed cutoff — which would produce different results on bright vs. dark images — the threshold is derived from the statistical properties of the Laplacian map itself:
Where μ is the mean and σ the standard deviation (computed with vDSP_meanv and vDSP_vsq), and k is the user-controlled sensitivity slider (0.3 → 3.0). This is the same statistical approach used in industrial machine vision systems.
Raw detected edges are one pixel thin — effectively invisible at screen resolution. A second convolution pass with a diamond-shaped kernel expands them outward, producing the visible colored outline around sharp zones. Without this step, the focus peaking overlay would appear as invisible scattered single pixels rather than clean borders.
The result is an RGBA image with pre-multiplied alpha (for correct gamma handling) composited as a separate CALayer on top of the photo. Changing the overlay color or sensitivity only recomputes the necessary buffer — no part of the interface is redrawn.
Nikon's RAW format (.NEF) is a TIFF container — the same format used for professional images since 1986. Autofocus data is hidden inside the MakerNote, a proprietary block that Nikon embeds inside EXIF tag 0x927C.
Last Roll uses no external libraries for this. The parser is written entirely in Swift and works directly on raw file bytes, navigating the nested binary structure layer by layer:
The result is a rectangle in the original image's pixel coordinate space. It's then scaled to screen coordinates and drawn as a vector overlay — a CGPath with rounded corners and a center crosshair — directly on top of the photo.
RAW files don't contain only raw sensor data. Every modern RAW file embeds at least one full-resolution JPEG preview — the same image the camera displays on its rear LCD. Last Roll exploits this to load photos at maximum speed.
The loader scans the file's byte stream looking for the standard JPEG boundary markers:
All embedded JPEG blocks are located and the largest one is returned — which corresponds to the full-resolution preview. This approach is significantly faster than full RAW decoding and requires no external libraries.
Sessions are serialized as JSON with dates in ISO 8601 format (2026-02-14T09:00:00Z) — the international standard for date interchange, ensuring sessions are read correctly across time zones and system locales.
The file is first written to a temporary location, then renamed to the target path in a single OS-level operation — the only operation that is truly atomic on macOS. If the app crashes mid-save, the previous file is untouched. Culling data is never lost to a partial write.
Sessions created by older app versions — before the notes field existed — decode cleanly. The custom decoder uses decodeIfPresent so missing JSON keys produce a default empty string rather than a parse error. No migration scripts, no versioning flags.