Scroll to explore the pipeline

Field observations are fragmented. A photo here, a location ping there, a pattern noticed over weeks. The challenge isn't collection—it's synthesis. Turning noise into signal, signal into intelligence.

Spyglass is a camera-first logging system designed for rapid field capture. Snap image, tag with minimal metadata, move on. But behind the brutal simplicity of the interface lies a Bayesian inference engine that never stops building its model of the world.

Raw captures flow through entity extraction, confidence weighting, and multi-source correlation. When patterns crystallize above threshold, structured briefs emerge—actionable intelligence from fragmented signals.

00

Design Process

Problem Space: Existing field collection tools optimize for structured data entry—forms, dropdowns, required fields. But field conditions demand speed. The operator's attention is a scarce resource. Every second spent managing UI state is a second not observing the environment.

Research: Studied military intelligence collection workflows (SALUTE briefs, INTREP formats), surveillance tradecraft, and mobile journalism tools. Key insight: professionals use unstructured capture (photos, voice memos) in the field and structure later. Why fight that pattern?

Failed Approaches: Early iterations tried guided capture flows—prompt for entity type, relationship, confidence level. Field testing showed this created decision fatigue. Operators stopped capturing marginal observations. The "easy path" of skipping the app won.

Breakthrough: Inverting the model. Capture everything with zero friction. Let inference happen in the background. The user doesn't assign confidence—the system calculates it from corroboration. Structure emerges from volume, not from upfront categorization.

Technical Constraints: On-device processing only (no cloud dependency for operational security). CoreML for entity extraction. Limited battery budget. Solution: aggressive background processing during charge, minimal real-time inference.

The design bet: lower capture friction more than compensates for noisier individual signals. Volume + inference beats sparse + structured.

Future Evolution: Multi-user fusion (shared belief networks across a team), temporal pattern detection (predicting where entities will appear), integration with external intelligence feeds, and real-time collaboration on active operations.

01

Belief Network

Each observation spawns or reinforces an entity node. Vehicles, people, locations, devices—all tracked as probability distributions, not discrete facts. Confidence accumulates through corroboration and decays without reinforcement.

Links form between entities that appear together—same time, same place, same pattern. When correlation strength exceeds threshold, the system generates hypotheses: potential relationships awaiting validation.

The network is alive. Nodes pulse with observation frequency. Links strengthen and fade. High-confidence entities pull related nodes closer. Weak entities drift to the periphery and eventually dissolve.

The system doesn't require explicit structure—patterns emerge from repeated observation and temporal proximity.

02

Signal Types

Different collection methods produce different signal types. Each carries its own reliability profile, decay rate, and correlation potential.

VISUAL COMINT HUMINT GEOINT SIGINT

VISUAL captures are the core—photographs from the field with embedded location and timestamp. High reliability, low latency.

COMINT (communications intelligence) logs intercepted or observed communications patterns. Text messages, radio chatter, observed phone usage.

HUMINT (human intelligence) captures verbal reports, overheard conversations, behavioral observations. Higher noise, but often provides context that sensors miss.

GEOINT (geospatial intelligence) tracks movement patterns, location clusters, route analysis. Time-series data that reveals habits.

SIGINT (signals intelligence) monitors electronic emissions, WiFi probe requests, Bluetooth beacons. Passive collection that builds device fingerprints.

A single photograph is ambiguous. Correlate it with location history, signal emissions, and prior sightings—now you have intelligence.

03

Inference Engine

Each observation updates the world model. Time, location, tags, and image content all factor into the probability graph. The inference engine maintains running posteriors on tracked entities.

Confidence builds through corroboration—independent signals pointing to the same conclusion. Confidence decays without reinforcement. The system forgets what isn't repeatedly confirmed.

Parameters are tuned for field conditions:

Confidence Threshold: 0.72 — Below this, patterns stay as hypotheses. Above, they generate briefs.

Temporal Decay: 0.003/hr — Stale information loses weight. Recent observations matter more.

Correlation Radius: 150m — Signals within this distance are candidates for entity fusion.

Pattern Window: 72hr — The temporal scope for detecting behavioral patterns.

The filing cabinet stores everything. The belief network decides what matters.

04

SALUTE Brief

When confidence crystallizes, the system generates a structured intelligence brief using the SALUTE format—a military standard for conveying tactical information quickly and completely.

S — Size: Who or what was observed. Entity count and classification.

A — Activity: What they were doing. Behavior pattern identification.

L — Location: Where. Precise coordinates, referenced to known landmarks.

U — Unit: Entity identification. Vehicle descriptions, individual profiles, group affiliations.

T — Time: When observed. Timestamp in local time with timezone.

E — Equipment: Notable items. Vehicles, devices, carried objects, collection method used.

Fragmentary signals in, actionable intelligence out. The brief is the product. Everything else is infrastructure.

Zero Friction Capture

The interface philosophy is brutal simplicity. Camera launches immediately on app open. Snap image, tag with minimal metadata, move on. Every tap should add value, not manage UI state.

Analysis happens in the background. Tags are freeform but the system learns. Related events cluster automatically. The user doesn't need to understand Bayesian inference—they just need to capture what they see.

The design bet is that intelligence emerges from volume and consistency. Lower the barrier to capture, maintain high-quality inference, let patterns surface on their own. The operator focuses on collection. Spyglass handles synthesis.

Built on SwiftUI and CoreML. Runs entirely on-device. No cloud dependency. Your intelligence stays yours.