Researchers on 30-day expeditions had spotty satellite internet, frozen fingers, and no patience for chat UIs. The existing AI assistant required typed prompts and always-on connectivity — two things the field could not offer.
- Must work offline for 72 hours and reconcile when a link is found.
- Touch targets big enough for insulated gloves.
- No LLM call costs more than $0.04 — budgets are tight on expedition.
Turn the copilot into a notebook. Researchers dictate or scribble; the AI fills in fields, suggests tags, and flags contradictions — but never takes the wheel. Streaming UI hides latency; optimistic UI hides offline gaps.
- 01
Field study
Ran a two-week pilot with the Svalbard team over Starlink. Logged every interruption. There were 312.
- 02
UX for unreliability
Designed state machines for every sync scenario. 'Draft' / 'Queued' / 'Reconciled' / 'Conflict'. Every change has a reversible history.
- 03
Prompt design
Wrote 40+ system prompts, evaluated against 200 historic logs. Built an eval harness in TypeScript that grades output against rubrics.
- 04
Motion as affordance
Subtle Framer Motion cues for sync status — a 400ms opacity dip, a gentle underline. Zero pop-ups.
“AI in hostile environments is not about model quality. It's about gracefully failing in 14 different weather conditions. Design for the worst day.”