2026-03-19
v0.7.0: Instant & Everywhere

Tasks extracted in seconds, pushed to your Apple Watch, 3x faster transcription, and a new way to add data sources through Claude Code. Plus: a clearer product vision for what Remember This is becoming.

The Big Picture

This release does two things. On the surface, it adds instant task extraction, Apple Reminders integration, and an interactive planning board. Under the surface, it marks a shift in what Remember This is becoming.

The product vision has been reframed around a simple idea: Remember This is a personal context engine. It runs in the background, turns your fragmented life data into structured context, and makes that context available to AI tools that can actually use it. The app itself is a setup, curation, and exploration interface — not a daily destination.

This release ships the last batch of features from the old "task management" framing, while laying groundwork for the new direction.

Instant Task Extraction

Previously, tasks were only extracted during scheduled Claude Code sessions (once a day, typically at 11 PM). That meant a voice memo recorded at 9 AM wouldn't become a task until 14 hours later.

Now, a small local LLM (Llama 3.2 3B via Ollama) triages items within seconds of transcription. It extracts tasks, assigns urgency (now/soon/someday), tags life domains, and flags items for deeper Claude processing later.

"Reminder to call the dentist tomorrow" becomes a task in your inbox within seconds of finishing the voice memo.

Apple Reminders Integration

Tasks marked as urgent are automatically pushed to Apple Reminders. That means they sync to your iPhone and Apple Watch via iCloud — no extra apps, no extra sync, no extra accounts.

Voice memo on your Mac → transcribed in seconds → task extracted → appears on your wrist. The whole pipeline runs locally.

Orientation Board

A new fullscreen view for visual weekly planning. Tasks are arranged in time-horizon lanes (Right Now, Today, This Week, This Month, This Quarter, This Year) and can be dragged between lanes to reschedule.

The board reads from and writes to a markdown file in your vault, so it stays in sync with Obsidian and is readable by Claude Code.

Set Up Sources with Claude

Adding data sources (Telegram messages, WhatsApp history, Git commits, browser history) used to require knowing about CLI tools and file paths. Now there's a "Set up with Claude" button that launches a Claude Code session with source-specific prompts.

Claude walks you through installing prerequisites, authenticating, and testing — then you add the source through the app's UI. The technical complexity is handled conversationally.

3x Faster Transcription

The transcription engine switched from whisper-rs to a Sona sidecar, resulting in roughly 3x faster transcription on Apple Silicon. You also get real-time progress display, per-asset language override, and better name recognition through Whisper proper noun hints.

More Data Sources

New breadcrumb importers for Telegram messages (via tg-cli) and bank transaction CSVs. Plus, the MCP server now auto-configures itself on startup, and you can search photos by OCR text.

People Knowledge Graph

The people directory (Notes/life/people/) now has a clear structure:

  • INDEX.md — auto-generated summary table with Obsidian wikilinks (app-managed)
  • PEOPLE-MAP.md — relationship map and context (Claude-managed)
  • Individual person files — one per person, editable by you

You can now create person files manually for people who don't appear in photos, and the app preserves any fields you add when it updates photo counts.

Fixes

  • MCP search fixed — search for transcriptions, people, and locations was silently returning empty results due to a SQL NULL handling bug. Fixed across all MCP search tools.
  • Auto-updater fixed — the update check was failing silently due to a Tauri v2 URL template variable bug. Now works reliably.
  • Tauri upgraded to 2.10.3 with updater plugin 2.10.0.

What's Next: First Draft of Your Life

The next milestone is the product's first real proof point: connect Apple Photos, let the system process your library, and see a "first draft of your life" — key people, recurring places, life periods, and relationship clusters, all inferred from data you already have.

This requires building out the entity infrastructure (people, places, events, eras as first-class objects), redesigning the background processing pipeline, and having Claude Code "dreamer" sessions that synthesize meaning from structured data.

The detailed specs for all of this are already written. Implementation starts after this release.

Try It

Free. Local. Private. Your life, structured for AI.

macOS 15+ required. Apple Silicon recommended.

Questions? Found a bug? Reach out.