Changelog

What's new in each release

Version 0.7.0 — Graph Intelligence

April 2026

  • Graph-based community detection: Louvain algorithm discovers knowledge domains from the memory link graph — zero LLM cost, automatic during nightly consolidation.
  • Hub node detection: highly-connected memories get a strength boost, surfacing foundational insights in lesson selection.
  • New MCP tool: hicortex_graph enables graph traversal — find connected memories, hub nodes, or shortest paths between memories.

Version 0.6.0 — Knowledge Domain Routing

April 2026

  • Knowledge domain routing: memories are automatically grouped into knowledge domains during nightly consolidation. Agents see a structured domain index instead of a flat project list.
  • Domain-aware lesson selection (Pro): lessons from related projects in the same domain are now boosted during injection, so deploy lessons from one repo help another repo in the same domain.
  • New MCP tool: hicortex_index lets agents query what knowledge domains exist before searching.
  • Configurable injection token budget via moduleIndexTokenBudget in config
  • Schema migration: new domain column on memories (automatic on upgrade)

Version 0.5.3 — Security & Quality

April 2026

  • Pre-ingestion redaction: API keys, tokens, paths, and secrets are automatically scrubbed from session transcripts before they reach the LLM or storage. 12 default patterns, configurable extras.
  • Contradiction detection: new lessons are checked against existing ones during nightly reflection. Contradicting lessons are suppressed to prevent wrong lessons from reinforcing themselves.
  • Anonymous telemetry for usage tracking (opt-out via config)
  • Pre-flight health check for reflect endpoint (skips reflection when MBP offline)

Version 0.5.0 — Pi Agent Support

April 2026

  • Pi coding agent support — nightly distillation reads Pi session transcripts alongside Claude Code sessions
  • MCP tools work via pi-mcp-adapter connecting to the Hicortex SSE endpoint
  • Configurable lesson injection target: set lessonTarget in config to inject into any file (e.g., .pi/EXPERIENCE.md for Pi agents, CLAUDE.md for CC)
  • Pro extension infrastructure: the client can download and load commercial Pro packages at runtime
  • Package directory renamed from openclaw-plugin to hicortex

Version 0.4.6 — Nightly Data-Loss Fix

April 2026

  • Fixed critical bug: nightly pipeline silently lost sessions when the distillation LLM was unreachable or the required model was missing
  • Pre-flight health check on remote Ollama distill endpoints (reachability + model loaded)
  • Per-session deduplication in server mode — makes retries idempotent
  • Last-run watermark only advances when every session was processed cleanly
  • 7 new regression tests (103 total)

Version 0.4.5 — Public Release

April 2026

  • Hicortex client is now open source at github.com/gamaze-labs/hicortex under the MIT license
  • Pre-public hardening: scrubbed internal references, updated to current LLM model versions
  • OpenAI default updated to gpt-5.4-nano, Anthropic to claude-sonnet-4-6, Google to gemini-2.5-flash

Version 0.4.4 — Open Source Foundation

April 2026

  • Hicortex client is now MIT-licensed open source at github.com/gamaze-labs/hicortex
  • Centralized feature gating with deterministic license validation (no more "free tier during validation" race)
  • Extension interfaces for future commercial Pro features (lesson selection, validation, smart context)
  • Versioned schema migrations with transactional application
  • Stricter IP boundary: commercial Pro code lives outside the published npm package
  • 81 vitest tests covering core paths

Version 0.4 — Multi-Client Support

March 2026

  • Connect multiple clients to one shared memory server
  • Client mode: npx @gamaze/hicortex init --server <url>
  • Clients distill sessions locally (privacy), POST memories to central server
  • POST /ingest REST endpoint for remote memory ingestion
  • Auto-detect Ollama models, Claude CLI, and API keys during setup
  • Split LLM configuration: separate models for scoring, distillation, and reflection
  • New MCP tools: hicortex_update and hicortex_delete
  • Balanced learning: reflection extracts lessons from both successes and failures
  • Named relationship types for memory links (updates, derives, extends)
  • updated_at timestamp on memories for audit trail
  • Default auth token for baseline security on all endpoints
  • Route distillation to remote Ollama instance (distillBaseUrl)
  • Smart transcript chunking based on model context window
  • Auto-detect Ollama for nightly when Claude CLI is rate-limited
  • Dynamic CLAUDE.md injection: lessons + memory index + project context

Version 0.3 — Claude Code Support

March 2026

  • Same @gamaze/hicortex package works with both Claude Code and OpenClaw
  • One-command install: npx @gamaze/hicortex init
  • Persistent HTTP/SSE MCP server with 4 tools (search, context, ingest, lessons)
  • Automatic daemon install (launchd on macOS, systemd on Linux)
  • Nightly pipeline: auto-capture CC transcripts, distill, consolidate, inject lessons
  • Claude CLI as LLM backend (uses your Claude subscription, no API key needed)
  • Unified DB at ~/.hicortex/ with migration from legacy OpenClaw path
  • CLAUDE.md learnings block injection
  • Custom commands: /learn, /hicortex-activate

Version 0.2 — OpenClaw Plugin

February 2026

  • Initial release as OpenClaw lifecycle plugin
  • Automatic session capture via OpenClaw hooks
  • Nightly distillation and consolidation pipeline
  • Semantic search with BM25 + vector embeddings + RRF fusion
  • Knowledge graph with memory linking
  • Memory decay and strengthening model
  • Internal memory scoring algorithm
  • Multi-provider LLM support (20+ providers)