Changelog
What's new in each release
Version 0.4.5 — Public Release
April 2026
- Hicortex client is now open source at github.com/gamaze-labs/hicortex under the MIT license
- Pre-public hardening: scrubbed internal references, updated to current LLM model versions
- OpenAI default updated to gpt-5.4-nano, Anthropic to claude-sonnet-4-6, Google to gemini-2.5-flash
Version 0.4.4 — Open Source Foundation
April 2026
- Hicortex client is now MIT-licensed open source at github.com/gamaze-labs/hicortex
- Centralized feature gating with deterministic license validation (no more "free tier during validation" race)
- Extension interfaces for future commercial Pro features (lesson selection, validation, smart context)
- Versioned schema migrations with transactional application
- Stricter IP boundary: commercial Pro code lives outside the published npm package
- 81 vitest tests covering core paths
Version 0.4 — Multi-Client Support
March 2026
- Connect multiple clients to one shared memory server
- Client mode:
npx @gamaze/hicortex init --server <url> - Clients distill sessions locally (privacy), POST memories to central server
POST /ingestREST endpoint for remote memory ingestion- Auto-detect Ollama models, Claude CLI, and API keys during setup
- Split LLM configuration: separate models for scoring, distillation, and reflection
- New MCP tools:
hicortex_updateandhicortex_delete - Balanced learning: reflection extracts lessons from both successes and failures
- Named relationship types for memory links (updates, derives, extends)
updated_attimestamp on memories for audit trail- Default auth token for baseline security on all endpoints
- Route distillation to remote Ollama instance (
distillBaseUrl) - Smart transcript chunking based on model context window
- Auto-detect Ollama for nightly when Claude CLI is rate-limited
- Dynamic CLAUDE.md injection: lessons + memory index + project context
Version 0.3 — Claude Code Support
March 2026
- Same
@gamaze/hicortexpackage works with both Claude Code and OpenClaw - One-command install:
npx @gamaze/hicortex init - Persistent HTTP/SSE MCP server with 4 tools (search, context, ingest, lessons)
- Automatic daemon install (launchd on macOS, systemd on Linux)
- Nightly pipeline: auto-capture CC transcripts, distill, consolidate, inject lessons
- Claude CLI as LLM backend (uses your Claude subscription, no API key needed)
- Unified DB at
~/.hicortex/with migration from legacy OpenClaw path - CLAUDE.md learnings block injection
- Custom commands:
/learn,/hicortex-activate
Version 0.2 — OpenClaw Plugin
February 2026
- Initial release as OpenClaw lifecycle plugin
- Automatic session capture via OpenClaw hooks
- Nightly distillation and consolidation pipeline
- Semantic search with BM25 + vector embeddings + RRF fusion
- Knowledge graph with memory linking
- Memory decay and strengthening model
- Internal memory scoring algorithm
- Multi-provider LLM support (20+ providers)