Installation
One command. Zero configuration.
Requirements
LLM and embeddings are handled automatically. No Python required.
Server Mode (Local Database)
Run Hicortex with a local SQLite database. All memories stay on your machine.
npx @gamaze/hicortex init
This command:
- Detects your Claude Code or OpenClaw installation
- Creates config at
~/.hicortex/config.json - Registers the MCP server with your agent
- Installs the background daemon for nightly processing
- Downloads the embedding model (ONNX, ~30MB)
~/.hicortex/hicortex.db — auto-migrated from ~/.openclaw/data/ if you used a previous version.
Client Mode (Remote Server)
Connect to a central Hicortex server. Clients distill transcripts locally and POST memories to the server.
npx @gamaze/hicortex init --server https://your-server:8787
Use this for multi-machine setups where you want a single shared memory across all your machines.
POST /ingest. Retrieval queries are sent to the server too.
OpenClaw
If you use OpenClaw as your agent gateway:
openclaw plugins install @gamaze/hicortex
openclaw gateway restart
The plugin auto-configures via OpenClaw hooks. No separate init needed.
Verify Installation
npx @gamaze/hicortex status
You should see:
- Mode:
serverorclient - Database path or server URL
- Memory count
- Daemon status
Or ask your agent: "What Hicortex tools do you have?"
It should list: hicortex_search, hicortex_context, hicortex_ingest, hicortex_lessons, hicortex_update, hicortex_delete.
Activate License (Optional)
Free tier includes 250 memories with full features. To upgrade:
- Get a key at hicortex.gamaze.com
- Tell your agent: "Activate my Hicortex key: hctx-your-key"
Your agent runs the activation tool and restarts. Done.
Updating
npx @gamaze/hicortex init
Re-running init updates to the latest version. Your memories, config, and license are preserved.
Next Steps
- Configuration — LLM providers, auth, license
- Usage — MCP tools, /learn command, nightly pipeline
- Upgrading — Pricing tiers and license activation