Installation

One command. Zero configuration.

Requirements

+ Node.js 18+
+ Claude Code or OpenClaw installed

LLM and embeddings are handled automatically. No Python required.

Server Mode (Local Database)

Run Hicortex with a local SQLite database. All memories stay on your machine.

npx @gamaze/hicortex init

This command:

  • Detects your Claude Code or OpenClaw installation
  • Creates config at ~/.hicortex/config.json
  • Registers the MCP server with your agent
  • Installs the background daemon for nightly processing
  • Downloads the embedding model (ONNX, ~30MB)
Database location: ~/.hicortex/hicortex.db — auto-migrated from ~/.openclaw/data/ if you used a previous version.

Client Mode (Remote Server)

Connect to a central Hicortex server. Clients distill transcripts locally and POST memories to the server.

npx @gamaze/hicortex init --server https://your-server:8787

Use this for multi-machine setups where you want a single shared memory across all your machines.

How it works: The client runs the nightly pipeline locally (scanning transcripts, distilling with your local LLM), then sends the resulting memories to the central server via POST /ingest. Retrieval queries are sent to the server too.

OpenClaw

If you use OpenClaw as your agent gateway:

openclaw plugins install @gamaze/hicortex
openclaw gateway restart

The plugin auto-configures via OpenClaw hooks. No separate init needed.

Verify Installation

npx @gamaze/hicortex status

You should see:

  • Mode: server or client
  • Database path or server URL
  • Memory count
  • Daemon status

Or ask your agent: "What Hicortex tools do you have?"

It should list: hicortex_search, hicortex_context, hicortex_ingest, hicortex_lessons, hicortex_update, hicortex_delete.

Activate License (Optional)

Free tier includes 250 memories with full features. To upgrade:

  1. Get a key at hicortex.gamaze.com
  2. Tell your agent: "Activate my Hicortex key: hctx-your-key"

Your agent runs the activation tool and restarts. Done.

Updating

npx @gamaze/hicortex init

Re-running init updates to the latest version. Your memories, config, and license are preserved.

Next Steps

  • Configuration — LLM providers, auth, license
  • Usage — MCP tools, /learn command, nightly pipeline
  • Upgrading — Pricing tiers and license activation