feat: add setup, configuration, and usage guide to README

This commit is contained in:
2026-02-20 15:28:28 -06:00
parent 0f81bb77d5
commit 54de10a8cc

View File

@@ -1,6 +1,95 @@
# InternalAI Workspace # InternalAI Workspace
Agent-assisted workspace for analyzing and tracking Monadical's internal data: meetings, emails, Zulip conversations, calendar events, documents, and git activity. Agent-assisted workspace to work on your own data with InternalAI (ContactDB / DataIndex).
## Setup
### Prerequisites
- [Greywall](https://gitea.app.monadical.io/monadical/greywall) installed — verify with `greywall --version`
- [OpenCode](https://opencode.ai) installed as a native binary (not a wrapper via bun/npm/pnpm)
### Greywall sandbox template
Run OpenCode in learning mode so Greywall can observe which files it reads and writes:
```
greywall --learning -- opencode
```
Interact briefly, then exit OpenCode. Greywall generates a sandbox template based on the observed filesystem access. Edit the template if needed.
### MCP configuration
Add the ContactDB and DataIndex MCP servers:
```
greywall -- opencode mcp add
```
Run the command twice with these settings:
| Name | Type | URL | OAuth |
|------|------|-----|-------|
| `contactdb` | Remote MCP | `http://caddy/contactdb-api/mcp/` | No |
| `dataindex` | Remote MCP | `http://caddy/dataindex/mcp/` | No |
Verify the servers are registered:
```
greywall -- opencode mcp list
```
Then open your proxy at `http://localhost:42000/proxy` and allow access to Caddy.
### LiteLLM provider
Add a `litellm` provider in `opencode.json`:
```json
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"litellm": {
"npm": "@ai-sdk/openai-compatible",
"name": "Litellm",
"options": {
"baseURL": "https://litellm-notrack.app.monadical.io",
"apiKey": "sk-xxxxx"
},
"models": {
"Kimi-K2.5-sandbox": {
"name": "Kimi-K2.5-sandbox"
}
}
}
}
}
```
Replace `apiKey` with your own key (check 1Password for "litellm notrack").
## Usage
Start OpenCode inside the Greywall sandbox:
```
greywall -- opencode
```
### First-run checklist
1. Select the Kimi K2.5 model under litellm in `/models` — type "hello" to confirm it responds (if not, check the proxy)
2. Test ContactDB access — ask "who am I?" (should trigger `get_me`)
3. Test DataIndex access — ask "what was my last meeting about?"
### Things you can do
- **Onboard yourself** — `can you onboard me?` creates your `MYSELF.md`
- **Weekly checkout** — `create my checkout of last week` builds a summary from your activity
- **Data analysis** — `create a workflow that searches all meetings since 2024 where Max is listed as a participant (not a contactdb), and output as csv` creates a marimo notebook in `workflows/`
- **Init a project** — `create the creatrix project` creates `projects/creatrix/` with base information
- **Sync a project** — `sync the creatrix project` runs a full 1-year analysis on the first run, then incremental syncs afterward, producing a live `project.md` document
## Skills ## Skills