- Add .env.example with LLM_API_URL, LLM_MODEL, LLM_API_KEY - Add .gitignore to exclude .env - Add Pattern 5 (LLM filtering) to notebook-patterns.md - Track workflows/lib with llm_call helper using mirascope - Update README with LLM setup step and updated project structure
3.4 KiB
InternalAI Agent
A documentation and pattern library that gives LLM agents the context they need to build data analysis workflows against Monadical's internal systems — ContactDB (people directory) and DataIndex (unified data from email, calendar, Zulip, meetings, documents).
The goal is to use opencode (or any LLM-powered coding tool) to iteratively create marimo notebook workflows that query and analyze company data.
Setup
- Install opencode
- Make sure InternalAI is running locally (ContactDB + DataIndex accessible via http://localhost:42000)
- Configure LiteLLM — add to
~/.config/opencode/config.json:
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"litellm": {
"npm": "@ai-sdk/openai-compatible",
"name": "Litellm",
"options": {
"baseURL": "https://litellm.app.monadical.io",
"apiKey": "xxxxx"
},
"models": {
"Kimi-K2.5-dev": {
"name": "Kimi-K2.5-dev"
}
}
}
}
}
Replace xxxxx with your actual LiteLLM API key.
- (Optional) LLM filtering in workflows — if your workflows need to classify or score entities via an LLM, copy
.env.exampleto.envand fill in your key:
cp .env.example .env
The workflows/lib module provides an llm_call helper (using mirascope) for structured LLM calls — see Pattern 5 in docs/notebook-patterns.md.
Quickstart
- Run
opencodefrom the project root - Ask it to create a workflow, e.g.: "Create a workflow that shows all meetings about Greyhaven in January"
- The agent reads
AGENTS.md, proposes a plan, and generates a notebook likeworkflows/001_greyhaven_meetings_january.py - Run it:
uvx marimo edit workflows/001_greyhaven_meetings_january.py - Iterate — review the output in marimo, go back to opencode and ask for refinements
How AGENTS.md is Structured
AGENTS.md is the entry point that opencode reads automatically. It routes the agent to the right documentation:
| Topic | File |
|---|---|
| Company context, tools, connectors | docs/company-context.md |
| People, contacts, relationships | docs/contactdb-api.md |
| Querying emails, meetings, chats, docs | docs/dataindex-api.md |
| Connector-to-entity-type mappings | docs/connectors-and-sources.md |
| Notebook templates and patterns | docs/notebook-patterns.md |
It also includes API base URLs, a translation table mapping natural-language questions to API calls, and rules for when/how to create workflow notebooks.
Project Structure
internalai-agent/
├── AGENTS.md # LLM agent routing guide (entry point)
├── .env.example # LLM credentials template
├── docs/
│ ├── company-context.md # Monadical org, tools, key concepts
│ ├── contactdb-api.md # ContactDB REST API reference
│ ├── dataindex-api.md # DataIndex REST API reference
│ ├── connectors-and-sources.md # Connector → entity type mappings
│ └── notebook-patterns.md # Marimo notebook templates and patterns
└── workflows/
└── lib/ # Shared helpers for notebooks
├── __init__.py
└── llm.py # llm_call() — structured LLM calls via mirascope