mirror of
https://github.com/Monadical-SAS/reflector.git
synced 2026-03-22 07:06:47 +00:00
- Add setup script (scripts/setup-local-llm.sh) for one-command Ollama setup Mac: native Metal GPU, Linux: containerized via docker-compose profiles - Add ollama-gpu and ollama-cpu docker-compose profiles for Linux - Add extra_hosts to server/hatchet-worker-llm for host.docker.internal - Pass response_format JSON schema in StructuredOutputWorkflow.extract() enabling grammar-based constrained decoding on Ollama/llama.cpp/vLLM/OpenAI - Update .env.example with Ollama as default LLM option - Add Ollama PRD and local dev setup docs
Website
This website is built using Docusaurus, a modern static website generator.
Installation
$ yarn
Local Development
$ yarn start
This command starts a local development server and opens up a browser window. Most changes are reflected live without having to restart the server.
Build
$ yarn build
This command generates static content into the build directory and can be served using any static contents hosting service.
Deployment
Using SSH:
$ USE_SSH=true yarn deploy
Not using SSH:
$ GIT_USER=<Your GitHub username> yarn deploy
If you are using GitHub pages for hosting, this command is a convenient way to build the website and push to the gh-pages branch.