mirror of
https://github.com/Monadical-SAS/reflector.git
synced 2025-12-20 20:29:06 +00:00
d94e2911c3f9addf4e261c1407bda92ad4b9d41f
* serverless: implement banana backend for both audio and LLM Related to monadical-sas/reflector-gpu-banana project * serverless: got llm working on banana ! * tests: fixes * serverless: fix dockerfile to use fastapi server + httpx
Reflector
Reflector server is responsible for audio transcription and summarization for now. The project is moving fast, documentation is currently unstable and outdated
Server
We currently use oogabooga as a LLM backend.
Using docker
Create a .env with
LLM_URL=http://IP:HOST/api/v1/generate
Then start with:
$ docker-compose up
Languages
Python
72.3%
TypeScript
26.9%
JavaScript
0.3%
Dockerfile
0.2%