mirror of
https://github.com/Monadical-SAS/reflector.git
synced 2026-02-04 18:06:48 +00:00
d94e2911c3f9addf4e261c1407bda92ad4b9d41f
* serverless: implement banana backend for both audio and LLM Related to monadical-sas/reflector-gpu-banana project * serverless: got llm working on banana ! * tests: fixes * serverless: fix dockerfile to use fastapi server + httpx
Reflector
Reflector server is responsible for audio transcription and summarization for now. The project is moving fast, documentation is currently unstable and outdated
Server
We currently use oogabooga as a LLM backend.
Using docker
Create a .env with
LLM_URL=http://IP:HOST/api/v1/generate
Then start with:
$ docker-compose up
Languages
Python
73.9%
TypeScript
24.6%
Shell
0.8%
JavaScript
0.3%
Dockerfile
0.2%