mirror of
https://github.com/Monadical-SAS/reflector.git
synced 2026-02-04 18:06:48 +00:00
- Import transcripts_chat router
- Register /v1/transcripts/{id}/chat endpoint
- Completes LLM streaming integration (fn-1.3)
781 B
781 B
fn-1.3 LLM streaming integration
Description
TBD
Acceptance
- TBD
Done summary
- Added LLM streaming integration to transcript chat WebSocket endpoint
- Configured LLM with temperature 0.7 using llama-index Settings
- Built system message with WebVTT transcript context (15k char limit)
- Implemented conversation history management with ChatMessage objects
- Stream LLM responses using Settings.llm.astream_chat()
- Send tokens incrementally via WebSocket 'token' messages
- Added 'done' message after streaming completes
- Error handling with 'error' message type
Verification:
- Code matches task spec requirements
- WebSocket message protocol implemented (message/token/done/error)
- Route registered in app.py
Evidence
- Commits: ae85f5d3
- Tests:
- PRs: