Files
reflector/server/tests
Igor Loskutov 0b5112cabc feat: add LLM streaming integration to transcript chat
Task 3: LLM Streaming Integration

- Import Settings, ChatMessage, MessageRole from llama-index
- Configure LLM with temperature 0.7 on connection
- Build system message with WebVTT transcript context (max 15k chars)
- Initialize conversation history with system message
- Handle 'message' type from client to trigger LLM streaming
- Stream LLM response using Settings.llm.astream_chat()
- Send tokens incrementally via 'token' messages
- Send 'done' message when streaming completes
- Maintain conversation history across multiple messages
- Add error handling with 'error' message type
- Add message protocol validation test

Implements Tasks 3 & 4 from TASKS.md
2026-01-12 18:28:43 -05:00
..
2025-11-24 22:24:03 -05:00
2023-09-13 17:26:03 +02:00
2025-12-22 12:09:20 -05:00
2025-12-22 12:09:20 -05:00
2025-12-22 12:09:20 -05:00
2025-10-20 12:55:25 -04:00
2025-11-24 22:24:03 -05:00
2025-08-13 10:03:38 -04:00