Files
reflector/.flow/tasks/fn-1.3.md
Igor Loskutov b461ebb488 feat: register transcript chat WebSocket route
- Import transcripts_chat router
- Register /v1/transcripts/{id}/chat endpoint
- Completes LLM streaming integration (fn-1.3)
2026-01-12 18:41:11 -05:00

781 B

fn-1.3 LLM streaming integration

Description

TBD

Acceptance

  • TBD

Done summary

  • Added LLM streaming integration to transcript chat WebSocket endpoint
  • Configured LLM with temperature 0.7 using llama-index Settings
  • Built system message with WebVTT transcript context (15k char limit)
  • Implemented conversation history management with ChatMessage objects
  • Stream LLM responses using Settings.llm.astream_chat()
  • Send tokens incrementally via WebSocket 'token' messages
  • Added 'done' message after streaming completes
  • Error handling with 'error' message type

Verification:

  • Code matches task spec requirements
  • WebSocket message protocol implemented (message/token/done/error)
  • Route registered in app.py

Evidence

  • Commits: ae85f5d3
  • Tests:
  • PRs: