Commit Graph

14 Commits

Author SHA1 Message Date
86ce68651f build: move to uv (#488)
* build: move to uv

* build: add packages declaration

* build: move to python 3.12, as sentencespiece does not work on 3.13

* ci: remove pre-commit check, will be done in another branch.

* ci: fix name checkout

* ci: update lock and dockerfile

* test: remove event_loop, not needed in python 3.12

* test: updated test due to av returning AudioFrame with 4096 samples instead of 1024

* build: prevent using fastapi cli, because there is no way to set default port

I don't want to pass --port 1250 every time, so back on previous
approach. I deactivated auto-reload for production.

* ci: remove main.py

* test: fix quirck with httpx
2025-07-16 18:10:11 -06:00
5267ab2d37 feat: retake summary using NousResearch/Hermes-3-Llama-3.1-8B model (#415)
This feature a new modal endpoint, and a complete new way to build the
summary.

## SummaryBuilder

The summary builder is based on conversational model, where an exchange
between the model and the user is made. This allow more context
inclusion and a better respect of the rules.

It requires an endpoint with OpenAI-like completions endpoint
(/v1/chat/completions)

## vLLM Hermes3

Unlike previous deployment, this one use vLLM, which gives OpenAI-like
completions endpoint out of the box. It could also handle guided JSON
generation, so jsonformer is not needed. But, the model is quite good to
follow JSON schema if asked in the prompt.

## Conversion of long/short into summary builder

The builder is identifying participants, find key subjects, get a
summary for each, then get a quick recap.

The quick recap is used as a short_summary, while the markdown including
the quick recap + key subjects + summaries are used for the
long_summary.

This is why the nextjs component has to be updated, to correctly style
h1 and keep the new line of the markdown.
2024-09-14 02:28:38 +02:00
projects-g
1d92d43fe0 New summary (#283)
* handover final summary to Zephyr deployment

* fix display error

* push new summary feature

* fix failing test case

* Added markdown support for final summary

* update UI render issue

* retain sentence tokenizer call

---------

Co-authored-by: Koper <andreas@monadical.com>
2023-10-13 22:53:29 +05:30
projects-g
6a43297309 Translation enhancements (#247) 2023-09-26 19:49:54 +05:30
projects-g
9fe261406c Feature additions (#210)
* initial

* add LLM features

* update LLM logic

* update llm functions: change control flow

* add generation config

* update return types

* update processors and tests

* update rtc_offer

* revert new title processor change

* fix unit tests

* add comments and fix HTTP 500

* adjust prompt

* test with reflector app

* revert new event for final title

* update

* move onus onto processors

* move onus onto processors

* stash

* add provision for gen config

* dynamically pack the LLM input using context length

* tune final summary params

* update consolidated class structures

* update consolidated class structures

* update precommit

* add broadcast processors

* working baseline

* Organize LLMParams

* minor fixes

* minor fixes

* minor fixes

* fix unit tests

* fix unit tests

* fix unit tests

* update tests

* update tests

* edit pipeline response events

* update summary return types

* configure tests

* alembic db migration

* change LLM response flow

* edit main llm functions

* edit main llm functions

* change llm name and gen cf

* Update transcript_topic_detector.py

* PR review comments

* checkpoint before db event migration

* update DB migration of past events

* update DB migration of past events

* edit LLM classes

* Delete unwanted file

* remove List typing

* remove List typing

* update oobabooga API call

* topic enhancements

* update UI event handling

* move ensure_casing to llm base

* update tests

* update tests
2023-09-13 11:26:08 +05:30
a7f0cdd50e server: fixes tests 2023-08-31 14:48:12 +02:00
Gokul Mohanarangan
b08724a191 correct schema typing from str to dict 2023-08-17 20:57:31 +05:30
Gokul Mohanarangan
a98a9853be PR review comments 2023-08-17 14:42:45 +05:30
Gokul Mohanarangan
eb13a7bd64 make schema optional argument 2023-08-17 09:23:14 +05:30
Mathieu Virbel
e55cfce930 server: fixes tests 2023-08-04 18:16:51 +02:00
d94e2911c3 Serverless GPU support on banana.dev (#106)
* serverless: implement banana backend for both audio and LLM

Related to monadical-sas/reflector-gpu-banana project

* serverless: got llm working on banana !

* tests: fixes

* serverless: fix dockerfile to use fastapi server + httpx
2023-08-04 10:24:11 +02:00
Mathieu Virbel
e4f2b785ca server: update process tools and tests 2023-08-01 20:16:54 +02:00
Mathieu Virbel
99c9ba3e6b tests: remove unused mock 2023-08-01 16:08:15 +02:00
Mathieu Virbel
1f8e4200fd tests: rework tests and fixes bugs along the way 2023-08-01 16:05:48 +02:00