f4cffc0e66
server: add tests on segmentation and fix issue with speaker
2023-11-02 17:39:21 +01:00
00eb9bbf3c
server: improve split algorithm
2023-11-02 17:39:21 +01:00
b323254376
server: move out profanity filter to transcript, and implement segmentation
2023-11-02 17:39:21 +01:00
Gokul Mohanarangan
c1a9005ec3
update buller condition
2023-10-14 18:55:40 +05:30
Gokul Mohanarangan
79fa537c35
update return format
2023-10-14 18:08:16 +05:30
Gokul Mohanarangan
894c989d60
update language codes
2023-10-14 17:35:30 +05:30
Sara
90c6824f52
replace two letter codes with three letter codes
2023-10-13 23:36:02 +02:00
projects-g
1d92d43fe0
New summary ( #283 )
...
* handover final summary to Zephyr deployment
* fix display error
* push new summary feature
* fix failing test case
* Added markdown support for final summary
* update UI render issue
* retain sentence tokenizer call
---------
Co-authored-by: Koper <andreas@monadical.com >
2023-10-13 22:53:29 +05:30
projects-g
628c69f81c
Separate out transcription and translation into own Modal deployments ( #268 )
...
* abstract transcript/translate into separate GPU apps
* update app names
* update transformers library version
* update env.example file
2023-10-13 22:01:21 +05:30
47f7e1836e
server: remove warmup methods everywhere
2023-10-06 13:59:17 -04:00
projects-g
e78bcc9190
Scaleai Translation ( #258 )
...
* hotfix
* remove assert from translation
* review comments
* reflector.media change targetLang to en
2023-09-28 18:16:39 +05:30
projects-g
24aa9a74bd
hotfix ( #254 )
2023-09-27 19:20:43 +05:30
projects-g
6a43297309
Translation enhancements ( #247 )
2023-09-26 19:49:54 +05:30
Gokul Mohanarangan
f56eaeb6cc
dont delete censored words
2023-09-25 21:25:18 +05:30
Gokul Mohanarangan
80fd5e6176
update llm params
2023-09-22 07:49:41 +05:30
Gokul Mohanarangan
009d52ea23
update casing and trimming
2023-09-22 07:29:01 +05:30
Gokul Mohanarangan
ab41ce90e8
add profanity filter, post-process topic/title
2023-09-21 11:12:00 +05:30
2b9eef6131
server: use mp3 as default for audio storage
...
Closes #223
2023-09-13 17:26:03 +02:00
projects-g
9fe261406c
Feature additions ( #210 )
...
* initial
* add LLM features
* update LLM logic
* update llm functions: change control flow
* add generation config
* update return types
* update processors and tests
* update rtc_offer
* revert new title processor change
* fix unit tests
* add comments and fix HTTP 500
* adjust prompt
* test with reflector app
* revert new event for final title
* update
* move onus onto processors
* move onus onto processors
* stash
* add provision for gen config
* dynamically pack the LLM input using context length
* tune final summary params
* update consolidated class structures
* update consolidated class structures
* update precommit
* add broadcast processors
* working baseline
* Organize LLMParams
* minor fixes
* minor fixes
* minor fixes
* fix unit tests
* fix unit tests
* fix unit tests
* update tests
* update tests
* edit pipeline response events
* update summary return types
* configure tests
* alembic db migration
* change LLM response flow
* edit main llm functions
* edit main llm functions
* change llm name and gen cf
* Update transcript_topic_detector.py
* PR review comments
* checkpoint before db event migration
* update DB migration of past events
* update DB migration of past events
* edit LLM classes
* Delete unwanted file
* remove List typing
* remove List typing
* update oobabooga API call
* topic enhancements
* update UI event handling
* move ensure_casing to llm base
* update tests
* update tests
2023-09-13 11:26:08 +05:30
60edca6366
server: add prometheus instrumentation
2023-09-12 13:11:13 +02:00
600f2ca370
server: add BroadcastProcessor tests
2023-08-31 14:48:12 +02:00
9ed26030a5
server: add Broadcast processor
2023-08-31 14:48:12 +02:00
bdf7fe6ebc
server: update process tools to save all events into a jsonl file
2023-08-31 14:48:12 +02:00
68dce235ec
server: pass source and target language from api to pipeline
2023-08-29 11:16:23 +02:00
Gokul Mohanarangan
d92a0de56c
update HTTP POST
2023-08-28 15:19:36 +05:30
Gokul Mohanarangan
ebbe01f282
update fixes
2023-08-28 14:32:21 +05:30
Gokul Mohanarangan
49d6e2d1dc
return both en and fr in transcriptio
2023-08-28 14:25:44 +05:30
Gokul Mohanarangan
a0ea32db8a
review comments
2023-08-21 13:50:59 +05:30
Gokul Mohanarangan
78153c6cfb
update
2023-08-21 12:57:50 +05:30
Gokul Mohanarangan
acdd5f7dab
update
2023-08-21 12:53:49 +05:30
Gokul Mohanarangan
5b0883730f
translation update
2023-08-21 11:46:28 +05:30
Gokul Mohanarangan
9332870e83
Merge branch 'main' of github.com:Monadical-SAS/reflector into modal
2023-08-17 20:57:58 +05:30
b43bd00fc0
server: fixes wav not saved correctly and mp3 generation invalid if started from /tmp from another device
2023-08-17 16:49:22 +02:00
Gokul Mohanarangan
235ee73f46
update prompt
2023-08-17 09:59:16 +05:30
Gokul Mohanarangan
17b850951a
pull from main
2023-08-17 09:38:35 +05:30
Gokul Mohanarangan
a24c3afe5b
cleanup
2023-08-17 09:35:49 +05:30
Gokul Mohanarangan
eb13a7bd64
make schema optional argument
2023-08-17 09:23:14 +05:30
Gokul Mohanarangan
5f79e04642
make schema optional for all LLMs
2023-08-16 22:37:20 +05:30
33ab54a626
server: replace wave module with pyav directly
...
Closes #87
2023-08-16 11:10:33 +02:00
Gokul Mohanarangan
976c0ab9a8
update prompt
2023-08-16 14:07:29 +05:30
a809e5e734
server: implement wav/mp3 audio download
...
If set, will save audio transcription to disk.
MP3 conversion is on-request, but cached to disk as well only if it is successfull.
Closes #148
2023-08-16 09:34:26 +02:00
a21a726eb1
server: prevent storing audio for transcription unless wanted
...
Closes #145
2023-08-15 14:11:57 +02:00
Mathieu Virbel
01806ce037
server: remove warmup, increase LLM timeout for now
2023-08-11 19:56:39 +02:00
Mathieu Virbel
82ce8202bd
server: improve llm warmup exception handling
...
If LLM is stuck to warm or an exception happen in the pipeline, then the processor responsible for the exception fail, and there is no fallback. So audio continue to arrive, but no processing happen.While this should be done right especially after disconnection, still, we should ignore llm warmup issue and just go.
Closes #140
2023-08-11 19:33:07 +02:00
Mathieu Virbel
802f2c248e
server: remove print
2023-08-11 16:18:39 +02:00
Mathieu Virbel
38a5ee0da2
server: implement warmup event for llm and transcription
2023-08-11 15:32:41 +02:00
Mathieu Virbel
445d3c1221
server: implement modal backend for llm and transcription
2023-08-11 12:43:09 +02:00
Mathieu Virbel
7f807c8f5f
server: implement FINAL_SUMMARY for websocket + update tests and fix flush
2023-08-08 19:32:20 +02:00
Mathieu Virbel
96f52c631a
api: implement first server API + tests
2023-08-04 20:06:43 +02:00
Mathieu Virbel
dce92e0cf7
server: fixes pipeline logger not transmitted to processors
...
Closes #110
2023-08-04 12:02:18 +02:00