Files
reflector/server
Mathieu Virbel 86ce68651f build: move to uv (#488)
* build: move to uv

* build: add packages declaration

* build: move to python 3.12, as sentencespiece does not work on 3.13

* ci: remove pre-commit check, will be done in another branch.

* ci: fix name checkout

* ci: update lock and dockerfile

* test: remove event_loop, not needed in python 3.12

* test: updated test due to av returning AudioFrame with 4096 samples instead of 1024

* build: prevent using fastapi cli, because there is no way to set default port

I don't want to pass --port 1250 every time, so back on previous
approach. I deactivated auto-reload for production.

* ci: remove main.py

* test: fix quirck with httpx
2025-07-16 18:10:11 -06:00
..
2025-03-25 11:09:01 +01:00
2025-02-03 16:11:01 +01:00
2025-07-16 18:10:11 -06:00
2023-07-26 15:13:46 +07:00
2025-07-16 18:10:11 -06:00
2025-06-17 16:30:23 -04:00
2025-02-28 13:00:22 +01:00
2025-07-16 18:10:11 -06:00
2023-08-29 10:58:27 +02:00
2025-07-16 18:10:11 -06:00
2024-08-12 12:22:21 +02:00
2025-07-16 18:10:11 -06:00
2025-06-17 12:18:41 -04:00
2025-07-16 18:10:11 -06:00
2025-07-16 18:10:11 -06:00

AWS S3/SQS usage clarification

Whereby.com uploads recordings directly to our S3 bucket when meetings end.

SQS Queue (AWS_PROCESS_RECORDING_QUEUE_URL)

Filled by: AWS S3 Event Notifications

The S3 bucket is configured to send notifications to our SQS queue when new objects are created. This is standard AWS infrastructure - not in our codebase.

AWS S3 → SQS Event Configuration:

  • Event Type: s3:ObjectCreated:*
  • Filter: *.mp4 files
  • Destination: Our SQS queue

Our System's Role

Polls SQS every 60 seconds via /server/reflector/worker/process.py:24-62:

Every 60 seconds, check for new recordings

sqs = boto3.client("sqs", ...) response = sqs.receive_message(QueueUrl=queue_url, ...)