Files
reflector/server
Mathieu Virbel b7f8e8ef8d fix: add missing session parameters to controller method calls
- Add db_session parameter to all RoomController.add() and update() calls in test_room_ics_api.py
- Fix TranscriptController.upsert_topic() calls to include session parameter in conftest.py fixture
- Fix TranscriptController.upsert_participant() and delete_participant() calls to include session parameter in API views
- Remove invalid setup_database fixture references, use pytest-async-sqlalchemy's database fixture instead
- Update CalendarEventController.upsert() calls to include session parameter

These changes ensure all controller methods receive the required session parameter
as part of the SQLAlchemy 2.0 migration pattern where sessions are explicitly managed.
2025-09-23 23:58:29 -06:00
..
2025-09-17 18:52:03 +02:00
2025-02-03 16:11:01 +01:00
2025-08-20 20:56:45 -04:00
2025-07-16 18:10:11 -06:00
2023-08-29 10:58:27 +02:00
2025-09-17 16:43:20 -06:00

AWS S3/SQS usage clarification

Whereby.com uploads recordings directly to our S3 bucket when meetings end.

SQS Queue (AWS_PROCESS_RECORDING_QUEUE_URL)

Filled by: AWS S3 Event Notifications

The S3 bucket is configured to send notifications to our SQS queue when new objects are created. This is standard AWS infrastructure - not in our codebase.

AWS S3 → SQS Event Configuration:

  • Event Type: s3:ObjectCreated:*
  • Filter: *.mp4 files
  • Destination: Our SQS queue

Our System's Role

Polls SQS every 60 seconds via /server/reflector/worker/process.py:24-62:

Every 60 seconds, check for new recordings

sqs = boto3.client("sqs", ...) response = sqs.receive_message(QueueUrl=queue_url, ...)

Requeue

uv run /app/requeue_uploaded_file.py TRANSCRIPT_ID

Pipeline Management

Continue stuck pipeline from final summaries (identify_participants) step:

uv run python -c "from reflector.pipelines.main_live_pipeline import task_pipeline_final_summaries; result = task_pipeline_final_summaries.delay(transcript_id='TRANSCRIPT_ID'); print(f'Task queued: {result.id}')"

Run full post-processing pipeline (continues to completion):

uv run python -c "from reflector.pipelines.main_live_pipeline import pipeline_post; pipeline_post(transcript_id='TRANSCRIPT_ID')"

.