Core API

Recordings

Set record: true when creating a hall and Relay will automatically record the session and upload it to your S3-compatible bucket. You own the file — Relay never stores your media.

Prerequisites

Before using recording, configure your S3 credentials in Dashboard → Settings → Storage. Relay encrypts your credentials at rest using Vault and uses them only at egress time.

Bucket name

The S3 bucket to write recordings to.

Region

e.g. "us-east-1". Required for AWS S3.

Access key ID

IAM user with s3:PutObject on the bucket.

Secret access key

Encrypted with your organisation key in Vault.

Enable recording

Pass record: true when creating a hall:

curl
1
2
3
4
5
6
7
curl -X POST https://api.relay.dev/v1/halls \
  -H "Authorization: Bearer relay_live_..." \
  -H "Content-Type: application/json" \
  -d '{
    "name": "recorded-session",
    "record": true
  }'

Recording begins as soon as the hall is created. When the hall is closed (or the last participant leaves), Relay finalises the recording and uploads it to your bucket.

Recording lifecycle

recording

Hall is open and recording is in progress.

processing

Hall closed. Relay is packaging the final file.

complete

File uploaded to your S3 bucket at bucket_path.

failed

Recording failed. Check your S3 credentials in Settings.

Subscribe to the recording.complete webhook event to know exactly when a file is ready in your bucket — no polling required.

The recording object

recording_idstring

Unique identifier for this recording.

hall_idstring

The hall this recording belongs to.

statusstring

Current lifecycle status (see above).

duration_secondsinteger | null

Total recording duration once complete.

file_size_bytesinteger | null

File size in bytes once complete.

bucket_pathstring | null

Full S3 key where the file was written, e.g. relay/recordings/h_01j9.../recording.mp4

created_atdatetime

When recording started.

completed_atdatetime | null

When the file was uploaded to S3.

Storage compatibility

Relay writes recordings using the S3 API. Any S3-compatible object store works:

Amazon S3Cloudflare R2Backblaze B2MinIODigitalOcean SpacesWasabi