How to automatically transfer Twilio recordings to a S3 bucket

Twilio offers the ability to record calls automatically, but the recordings can pile up quickly. In this post, I describe how to create an AWS lambda function that automatically transfers the recordings to an S3 bucket for archiving purposes.

1) Create an IAM role for the Lambda function

The first step is to create a role that the lambda function can assume so that it can write to the proper S3 bucket. The role should trust AWS Service > Lambda, and should allow s3:PutObject.

Trust Policy

  "Version": "2012-10-17",
  "Statement": [
      "Effect": "Allow",
      "Principal": { "Service": "lambda.amazonaws.com" },
      "Action": "sts:AssumeRole"

Permission Policy

    "Version": "2012-10-17",
    "Statement": [
            "Effect": "Allow"
            "Action": "s3:PutObject",
            "Resource": "arn:aws:s3:::<BUCKET_NAME>/<PREFIX>/*",
            "Effect": "Allow",
            "Action": [
            "Resource": "*"

2) Create a Lambda function to transfer the recordings

The second step is to create a Lambda function running on Nodejs 12.x. The lambda function should use the role in step 1 as its execution role. The body of the lambda function should be put in a index.js file. You need to adjust the ACCOUNT_SIDAUTH_TOKEN to match Twilio credentials, and S3_BUCKET and S3_PREFIX to indicate where to save the recordings.

'use strict';
const axios = require('axios');
const AWS = require('aws-sdk');
const S3UploadStream = require('s3-upload-stream');
const Twilio = require('twilio');

const ACCOUNT_SID = "AC2xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx";
const AUTH_TOKEN = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx";
const S3_PREFIX = "Recordings";

function add_leading_zero(number) {
    let n = number.toString();
    if (n.length < 2) { n = '0' + n; }
    return n;

function get_download_url(recording) {
    const apiVersion = recording.apiVersion;
    const recordingSID = recording.sid;
    return `https://api.twilio.com/${apiVersion}/Accounts/${ACCOUNT_SID}/Recordings/${recordingSID}.mp3`

function get_upload_path(recording, from_number, to_number) {
    const recordingDate = new Date(recording.dateCreated);
    const year = recordingDate.getFullYear();
    const month = add_leading_zero(recordingDate.getMonth() + 1);
    const day = add_leading_zero(recordingDate.getDate());
    const hour = add_leading_zero(recordingDate.getHours());
    const minute = add_leading_zero(recordingDate.getMinutes());
    const second = add_leading_zero(recordingDate.getSeconds());
    const duration = add_leading_zero(recording.duration);
    return `${S3_PREFIX}/${year}/${month}/${year}.${month}.${day}-${hour}.${minute}.${second}_${from_number}_${to_number}_${duration}s.mp3`

async function transfer_recording(download_url, upload_stream) {
    const response = await axios({method: 'GET', url: download_url, responseType: 'stream'});
    return new Promise((resolve, reject) => {
        upload_stream.on('uploaded', resolve)
        upload_stream.on('error', reject)

module.exports.handler = async function(event, context, callback) {
    const client = Twilio(ACCOUNT_SID, AUTH_TOKEN);

    // Retrieving and deleting all recordings
    const recordings = await client.recordings.list();
    for (const recording of recordings) {
        if (recording.status !== "completed") { continue; }

        // Getting the upload and download paths
        const call = await client.calls(recording.callSid).fetch();
        const download_url = get_download_url(recording);
        const upload_path = get_upload_path(recording, call.from, call.to);
        let s3Stream = S3UploadStream(new AWS.S3());

        // Alternatively, one could download a ".wav" by using ".wav" in the in the download_url and a ContentType of "audio/x-wav"
        let upload_stream = s3Stream.upload({Bucket: S3_BUCKET, Key: upload_path, ContentType: 'audio/mpeg'});

        // Transferring to S3
        console.log(`Transferring ${recording.callSid} to ${upload_path}`);
        await transfer_recording(download_url, upload_stream);

        // Deleting recording
        await client.recordings(recording.sid).remove();

Because the lambda function depends on other npm packages, we need to install these packages locally and then upload a zip archive to Lambda. In an empty directory on your computer, create the file index.js, then create a npm package and install the dependencies with:

$ npm init
$ npm install aws-sdk axios s3-upload-stream twilio

The resulting directory should look something like this:

You can then create a zip archive from these files, and create your lambda function by uploading a zip file.

I use a 512 MB of memory, and a timeout of 5 mins, but you can adjust depending on your workload.

3) Trigger automatically using Cloudwatch Events

The final step is to add a trigger to the function. It is possible to trigger the function once per hour by clicking Add Trigger on the lambda page and entering the following settings:

At each invokation, the lambda function should transfer the Twilio recordings to the specified S3 bucket, and then delete the recording from Twilio. The s3 bucket should contain files similar to this:

Congratulations. You now have a AWS Lambda function that automatically transfers recordings from Twilio to S3 on a schedule. You can now set lifecycle policies to transition and expire your recordings.


353 Saint-Nicolas St.
Suite 200
Montreal, QC  H2Y 2P1

398 Avenue Road
Suite 423
Toronto, ON  M4V 2H4



Copyright © 2020 - Attrava Inc. - All Rights Reserved