Amazon S3 Destination

Segment makes it easy to send your data to Amazon S3 (and lots of other destinations). Once you've tracked your data through our open source libraries we'll translate and route your data to Amazon S3 in the format they understand. Learn more about how to use Amazon S3 with Segment.

Getting Started

The Amazon S3 destination puts the raw logs of the data we’re receiving into your personal S3 bucket.

Note: The data is copied into your bucket every hour around :40 minute mark. You may see multiple files over a period of time depending on the volume of data copied.

Required Steps

  • Create a bucket with its region as us-east (us-standard).
  • Create a folder “segment-logs” inside the bucket.
  • Edit your bucket policy to allow Segment to copy files into the bucket:
    "Version": "2008-10-17",
    "Id": "Policy1425281770533",
    "Statement": [
            "Sid": "Stmt1425281765688",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::107630771604:user/s3-copy"
            "Action": "s3:PutObject",
            "Resource": "arn:aws:s3:::YOUR_BUCKET_NAME/segment-logs/*"

Note: the Resource property string must end with /*.

Specifically, this adds the ability to s3:PutObject for the Segment s3-copy user for your bucket.

You can edit your bucket policy in the AWS management console by right-clicking the bucket and then selecting the “edit policy” option.

Lastly, enable the Amazon S3 destination in your Segment destination catalog, and put in your bucket name in the destination settings. It will take about an hour to start receiving data.

Data format

Your logs will be stored as gzipped, newline-separated JSON containing the full call information. For a list of supported properties, you’ll want to check out our Spec docs.

The logs themselves are binned by day, and named according to the following file format:


The received-day will refer to the UTC day that the files were received by our API, which makes it easy to find all calls received within a certain timeframe.

How can I download the data from my bucket?

We’ve had the most luck using the AWS CLI and writing a short script to download particular days, one at a time. We’ve found AWS CLI to be significantly faster than s3cmd because it downloads files in parallel.

NOTE: S3 transparently decompresses the files for most clients. However, if you would like to access the raw gzipped data for whatever reason, you can programmatically download the file using their SDK and setting ResponseContentEncoding: none (doesn’t work on the CLI). You can also manually remove the metadata on the file (Content-Type: text/plain and Content-Encoding: gzip) through the AWS interface, which will allow you to download the file as gzipped.

To set up AWS CLI, you’ll need to first install it. There are detailed instructions here, or this will generally work for linux machines:

$ sudo apt-get install awscli

Then you’ll need to configure AWS CLI with your Access Key ID and Secret Access Key. You can create or find these keys in your Amazon IAM user management console. Then run the following command which will prompt you for the access keys:

$ aws configure

Now you’re ready to download some logs!

To see a list of the most recent log folders:

$ aws s3 ls s3://{bucket}/segment-logs/{source-id}/ | tail -10

To download the files for a specific day:

$ aws s3 sync s3://{bucket}/segment-logs/{source-id}/{received-day} .

Or to download all files for a source:

$ aws s3 sync s3://{bucket}/segment-logs/{source-id} .

To put the files in a specific folder replace the . at the end (“current directory”) with the desired directory like ~/Downloads/logs.

Supported Sources and Connection Modes

📱 Device-based
☁️ Cloud-based

To learn more about about Connection Modes and what dictates which we support, see here.


Segment lets you change these destination settings via your Segment dashboard without having to touch any code.

Bucket Name

Your S3 bucket name.

If you have any questions or see anywhere we can improve our documentation, please let us know or kick off a conversation in the Segment Community!