Page updated Jan 16, 2024

Storing analytics data

The Amazon Kinesis Firehose analytics provider allows you to send analytics data to an Amazon Kinesis Firehose stream for reliably storing data.

Installation and Configuration

Ensure you have setup IAM permissions for firehose:PutRecordBatch.

Example IAM policy for Amazon Kinesis Firehose:

1{
2 "Version": "2012-10-17",
3 "Statement": [{
4 "Effect": "Allow",
5 "Action": "firehose:PutRecordBatch",
6 // replace the template fields
7 "Resource": "arn:aws:firehose:<Region>:<AccountId>:deliverystream/<StreamName>"
8 }]
9}

Configure Kinesis Firehose:

1import { Amplify } from 'aws-amplify';
2
3Amplify.configure({
4 ...Amplify.getConfig(),
5 Analytics: {
6 KinesisFirehose: {
7 // REQUIRED - Amazon Kinesis Firehose service region
8 region: 'us-east-1',
9
10 // OPTIONAL - The buffer size for events in number of items.
11 bufferSize: 1000,
12
13 // OPTIONAL - The number of events to be deleted from the buffer when flushed.
14 flushSize: 100,
15
16 // OPTIONAL - The interval in milliseconds to perform a buffer check and flush if necessary.
17 flushInterval: 5000, // 5s
18
19 // OPTIONAL - The limit for failed recording retries.
20 resendLimit: 5
21 }
22 }
23});

Storing data

You can send a data to a Kinesis Firehose stream with the standard record method. Any data is acceptable and streamName is required:

1import { record } from 'aws-amplify/analytics/kinesis-firehose';
2
3record({
4 data: {
5 // The data blob to put into the record
6 },
7 streamName: 'myKinesisStream'
8});

Flush events

The recorded events are saved in a buffer and sent to the remote server periodically (You can tune it with the flushInterval option). If needed, you have the option to manually clear all the events from the buffer by using the 'flushEvents' API.

1import { flushEvents } from 'aws-amplify/analytics/kinesis-firehose';
2
3flushEvents();