Page updated Feb 15, 2024

Upload files

uploadData

The uploadData method uploads files into Amazon S3.

1import { uploadData } from 'aws-amplify/storage';
2
3try {
4 const result = await uploadData({
5 key: filename,
6 data: file,
7 options: {
8 accessLevel: 'guest', // defaults to `guest` but can be 'private' | 'protected' | 'guest'
9 onProgress // Optional progress callback.
10 }
11 }).result;
12 console.log('Succeeded: ', result);
13} catch (error) {
14 console.log('Error : ', error);
15}

Public level

The accessLevel defaults to guest

1import { uploadData } from 'aws-amplify/storage';
2
3try {
4 const result = await uploadData({
5 key: filename,
6 data: file
7 }).result;
8 console.log('Succeeded: ', result);
9} catch (error) {
10 console.log('Error : ', error);
11}

Protected level

1import { uploadData } from 'aws-amplify/storage';
2
3try {
4 const result = await uploadData({
5 key: filename,
6 data: file,
7 options: {
8 accessLevel: 'protected'
9 }
10 }).result;
11 console.log('Succeeded: ', result);
12} catch (error) {
13 console.log('Error : ', error);
14}

Private level

1import { uploadData } from 'aws-amplify/storage';
2
3try {
4 const result = await uploadData({
5 key: filename,
6 data: file,
7 options: {
8 accessLevel: 'private'
9 }
10 }).result;
11 console.log('Succeeded: ', result);
12} catch (error) {
13 console.log('Error : ', error);
14}

Monitor progress of upload

To track the progress of your upload, you can use the onProgress in options:

1import { uploadData } from 'aws-amplify/storage';
2
3try {
4 const result = uploadData({
5 key: filename,
6 data: file,
7 options: {
8 onProgress: ({ transferredBytes, totalBytes }) => {
9 if (totalBytes) {
10 console.log(
11 `Upload progress ${
12 Math.round((transferredBytes / totalBytes) * 100)
13 } %`
14 );
15 }
16 }
17 }
18 }).result;
19 console.log('Key from Response: ', result.key);
20} catch (error) {
21 console.log('Error : ', error);
22}

Pause and resume upload

We have callback functions that support resuming, pausing, and cancelling uploadData requests.

1import { uploadData } from 'aws-amplify/storage';
2
3// Pause and resume a task
4const uploadTask = uploadData({ key, data: file });
5//...
6uploadTask.pause();
7//...
8uploadTask.resume();
9//...
10await uploadTask.result;

Cancel upload

1import { uploadData, isCancelError } from 'aws-amplify/storage';
2
3const uploadTask = uploadData({ key, data: file });
4//...
5uploadTask.cancel();
6try {
7 await uploadTask.result;
8} catch (error) {
9 if (isCancelError(error)) {
10 // Handle error thrown by task cancellation
11 }
12}

Other options available are:

1import { uploadData } from 'aws-amplify/storage';
2
3uploadData({
4 key,
5 data: file,
6 options: {
7 contentType?: "text/html", // (String) The default content-type header value of the file when downloading it.
8 contentEncoding?: "compress" // (String) The default content-encoding header value of the file when downloading it.
9 contentDisposition?: "attachment", // (String) Specifies presentational information for the object
10 metadata?: {key: "value"}, // (map<String>) A map of metadata to store with the object in S3.
11 useAccelerateEndpoint?: boolean; // Whether to use accelerate endpoint.
12 }
13});

When a page refresh occurs during the upload, re-initializing the upload with the same file will continue from previous break point.

Uploads that were initiated over one hour ago will be cancelled automatically. There are instances (e.g device went offline, user logs out) where the incomplete file remains in your S3 account. It is recommended to setup a s3 lifecycle rule to automatically cleanup incomplete upload requests.

Browser uploads

Upload a file in the browser:

1import { uploadData } from 'aws-amplify/storage';
2
3const uploadDataInBrowser = async (event) => {
4 if (event?.target?.files) {
5 const file = event.target.files[0];
6 uploadData({
7 key: file.name,
8 data: file
9 });
10 }
11};