Name:
interface
Value:
Amplify has re-imagined the way frontend developers build fullstack applications. Develop and deploy without the hassle.

Page updated May 1, 2024

Upload files

uploadData

The uploadData method uploads files into Amazon S3.

import { uploadData } from 'aws-amplify/storage';
try {
const result = await uploadData({
path: 'public/album/2024/1.jpg',
// Alternatively, path: ({identityId}) => `protected/${identityId}/album/2024/1.jpg`
data: file,
options: {
onProgress // Optional progress callback.
}
}).result;
console.log('Succeeded: ', result);
} catch (error) {
console.log('Error : ', error);
}
import { uploadData } from 'aws-amplify/storage';
try {
const result = await uploadData({
key: 'album/2024/1.jpg',
data: file,
options: {
accessLevel: 'guest', // defaults to `guest` but can be 'private' | 'protected' | 'guest'
onProgress // Optional progress callback.
}
}).result;
console.log('Succeeded: ', result);
} catch (error) {
console.log('Error : ', error);
}

Monitor progress of upload

To track the progress of your upload, you can use the onProgress option:

import { uploadData } from 'aws-amplify/storage';
try {
const result = uploadData({
path: 'public/album/2024/1.jpg',
// Alternatively, path: ({identityId}) => `protected/${identityId}/album/2024/1.jpg`
data: file,
options: {
onProgress: ({ transferredBytes, totalBytes }) => {
if (totalBytes) {
console.log(
`Upload progress ${
Math.round((transferredBytes / totalBytes) * 100)
} %`
);
}
}
}
}).result;
console.log('Path from Response: ', result.path);
} catch (error) {
console.log('Error : ', error);
}
import { uploadData } from 'aws-amplify/storage';
try {
const result = uploadData({
key: 'album/2024/1.jpg',
data: file,
options: {
onProgress: ({ transferredBytes, totalBytes }) => {
if (totalBytes) {
console.log(
`Upload progress ${
Math.round((transferredBytes / totalBytes) * 100)
} %`
);
}
}
}
}).result;
console.log('Key from Response: ', result.key);
} catch (error) {
console.log('Error : ', error);
}

Pause and resume upload

We have callback functions that support resuming, pausing, and cancelling uploadData requests.

import { uploadData } from 'aws-amplify/storage';
// Pause and resume a task
const uploadTask = uploadData({ path, data: file });
//...
uploadTask.pause();
//...
uploadTask.resume();
//...
await uploadTask.result;
import { uploadData } from 'aws-amplify/storage';
// Pause and resume a task
const uploadTask = uploadData({ key, data: file });
//...
uploadTask.pause();
//...
uploadTask.resume();
//...
await uploadTask.result;

Cancel upload

import { uploadData, isCancelError } from 'aws-amplify/storage';
const uploadTask = uploadData({ path, data: file });
//...
uploadTask.cancel();
try {
await uploadTask.result;
} catch (error) {
if (isCancelError(error)) {
// Handle error thrown by task cancellation
}
}
import { uploadData, isCancelError } from 'aws-amplify/storage';
const uploadTask = uploadData({ key, data: file });
//...
uploadTask.cancel();
try {
await uploadTask.result;
} catch (error) {
if (isCancelError(error)) {
// Handle error thrown by task cancellation
}
}

Other options available are:

import { uploadData } from 'aws-amplify/storage';
uploadData({
path: 'public/album/2024/1.jpg',
// Alternatively, path: ({identityId}) => `protected/${identityId}/album/2024/1.jpg`
data: file,
options: {
contentType?: 'text/html', // (String) The default content-type header value of the file when downloading it.
contentEncoding?: 'compress' // (String) The default content-encoding header value of the file when downloading it.
contentDisposition?: 'attachment', // (String) Specifies presentational information for the object
metadata?: {key: 'value'}, // (map<String>) A map of metadata to store with the object in S3.
useAccelerateEndpoint?: boolean; // Whether to use accelerate endpoint.
}
});

When a page refresh occurs during the upload, re-initializing the upload with the same file will continue from previous break point.

Uploads that were initiated over one hour ago will be cancelled automatically. There are instances (e.g device went offline, user logs out) where the incomplete file remains in your S3 account. It is recommended to setup a s3 lifecycle rule to automatically cleanup incomplete upload requests.

Browser uploads

Upload a file in the browser:

import { uploadData } from 'aws-amplify/storage';
const uploadDataInBrowser = async (event) => {
if (event?.target?.files) {
const file = event.target.files[0];
uploadData({
path: file.name,
data: file
});
}
};
import { uploadData } from 'aws-amplify/storage';
const uploadDataInBrowser = async (event) => {
if (event?.target?.files) {
const file = event.target.files[0];
uploadData({
key: file.name,
data: file
});
}
};