How To Upload Files Directly To S3 Using Presigned URLs
What are resigned URLs in S3 and what are they useful for?
Uploading files to Amazon S3 is quite simple.
But you can’t directly upload files from your frontend app.
Instead, you need to use a serverless function — like AWS Lambda or have a server running with an endpoint.
Uploading to S3 is much more common with a Lambda function because of the integrations it has, as well as with other AWS services.
However, an issue you will face when uploading files to S3 with Lambda is the 6MB limit Lambda imposes on payloads.
So the question is how do you upload files larger than 6MB to S3 using Lambda?
Presigned URLs
A presigned URL is a URL that gives you access to an S3 bucket for a limited time.
The URL includes a signature from your AWS access credentials so that you don’t need to have explicit access to the S3 bucket.
You can set the expiration date on the URL and after that date/time passes you can no longer access the bucket.
Some use cases are:
Allow users to upload or download files without giving them full access to your bucket.
Bypass the Lambda 6MB payload limit by uploading files directly to S3
Provide users temporary access to your bucket with a one-time download link
Uploading A File Using a Presigned URL
Let’s see how we can easily upload files small or large to S3 directly by generating a presigned URL.
Here’s the general overview:
Make a PutObjectCommand to S3 with a key and bucket name
Use the getSignedUrl method from S3 to generate a presigned URL
Use the Presigned URL in your frontend code to make a POST request to S3 directly.
Create a new Lambda function and copy the following code to generate a presigned URL:
import { S3Client, GetObjectCommand, PutObjectCommand } from "@aws-sdk/client-s3";
import { getSignedUrl } from "@aws-sdk/s3-request-presigner";
const s3Client = new S3Client({ region: "us-east-1" });
export const handler = async () => {
const bucketName = "<your-bucket-name>";
const key = "my-cat-picture.png";
const params = {
Bucket: bucketName,
Key: key,
};
try {
const command = new PutObjectCommand(params);
const presignedUrl = await getSignedUrl(s3Client, command, { expiresIn: 300 });
return {
statusCode: 200,
body: JSON.stringify({ url: presignedUrl })
};
} catch (err) {
return {
statusCode: 500,
body: JSON.stringify({ error: 'Could not generate pre-signed URL' })
};
}
};
All this code does is pass a key (typically a file name) and a bucket name to the PutObjectCommand.
Then we use the getSignedUrl method from S3’s API to get a presigned URL which provides temporary access to the bucket.
We can return this presigned URL to the frontend client that called this Lambda function.
When the front end client gets the returned presigned URL, we can then make a POST request using a fetch function with the contents of the file upload.
Here’s a simple demonstration:
//get the presigned URL from your Lambda function and pass the URL into the function param below
const uploadFileToS3 = async (file, presignedUrl) => {
try {
const response = await fetch(presignedUrl, {
method: 'POST',
body: file,
});
if (response.ok) {
console.log('File uploaded successfully!');
} else {
console.error('File upload failed:', response.statusText);
}
} catch (error) {
console.error('Error during file upload:', error);
}
}
The function above accepts a file object and a presigned URL which you get from the Lambda function result above.
With a POST request using the presigned URL, you can then add the file to the body attribute and the file will then be uploaded directly to S3, bypassing the 6MB limit in Lambda.
Conclusion
Using presigned URLs to upload files directly to S3 allows you to bypass the Lambda 6MB payload limit for larger files.
With presigned URLs you can also provide temporary access to users to your bucket in S3.
This provides a more secure and simplified process of handling file uploads from your frontend.
I just launched a brand new newsletter that gives you valuable tips on saving money and reducing costs in the cloud — I’d love to have you join:
https://thecloudeconomist.beehiiv.com/subscribe
👋 My name is Uriel Bitton and I’m committed to helping you master Serverless, Cloud Computing, and AWS.
🚀 If you want to learn how to build serverless, scalable, and resilient applications, you can also follow me on Linkedin for valuable daily posts.
Thanks for reading and see you in the next one!