How I Use Amazon S3 Storage Class Analysis To Save Money On My Storage
Storage class analysis helps me get per-bucket analysis of my data so I can more efficiently move it to other storage classes
Amazon S3 Storage Class Analysis is an S3 built-in tool that gives you visibility into your storage patterns.
It helps you make better decisions about moving data to cheaper storage classes.
In this article, I’ll show you how I’ve used this tool to help clients save hundreds of dollars over time with their storage on S3.
What Is S3 Storage Class Analysis?
S3 Storage class analysis helps Canva save millions on storage costs — Canva Blog
(reference: https://www.canva.dev/blog/engineering/optimising-s3-savings/)
S3 Storage Class Analysis is an AWS feature that monitors access patterns of the objects in your S3 buckets.
It helps identify data that is not being accessed frequently so you can consider moving it to a more cost-effective storage class such as S3 Standard-IA, S3 One Zone-IA, or S3 Glacier.
Instead of guessing which data is cold, you can rely on real access metrics provided by S3 itself.
Storage Class Analysis provides daily insights, with data displayed as CSV reports to an S3 bucket you choose, making it easy to see your data’s patterns.
Here’s Why It Matters (For Costs)
Not all storage on S3 is the same.
AWS offers multiple S3 storage classes, each with different pricing models.
For example, the Standard class is optimized for frequent access but is also the most expensive per GB.
On the other hand, classes like Glacier are extremely cheap but come with slower retrieval times and additional costs if retrieved very often.
Without understanding how often my data is accessed, I was hesitant to move files into lower-cost tiers. I didn’t want to risk making the wrong choice and paying more in retrieval fees than I’d save on storage.
With S3 Storage Class Analysis, you essentially remove the guesswork.
With it, you get per-bucket insight on which files haven’t been touched in 30, 60, or 90+ days.
How To Use S3 Storage Class Analysis
1. Set Up a Destination Bucket
The first step here is to create an S3 bucket where this analysis data will be stored.
In S3, create a new bucket and call it “my-analysis-bucket” (make sure the name is available).
The analysis data will be stored in CSV format.
2. Enable Storage Class Analysis
Next, navigate to the bucket you want to get analysis about.
Select the Metrics tab (as show above), and scroll down to the Storage class analysis section.
Here you can click on the Create configuration button to create a new configuration.
Below you can name your configuration and choose your filter type. Leave the default selected so you can add fine-grained filtering.
A good example is to use data that is prefixed as “log” or “backup”, etc.
Enter the prefix you need. You can also use tags so the analysis would be run only on objects with specific tags.
Next, in Export CSV, you can click the enable radio button and choose a bucket to send the analytics report to (i.e. the bucket we created in step 1).
3. Monitor Trends
I will typically review data weekly to find patterns, if any.
For example, if I see that certain prefixes (like “/logs/2024/”) haven’t been accessed in 90 days, I flag them for storage class transition.
4. Automate With Lifecycle Rules
Once I’m confident about the access patterns, I implement S3 Lifecycle Policies to automatically transition stale data to lower-cost classes.
For example, move objects to Standard-IA after 30 days, then to Glacier after 180 days. This eliminates the need for manual intervention.
5. Review and Iterate
Don’t just “set it and forget.”
This is a periodical process where you must revisit the storage class analysis reports and optimize your lifecycle rules.
Your app’s access patterns will change, so should your storage strategy.
👋 My name is Uriel Bitton and I’m committed to helping you master Serverless, Cloud Computing, and AWS.
🚀 If you want to learn how to build serverless, scalable, and resilient applications, you can also follow me on Linkedin for valuable daily posts.
Thanks for reading and see you in the next one!