Close

Let's get on each others' calendars.

Navigating S3 Intelligent-Tiering Pricing:

What You Need to Know

When setting up your account it’s easy to see why S3 Intelligent-Tiering pricing is such an attractive option. For a small fee, you can hand off object pricing management to Amazon to automatically reduce your costs over time. What’s the problem?

Well, that’s exactly what we’re going to explore today.

S3 Intelligent-Tiering is a strong option for anyone who isn’t familiar with S3’s pricing plans or for when you don’t have the time to manage objects yourself to reduce costs where possible. The problem arises when it’s treated as the set-it-and-forget-it option that many (including Amazon) seem to think it is.

It’s a great way to pay standard storage costs plus an overhead per 1,000 objects for the privilege.

But we’re getting ahead of ourselves. Before we can dive into when S3 Intelligent-Tiering pricing makes sense to go for and when it will instead inflate your costs, we need to cover the basics.

That’s why this post will guide you through:

  • What is S3 Intelligent-Tiering?
  • S3 Intelligent-Tiering pricing
  • Managing S3 Intelligent-Tiering costs
  • How to group and manage all of your S3 costs

Let’s get started.

What is S3 Intelligent-Tiering?

Source, image in the public domain

Like many other AWS products, S3 can be incredibly difficult to manage when it comes to getting the best deal for what you need. The pricing plans for themselves are fairly simple, but it’s all too easy to miss the added costs that can quickly build up and the nuances that make each plan more cost-efficient for specific use cases.

For example, S3 Standard is the most expensive plan when it comes to raw storage but is the cheapest for requests to your bucket. That means that it’s ideal for smaller amounts of data that are frequently (if not constantly) being actively used. S3 Glacier Deep Archive is the opposite, with the cheapest storage-per-GB costs but some of the most expensive retrieval request costs.

There are a few nuances, such as the data retrieval costs of S3 Glacier Flexible Retrieval - Expedited being magnitudes above that of Glacier Deep Archive, but these are mostly due to differences in the amount of time it takes for said data to be retrieved. Each plan is a balancing act between storage, requests, and retrieval. Generally, the more expensive storage costs are, the lower the cost for requests and the lower retrieval time is.

S3 Intelligent-Tiering is the most hands-off approach to managing your S3 buckets to try to optimize your pricing plans. Essentially, it’s a way to have Amazon automatically analyze your buckets to assess whether you’d be able to pay less by moving onto a different plan, and to move you to those plans.

There is a small fee for managing your S3 pricing plans automatically, but if you’re not experienced enough to know which plan you need (or would be best for you) or if you simply don’t have time to optimize your plans yourself, it’s a great solution. The money you could save by switching plans could far outweigh what you’re paying for S3 Intelligent-Tiering, but you could also simply move to a specific plan if you notice that your pricing plans have remained static for a long time, and don’t think that will change anytime soon.

If you haven’t used S3 before or you simply aren’t sure how often you’ll need or be able to spend the time changing to a cheaper pricing plan, it’s your best option to at least test the waters.

Source, image in the public domain

Before we move on to the specifics of S3 Intelligent-Tiering pricing you need to know how this pricing plan works because it’s not as simple as “your buckets will be transferred to the plan that will lower your bills as much as possible”. Instead, it’s entirely based on the length of time it’s been since you last accessed data in that bucket.

All data will start out on the Frequent Access tier. If objects aren’t accessed for 30 consecutive days, they will be moved on to the Infrequent Access tier. After 90 consecutive days of not being accessed, objects will be moved to the Archive Instant Access tier.

You also have the option to activate two asynchronous Archive Access tiers of S3 Intelligent-Tiering - Archive Access and Deep Archive Access. If you opt-in to the former, all objects not accessed for 90 days will instead be moved directly onto the Archive Access tier. If you opt-in to the latter, after 180 days of not being accessed objects will be transferred to the Deep Archive Access tier.

The key thing to remember with all of this is that accessing objects at any time will instantly put them back onto the Frequent Access tier, effectively resetting their timer until you next access them again.

S3 Intelligent-Tiering pricing

It’s time to get into the meat of S3 Intelligent-Tiering pricing, and there is one fact that needs to be held in mind at all times.

Bar one exception, there are no data retrieval costs. This is because as soon as objects are accessed, they are moved straight back onto the Frequent Access tier, which has no retrieval costs. This effectively means that you can discard this element from your cost calculations, with one exception that we’ll get into later on.

First off, there’s a flat charge of $0.0025 per 1,000 objects (where objects are larger than 128 KB) to cover the management and automation of this plan. This applies no matter what, so it’s the first cost that you should note.

By default, all objects will initially be placed on the Frequent Access tier. This S3 Intelligent-Tiering pricing plan mirrors S3 Standard in almost every way. Storage on this tier costs $0.023 per GB per month for the first 50 TB, $0.022 per GB per month for the next 450 TB, and $0.021 per GB per month for anything beyond 500 TB. These figures do not include the free 5 GB of storage per month from AWS’ free tier.

Frequent Access also dictates the cost of any requests made, as making a request will instantly shift an object to be moved back onto this tier. So, PUT, COPY, POST, and LIST requests will cost $0.005 per 1,000 requests, GET, SELECT, and all other requests cost $0.0004 per 1,000 requests, and Lifecycle Transition requests cost $0.01 per 1,000 requests.

After 30 days of not being accessed, objects will be moved to the Infrequent Access tier. This costs $0.0125 per GB per month for storage and, as mentioned above, has no extra costs for data retrieval or requests.

Source, image in the public domain

After 90 days, objects shift onto the Archive Instant Access tier, which costs $0.004 per GB per month for storage. There are no extra charges for requests or data retrieval as these actions immediately put the object back onto Frequent Access.

You can opt-in to two asynchronous access tiers too. The first will override Archive Instant Access to obtain objects not accessed for 90 consecutive days, and is called the Archive Access tier. Storage costs are $0.0036 per GB per month, but you can incur some data retrieval costs here if you choose to expedite them. If expedited, data retrieval costs $10.00 per 1,000 requests and $0.03 per GB retrieved, both of which are greatly above the average for other plans where these charges apply.

The final (and last optional) tier is Deep Archive Access, to which objects will be moved after 180 days of not being accessed if the plan is active. Storage costs here are $0.00099 per GB per month, and there are no additional costs for requests or data retrieval.

There are additional charges associated with data transfer (particularly data transfer out of your S3 buckets to the internet), storage management and analytics features such as Amazon S3 Inventory, data replication and S3 Object Lambda. In the interest of keeping this article focused on S3 Intelligent-Tiering Pricing, however, we’re going to skim over these. You can see the full list of potential bonus charges for S3 usage in our wider S3 pricing guide.

Managing S3 Intelligent-Tiering costs

Source, image in the public domain

The most important thing to remember when trying to manage and optimize your S3 Intelligent-Tiering pricing is how your objects are moved from one plan to another. The longer you go without accessing an object, the lower your base storage costs will be according to the plan that it’s switched to (after 30, 90, and 180 days without being accessed).

As soon as you access an object it will immediately be switched back to the Frequent Access tier which will increase storage costs back to their maximum on Intelligent-Tiering. So, if you’re going to be accessing an object at least once every 30 days, you’ll end up paying the same price as S3 Standard but with the general management fee on top, making it pointless. At that point you’re better off switching to the Standard plan.

However, this S3 pricing plan suits use cases such as data lakes, analytics, and even user-generated content rather well. Items such as these are subject to unpredictable requests and access patterns, meaning that you won’t be able to reliably anticipate which plan will be the cheapest. Although you have to pay a flat rate of $0.0025 per 1,000 objects, it can still be worthwhile to have your objects automatically move to a cheaper storage plan based on how long it’s been since they were last accessed.

If you don’t have the team size, time, or expertise necessary to confidently adjust the pricing plan of your objects based on their access patterns, S3 Intelligent-Tiering is a good way to still make use of the reduced storage costs of pricing plans based on infrequent access. However, you should always keep an eye on your usage statistics to make sure that you’re not paying extra for objects that could otherwise be stored on a static plan that has the same storage costs, but without the flat charge for automatic management.

How to group and manage all of your S3 costs

Source

The core takeaway from analyzing S3 Intelligent-Tiering pricing is that you can usually save money by manually shifting objects to their ideal access plan instead of letting S3 automatically do it and paying for the privilege. 

What if you don’t have the time or knowledge to do that? How can you cut costs without having to track and manage every last object you store in S3?

That’s where Aimably’s Cost Reduction Assessment comes in.

Our assessment will take all of your Cost and Usage Report (CUR) data and analyze it for you. By combining your statistics with our knowledge of anything and everything AWS related, we can present you with a list of every action you can take to cut your AWS costs, the risk associated with those actions, and tie everything back to your business’ goals and performance.

For example, the Cost Reduction Assessment will do the same job as S3 Intelligent-Tiering in showing you the optimal pricing plans for your S3 objects without having to pay for the privilege of automatically moving them to their ideal plan. With static plans you also won’t be penalized as much for accessing an object - it won’t take at least 30 days for your basic storage costs to start to drop again.

Not to mention that our assessment will look at your entire AWS account and offer suggestions, rather than just your S3 account. There really is no better way to see the ideal setup to get the most out of AWS without having to pay through the nose for that privilege.

Reduce your S3 costs today by getting our Cost Reduction Assessment!

AWS Total Cost of Ownership