S3 Glacier retrieval pricing per GB and per request
Glacier retrieval pricing is a two-part bill: GB retrieved and retrieval requests. Understanding both is essential if you have many small objects or frequent restores.
The two retrieval price components
- GB retrieved: volume of data restored and read.
- Retrieval requests: number of objects restored (request-like fees).
Small-object amplification (why requests matter)
- Restoring 1 TB as 1,000,000 objects creates far more request fees than a few large objects.
- If you archive many small files, request fees can dominate retrieval cost.
- Packaging or batching into fewer objects can materially reduce request fees.
Tier choice changes cost
- Fast tiers: higher $/GB and request pricing, but shorter latency.
- Standard tiers: balance between cost and latency.
- Bulk tiers: lowest cost, but slowest restores.
How to estimate retrieval pricing (quick method)
- Estimate retrieval GB/month (restores per month x avg restore size).
- Estimate retrieval requests/month (objects restored).
- Apply your tier pricing to both GB and requests.
- Run a peak scenario for backfills or audits.
Pricing formula (simple and defensible)
- Retrieval GB cost = retrieved GB/month x $/GB for the chosen tier.
- Request cost = (retrieval requests/month / 1,000) x $ per 1,000 requests.
- Total retrieval = GB cost + request cost.
Keep the two units separate. Even if GB cost is the larger line item, requests can spike during restores of many small objects.
Worked example: small objects vs large objects
Suppose you restore 500 GB in a month. If those 500 GB are split into 2 million small objects, request fees can be significant. If the same 500 GB are stored as 5,000 large objects, request fees are tiny by comparison. The data volume is identical, but the request count is not. That is why object packaging and batch size matter.
Validation checklist
- Confirm retrieval GB from billing exports or restore job summaries (not stored GB).
- Count objects restored per month (or average objects per restore x restores/month).
- Separate routine restores from rare backfills or audits and model them as a peak scenario.
- Track the chosen retrieval tier and latency expectations for each workflow.
Estimate requests from object size (fast method)
- Requests/month ~= retrieval GB/month / avg object size (GB).
- Example: 500 GB retrieved with 5 MB objects -> about 102,400 objects.
- Use inventory reports to get real object size distribution if possible.
Tier selection notes (cost vs urgency)
- Expedited tiers reduce latency but increase both GB and request pricing.
- Bulk tiers are cheapest for large restores that are not time critical.
- If you run both routine and emergency restores, model them separately.
Where to get reliable inputs
- Restore jobs: track how many objects are restored per job and total bytes restored.
- S3 Inventory: use inventory reports to see object count and size distribution.
- Billing exports: confirm which usage types are driving charges (GB vs requests).
Optimization ideas that reduce request fees
- Bundle small files into larger archives before moving them to Glacier.
- Keep a manifest so you can restore only the needed subsets.
- Use a staging bucket to avoid repeated restore requests for the same objects.
Glossary (terms that affect cost)
- Restore request: the API call that initiates a rehydration job.
- Rehydration: the process of making archive objects readable again.
- Restore window: the temporary period when restored data stays accessible.
- Object fan-out: many small objects creating a large request count.
Common mistakes
- Modeling only GB retrieved and ignoring request fees.
- Using stored GB as a proxy for retrieval GB.
- Forgetting that retrieval tiers change both latency and price.
Related tools
Related guides
Estimate Glacier/Deep Archive retrieval volume (GB and requests)
How to estimate archival retrieval costs: model GB restored per month and the number of objects retrieved (requests), plus common drivers like restores, rehydration, and analytics.
S3 Glacier Pricing & Cost Guide (storage, retrieval, Deep Archive)
Practical S3 Glacier cost model: storage GB-month, retrieval volume and requests, and minimum duration fees.
Estimate Secrets Manager API calls per month (GetSecretValue volume)
A practical workflow to estimate Secrets Manager API request volume (especially GetSecretValue): measure and scale when possible, model from runtime churn when not, and validate with CloudTrail so your budget survives peaks.
Glacier/Deep Archive cost optimization (reduce restores and requests)
A practical playbook to reduce archival storage costs: reduce restores, reduce small-object request volume, and avoid minimum duration penalties. Includes validation steps and related tools.
S3 request costs: when GET/PUT/LIST becomes meaningful
S3 request costs are often tiny, but they can matter for workloads with many small objects or high metadata churn. Learn how to estimate request fees and when to care.
S3 Glacier retrieval time: how long restores take by tier
A practical guide to S3 Glacier retrieval time: how restore tiers map to latency, what drives delays, and how to plan workflows without surprises.
Related calculators
RPS to Monthly Requests Calculator
Estimate monthly request volume from RPS, hours/day, and utilization.
API Request Cost Calculator
Estimate request-based charges from monthly requests and $ per million.
CDN Request Cost Calculator
Estimate CDN request fees from monthly requests and $ per 10k/1M pricing.
FAQ
Why do retrieval costs spike for small objects?
Retrieval pricing usually includes both GB retrieved and request fees. Many small objects can create large request counts even when GB is modest.
Do retrieval tiers change pricing?
Yes. Faster tiers are typically more expensive. Choose the slowest tier that meets your workflow latency needs.
What inputs should I track?
Track retrieval GB per month and objects retrieved per month. These two inputs explain most retrieval bills.
Last updated: 2026-01-30