Akhil
Home
Experience
Achievements
Blog
Tools
Contact
Resume
Home
Experience
Achievements
Blog
Tools
Contact
Resume
Portfolio

Building robust and scalable cloud-native solutions with modern DevOps practices.

Navigation

  • Home
  • Experience
  • Achievements
  • Blog
  • Tools
  • Contact

Get in Touch

akhil.alakanty@gmail.com

+1 (248) 787-9406

Austin, TX

GitHub
LinkedIn
Twitter
Email

© 2025 Akhil Reddy. All rights reserved.

Built with Next.js, Tailwind CSS, and Framer Motion. Deployed on Vercel.

    Slashing Our S3 Bill by 40%: The Magic of Lifecycle Policies

    Our data lake costs were spiraling, but most of the data was rarely touched. Learn how we used S3 Lifecycle Policies to automatically move cold data to cheaper storage tiers like Glacier, cutting our bill by 40% with zero code changes.

    8/11/2025, 8:00:00 PM
    AWSS3Cost OptimizationFinOpsGlacier

    Paying to Store Data Nobody Looks At

    Our data lake was growing—fast. Every day, new datasets landed in our S3 bucket, ready for analysis. But here's the catch: after about 30 days, most of that data was never touched again. Yet, we were still paying premium S3 Standard prices to store it. Our S3 bill was becoming one of our largest cloud expenses.

    We needed a way to automatically move this "cold" data to cheaper, long-term storage without an ongoing manual effort. The answer? S3 Lifecycle Policies.

    Set It and Forget It: Automated Storage Tiering

    S3 Lifecycle Policies are a powerful feature that lets you define rules to automatically transition objects to different storage classes based on their age. This means you can create a cost-optimization pipeline that runs on autopilot.

    Here's the lifecycle we defined for our data lake:

    graph TD
        A[S3 Standard<br>(Hot / Frequent Access)] -- 30 Days --> B(S3 Standard-IA<br>(Warm / Infrequent Access));
        B -- 90 Days --> C(S3 Glacier Instant Retrieval<br>(Cool / Archive));
        C -- 180 Days --> D(S3 Glacier Deep Archive<br>(Cold / Long-Term Archive));
        D -- 7 Years --> E((Delete));
    

    The Lifecycle Rule in Detail

    We translated this logic into a single lifecycle rule applied to our datasets/ prefix. The rule automatically handles the transitions, and even cleans up old, non-current object versions to save more money.

    Here is the exact JSON configuration we applied to the bucket:

    {
      "Rules": [
        {
          "ID": "DataLakeTieringAndExpiration",
          "Status": "Enabled",
          "Filter": {
            "Prefix": "datasets/"
          },
          "Transitions": [
            {
              "Days": 30,
              "StorageClass": "STANDARD_IA"
            },
            {
              "Days": 90,
              "StorageClass": "GLACIER_IR"
            },
            {
              "Days": 180,
              "StorageClass": "DEEP_ARCHIVE"
            }
          ],
          "NoncurrentVersionTransitions": [
            {
              "NoncurrentDays": 30,
              "StorageClass": "ONEZONE_IA"
            }
          ],
          "Expiration": {
            "Days": 2555 // Approx. 7 years
          },
          "NoncurrentVersionExpiration": {
            "NoncurrentDays": 60
          }
        }
      ]
    }
    

    The Impact: 40% Savings with Zero Effort

    • 40% Cost Reduction: Within two months, our S3 storage costs for this bucket dropped by 40%, saving us thousands of dollars annually.
    • Zero Code Changes: We didn't have to modify a single line of application code. The process is completely transparent to the applications reading and writing the data.
    • Compliance and Cleanup: The policy ensures we automatically delete data after its retention period (7 years), helping with compliance and preventing indefinite data growth.

    S3 Lifecycle Policies are one of the most effective tools for controlling AWS storage costs. By analyzing our data access patterns and implementing a simple rule, we achieved significant savings with minimal effort.