AWS S3: Blank out your Bucket

Photo by Unsplash

Simple way to give your S3 bucket a fresh start when it is too filled with useless files.

Simple way to give your S3 bucket a fresh start when it is too filled with useless files.

It is a bit difficult emptying an AWS S3 bucket, these are the easiest ways to do so (In my opinion).


AWS S3: Lifecycle Policy

If you are a patient soul (unlike me), then I’d suggest you create a lifecycle policy like the following:

Tried it out and it deleted all of my files in 24 hrs (not the most effective method though).


AWS Lambda

Create an AWS Lambda function with the following python code which will iterate through the AWS S3 bucket and its folders to then delete its content:

                import boto3
import os
REGION = os.environ['AWS_REGION']
s3_client = boto3.client('s3', region_name=REGION)
def lambda_handler(event, context):
    bucket_name = "sns-query-result-jean"
    return delete_bucket(bucket_name)
def delete_bucket(bucket_name):
    # Check if versioning is enabled
    status = versioning_status(bucket_name)
    
    #Delete bucket files
    delete_bucket_files(bucket_name, status)
    return {"status":"Success"}
#Validates if S3 has versioning enabled    
def versioning_status(bucket_name):
    response = s3_client.get_bucket_versioning(Bucket=bucket_name)
    status = response.get('Status','')
    return status
#Delete all S3 bucket files
def delete_bucket_files(bucket_name, status):
    #Disables versioning if enabled
    if status == 'Enabled':
        response = s3_client.put_bucket_versioning(Bucket=bucket_name, VersioningConfiguration={'Status': 'Suspended'})
    
    #Obtains versions of all files and Applies file version search to bucket
    page_iterator = get_paginator('list_object_versions', bucket_name)
#Loops through each file
    for page in page_iterator:
        if 'DeleteMarkers' in page and page['DeleteMarkers'] is not None:
            for marker in page['DeleteMarkers']:
                s3_client.delete_object(Bucket=bucket_name, Key=marker['Key'], VersionId=marker['VersionId'])
        if 'Versions' in page and page['Versions'] is not None:
            for version in page['Versions']:
                s3_client.delete_object(Bucket=bucket_name, Key=version['Key'], VersionId=version['VersionId'])
    
    page_iterator_v2 = get_paginator('list_objects_v2', bucket_name)
    for page in page_iterator_v2:
        if 'Contents' in page  and page['Contents'] is not None:
            for content in page['Contents']:
                s3_client.delete_object(Bucket=bucket_name, Key=content['Key'])
    return "Success"
def get_paginator(element, bucket_name):
    paginator = s3_client.get_paginator(element)
    return paginator.paginate(Bucket=bucket_name)
            

Note: Must include the following permissions to your AWS Lambda execution role.

                {
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": [
                "s3:DeleteObjectVersion",
                "s3:ListBucketVersions",
                "s3:ListBucket",
                "s3:GetBucketVersioning",
                "s3:DeleteObject",
                "s3:PutBucketVersioning"
            ],
            "Resource": "*"
        }
    ]
}
            


AWS CLI

The following command would probably the easiest method out of all the previous options:

                aws s3 rm --recursive s3://your_bucket_name
            


Conclusion

Simple, but effective ways to give your AWS S3 buckets a fresh start. Especially if you’re like me and leave the clutter for way too long.

Hope this was helpful. If you have other recommendations, please leave a comment.

As always, Thank you and Gracias!!!


Only registered users can post comments. Please, login or signup.

Start blogging about your favorite technologies and get more readers

Join other developers and claim your FAUN account now!

Avatar

Jean Velez Torres

Cloud Solutions Architect, Evertec, Inc.

@jeanvelez2
Hello Everyone! I'm Jean Velez, Cloud Solutions architect from Puerto Rico who wants to start out blogging. Always loved to teach others (not the best teacher, but still enjoy it). Works with AWS and Azure, and also a python enthusiast.
Stats
35

Influence

2k

Total Hits

8

Posts