Home > Uncategorized > Clean up expired #S3 files in #AWS using C#

Clean up expired #S3 files in #AWS using C#

s3

If you use Amazon S3 to store temporary files, that are perhaps autogenerated, then emailed to customers, or are otherwise obsolete after a few hours, then it’s a good idea to delete them – to keep your storage costs low, and prevent any leaks of sensitive data later down the line.

Here is some C# code that reads all files in a specified bucket, and deletes files older than 12 hours old. – This example is hard-coded to the EU West Region (Ireland).

First, you need to install the NUGET package:

Install-package AWSSDK.S3

Then, create a class called S3, with the following code – I’ve ommited the keys, that you can get from IAM

using Amazon.S3;
using Amazon.S3.Transfer;
using System;
using System.Collections.Generic;
using System.IO;
using Amazon.S3.Model;

namespace S3Cleanup
{
/// <summary>
/// Amazon S3 functionality
/// </summary>
public static class S3
{
private static readonly AmazonS3Config Cfg = new AmazonS3Config { RegionEndpoint = Amazon.RegionEndpoint.EUWest1 };
private static readonly AmazonS3Client S3Client = new AmazonS3Client(“xxxx”, “xxx”, Cfg);

public static IEnumerable<S3Object> ListBucket(string bucket)
{

var request = new ListObjectsRequest {BucketName = bucket};
var lS3 = new List<S3Object>();
do
{
var response = S3Client.ListObjects(request);
lS3.AddRange(response.S3Objects);
if (response.IsTruncated)
{
request.Marker = response.NextMarker;
}
else
{
request = null;
}
} while (request != null);
return lS3;
}

public static void DeleteObject(string bucket, S3Object s3Object)
{
var request = new DeleteObjectRequest { BucketName = bucket, Key = s3Object.Key };
S3Client.DeleteObject(request);
}

public static string Upload(string bucket, byte[] data, string extension)
{
var ms = new MemoryStream(data);
var filename = Guid.NewGuid().ToString(“D”) + extension;
var fileTransferUtility = new TransferUtility(S3Client);
var fileTransferUtilityRequest = new TransferUtilityUploadRequest
{
BucketName = bucket,
InputStream = ms,
StorageClass = S3StorageClass.ReducedRedundancy,
Key = filename,
CannedACL = S3CannedACL.PublicRead
};
fileTransferUtility.Upload(fileTransferUtilityRequest);
return “https://s3-eu-west-1.amazonaws.com/&#8221; + bucket + “/” + filename;
}
}
}

Then you can write come code in your Main() method to do the following:

if (args.Length == 0)
{
Console.WriteLine(“Pass bucket name as argument”);
return;
}
var strBucketName = args[0];
var s3Objects = S3.ListBucket(strBucketName);
Console.WriteLine(“Found ” + s3Objects.Count() + ” objects in bucket”);
foreach (var s3object in s3Objects)
{
var age = DateTime.Now – s3object.LastModified;
if (age.TotalHours <= 12) continue;
Console.WriteLine(“Deleting ” + s3object.Key);
S3.DeleteObject(strBucketName, s3object);
}

Then, if you set that up as a scheduled task, you can delete your old S3 files, and keep your costs down.

Advertisement
Categories: Uncategorized
  1. December 30, 2022 at 5:21 pm

    Thanks ffor this

    Liked by 1 person

  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: