The purpose of this blog post is to build a real-world DevOps automation project using Python, AWS Lambda, and Terraform. If you are someone interested in learning/building a cool DevOps project this is for you.
DevOps real-world AWS Lambda automation project that you can add to your resume
Scenario
Our team receives numerous files from a third-party vendor, who uploads them to an S3 bucket. These files are suffixed with date stamps. Over time, we accumulated over a thousand files, which presented a challenge since S3 doesn’t allow sorting objects by date when there are over 1,000 objects.
Our team performs daily checks, downloading the current day’s file to process the information. However, they struggled to sort and locate the latest files efficiently. To address this issue, we developed a Lambda function that organizes files in a specific path into folders structured by year/month/day.
Implementation
- I will use Terraform to provision the Lambda function.
- I will use Python as Lambda runtime.
- Python script will pick the files uploaded to a path and move them to their respective folder with year, month, and date.
- S3 notification will trigger the Lambda (When any new files get uploaded to the bucket on a path)
Prerequisite:
- Basic understanding of AWS services such as Lambda, S3, IAM, etc.
- Basic understanding of Python and boto3 SDK
- Basic knowledge of Terraform.
Project setup
Creating the file structure as below. .tf
files will store the Terraform code, and the Python code will be in lambda_functions/main.py
pre-setup-script.sh
will be used to test the Lambda function. This script will create random files in the bucket and this event will trigger the Lambda function.

Before we write Terraform, let me create a bucket with the name inbound-bucket-custome
and a folder incoming
. I will also create a bucket to store Terraform state — my-backend-devops101-terraform
Python script for Lambda function
lambda_functions/main.py

This code will perform the below functions.
- Retrieves the S3 bucket and file path from an environment variable (
BUCKET_PATH
). - Lists all files in a specific directory (
prefix
) within the S3 bucket.
For each file:
- Extracts the
year
,month
, anddate
from the file name. - Constructs a new path for the file based on this date information.
- Copies the file to the new path.
- Deletes the original file.
Terraform to deploy an AWS Lambda function with Python runtime
Setting up Terraform providers and s3 backend
versions.tf

Package the Python code as a zip file
The lambda function will require the zip file with Python code. I have used the Terraform data source archive_file
to zip the Python function code. We have provided the path to the Lambda function and the destination where to store the zip file.
lambda.tf

Create Lambda function Terraform resources
runtime -> python3.11
filename -> location of the zip file with Python code
source_code_hash -> This allows the Terraform to update the Lambda with the code changes
- We will pass the bucket name as an environment variable, Python code will access it using
os.getenv(“ENV_NAME”)
lambda.tf

Create a Lambda execution role.
lambda.tf

Iam policy that allows Lambda to access the files from the bucket

Attach the policy with the required permissions to the Lambda execution role

Create an S3 trigger for the Lambda function
Lambda needs explicit permission to be triggered by S3 hence we will create lambda permissions before creating the S3 trigger
lambda.tf

Create A bucket policy that will allow Lambda to get the bucket objects for the S3 notification trigger

Configure the AWS credentials
- Create an IAM user with only CLI access
- Create Access and Security key
- Install AWS CLI
- Run
aws configure

Note: We should never expose sensitive such as access keys, or security keys in the repo, or any public place. I will remove these credentials before publishing the blog post.
Deploy the Terraform
# Initialize the Terraform
terraform init
# Plan the Terraform
terraform plan
# Apply terraform
terraform apply

This will deploy all the resources.
Let’s test it. As you can see screenshots below, there are no files in the bucket path.


Let me create some objects in the bucket, which should trigger the lambda function.
I have created a short script that does it for us — pre-setup-script.sh
## pre-setup-script.sh ##
#!/bin/bash
# Define the S3 bucket name
S3_BUCKET="inbound-bucket-custome"
# Create 10 files with the format filename-randomnumber-yyyy-mm-dd
for i in {1..10}; do
RANDOM_NUMBER=$((1 + RANDOM % 1000))
FILENAME="filename-$RANDOM_NUMBER-$(date +%Y-%m-%d).txt"
echo "This is file number $i" > $FILENAME
aws s3 cp $FILENAME s3://$S3_BUCKET/incoming/
done
We will run this script to upload files to the bucket.

This will trigger the lambda function, which will move these files into the new folder structure.


That is all for this blog post, I hope you found it useful.
Connect with me on Linkedin: https://www.linkedin.com/in/akhilesh-mishra-0ab886124/