Using Terraform to Create an Image Resizer with AWS Lambda and S3

Think-it logo
DevOps Chapter
Engineering.12 min read
Terraform is an open-source Infrastructure as Code (IaC) tool that allows you to define and provision cloud resources using a declarative language. Learn how Terraform, in collaboration with AWS, revolutionizes infrastructure management.

In today's cloud era, serverless architectures are increasingly favored for building scalable and cost-efficient applications. A typical use case is creating an image resizer service that processes images on-the-fly, which is especially beneficial for web applications that need to serve images in various sizes to enhance performance and user experience. This article will guide you through setting up an image resizer using AWS Lambda and S3, with Terraform managing the infrastructure.

Prerequisites

Before we start, ensure you have the following:

  1. AWS Account: An active AWS account.
  2. Terraform: Installed on your local machine. You can download it from here.
  3. AWS CLI: Installed and configured with your AWS credentials.
  4. Basic understanding of AWS services and Terraform concepts.

Architecture overview

This is the overall architecture that we are going to deploy through Terraform, which involves both S3 and AWS Lambda. This serverless architecture allows for scalable, cost-efficient image processing without the need to manage underlying infrastructure. Our architecture encompasses the following components:

  • AWS S3 Buckets: Two S3 buckets will be created—one for storing the original images and another for the resized images. These buckets serve as the primary storage for our image processing pipeline. Two buckets are used in order to avoid a infinite loop of resealing.
  • AWS Lambda Function: A Lambda function will be implemented to handle the resizing of images. This function will be triggered automatically whenever a new image is uploaded to the original image bucket.
  • IAM Roles and Policies: Appropriate IAM roles and policies will be configured to grant the necessary permissions for the Lambda function to interact with the S3 buckets.
  • S3 Bucket Notification: The original image S3 bucket will be set up to trigger the Lambda function upon new uploads, ensuring seamless and automated image processing.

Step 1: Set Up the S3 Buckets

We’ll create two S3 buckets: one for storing the original images and another for the resized images. Terraform makes this process straightforward.

Create a new directory for your Terraform project and a file named main.tf:

provider "aws" {
  region = "REGION"
}

resource "aws_s3_bucket" "original_images" {
  bucket = "your-original-images-bucket"
}

resource "aws_s3_bucket" "resized_images" {
  bucket = "your-resized-images-bucket"
}
# Replace "your-original-images-bucket" and "your-resized-images-bucket" and "REGION" with unique names for your buckets.

Step 2: Create the Lambda Function

Next, we’ll create a Lambda function that will resize the images. This function will be triggered whenever a new image is uploaded to the original images bucket. First, let’s write the Lambda function code in Python:

Create a file named lambda_function.py:

import boto3
from PIL import Image
import io

s3 = boto3.client('s3')

def lambda_handler(event, context):
    bucket = event['Records'][0]['s3']['bucket']['name']
    key = event['Records'][0]['s3']['object']['key']
    
    response = s3.get_object(Bucket=bucket, Key=key)
    image = Image.open(response['Body'])
    
    resized_image = image.resize((100, 100))
    
    buffer = io.BytesIO()
    resized_image.save(buffer, 'JPEG')
    buffer.seek(0)
    # Replace 'your-resized-images-bucket' with the name of your resized images bucket.
    s3.put_object(
        Bucket='your-resized-images-bucket',
        Key=key,
        Body=buffer,
        ContentType='image/jpeg'
    )
    
    return {
        'statusCode': 200,
        'body': 'Image resized successfully'
    }

Since we are using external packages, we should create a requirements.txt file containing this entry below:

Pillow

Now, we should first build our Lambda layer package since we are using a third-party library. The build can be achieved using the official AWS Lambda Docker image.

docker run -v "$(PWD)":/var/task "public.ecr.aws/sam/build-python3.9" /bin/sh -c '
  mkdir -p packaged layer/python && \
  pip install -r requirements.txt -t layer/python && \
  cd layer && zip -q -r9 ../packaged/pil.zip python; \
  cd .. && rm -rf layer
'

Add the following code to your main.tf to package this code into a ZIP file using the data source:

#This data source aims to generate an archive from content, a file, or a directory of files.
data "archive_file" "function_package" {
  type        = "zip"
  source_file = "${path.root}/lambda_function.py"
  output_path = "${path.root}/packaged/lambda_function.zip"
}

Step 3: Define the Lambda Function in Terraform

Add the following to your main.tf file to create the Lambda function and the necessary IAM role:

# Lambda layer containing Python package
resource "aws_lambda_layer_version" "pillow_layer" {
  filename            = "${path.root}/packaged/pil.zip"
  layer_name          = "image-resizer-layer"
  source_code_hash    = filebase64sha256("${path.root}/packaged/pil.zip")
  compatible_runtimes = ["python3.9"]
}

# Lambda function Role definition
resource "aws_iam_role" "lambda_role" {
  name = "lambda_s3_exec_role"
  assume_role_policy = jsonencode({
    Version = "2012-10-17",
    Statement = [
      {
        Action = "sts:AssumeRole",
        Effect = "Allow",
        Sid    = "",
        Principal = {
          Service = "lambda.amazonaws.com",
        },
      },
    ],
  })
}
resource "aws_iam_role_policy_attachment" "lambda_s3_policy" {
  role       = aws_iam_role.lambda_role.name
  policy_arn = "arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole"
}

#Created Policy for IAM Role
resource "aws_iam_policy" "policy" {
  name        = "lambda_s3_policy"
  description = "A test policy"

  policy = <<EOF
   {
"Version": "2012-10-17",
"Statement":[
    {
        "Effect": "Allow",
        "Action": [
            "s3:GetObject"
        ],
        "Resource": "arn:aws:s3:::${aws_s3_bucket.original_images.bucket}/*"
    },
    {
        "Effect": "Allow",
        "Action": [
            "s3:PutObject"
        ],
        "Resource": "arn:aws:s3:::${aws_s3_bucket.resized_images.bucket}/*"
    }
]
} 
    EOF
}
resource "aws_iam_role_policy_attachment" "lambda_s3_policy" {
  role       = aws_iam_role.lambda_role.name
  policy_arn = "arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole"
}

resource "aws_lambda_function" "image_resizer" {
  filename         = "packaged/lambda_function.zip"
  function_name    = "image-resizer"
  role             = aws_iam_role.lambda_role.arn
  handler          = "lambda_function.lambda_handler"
  runtime          = "python3.9"
  layers           = [aws_lambda_layer_version.pillow_layer.arn]
  source_code_hash = data.archive_file.function_package.output_base64sha256
}
#This allows S3 to invoke the lambda.
resource "aws_lambda_permission" "allow_s3" {
  statement_id  = "AllowExecutionFromS3"
  action        = "lambda:InvokeFunction"
  function_name = aws_lambda_function.image_resizer.function_name
  principal     = "s3.amazonaws.com"
  source_arn    = aws_s3_bucket.original_images.arn
}

Step 4: Configure S3 to Trigger Lambda

Finally, configure the original images in the S3 bucket to trigger the Lambda function on new uploads. Add the following to your main.tf file:

resource "aws_s3_bucket_notification" "bucket_notification" {
  bucket = aws_s3_bucket.original_images.id

  lambda_function {
    lambda_function_arn = aws_lambda_function.image_resizer.arn
    events              = ["s3:ObjectCreated:*"]
  }

  depends_on = [aws_lambda_permission.allow_s3]
}

Step 5: Deploy the Infrastructure

Now, let’s deploy the infrastructure. Run the following commands:

terraform init
terraform apply
Note: Terraform will create a state file to track the resources it manages. Ensure this file is stored securely and not committed to version control.
# Confirm terraform apply with yes.

Step 6: Test the Image Resized

Simply upload an image to the original images S3 bucket. You should see a resized image appear in the resized images S3 bucket shortly after.

As you can see here, the initial size of the image is 79.5 KB.

And now, the reduced size of the image is 2.8 KB.

For more information on Terraform and AWS, check out the official Terraform documentation and AWS Terraform provider documentation.

Share this story