Event-Driven Automation: Creating a Text File upon EC2 Instance Lifecycle Events

Introduction

In the world of cloud computing, Amazon Elastic Compute Cloud (EC2) instances are the backbone of countless applications and services. As the need for automation and orchestration grows, it becomes essential to leverage the capabilities offered by EC2 instance lifecycle events. These events, triggered during various stages of an instance’s lifecycle, provide a powerful mechanism to perform custom actions and streamline operations.

In this blog post, we will explore how to harness the potential of EC2 instance lifecycle events by focusing on a specific use case: creating a text file to a S3 bucket upon the occurrence of a lifecycle event. We will dive into the underlying concepts, walk through the necessary setup, and demonstrate step-by-step how to achieve this automation using AWS services.

By the end of this tutorial, you will have a solid understanding of creating an event driven automated trigger that can be used to set an action. In this simple example, we will use EC2 instance lifecycle events at a trigger with the action of writting key fields of this transaction to a text file to S3.

In the software development theme, we will somewhat use the Don’t Repeat Yourself principle and create the below in Terraform.

Logical Flow

Tasks

  • Create S3
  • Create Lambda
  • Modify Lambda permission to write to S3
  • Create SNS Topic
  • Create SNS subscription to Lambda
  • Create Eventbridge rule to trigger action to SNS
  • Create an EC2 instance and confirm successfull workflow

S3 Creation

We will create a S3 bucket with a random 16 alphanumeric lowercase character string with ‘events’ as the suffix.

resource "random_string" "bucket_name" {
  length  = 16
  special = false
  #lower   = true
  upper = false
}

resource "aws_s3_bucket" "my_bucket" {
  bucket = "${random_string.bucket_name.result}-events"
}

Lambda

Since Lambda does require some type of code, i will be using Python to create a json file based on events and send that json file to the S3 bucket. I will start by zipping up the Python file and then use that zip file to upload to the Lambda function that i will create.

Python Code To Be Saved As a File

import os
import json
import boto3

s3 = boto3.client('s3')
BUCKET_NAME = 'li6ofs6frg2j95w2-events'


def lambda_handler(event, context):
    message = event
    file_name = f"{context.aws_request_id}.json"


    s3.put_object(
        Body=json.dumps(message),
        Bucket=BUCKET_NAME,
        Key=file_name
    )


    return {
        'statusCode': 200,
        'body': json.dumps(f'Saved SNS message to {BUCKET_NAME}/{file_name}')
    }

Zip Python File

data "archive_file" "lambda_zip" {
  type        = "zip"
  source_file = "lambda_function.py"
  output_path = "lambda_function.zip"
}

Lambda Function Creation

#variable to use in multi-locations
variable "lambda_function_name" {
  default = "my-lambda-function"
}

#creation of lambda function with attributes
resource "aws_lambda_function" "test_lambda" {
  function_name    = var.lambda_function_name
  handler          = "lambda_function.lambda_handler"
  runtime          = "python3.9"
  role             = aws_iam_role.lambda_role.arn
  timeout          = 10
  memory_size      = 128
  filename         = "lambda_function.zip"
  source_code_hash = data.archive_file.lambda_zip.output_base64sha256

  depends_on = [
    aws_iam_role_policy_attachment.lambda_logs,
    aws_cloudwatch_log_group.lambda_cwlog,
  ]
}

#Role for Lambda function
resource "aws_iam_role" "lambda_role" {
  name = "my-lambda-role"

  assume_role_policy = <<EOF
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Principal": {
        "Service": "lambda.amazonaws.com"
      },
      "Action": "sts:AssumeRole"
    }
  ]
}
EOF
}

#creation of a log group for lambda function
resource "aws_cloudwatch_log_group" "lambda_cwlog" {
  name              = "/aws/lambda/${var.lambda_function_name}"
  retention_in_days = 14
}


#data for the policy for log group
data "aws_iam_policy_document" "lambda_logging" {
  statement {
    effect = "Allow"

    actions = [
      "logs:CreateLogGroup",
      "logs:CreateLogStream",
      "logs:PutLogEvents",
    ]

    resources = ["arn:aws:logs:*:*:*"]
  }
}

#creating the policy data for logging to the policy
resource "aws_iam_policy" "lambda_logging" {
  name        = "lambda_logging"
  path        = "/"
  description = "IAM policy for logging from a lambda"
  policy      = data.aws_iam_policy_document.lambda_logging.json
}

#attaching the policy for logging
resource "aws_iam_role_policy_attachment" "lambda_logs" {
  role       = aws_iam_role.lambda_role.name
  policy_arn = aws_iam_policy.lambda_logging.arn
}

#data for the policy for s3 actions to have the ability to put files into S3
data "aws_iam_policy_document" "lambda_s3" {
  statement {
    effect = "Allow"
    actions = [
      "s3:*"
    ]
    resources = ["arn:aws:s3:::${random_string.bucket_name.result}-events/*"]
  }
}

#creating the policy data for s3 to the policy
resource "aws_iam_policy" "lambda_s3" {
  name        = "lambda_s3"
  path        = "/"
  description = "IAM policy for s3 from a lambda"
  policy      = data.aws_iam_policy_document.lambda_s3.json
}

#attaching the policy for s3
resource "aws_iam_role_policy_attachment" "lambda_s3" {
  role       = aws_iam_role.lambda_role.name
  policy_arn = aws_iam_policy.lambda_s3.arn
}

#allowing SNS to execute lambda function
resource "aws_lambda_permission" "with_sns" {
  statement_id  = "AllowExecutionFromSNS"
  action        = "lambda:InvokeFunction"
  function_name = aws_lambda_function.test_lambda.function_name
  principal     = "sns.amazonaws.com"
  source_arn    = aws_sns_topic.my_topic.arn
}

Create SNS Topic and Subscription

#obtaining the current Account ID
data "aws_caller_identity" "current" {}

#create SNS topic
resource "aws_sns_topic" "my_topic" {
  name = "my-topic"
}

#create a subscription under the topic for lambda
resource "aws_sns_topic_subscription" "my_subscription" {
  topic_arn = aws_sns_topic.my_topic.arn
  protocol  = "lambda"
  endpoint  = aws_lambda_function.test_lambda.arn
}

#create a policy for SNS to run SNS tasks (sid 1) and for EventBridge to send to SNS (sid 2)
resource "aws_sns_topic_policy" "my_sns_topic_policy" {
  arn    = aws_sns_topic.my_topic.arn
  policy = data.aws_iam_policy_document.my_custom_sns_policy_document.json
}

#policy data for SNS
data "aws_iam_policy_document" "my_custom_sns_policy_document" {
  policy_id = "__default_policy_ID"

  statement {
    sid = "1"
    actions = [
      "SNS:Subscribe",
      "SNS:SetTopicAttributes",
      "SNS:RemovePermission",
      "SNS:Receive",
      "SNS:ListSubscriptionsByTopic",
      "SNS:GetTopicAttributes",
      "SNS:DeleteTopic",
      "SNS:AddPermission",
    ]

    condition {
      test     = "StringEquals"
      variable = "AWS:SourceOwner"

      values = [
        data.aws_caller_identity.current.account_id
      ]
    }

    effect = "Allow"

    principals {
      type        = "AWS"
      identifiers = ["*"]
    }

    resources = [
      aws_sns_topic.my_topic.arn,
    ]

  }

  statement {
    sid = "2"
    actions = [
      "sns:Publish"
    ]
    effect  = "Allow"
    principals {
      type        = "Service"
      identifiers = ["events.amazonaws.com"]
    }
    resources = [
      aws_sns_topic.my_topic.arn
    ]
  }

}

Create EventBridge Rule

#create a rule to find changes to any EC2 state
resource "aws_cloudwatch_event_rule" "console" {
  event_pattern = jsonencode(
    {
      detail = {
        state = [
          "pending",
          "shutting-down",
          "stopping",
          "terminate",
          "stopped",
        ]
      }
      detail-type = [
        "EC2 Instance State-change Notification",
      ]
      source = [
        "aws.ec2",
      ]
    }
  )
  name = "ar"
}

#create a target to SNS whenever this rule is triggered
resource "aws_cloudwatch_event_target" "my_target" {
  rule      = aws_cloudwatch_event_rule.console.name
  target_id = "my-target"
  arn       = aws_sns_topic.my_topic.arn
}

Conclusion

In conclusion, leveraging lifecycle events provides a powerful means of automation and orchestration in the realm of cloud computing. Through this blog post, we explored a specific use case: creating a text file in an S3 bucket triggered by an EC2 lifecycle event. By understanding the underlying concepts, setting up the necessary components, and following the step-by-step guide using AWS services, we have successfully demonstrated how to achieve this automation. The knowledge gained from this tutorial empowers you to create event-driven automated triggers and take actions based on lifecycle events of other services as well as the ability to integrate with other applications. By embracing the principles of automation and eliminating repetition through tools like Terraform, you can enhance the efficiency and scalability of your applications in the cloud computing landscape.

Leave a Comment

Your email address will not be published. Required fields are marked *