Serverless Setup in AWS with SQS, SNS, API Gateway & Lambda

Chinelo Osuji
18 min readSep 17, 2023

What is a Serverless Setup in AWS?

A serverless setup, or infrastructure, in AWS refers to the design and deployment of applications and services without the need to manage traditional servers or infrastructure components. Instead, serverless computing shifts the operational responsibilities from the user to the cloud provider, allowing developers to focus solely on writing code and building applications. With serverless computing, you have the ability to pay only for the actual compute time used, which can lead to significant cost savings compared to maintaining traditional servers.

What Services Can I Use For This Kind of Setup?

AWS offers several serverless services and tools that enable this type of architecture, including but not limited to:

AWS Lambda

A serverless compute service that runs your code in response to events and automatically manages the compute resources. It eliminates the need for server provisioning, scales automatically, and charges you only for the compute time used.

Amazon API Gateway

A serverless API management service that enables you to create and publish RESTful APIs for your serverless functions. It provides features like API versioning, throttling, security, and integrates seamlessly with AWS Lambda for building serverless API backends.

Amazon DynamoDB

A fully managed, serverless NoSQL database service designed for applications that require single-digit millisecond latency. It scales automatically with your workload and offers seamless integration with AWS Lambda for real-time data processing.

Amazon SNS

Also called Simple Notification Service, is a serverless messaging service that enables decoupled communication between microservices and distributed systems. It integrates seamlessly with AWS Lambda for event-driven processing, reacting to changes and notifications in real-time.

Amazon SQS

Also called Simple Queue Service is a serverless message queuing service that decouples the components of cloud applications. In other words, it helps different parts of computer programs communicate with each other by storing messages and making sure they get delivered reliably.

AWS Step Functions

A serverless orchestration service that coordinates and manages workflows with visual state machines. It simplifies complex application workflows, provides detailed visibility, and handles retries and error handling. Step Functions are especially known for orchestrating multiple Lambda functions in serverless workflows.

Scenario

Let’s say we have a pharmaceutical company that runs their business through an online pharmacy. The company aims to ensure that their customers have access to the medicines they need, when they need them. As part of their commitment to customer care, they recognize the importance of timely medication refills and avoiding any gaps in their treatment due to expired prescriptions. To address this challenge, they have decided to implement a system that not only tracks each customer’s prescription details but also automatically sends out notifications as their prescriptions approach expiration.

Given the vast number of resources offered by AWS, the pharmaceutical company chooses to utilize its services for this project. They decided to deploy a serverless infrastructure consisting of Lambda functions with HTTP API triggers and an Oracle event source to receive events (or prescription updates). These events can originate from an external system, such as the company’s online pharmacy website or an Oracle database that holds prescription information. They also implement an SQS queue to receive the messages (or prescription information) from those events, publish those messages to SNS Topics to then send notifications to subscribers (or in this case, customers), and a DynamoDB table to store output of the SNS Topics. And to automate the workflow, an AWS Step Function is implemented to run the Lambda functions in sequence, along with an Eventbridge rule to schedule the Step Function to run at a specific time every 24 hours.

And if the company is using Oracle CRM for managing customer interactions, integrating prescription information into the same Oracle environment can streamline operations. Having prescription data in the same database can help to provide a dynamic view of each customer, including their prescription history, order status, and communication history.

With this architecture, the pharmaceutical company can ensure timely notifications to its customers, leading to increased customer trust and potentially more business as customers are more likely to get their prescriptions refilled before running out of medication.

Let’s get started…

The code below is our starting point. It sets up and configures an SQS environment for handling prescription updates. It defines two functions, check_expiry and check_refills, which checks whether a prescription has expired and whether it has any refills remaining.

The main function creates an SQS client and attempts to create a Dead Letter Queue (DLQ), which is a specific queue that handles messages that cannot be successfully processed due to issues such as data format errors, code bugs, or temporary service outages. If the DLQ creation is successful, it extracts the DLQ URL and ARN and logs them.

The function also defines a Redrive policy for the primary queue, which means that if a message fails to be processed a certain number of times (in this case 5 times), it will be moved to the DLQ. It then creates the main SQS queue, extracts its URL and ARN, then logs them. Finally, it sets these queue URLs and ARNs as environment variables (QUEUE_URL, QUEUE_ARN, and DLQ_ARN) for later use and exception handling is in place to log errors appropriately.

We’re going to run this script in AWS Cloud9 IDE along with a few others throughout this project.

Go to Cloud9 in the AWS Console and create an environment.

Once you’re connected to your Cloud9 environment, open a terminal.

Before we do anything, let’s check if boto3 is installed by using the pip package manager.

Run pip show boto3 to get information about the boto3 package installed in your Cloud9 environment.

The output below indicates that the boto3 package is not installed in my environment.

So, I ran pip install boto3 to install the boto3 package.

I also ran pip show boto3 to verify that the package has been installed.

Now let’s create a New File for the Python script.

Copy and paste the script for creating the SQS Queue in the open field.

Save the file with the name of your choice.

Now run python <filename>.py to execute the script.

Below is an example of the output.

The output includes the DLQ URL, DLQ ARN, Main Queue URL, and Main Queue ARN. These values will be used later throughout this project.

Also, we can go to Simple Queue Service in the AWS Console and here we will see the Main SQS Queue and the DLQ that was created from executing the Python script in the Cloud9 environment.

Now let’s create the Python script we’re going to use for our 1st Lambda function. The code below initializes an SQS client and retrieves the URLs of SQS queue and DLQ from environment variables. Depending on the event source, whether it’s an HTTP API request or an Oracle database event, it extracts specific data fields from the event, including email, phone number, prescription details, and customer information. It constructs a message body with this data and adds a timestamp. Next, it sends this message to an SQS queue specified by the QUEUE_URL. If the process is successful, it logs a message indicating success and returns a response confirming the message's delivery. If any errors occur, it logs an error message and returns an error response.

Copy and paste the code below in your Notepad and save the file with the extension “.py”.

Then, create an S3 Bucket to store the .py file.

If you need assistance with this, please refer to my previous article on creating S3 Buckets. CLICK HERE

Pack the .py file into a .zip file and upload it to the S3 bucket that you created.

Now we’re going to create the Cloudformation template that will use the Python script stored in the S3 bucket as the source code for the Lambda function.

To be more specific, the template creates a Lambda function that handles messages, specifying its runtime (python 3.7), code location (S3 bucket), execution role, and environment variables. It also creates an IAM role with a policy allowing the Lambda function to send messages to an SQS queue. Additionally, it defines an HTTP API using API Gateway, with a route that specifies how to handle POST requests. The integration connects the API Gateway to the Lambda function, invoking it when requests are made. And it grants the necessary permissions for the API Gateway to invoke the Lambda function. I don’t have an actual Oracle event source for this project so the HTTP API will do the job.

Keep in mind, setting environment variables at runtime in the Python script won’t be permanent in Lambda executions. It sets it for the current process. This means os.environ in the script won’t persist across Lambda invocations.

After running the main function from the provided Python script, you’ll receive logging outputs for the created DLQ URL, DLQ ARN, and the main SQS queue ARN. You will need to replace PLACE_YOUR_QUEUE_URL_HERE, PLACE_YOUR_MAIN_QUEUE_ARN_HERE, PLACE_YOUR_TOPIC_ARN_HERE, and PLACE_YOUR_DLQ_ARN_HERE in the CloudFormation template below with the appropriate values from the Python script’s logs. This means that the SNS Topic has to be created before deploying this template. I’m covering this next.

Also keep in mind to replace <YOUR S3 BUCKET NAME GOES HERE> with the name of your S3 Bucket.

And to replace <FILENAME OF 1ST LAMBDA FUNCTION FILE STORED IN S3 BUCKET>, with the zip filename of the Lambda function in your S3 Bucket.

Copy the code below and paste it in your Notepad and save the file with the extension “.yaml”.

Now let’s upload the template to CloudFormation.
Go to AWS CloudFormation and click Create stack.

On Step 1 page select Template is ready and Upload a template file.
Once you’ve selected the file, click Next.

On Step 2 page enter a Stack name and click Next.

On Step 3 page keep all default stack options and click Next.

On Step 4 page scroll down click the box next to I acknowledge that AWS CloudFormation might create IAM resources and click Submit.

On the next page, we will see the stack creation in progress. Wait a few minutes for completion.

Next let’s create the SNS Topic that will be used to publish messages from the SQS Queue. The code below creates an SNS topic named PrescriptionNotifications and extracts its ARN. The ARN is stored as an environment variable for future use. Subsequently, the code sets an attribute for the SNS topic, specifying the display name as Prescription Notifications. Any successful actions are logged, while any exceptions during topic creation or attribute setting are caught and logged for error handling and debugging.

Go back to your Cloud9 environment.

Create a New file for the script.

Copy and paste the script for creating the in the open field.

Save the file with the name of your choice.

Now run python <filename>.py to execute the script.

Below is an example of the output.

The output includes the SNS Topic’s ARN. We must use this value in our CloudFormation template as previously mentioned.

And if we go to Amazon SNS in the AWS Console, we can see the SNS Topic that was created from executing the script.

Now let’s create the Python script we’re going to use for our 2nd Lambda function. The code below defines the Lambda function that processes messages from an SQS Queue, custom exceptions for handling invalid phone numbers and email addresses, sets up clients for SNS and SQS, retrieves necessary environment variables, and processes prescription-related messages received from an SQS Queue.

To be more specific, this Lambda function is triggered by messages from the SQS Queue created by the first Lambda function. It receives messages from the SQS Queue and processes them. It extracts necessary data from the received messages, such as prescription details, and publishes the original message to an SNS Topic. It also has functions to validate phone numbers and email addresses using external libraries and raises custom exceptions with specific error messages for invalid data. After processing, it deletes the message from the SQS Queue. Additionally, it performs checks on the prescription, such as expiration and refills, and sends notifications to the SNS Topic if certain conditions are met.

Copy and paste the code below in your Notepad and save the file with the extension “.py”.

Then pack the .py file into a .zip file and upload it to the same S3 bucket that’s storing the first Lambda function.

Next, we’re going to create the Cloudformation template that will use the Python script stored in the S3 bucket as the source code for the 2nd Lambda function.

The template below creates a Lambda function responsible for handling messages from the SQS Queue, processing them, and publishing the results to the SNS Topic. The Lambda function is triggered by messages in the SQS Queue and is associated with an IAM role granting it permissions to interact with SQS and SNS. An HTTP API using API Gateway is defined with routes, integrations, and permissions to allow external HTTP POST requests to trigger the Lambda function.

This creates a communication flow where messages sent to the SQS Queue can be processed by the Lambda function and results in notifications sent to customers via SNS. This architecture is sometimes used for decoupling components in a microservices or event-driven system.

Keep in mind to replace <YOUR S3 BUCKET NAME GOES HERE> with the name of your S3 Bucket.

And to replace <FILENAME OF 2ND LAMBDA FUNCTION FILE STORED IN S3 BUCKET>, with the zip filename of the Lambda function in your S3 Bucket.

Copy the code below and paste it in your Notepad and save the file with the extension “.yaml”.

Now let’s upload the template to CloudFormation.
Go to AWS CloudFormation and click Create stack.

On Step 1 page select Template is ready and Upload a template file.
Once you’ve selected the file, click Next.

On Step 2 page enter a Stack name and click Next.

On Step 3 page keep all default stack options and click Next.

On Step 4 page scroll down click the box next to I acknowledge that AWS CloudFormation might create IAM resources and click Submit.

On the next page, we will see the stack creation in progress. Wait a few minutes for completion.

Next let’s create the DynamoDB table that we will use to store output of the SNS Topics

The Python script below begins by establishing a connection to DynamoDB using the boto3 library. Then, it creates the DynamoDB table with various settings, including the table name, key schema, attribute definitions, provisioned throughput, and global secondary indexes. The code also includes error handling to manage different scenarios. It checks if the table already exists and logs a warning if so, logs AWS SDK-specific errors, and logs any other exceptions that might occur during the table creation process. Finally, it prints a confirmation message if the table is created successfully including its ARN, and it waits until the table creation process is finished using a waiter.

Go back to your Cloud9 environment.

Create a New file for the script.

Copy and paste the script for creating the in the open field.

Save the file with the name of your choice.

Now run python <filename>.py to execute the script.

Below is an example of the output. The output includes the Dynamo DB Table’s ARN.

We can go to DynamoDB in the AWS Console and here we will see the DynamoDB table we created from executing the script.

And here we can see the Global secondary indexes created from the script. They allow you to query and retrieve data in ways other than the primary key of the table.

Now let’s create the Python script we’re going to use for our 3rd Lambda function.

This Lambda function is designed to store prescription updates in DynamoDB. It processes messages from an SNS topic, extracts data from the messages, checks the prescription status (expiration and refills), and stores the prescription update in a DynamoDB table. It maintains counts of successfully processed messages and failed operations.

To be more specific, it starts by initializing a connection to DynamoDB and defining the name of the DynamoDB table to be used. The lambda_handler function is the entry point for processing events, which are typically triggered by messages from the message queue. Within this function, it goes over each message in the event and attempts to extract various fields from the JSON content of each message. These extracted fields are related to prescription updates.

The store_prescription_update function handles the actual insertion of data into the DynamoDB table and ensures that timestamps are in the correct ISO 8601 format. Any errors during this process are logged. It also keeps track of the number of successfully processed messages and any failures.

Copy and paste the code below in your Notepad and save the file with the extension “.py”.

Then pack the .py file into a .zip file and upload it to the same S3 bucket that’s storing the first 2 Lambda functions.

Below is an example of how your S3 Bucket Objects list should look at this point.

Now we’re going to create the Cloudformation template that will use the Python script stored in the S3 bucket as the source code for the 2nd Lambda function.

This template includes a Lambda function that’s configured to respond to events from both an SNS topic and an HTTP API endpoint. The Lambda function has an associated IAM role with specific permissions to access DynamoDB and SNS. An API Gateway is created to manage HTTP requests and route them to the Lambda function. The integration between the API Gateway and Lambda is specified in the SnsToDynamoHttpApiIntegration resource. Lastly, LambdaApiGatewayPermission grants permission for the API Gateway to invoke the Lambda function. This code sets up the infrastructure for the serverless application to handle events from both SNS and HTTP requests.

Keep in mind to replace <YOUR S3 BUCKET NAME GOES HERE> with the name of your S3 Bucket.

And to replace <FILENAME OF 3RD LAMBDA FUNCTION FILE STORED IN S3 BUCKET>, with the zip filename of the Lambda function in your S3 Bucket.

Copy the code below and paste it in your Notepad and save the file with the extension “.yaml”.

Now let’s upload the template to CloudFormation.
Go to AWS CloudFormation and click Create stack.

On Step 1 page select Template is ready and Upload a template file.
Once you’ve selected the file, click Next.

On Step 2 page enter a Stack name and click Next.

On Step 3 page keep all default stack options and click Next.

On Step 4 page scroll down click the box next to I acknowledge that AWS CloudFormation might create IAM resources and click Submit.

On the next page, we will see the stack creation in progress. Wait a few minutes for completion.

Now, to top it off, we can create a Step Function to coordinate the execution of these Lambda functions to handle prescription updates from the initial processing to sending notifications and storage in a structured and chronological manner, ensuring that each step of the workflow is executed as intended.

The Step Function starts with the execution of the 1st Lambda Function. Once the 1st Lambda Function completes its processing and returns a response, the Step Function moves on to the 2nd Lambda Function.

The 2nd Lambda Function is triggered by messages from the SQS queue created by the 1st Lambda Function. It processes these messages, publishes them to an SNS topic, performs additional checks, and deletes the processed messages from the SQS queue. After the 2nd Lambda Function completes its execution, the Step Function proceeds to the 3rd Lambda Function.

The 3rd Lambda Function processes messages from an SNS topic, stores prescription updates in DynamoDB, and keeps track of successful and failed operations.

Also, the template creates an Eventbridge Rule named DailyPrescriptionUpdatesRule. This rule is scheduled to run at a specific time daily using a CRON expression to specify the schedule. In this case, 'cron(0 12 * * ? *)' means the rule triggers every day at 12:00 PM UTC. To convert this time to EST, you would subtract 5 hours, making it 7:00 AM EST. The target is the State Machine of the Step Function, which helps to effectively schedule the execution of the State Machine.

Below is the Cloudformation template that we will use to create the Step Function and Eventbridge Rule.

Copy the code below and paste it in your Notepad and save the file with the extension “.yaml”.

Keep in mind to replace the Resource for Lambda 1, 2, and 3 in the Step Function with the ARNs of the Lambda functions. You can go to Lambda in the AWS Console to copy and paste the ARNs in the code as shown below.

Now let’s create another CloudFormation stack.

On Step 1 page select Template is ready and Upload a template file.
Once you’ve selected the file, click Next.

On Step 2 page enter a Stack name and click Next.

On Step 3 page keep all default stack options and click Next.

On Step 4 page scroll down click the box next to I acknowledge that AWS CloudFormation might create IAM resources and click Submit.

On the next page, we will see the stack creation in progress. Wait a few minutes for completion.

We can go to Step Functions in the AWS Console and here we will see the State Machine we created from deploying the stack.

And we can go to Eventbridge in the AWS Console and here we will see the Rule we created. If you click on the Rule, you can view the Event schedule based on your Local time zone.

We’re almost done!

Now let’s test the Lambda function’s HTTP API trigger. I’m using Windows, so I used the Bash script below in PowerShell to send HTTP POST requests to the API Gateway endpoint with a JSON payload containing prescription-related details.

Keep in mind to replace <YOUR_API_GATEWAY_URL> with the Invoke URL of the Stage of the API.

$env:API_URL = "<YOUR_API_GATEWAY_URL>/send"

$JSON_PAYLOAD = @'
{
"email_address": "example@example.com",
"phone_number": "+17864444444",
"prescription_id": "4444",
"medication_name": "MedicineName",
"expiration_date": "2023-09-15",
"refills_remaining": 0,
"pet_name": "Fluffy",
"update_type": "expired",
"customer_name": "Chinelo Osuji"
}
'@

$headers = @{
"Content-Type" = "application/json"
}

try {
$response = Invoke-RestMethod -Method Post -Uri $env:API_URL -Headers $headers -Body $JSON_PAYLOAD
Write-Host "API Response: $response"
} catch {
Write-Host "API Request Failed: $($_.Exception.Message)"
}

To retrieve the Invoke URL, go to API Gateway in the AWS Console and click on the API that’s associated with the 1st Lambda function.

On the left, click Stages and select prod.

On the right is the Invoke URL in the Stage details. Copy the URL and paste it in replace of <YOUR_API_GATEWAY_URL>.

Now copy the entire Bash script and paste it in the Powershell terminal.

When we run the script, the output will indicate the message was sent to the SQS Queue.

And when we go to SQS in the AWS Console, we can now see the messages available in the queue as a result of running the Bash script.

So, to conclude, we created a prescription management system where prescription updates are received via HTTP API or an Oracle event source. SQS is used as an intermediate message queue for handling updates. These updates are processed and stored in DynamoDB, and notifications are sent to customers through SNS when prescriptions are expired, about to expire or run out of refills. And for a smooth workflow, we used a Step Function to run the Lambda functions in chronological order and an Eventbridge rule to schedule the Step Function to run the Lambda functions daily at a specific time. As a result, automating the dispersal of prescription updates to customers daily.

Keep in mind to delete the CloudFormation stacks so that you’re not charged for resources you don’t need. Since the DeletionPolicy was set to Delete for all resources, once we delete the stack, we do not have to go and delete each resource individually. You have the option to set the DeletionPolicy to Retain for any resources you do not want to have automatically deleted.

Thank you for taking the time to go through this process with me!

I hope you enjoyed this one because I sure did!

--

--

Chinelo Osuji

DevOps | Cloud | Data Engineer | AWS | Broward College Student