Skip to content

Spring boot Application using AWS Serverless Services such as S3,API Gateway, Lambda, SQS and Dynamo DB

Notifications You must be signed in to change notification settings

pmzone31/AWS-Serverless-Springboot

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 

Repository files navigation

AWS Serverless Services using Spring boot

In this Project, we will implement a full serverless solution using AWS Services and Java Spring boot as the framework for the processing. We will implement the below solution :

User Invoking API Gateway Rest API and upload file on S3.

S3 event notification triggers Lambda.

Lambda processes the records in the file and pushes them to SQS.

Another Lambda function consumes the records from SQS and saves it into Dynamo DB.

Here is the architecture diagram for the same :

diagram-export-1-27-2025-1_59_22-PM

Step 1 : Create a S3 bucket.

Navigate to AWS Console and search for S3. Click Create Bucket.

Screenshot 2025-01-27 at 2 02 29 PM

Give name to S3 bucket and uncheck block all public access. Keep all other settings as default .

Screenshot 2025-01-27 at 2 03 32 PM

This will create our S3 bucket.

Step 2 : Create API Gateway Rest API

In the AWS Console , search for API Gateway and Select Rest API and Build.

Screenshot 2025-01-27 at 2 06 28 PM

Give the name and description and Click Create API

Screenshot 2025-01-27 at 2 07 10 PM

Once API is created, Click Create Resource :

Resource path should be left as it is and Resource-name should be {bucket}. This means we will pass our bucket name as path parameter.

Click Create Resource again and enter resource-name as {filename}

Screenshot 2025-01-27 at 2 08 13 PM

Lets now create API Gateway role to access s3

Navigate to API Gateway and select Create Role and select AWS Service as API Gateway

Screenshot 2025-01-27 at 2 15 53 PM

Clickk, next, next and Enter role name and hit submit.

Navigate to the newly created role and attach permissions tab . click Add permissions and add the below json :

{ "Version": "2012-10-17", "Statement": [ { "Sid": "VisualEditor0", "Effect": "Allow", "Action": "s3:", "Resource": "" } ] }

Now, Let us go back to API Gateway and Click Create Method.

Select method as PUT and AWS Service as S3 as shown in the screenshot. Rest all details should be default.

Screenshot 2025-01-27 at 2 22 26 PM

Now go to integration request add edit path variables.

Screenshot 2025-01-27 at 2 22 26 PM

One last thing, Go to settings , we need to accept binary file types and enter /

Screenshot 2025-01-27 at 2 26 26 PM

Now click on deploy and enter stage name as dev.

Once it is deployed successdfully, we have not integrated API Gateway and S3.

Step 3 : Create Lambda Function which will trigger when file gets uploaded to S3 and send the data to Amazon SQS.

Lets create a Lambda function using spring boot. Navigate to https://start.spring.io and Download skeleton project.

Add the below dependencies in pom.xml com.amazonaws aws-lambda-java-core 1.2.3 com.amazonaws aws-lambda-java-events 3.11.5 com.amazonaws aws-java-sdk-s3 1.12.705 org.apache.commons commons-csv 1.13.0 com.amazonaws aws-java-sdk-sqs 1.12.705

Also, we need to add a plugin to build jar for Lambda .

org.apache.maven.plugins maven-shade-plugin false package shade

Next, Create a LambdaHandler Class and add the code as shown below :

Screenshot 2025-01-27 at 2 57 02 PM Screenshot 2025-01-27 at 2 57 39 PM

This shows we receive file from S3 using S3Event Object, process the file and save it as Java Object by parsing it using Open CSV parser. Once the java object is formed, it is being sent to SQS queue.

In AWS console, Navigate to Lambda and Click Create Function. Select runtime as Java 17 and enter function name and click create.

Screenshot 2025-01-27 at 2 59 31 PM

Now, lets upload the jar file, Click on Code tab and upload .zip or jar file.

Screenshot 2025-01-27 at 3 00 35 PM

Upload our jar file and click save.

one last thing, we need to edit runtime settings and enter fully qualified name of our handler class and method name as shown in the screenshot :

Screenshot 2025-01-27 at 3 02 44 PM

Now, Lets create Trigger for S3 . We can do this both ways, Create Event Notification in S3 or trigger in Lambda.

Lets create in Lambda itself. Click Add trigger. Select service as S3.

Screenshot 2025-01-27 at 3 04 55 PM

We can verify that it is also now reflected in S3 event notification :

Screenshot 2025-01-27 at 3 06 29 PM

Let us now create SQS queue where Lambda can push messages.

Click Create Queue. Enter Queue name and it should be standard Queue. Leave Everything default.

Screenshot 2025-01-27 at 5 05 53 PM

With this Step 2 is completed and We should be able to now invoke API gateway and upload file in S3. This in turn should trigger Lambda function which should push messages to SQS.

Step 3 : Lambda to consume Message from SQS and saving into Dynamo dB.

We need to create Lambda function which will pick messages from SQS. Lets create a role for Lambda which gives permissions for SQS execution.

click Create Role and Add permissions AWSLambdaSQSQueueExecutionRole policy .

Also, as this lambda function will also write to dynamo db, we need to add one customer managed custom policy as shown below :

Screenshot 2025-01-27 at 5 11 30 PM

Here is the custom policy :

{ "Version": "2012-10-17", "Statement": [ { "Sid": "DynamoDBIndexAndStreamAccess", "Effect": "Allow", "Action": [ "dynamodb:GetShardIterator", "dynamodb:Scan", "dynamodb:Query", "dynamodb:DescribeStream", "dynamodb:GetRecords", "dynamodb:ListStreams" ], "Resource": [ "arn:aws:dynamodb:us-east-1:215472211497:table/Books/index/", "arn:aws:dynamodb:us-east-1:215472211497:table/Books/stream/" ] }, { "Sid": "DynamoDBTableAccess", "Effect": "Allow", "Action": [ "dynamodb:BatchGetItem", "dynamodb:BatchWriteItem", "dynamodb:ConditionCheckItem", "dynamodb:PutItem", "dynamodb:DescribeTable", "dynamodb:DeleteItem", "dynamodb:GetItem", "dynamodb:Scan", "dynamodb:Query", "dynamodb:UpdateItem" ], "Resource": "arn:aws:dynamodb:us-east-1:215472211497:table/" }, { "Sid": "DynamoDBDescribeLimitsAccess", "Effect": "Allow", "Action": "dynamodb:DescribeLimits", "Resource": [ "arn:aws:dynamodb::123456789012:table/", "arn:aws:dynamodb::123456789012:table//index/" ] } ] }

Now, Lets create a Dynamo DB table. Enter table name and parititon Key as "id". Rest everything we can keep as default.

Screenshot 2025-01-27 at 5 18 32 PM

Lets now Create a lambda function using Spring boot. Navigate to https://start.spring.io and download skeleton project.

Add the below dependencies :

com.amazonaws aws-lambda-java-core 1.2.3 com.amazonaws aws-lambda-java-events 3.11.5 com.amazonaws aws-java-sdk-sqs 1.12.705
    <dependency>

com.amazonaws aws-java-sdk-dynamodb 1.12.705

We will add the lambda event handler to process the message as shown below. Also, we have created dynamo db mapper using the endpoint and region details :

Screenshot 2025-01-27 at 5 21 12 PM

We have used spring boot JPA methods to save the details. Employee Entity class is created and annotated with Dynamo DB table annotations.

Screenshot 2025-01-27 at 5 22 39 PM

We can build the code. Next, Go to Lambda and create a new function and upload the jar file. Next, update the runtime settings as explained in the earlier lambda method.

We can now deploy the lambda function.

It's time to test the end to end functionality :

Lets execute the API gateway using Postman. (In real world, this will be called by front end application) and upload the csv file.

Screenshot 2025-01-27 at 5 26 38 PM

Now, Lets examine the records in Dynamo DB table :

Screenshot 2025-01-27 at 5 28 13 PM

Congratulations ! we have now successfully built a scalable full serverless asynchronous event based system using AWS services.

Thank you

Rahul Ahuja

About

Spring boot Application using AWS Serverless Services such as S3,API Gateway, Lambda, SQS and Dynamo DB

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Java 100.0%