In this Project, we will implement a full serverless solution using AWS Services and Java Spring boot as the framework for the processing. We will implement the below solution :
User Invoking API Gateway Rest API and upload file on S3.
S3 event notification triggers Lambda.
Lambda processes the records in the file and pushes them to SQS.
Another Lambda function consumes the records from SQS and saves it into Dynamo DB.
Here is the architecture diagram for the same :
Navigate to AWS Console and search for S3. Click Create Bucket.
Give name to S3 bucket and uncheck block all public access. Keep all other settings as default .
This will create our S3 bucket.
In the AWS Console , search for API Gateway and Select Rest API and Build.
Give the name and description and Click Create API
Once API is created, Click Create Resource :
Resource path should be left as it is and Resource-name should be {bucket}. This means we will pass our bucket name as path parameter.
Click Create Resource again and enter resource-name as {filename}
Lets now create API Gateway role to access s3
Navigate to API Gateway and select Create Role and select AWS Service as API Gateway
Clickk, next, next and Enter role name and hit submit.
Navigate to the newly created role and attach permissions tab . click Add permissions and add the below json :
{ "Version": "2012-10-17", "Statement": [ { "Sid": "VisualEditor0", "Effect": "Allow", "Action": "s3:", "Resource": "" } ] }
Now, Let us go back to API Gateway and Click Create Method.
Select method as PUT and AWS Service as S3 as shown in the screenshot. Rest all details should be default.
Now go to integration request add edit path variables.
One last thing, Go to settings , we need to accept binary file types and enter /
Now click on deploy and enter stage name as dev.
Once it is deployed successdfully, we have not integrated API Gateway and S3.
Step 3 : Create Lambda Function which will trigger when file gets uploaded to S3 and send the data to Amazon SQS.
Lets create a Lambda function using spring boot. Navigate to https://start.spring.io and Download skeleton project.
Add the below dependencies in pom.xml com.amazonaws aws-lambda-java-core 1.2.3 com.amazonaws aws-lambda-java-events 3.11.5 com.amazonaws aws-java-sdk-s3 1.12.705 org.apache.commons commons-csv 1.13.0 com.amazonaws aws-java-sdk-sqs 1.12.705
Also, we need to add a plugin to build jar for Lambda .
org.apache.maven.plugins maven-shade-plugin false package shadeNext, Create a LambdaHandler Class and add the code as shown below :
This shows we receive file from S3 using S3Event Object, process the file and save it as Java Object by parsing it using Open CSV parser. Once the java object is formed, it is being sent to SQS queue.
In AWS console, Navigate to Lambda and Click Create Function. Select runtime as Java 17 and enter function name and click create.
Now, lets upload the jar file, Click on Code tab and upload .zip or jar file.
Upload our jar file and click save.
one last thing, we need to edit runtime settings and enter fully qualified name of our handler class and method name as shown in the screenshot :
Now, Lets create Trigger for S3 . We can do this both ways, Create Event Notification in S3 or trigger in Lambda.
Lets create in Lambda itself. Click Add trigger. Select service as S3.
We can verify that it is also now reflected in S3 event notification :
Let us now create SQS queue where Lambda can push messages.
Click Create Queue. Enter Queue name and it should be standard Queue. Leave Everything default.
With this Step 2 is completed and We should be able to now invoke API gateway and upload file in S3. This in turn should trigger Lambda function which should push messages to SQS.
We need to create Lambda function which will pick messages from SQS. Lets create a role for Lambda which gives permissions for SQS execution.
click Create Role and Add permissions AWSLambdaSQSQueueExecutionRole policy .
Also, as this lambda function will also write to dynamo db, we need to add one customer managed custom policy as shown below :
Here is the custom policy :
{ "Version": "2012-10-17", "Statement": [ { "Sid": "DynamoDBIndexAndStreamAccess", "Effect": "Allow", "Action": [ "dynamodb:GetShardIterator", "dynamodb:Scan", "dynamodb:Query", "dynamodb:DescribeStream", "dynamodb:GetRecords", "dynamodb:ListStreams" ], "Resource": [ "arn:aws:dynamodb:us-east-1:215472211497:table/Books/index/", "arn:aws:dynamodb:us-east-1:215472211497:table/Books/stream/" ] }, { "Sid": "DynamoDBTableAccess", "Effect": "Allow", "Action": [ "dynamodb:BatchGetItem", "dynamodb:BatchWriteItem", "dynamodb:ConditionCheckItem", "dynamodb:PutItem", "dynamodb:DescribeTable", "dynamodb:DeleteItem", "dynamodb:GetItem", "dynamodb:Scan", "dynamodb:Query", "dynamodb:UpdateItem" ], "Resource": "arn:aws:dynamodb:us-east-1:215472211497:table/" }, { "Sid": "DynamoDBDescribeLimitsAccess", "Effect": "Allow", "Action": "dynamodb:DescribeLimits", "Resource": [ "arn:aws:dynamodb::123456789012:table/", "arn:aws:dynamodb::123456789012:table//index/" ] } ] }
Now, Lets create a Dynamo DB table. Enter table name and parititon Key as "id". Rest everything we can keep as default.
Lets now Create a lambda function using Spring boot. Navigate to https://start.spring.io and download skeleton project.
Add the below dependencies :
com.amazonaws aws-lambda-java-core 1.2.3 com.amazonaws aws-lambda-java-events 3.11.5 com.amazonaws aws-java-sdk-sqs 1.12.705 <dependency>
com.amazonaws aws-java-sdk-dynamodb 1.12.705
We will add the lambda event handler to process the message as shown below. Also, we have created dynamo db mapper using the endpoint and region details :
We have used spring boot JPA methods to save the details. Employee Entity class is created and annotated with Dynamo DB table annotations.
We can build the code. Next, Go to Lambda and create a new function and upload the jar file. Next, update the runtime settings as explained in the earlier lambda method.
We can now deploy the lambda function.
It's time to test the end to end functionality :
Lets execute the API gateway using Postman. (In real world, this will be called by front end application) and upload the csv file.
Now, Lets examine the records in Dynamo DB table :
Congratulations ! we have now successfully built a scalable full serverless asynchronous event based system using AWS services.
Thank you
Rahul Ahuja
