Thursday, August 17, 2017

Creating a Thumbnail using AWS Lambda (Serverless Architecture)

Introduction to AWS Lambda

In one of the earlier blog here, we discussed about AWS Lambda which is a FAAS (Function As A Service) with a simple example. In Lambda the granularity is at a function level and the pricing is also on the number of times a function is called and so is directly proportional to the growth of the business. We don't need to think in terms of the servers (serverless architecture), AWS will automatically scale the resources as the number of the calls to the Lambda function increases. We should be able to allocate the amount of memory allocated to the Lambda function and the Lambda function is automatically allocated the proportional CPU. Here is the FAQ on Lambda.

Amazon has published a nice article here on the how create a Lambda function which gets triggered by an image in S3 and then automatically creates a Thumbnail of the same again in a different bucket in S3. The same can be used for a photo sharing site like Flicker, Picassa etc. The article is in detail, but there are a lot steps involved using the CLI (Command Line Interface) which is not a piece of cake for those who are just starting with the AWS service. Here in this blog we will look at the sequence of steps using the AWS Web Management Console for the same.


Sequence of steps for creating an AWS Lambda function

Here we go with the assumptions that the Eclipse and Java 8 SDK setup have already been done on the system and that the participants have already have an account created with AWS. For the sake of this article the IAM, S3 and Lambda resources consumed fall under the AWS free tier.

Step 1: Start the Eclipse. Go to 'File -> New -> Project ...' and choose 'Maven Project' and click on Next.


Step 2: Choose to create a a simple project and click on Next.

creating maven project in eclipse

Step 3: Type the following artifact information and click on Finish.

Group Id: doc-examples
Artifact Id: lambda-java-example
Version: 0.0.1-SNAPSHOT
Packaging: jar
Name: lambda-java-example

creating maven project in eclipse

The project will be created and the Package Explorer will be as below.

maven project structure in eclipse

Step 4: Replace the pom.xml content with the below content. This Maven file has all the dependencies and the plugins used in this project.

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>doc-examples</groupId>
  <artifactId>lambda-java-example</artifactId>
  <version>0.0.1-SNAPSHOT</version>
  <name>lambda-java-example</name>
  <dependencies>
   <dependency>
    <groupId>com.amazonaws</groupId>
    <artifactId>aws-lambda-java-core</artifactId>
    <version>1.1.0</version>
   </dependency>
   <dependency>
    <groupId>com.amazonaws</groupId>
    <artifactId>aws-lambda-java-events</artifactId>
    <version>1.3.0</version>
   </dependency>
  </dependencies>
  <build>
   <plugins>
    <plugin>
     <groupId>org.apache.maven.plugins</groupId>
     <artifactId>maven-shade-plugin</artifactId>
     <version>2.3</version>
    </plugin>
   </plugins>
  </build>
</project> 

Step 5: Create the example package and then add the S3EventProcessorCreateThumbnail java code from here in the S3EventProcessorCreateThumbnail.java file.

maven project structure in eclipse

Step 6: Now, it's time to build and package the project. Right click on the project in the Package explorer view and then go to 'Run As -> Maven build ....'. Enter 'package' in the Goals as shown below and then click Run.

packaing the code using maven

Once the maven build is complete, then the BUILD SUCCESS should be shown in the bottom right console. And also the jar should appear in the target folder after refreshing the project.

jar file in eclipse

Step 7: Build the project again using the above step with the Goal as 'package shade:shade'. Make sure to change the name of the Maven configuration to something else than the previous name as shown below.

packaing the code using maven

Once the Maven build is complete, then the BUILD SUCCESS should be shown in the bottom right console. And also the jar should appear in the target folder after refreshing the project.

jar file in eclipse

Step 8: An IAM role has to be created and attached to the Lambda function, so that it can access the appropriate AWS resources. Goto the IAM AWS Management Console. Click on Roles and 'Create new role'.

creating an iam role

Step 9: Select the AWS Lambda role type.

creating an iam role

Step 10: Filter for the AWSLambdaExecute policy and select the same.

creating an iam role

Step 11: Give the role a name and click 'Create role'.

creating an iam role

The role will be created as shown below. The same role will be attached to the Lambda function later.

creating an iam role

Step 12: Goto the Lambda AWS Management Console and click on 'Create a function' and then select 'Author from scratch'.

creating a lambda function

creating a lambda function

Step 13: A trigger can be added to the S3 bucket later, Click on Next.

creating a lambda function

Step 14: Specify the below for the Lambda details and click on Next.
  • Function name as 'CreateThumbnail'
  • Runtime as 'Java 8'
  • Upload the lambda-java-example-0.0.1-SNAPSHOT.jar file from the target Eclipse folder
  • Handler name as example.S3EventProcessorCreateThumbnail::handleRequest
  • Choose the role which has been created in the IAM

Step 15: Verify the details and click on 'Create function'.

creating a lambda function

Within a few seconds the success screen should be shown as below.

creating a lambda function

Clicking on the Functions link on the left will show the list of all the functions uploaded to Lambda from this account.

creating a lambda function

Now that the Lambda function has been created, it's time to create buckets in S3 and link the source bucket to the Lambda function.

Step 16: Go to the S3 AWS Management Console and create the source and target buckets. The name of the target bucket should be the source bucket name appended with the word resized. The logic for the same has been hard coded in the Java code. There is no need to add the airline-dataset bucket, this is a bucket was already there in S3.

creating buckets in s3

Step 17: Click on the S3 source bucket and then properties to associate it with the Lambda function created earlier as shown below.



s3 attaching the lambda function

s3 attaching the lambda function

Step 18: Upload the image to the source bucket.

image in the source bucket

If everything goes well then the Thumbnail image should be in the target bucket in a few seconds. Note that the size of the resized Thumbnail image is smaller than the original image.

image in the target bucket

Step 19: The log files for the Java Lambda function can be found in the CloudWatch AWS Management Console as shown below. If for some reason, the resized image is not there in the target S3 folder, the reason for the same can be found in CloudWatch logs.

cloudwatch logs

cloudwatch logs

cloudwatch logs

Step 20: Few metrics can also be got from the AWS Lambda Management Console like the number of invocation count and duration.

cloudwatch metrics

 

Conclusion

Few things can be automated using the AWS Toolkit for Eclipse and the serverless.com framework. But, they hide most of the details and so it's better to follow the above sequence of steps to know what happens behind the scenes in the AWS Lambda service. In the future blogs, we will also explore on how to do the same with AWS Toolkit for Eclipse and the serverless.com framework also.

The AWS Lambda function can be fronted by AWS API Gateway which provides a REST based interface to create, publish, maintain, monitor, and secure APIs at any scale. This is again a topic for a future blog post.

Serverless Architecture would be future as it removes the burden about the infrastructure from the developer and moves it to the Cloud vendor like AWS. DynamoDB also falls under the same category. While creating the DynamoDB table, we simply specify the Read and the Write Capacity Units and Amazon will automatically provision the appropriate resources as in the case of Lambda.

This is one of the lengthiest blog I have written and really enjoyed it. Plan to write more such detailed blogs in the future.

No comments:

Post a Comment