Over 10 years we help companies reach their financial and branding goals. Engitech is a values-driven technology agency dedicated.

Gallery

Contacts

411 University St, Seattle, USA

engitech@oceanthemes.net

+1 -800-456-478-23

Software Development Uncategorized

AWS Lambda 

AWS Lambda is a serverless, event-driven compute service that lets you run code for virtually any type of application or backend service without provisioning or managing servers. You can trigger Lambda from over 200 AWS services and software as a service (SaaS) applications, and only pay for what you use. The code is executed based on the response of events in AWS services such as adding/removing files in S3 bucket, updating Amazon DynamoDB tables, HTTP request from Amazon API Gateway etc.

Use cases

  • Process data at scale: Execute code at the capacity you need. Scale to match your data volume automatically and enable custom event triggers.
  • Run interactive web and mobile backends.
  • Create event-driven applications.

Links

Serverless Framework

The Serverless Framework helps you develop and deploy your AWS Lambda functions, along with the AWS infrastructure resources they require. It’s a CLI that offers structure, automation and best practices out-of-the-box, allowing you to focus on building sophisticated, event-driven, serverless architectures, comprised of Functions and Events.

  • It manages your code as well as your infrastructure
  • It supports multiple languages (Node.js, Python, Java, and more)

Links

Lets now jump into some coding. For all the examples used in this article uses python language and the source code for this is available in this github repository if you want to play around more with the code.

Install docker on your machine. You can follow the steps mentioned in this official docker installation link.

It requires nodejs already installed on your machine as serverless framework is written in nodejs. To install nodejs on your machine follow this nodejs installation guide by digital ocean.

Setup Serverless Framework

Open this link and signup for serverless account which is free for basic development.

Open up terminal and type npm install -g serverless to install serverless

npm install -g serverless

Once the installation process is complete you can verify serverless is installed by running the following command

serverless --version

Setting up AWS for serverless framework

To run serverless commands that interface with your AWS account, you will need to setup AWS account credentials for serverless. In this article we will use IAM role access key and secret for serverless provider. There are also other ways to access to your cloud provider account to create and manage resources on your behalf.

Once we obtain API key and Secret for the IAM user, login to serverless dashboard.

  1. Navigate to org from left side menu.
  2. Click providers tab and click Add button. This will open a pop-up window to add provider.
  3. Click Access/Secret Keys inside the provider pop-up window.
  4. Choose a name and paste the AWS access key and AWS secret key in the fields and click Create AWS Provider button on the buttom of the Add provider pop-up window.

Now our setup for AWS provider is done and let’s start writing the AWS function. In this article we will write AWS lambda functions in python(you can choose any from the supported languages Java, Go, Powershell, Node.js, C#, Python, Ruby).

Create a Project folder (i.e AWS_serverless) and inside that folder create a folder for lambda function (i.e S3_trigger_function). Since we are using python we need to setup virtual environment, i preferably use venv for python environment management which creates a isolated Python environment for our code.

We will now start writing our python function. We create a folder inside S3_trigger_function named lambda_function and create python file names lambda_handler.py. This python file will contain a Handler function(lambda_handler) which takes event and context as two arguments. This handler function is from where the execution starts.

  • Event: This parameter has all the details for the trigger used.
  • Context: This parameter has runtime details for the lambda function to execute. It has the details like the time left before AWS Lambda terminates a function i.e, timeout specified while creating Lambda function, name of the Lambda function, cloudwatch group name, arn details etc.

So the basic handler function definition will looks like the following:

We are going to write a lambda function which triggers on image file uploaded to S3 bucket and uploads the formatted/resized image file to destination function.

In case of S3 bucket event the event object will looks like the following:

We create a lambda_handler.py file inside the lambda_function folder and write the following code which gets triggered when the file is uploaded to s3 bucket(unmodified-images-bucket), resize it using Pillow package and uploads the resized image to destination s3 bucket(modified-images-bucket) using boto3 package.

We also created Config.py file and the contents looks like the following.

The following is the requirements.text file for the code above inside the lambda_function folder.

Now lets move to deployment for this lambda function using serverless framework.

Serverless framework deploys the code and infrastructure by picking configuration settings from the serverless.yml settings.

Along with deploying our infrastructure we will also dive in two important plugins as follows:

  • Serverless-secrets-plugin: In real world deployment we don’t want to expose our configuration secrets or not to check them into version control system. We can create different secrets for different environments(secrets.live.yml, secrets.stage.yml, secrets.test.yml) and encrypt them using the following command.

serverless encrypt --stage live --password '{your super secure password}'

This will result in an encrypted file e.g. secrets.dev.yml.encrypted. You can check the encrypted file into your version control system e.g. Git. It’s recommened to add your unencrypted file to .gitignore or similar so you and your colleagues can’t check it in by accident.

  • serverless-python-requirements:
    •  Install this plugin running this npm command sls plugin install -n serverless-python-requirements
    • That’s all that’s needed for basic use! The plugin will now bundle your python dependencies specified in your requirements.txt or Pipfile when you run sls deploy.
    • sls plugin install -n serverless-python-requirements

We go back to our serverless dashboard and click apps button on the left side menu. Then we click the create app button from the top right side and create a app. This will open the page to select the template for our app. We will select Lambda from the list and We name it aws_serverless.

After clicking the deploy button we select the provider for AWS which we configured in the beginning(ajay_provider_admin).

And we go back to apps page and our app is created successfully.

Now lets create a file serverless.yml in S3_trigger_function folder and have the following contents in that file and we will try to understand the content of this file. We also create a secrets.live.yml file where we put all the configuration and secrets used by lambda function inside S3_trigger_function folder.

Lets now try to understand serverless.yml file.

On the top of the serverless.yml file we define org, app and service. Org and app we pick from the serverless dashboard and service will be created with service name for this function.

Next provider section tells the serverless framework about the name of the provider(which is aws in our case), runtime for the lambda function, memory size allocated for the function, timeout for the lambda function execution, stage and region we want to deploy this function to. iamRoleStatements defines the policy that will be attached to this function execution role ARN, we provide here full access to s3 and lambda function execution(in case this lambda function want to trigger another lambda function). Deployment bucket is defined as which bucket the serverless function should use to store the zip of the deployment package along with how many previous deployment artifacts we want to keep by specifying them using maxPreviousDeploymentArtifacts.

Next package section we specify to package the function individually, in case our serverless.yml contains definition for more than one function.

Next in the custom section we configure pythonRequirements. We have set dockerizePip to true as Compiling non-pure-Python modules or fetching their manylinux wheels is supported on non-linux OSs via the use of Docker and the docker-lambda image.The dockerizePip option supports a special case in addition to booleans of ‘non-linux’ which makes it dockerize only on non-linux environments.that serverless-python-requirements will build you python packages in docker using a lambda environment, and then zip them up ready to be uploaded with the rest of your code.

You can enable two kinds of caching with this plugin which are currently both ENABLED by default. First, a download cache that will cache downloads that pip needs to compile the packages. And second, a what we call “static caching” which caches output of pip after compiling everything for your requirements file. Since generally requirements.txt files rarely change, you will often see large amounts of speed improvements when enabling the static cache feature. These caches will be shared between all your projects

Next we define the secrets file which will be used to set the environment variables for the lambda function.

Next we looks at functions section, which contains the path to the handler function and module where this handler resides. We also define environment variables that this function should pick from the secrets files.

And finally the event we want to attached this function to. We defined a s3 event which takes bucket as name of the bucket for which we want to trigger this function, event for object created, object deleted(s3:ObjectCreated:*, s3:ObjectRemoved:*). We can also define suffix and prefix for the file name for this lambda function to trigger. 

This link contains all the supported events which we can attach to lambda function.

And the last section is the plugins section where we define all the plugins required by our serverless framework deployments.

Now to deploy this function we change to S3_trigger_function folder. Before deploying run the following command to obtain Serverless Framework access token for serverless cli and this will prompt you to login page for the serverless framework and login.

sls login

and run the following command

sls deploy

Now lets write another function to trigger using API Gateway as REST API Post method. We will send the city name to the API Gateway in request body and it fetches the weather report for us and returns the json response.

We create another folder in our project directory called API_gateway_trigger and again create lambda function folder inside this folder and similarly we create serverless.yml file inside API_gateway_trigger function.

We create lambda_handler.py file inside lamba_function folder and contains the following contents.

And similarly we create config.py file inside the lambda_function folder and contains the following contents.

Next we looks at the content of the serverless.yml file where we will learn how we can attach this lambda function to API gateway for http post REST API.

As compared to previous serverless.yml file there two major difference here to observe.

First we define API key to secure this API with the API key and pick the API key value from secrets file defined as below.

Second we defined the event for this lambda function using http and provided the path(weather-report), method(post), private(true) as defined below.

Now to deploy this function we change to API_gateway_trigger folder and run the following command.

sls deploy

Now lets write another function to trigger when message is published to SNS topic. We will publish tractor sensor data to SNS topic and it will save the data in RDS database

We create another folder in our project directory called SNS_trigger_function and again create lambda function folder inside this folder and similarly we create serverless.yml file inside SNS_trigger_function function.

We create lambda_handler.py file inside lamba_function folder and contains the following contents.

And similarly we create config.py file inside the lambda_function folder and contains the following contents.

Next we looks at the content of the serverless.yml file where we will learn how we can attach this lambda function to listen to messages published to SNS topic.

we defined the event for this lambda function using sns and provided ARN path of the SNS topic defined in our secrets.live.yml.

Now to deploy this function we change to SNS_trigger_function folder and run the following command.

sls deploy

Advantages of AWS lambda functions

Saves Time and Effort:  The foremost and one of the major benefits of going serverless using AWS Lambda is the time and effort saved from creating and maintaining the infrastructure of your application. 

  • AWS saves you a lot of effort by provisioning and managing the infrastructure your Lambda functions run on. 
  • It automatically scales the instances to handle times of excessive load and implements a proper logging and error handling system. For those who have been involved in the creation or maintenance of infrastructure, they will fairly understand the gravity of this advantage. 
  • Not only does it save you time while building the application, but it will also save you an ample amount of time required to maintain an already established system as your application evolves and scales up in the future. 
  • Time saved means quicker production and greater agility in launching your product in the market thus giving you an edge over your competition.

Helps in building a scalable solution: The reason why AWS Lambda has become one of the most popular solutions in such a short span of time is its ability to accommodate applications at scale as well as applications in early stages. 

  • For applications having large amounts of load, AWS runs your Lambda function simultaneously with other Lambda functions. This means that you do not have to worry about clogged up queues. 
  • In addition to this, multiple instances of the same Lambda function can be provisioned and run at the same time. 

You Pay For What You Need: Another great advantage of using AWS Lambda is that you only pay for what you need. Lambda can accommodate applications with differing loads and charges you for the number of requests your Lambda functions receive and the time it takes to execute those requests per 100ms.

Disadvantages of AWS lambda functions

Cold Start: The serverless architecture executes functions on temporarily created containers. Let’s consider a simple function such as a client registering their details. As soon as the client submits his or her information to the Lambda function, AWS will create a temporary container, deploy all the dependencies involved, and run the required code. Once the request is completed, the container is destroyed. Now, if there is another request after sometime, the same process would be followed. 

The drawback lies in the time taken by Lambda to create this temporary container. This is usually between 100 milliseconds to 2 minutes and is known as a cold start. However, there is a workaround which we recently implemented in our application. All you need to do is ping your Lambda every 5 minutes to make sure the container is not destroyed. This way only the first request will take some time to process, all the subsequent ones would be processed much faster without any delay. 

Computational Restrictions: The functions which require a lot of processing cannot be handled by AWS Lambda. However, there is a workaround here as well. You can create a queuing process to break up your work into a series of smaller functions and execute them simultaneously using AWS Lambda. 

Vendor control and vendor lock-in: Another issue with AWS Lambda is that the third-party applications used get to be decided by AWS. This means you would have to give up a lot of control over your application. Another pain point is vendor lock-in that is once the operations are built around a serverless application, it can be costly, time-consuming, and difficult to port your operations elsewhere.Issues with working in a Virtual Private Cloud: Many clients use a Virtual Private Cloud (VPC) for an extra layer of security. Using Lambda with any such function can entail some additional delay over and above the cold start mentioned above. This too can be dealt with using a warm start.

×