Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 20 Next »

Introduction

AWS Lambda function's code consists of scripts or compiled programs and their dependencies. We use a deployment package to deploy our function code to Lambda. Lambda supports two types of deployment packages: container images and .zip file archives. We are going to use the updateLambdaFunctionCode operation to deploy the function code. The operation can deploy the function code from the AWS ECR, S3 Bucket, and local Archive directory. We can select the option to publish a new version, by default operation will not publish the new version. Using the Environment variables file or Input Argument we can also add the function Environment variables. Operation also support encryption of the variables using AWS KMS key. Operation will use the configured AWS cloud account to perform the operation.

Objective

The goal of the tutorial is to perform the Blue/Green deployment in AWS Lambda. We will use the function code available at S3 bucket and the Environment file present at the git repository, and to encrypt secured variables, we will use the AWS KMS key. AWS plugin has updateLambdaFunctionCode, getLambdaAlias, and upsertLambdaAlias operations, these operations we can use to perform the Blue/Green deployment in an easy way. Blue Green Deployment is just like we deploy two versions of our application, one is the stable version, and another is a new feature or bug fix let’s say, forwarding a certain percentage of traffic to the second version as well in production to ensure that everything is working fine. The Blue environment represents the currently active version of the Lambda function. In contrast, the Green environment is a development version of code where new changes are deployed and tested. Once the changes in the Green environment are verified, green deployment will be promoted to Blue, enabling seamless and zero-downtime deployments. With Blue Green deployment we can test our application with real-time users, without replacing the production workload completely.

  • configuration of the properties e.g. Cloud account, and CLI path.

  • cloning the environment file from Git repository.

  • create an Alias to Maintain Blue/Green Deployment. ( Alias map to the stable version that is Blue )

  • deploy the function code with the environment variables and publish a new version (Green)

  • update Alias to Map new version (Green), weighted at some X% (Blue version at (100-X)% of traffic)

  • Verify that the new version is healthy

Detail of Blue/Green Deployment

Blue-Green Deployment in AWS Lambda involves two services, API Gateway and AWS Lambda, we’ll use API Gateway’s Lambda integration with an alias to shape it as Blue-Green Deployment, here Lambda Function Consists of two different but identical environments called Blue and Green respectively.

These two different lambda versions, mapped to a single Lambda alias, a pointer to one (or another additional weighted version) version of the Lambda function. Lambda Versions are revisions of our code, we can create new versions of code without disturbing the production workload.

API Gateway: Allows us to specify a lambda alias as a target, so we can specify a lambda alias that has a Blue-Green Deployment setup configured, i.e. routing traffic to two different environments using the single API.

Blue Deployment: It’s the primary Deployment which is stable, and being used as production.

Green Deployment: It’s a kind of clone version, but it has additional changes in it, we can route the traffic to the Green deployment so that if any issues are there in the Deployment we can fix them and then promote it to Blue, so that reducing the chances of failures in production environment.

Advantage of Blue-Green Deployment:

  • Zero Downtime: Blue-Green Deployment eliminates downtime during the deployment process since the switch from the blue to the green environment is instantaneous. This ensures uninterrupted service availability for users.

  • Fast Rollback: In case any issues or failures occur during the deployment of the new version in the green environment, rolling back to the stable version in the blue environment is quick and straightforward.

  • Reliable Testing: Blue-Green Deployment allows comprehensive testing of the new version in an environment that mirrors the production setup. This ensures a higher level of confidence in the stability and compatibility of the new version before directing user traffic to it and many more…

Checklist

Checklist

Description

AWS Access Key

AWS Access Key of the user.

AWS Secret Key

Password for the Access Key

AWS Default Region

Default region can be set. eg. ap-south-1

AWS CLI installation

AWS CLI needs to be installed where the plugin operation shall run (FlexDeploy server)

AWS CLI in class path

AWS CLI should be added to the class path on the FlexDeploy Server. Else the path can also be set under FlexDeploy environment level property

AWS Lambda Function

AWS Lambda Function should be already present.

AWS KMS Key

AWS KMS key to secured the environment variable.

Configure Cloud account

To connect with AWS Lambda Function, we required to configure Cloud account, with credentials details. Configure AWS Cloud Account under Integration. FlexDeploy will connect to the Lambda Function and add the environment variables.

  1. Navigate to the Integrations

  2. Select Cloud from the left-hand pane

  3. Create a new Cloud account with the “+” button. Create a new Cloud account of provider type “AWS”

It should have a AWS Access Key and AWS Secret Key. The user must have relevant access to AWS Lambda Function.

  1. AWS Secret Key is a password field and hence needs to be kept hidden. To update the same click on the pencil icon as shown below

  2. Update the AWS Secret Key value under Secret Text. This is to make sure no one else can retrieve the password

After configuration we would be able to use the Cloud Account as a drop down from the list.

Create AWS Lambda Function

AWS Lambda is a compute service that lets you run code without provisioning or managing servers. Lambda runs your code on a high-availability compute infrastructure and performs all of the administration of the compute resources, including server and operating system maintenance, capacity provisioning and automatic scaling, and logging. With Lambda, all you need to do is supply your code in one of the language runtimes that Lambda supports. Please refer to the link for more information https://docs.aws.amazon.com/lambda/latest/dg/welcome.html

To create the Lambda Function go to the AWS console

  1. Navigate to the Services

  2. Select Compute from the left-hand pane

  3. Now click on the Lambda service option

After selecting the Lambda service, new window will open and it contains detail of all the functions.

Now select the create function option, it will open window to create function and configured detail.

By default AWS creates execution role with basic Lambda permissions, we can select an existing role also. In above example we are using existing role ( basic-lambda-role ) . Please refer to the link for more information https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles.html

The role which we are selecting must have basic Lambda permissions, the role we have selected also have permission for KMS key to decrypt the secured variables. If we are using the KMS key to encrypt the secured variables then we must have to give permission to the role to use the KMS key.

In above role we can see we have one permissions policy name as kms-access, this policy allow us to use the KMS key to decrypt the variables, which we have used to encrypt the variables.

Policy detail:

Trust relationships detail: ( Entities that can assume this role under specified conditions )

Detail of the AWS Lambda function which we have created and going to use for this tutorial:

If we check the Code details of the function, then we found we have sample code. We will update the code using our AWS plugin operation.

On testing the code, using the Test option provided by AWS Lambda we will get this response:

If we check the Environment variables details under the Configuration, there is no environment variables are present. Once successful execution of the operation we should be able to see some environment variables.

Create AWS KMS Key

AWS Key Management Service (AWS KMS) is a managed service that makes it easy for us to create and control the cryptographic keys that are used to protect our data. Please refer to the link for more information https://aws.amazon.com/kms/

AWS KMS key is required to encrypt the secured variables before adding them to Lambda function. If we don’t have any secured variables in that case we don’t required to configure KMS key detail in the project. In our scenario we are adding both secured and non-secured variables to the Lambda function.

To create the Lambda Function go to the AWS console

  1. Navigate to the Services

  2. Select Security, Identity, & Compliance from the left-hand pane

  3. Now click on the Key Management Service service option

Detail of the KMS key which we are using for this tutorial:

We can use Key ID or Key ARN value in the project to encrypt the variables, both are accepted.

Create AWS Alias

To create or update the Lambda Alias we can use the upsertLambdaAlias operation available in the AWS plugin, please refer to the tutorial document for more information.

Create AWS S3 bucket

Amazon Simple Storage Service (Amazon S3) is an object storage service offering industry-leading scalability, data availability, security, and performance. Customers of all sizes and industries can store and protect any amount of data for virtually any use case, such as data lakes, cloud-native applications, and mobile apps. With cost-effective storage classes and easy-to-use management features, you can optimize costs, organize data, and configure fine-tuned access controls to meet specific business, organizational, and compliance requirements.Please refer to the link for more information https://aws.amazon.com/s3/

To create the Lambda S3 go to the AWS console

  1. Navigate to the Services

  2. Select Storage from the left-hand pane

  3. Now click on the S3 service option

After selecting the S3 service, new window will open and it contains detail of all the S3 buckets.

Now select the create bucket option, it will open window to create S3 bucket and configured detail.

We can also Enable Bucket Versioning, by default it’s Disable. Please refer to the link for more information https://docs.aws.amazon.com/AmazonS3/latest/userguide/Versioning.html

We have created S3 bucket, we can see the details and Upload the AWS Lambda function code.

Once we upload the object, we can see the details.

We have enabled the object versioning, we can see the details about different versions.

Git repository structure

The Git repository contains the Environment file. The Sample Git repository structure is given below.

Pre-requisite

Configure IAM user

To access the Lambda Function we need to create an AWS IAM account with required permissions. To create the AWS IAM user navigate to the AWS Identity and Access Management (IAM) service page, and click on the Add users option. Next assign the required permission to access the Lambda Function. Once user is created, AWS secret key can be generated, this key we have to configure in Cloud account.

For more information about IAM user please ref. IAM users - AWS Identity and Access Management

CLI installation

  • AWS CLI should be installed in the m/c where the plugin is to be executed. Preferably add AWS CLI path in m/c classpath.

Build and Deploy Workflows

Navigate to the Workflows tab and create a workflow using the “+”(Click to create new Workflow) button as highlighted below.

Next, create one Build and Deploy workflow as shown below. The workflow Type field defines the type of workflow.
Build Workflow

  1. Navigate to the Workflows

  2. Select the “+” button from the left-hand pane to create a new workflow

Deploy Workflow

  1. navigate to the Workflows

  2. Select the “+” button from the left-hand pane to create a new workflow

The Workflow Group and Subgroup define the folder hierarchy. Once both workflows are created it should look like the below. No constraint on workflow or folder naming convention.

The steps of the workflow execution can be configured through the Workflow Definition section.

Below given is a sample build workflow to copy the file from Git repository.

Step-i: Clone Git Repository
This step will clone the Git repository codebase into the project execution working directory. The Git URL will be retrieved from Source Control configured under Project Configuration.

Step-ii: Copy the environment file
The below step will copy the environment file to the artifact. Also check the Produces Artifact option to save the files as artifact so that can be used from Deploy workflow.

Below given is a sample workflow to deploy lambda function code from AWS S3 bucket and update already existing Alias to point the newly published version.

Step-i: updateLambdaFunctionCode

This step will deploy Lambda function code, and also publish the function version. We are setting function version variable, which we will use in upsert lambda operation.

In above configuration using following Inputs.

Input Name

Input Code

Type

Required

Description

Additional Arguments

FDAWS_LAMBDA_INP_ADD_ENV_VAR_ADDITIONAL_ARG

String

No

Literal key and value pairs. e.g. --region=us-east-1

And for boolean type arguments give the option without any value. e.g --publish --debug

Environment Variables

FDAWS_LAMBDA_INP_ENV_VAR

String

No

Environment Variables in acceptable format.

Publish new version

FDAWS_LAMBDA_INP_PUBLISH_VERSION

Boolean

No

Select to publish a new version. Default value is false.

Step-ii: upsertLambdaAlias

This step will update the given Alias ( Dev ) , with the newly published version.

In above configuration using following Inputs.

Input Name

Input Code

Type

Required

Description

Alias Name

FDAWS_LAMBDA_INP_ALIAS_NAME

String

Yes

AWS Lambda Alias name

Alias Description

FDAWS_LAMBDA_INP_ALIAS_DESCR

String

No

Description of the Alias

Alias Additional Argument

FDAWS_LAMBDA_INP_ALIAS_ADDITIONAL_ARG

String

No

Literal key and value pairs. e.g. --region=us-east-1 And for boolean type arguments give the option without any value. e.g --publish --debug

Alias Function Version

FDAWS_LAMBDA_INP_FUNCTION_VERSION

String

Yes

Function version associated with Alias

Project Configuration

Navigate to the Project tab and create a Project with a logical name(AWS-Deploy-Lambda-Function-Using-S3)

Configure the Build and Deploy workflow that has been created in previous steps as shown below.

Source Control

Configure the Source SCM repository under Source Control as shown below.

  1. To configure Project specific Source Control one first need to navigate to the Project Configuration tab.

  2. Next, expand the SOURCE CONTROL option from the left-hand pane.

  3. Select the appropriate Source Control Type

  4. Configure Source Repository. For detailed steps of Source Control configuration please refer to Configure Source Control in FlexDeploy

Project Properties

  1. FlexDeploy 6.5

  2. AWS Lambda Function Code Deploy Using Local Archive

Share

AWS Lambda Function Code Deploy Using Local Archive

Introduction

AWS Lambda function's code consists of scripts or compiled programs and their dependencies. We use a deployment package to deploy our function code to Lambda. Lambda supports two types of deployment packages: container images and .zip file archives. We are going to use the updateLambdaFunctionCode operation to deploy the function code. The operation can deploy the function code from the AWS ECR, S3 Bucket, and local Archive directory. We can select the option to publish a new version, by default operation will not publish the new version. Using the Environment variables file or Input Argument we can also add the function Environment variables. Operation also support encryption of the variables using AWS KMS key. Operation will use the configured AWS cloud account to perform the operation.

Objective

The goal of the tutorial is to deploy the function code using the Archive file from the Artifact directory and publish the function version. We will use the Function code and Environment file present at the git repository. We are going to add secured variables also and to encrypt these variables, we will use the AWS KMS key.

  • configuration of the properties e.g. Cloud account, and CLI path.

  • cloning the function code and create the Archive file.

  • cloning the environment file from Git repository

  • deploy the function code and adding the environment variables to the Lamba function.

  • verify the function code.

Checklist

Checklist

Description

AWS Access Key

AWS Access Key of the user.

AWS Secret Key

Password for the Access Key

AWS Default Region

Default region can be set. eg. ap-south-1

AWS CLI installation

AWS CLI needs to be installed where the plugin operation shall run (FlexDeploy server)

AWS CLI in class path

AWS CLI should be added to the class path on the FlexDeploy Server. Else the path can also be set under FlexDeploy environment level property

AWS Lambda Function

AWS Lambda Function should be already present.

AWS KMS Key

AWS KMS key to secured the environment variable.

Configure Cloud Account

To connect with AWS Lambda Function, we required to configure Cloud account, with credentials details. Configure AWS Cloud Account under Integration. FlexDeploy will connect to the Lambda Function and add the environment variables.

  1. Navigate to the Integrations

  2. Select Cloud from the left-hand pane

  3. Create a new Cloud account with the “+” button. Create a new Cloud account of provider type “AWS”

It should have a AWS Access Key and AWS Secret Key. The user must have relevant access to AWS Lambda Function.

  1. AWS Secret Key is a password field and hence needs to be kept hidden. To update the same click on the pencil icon as shown below

  2. Update the AWS Secret Key value under Secret Text. This is to make sure no one else can retrieve the password

After configuration we would be able to use the Cloud Account as a drop down from the list.

Create AWS Lambda Function

AWS Lambda is a compute service that lets you run code without provisioning or managing servers. Lambda runs your code on a high-availability compute infrastructure and performs all of the administration of the compute resources, including server and operating system maintenance, capacity provisioning and automatic scaling, and logging. With Lambda, all you need to do is supply your code in one of the language runtimes that Lambda supports. Please refer to the link for more information What is AWS Lambda? - AWS Lambda

To create the Lambda Function go to the AWS console

  1. Navigate to the Services

  2. Select Compute from the left-hand pane

  3. Now click on the Lambda service option

After selecting the Lambda service, new window will open and it contains detail of all the functions.

Now select the create function option, it will open window to create function and configured detail.

By default AWS creates execution role with basic Lambda permissions, we can select an existing role also. In above example we are using existing role ( basic-lambda-role ) . Please refer to the link for more information IAM roles - AWS Identity and Access Management

The role which we are selecting must have basic Lambda permissions, the role we have selected also have permission for KMS key to decrypt the secured variables. If we are using the KMS key to encrypt the secured variables then we must have to give permission to the role to use the KMS key.

In above role we can see we have one permissions policy name as kms-access, this policy allow us to use the KMS key to decrypt the variables, which we have used to encrypt the variables.

Policy detail:

Trust relationships detail: ( Entities that can assume this role under specified conditions )

Detail of the AWS Lambda function which we have created and going to use for this tutorial:

If we check the Code details of the function, then we found we have sample code. We will update the code using our AWS plugin operation.

On testing the code, using the Test option provided by AWS Lambda we will get this response:

If we check the Environment variables details under the Configuration, there is no environment variables are present. Once successful execution of the operation we should be able to see some environment variables.

Create AWS KMS Key

AWS Key Management Service (AWS KMS) is a managed service that makes it easy for us to create and control the cryptographic keys that are used to protect our data. Please refer to the link for more information Encryption Cryptography Signing - AWS Key Management Service - AWS

AWS KMS key is required to encrypt the secured variables before adding them to Lambda function. If we don’t have any secured variables in that case we don’t required to configure KMS key detail in the project. In our scenario we are adding both secured and non-secured variables to the Lambda function.

To create the Lambda Function go to the AWS console

  1. Navigate to the Services

  2. Select Security, Identity, & Compliance from the left-hand pane

  3. Now click on the Key Management Service service option

Detail of the KMS key which we are using for this tutorial:

We can use Key ID or Key ARN value in the project to encrypt the variables, both are accepted.

Git Repository Structure

The Git repository contains the Environment file.

The Sample Git repository structure is given below.

Environment Variable File Structure

This is the example of environment file with json structure, please refer the document to get more details about environment variables' acceptable structure.

Pre-requisite

Configure IAM user

To access the Lambda Function we need to create an AWS IAM account with required permissions. To create the AWS IAM user navigate to the AWS Identity and Access Management (IAM) service page, and click on the Add users option. Next assign the required permission to access the Lambda Function. Once user is created, AWS secret key can be generated, this key we have to configure in Cloud account.

For more information about IAM user please ref. IAM users - AWS Identity and Access Management

CLI Installation

  • AWS CLI should be installed in the m/c where the plugin is to be executed. Preferably add AWS CLI path in m/c classpath.

Build and Deploy Workflows

Navigate to the Workflows tab and create a workflow using the “+”(Click to create new Workflow) button as highlighted below.

Next, create one Build and Deploy workflow as shown below. The workflow Type field defines the type of workflow.
Build Workflow

  1. Navigate to the Workflows

  2. Select the “+” button from the left-hand pane to create a new workflow

Deploy Workflow

  1. navigate to the Workflows

  2. Select the “+” button from the left-hand pane to create a new workflow

The Workflow Group and Subgroup define the folder hierarchy. Once both workflows are created it should look like the below. No constraint on workflow or folder naming convention.

The steps of the workflow execution can be configured through the Workflow Definition section.

Below given is a sample build workflow to copy the file from Git repository.

Step-i: Clone Git Repository
This step will clone the Git repository codebase into the project execution working directory. The Git URL will be retrieved from Source Control configured under Project Configuration.

 

Step-ii: Create Function archive and save as Artifact
The below step will create Function archive and also check the Produces Artifact option to save the files as artifact so that can be used from Deploy workflow.

Step-iii: Copy the environment file
The below step will copy the environment file, so that can be used from Deploy workflow.

Below given is a sample workflow to deploy lambda function code.

Step-i: updateLambdaFunctionCode

This step will deploy Lambda function code, and also publish the function version.

In above configuration using following Inputs.

Input Name

Input Code

Type

Required

Description

Additional Arguments

FDAWS_LAMBDA_INP_ADD_ENV_VAR_ADDITIONAL_ARG

String

No

Literal key and value pairs. e.g. --region=us-east-1

And for boolean type arguments give the option without any value. e.g --publish --debug

Environment Variables

FDAWS_LAMBDA_INP_ENV_VAR

String

No

Environment Variables in acceptable format.

Publish new version

FDAWS_LAMBDA_INP_PUBLISH_VERSION

Boolean

No

Select to publish a new version. Default value is false.

Project Configuration

Navigate to the Project tab and create a Project with a logical name(AWS-Deploy-Lambda-Function-Code)

Configure the Build and Deploy workflow that has been created in previous steps as shown below.

Source Control

Configure the Source SCM repository under Source Control as shown below.

  1. To configure Project specific Source Control one first need to navigate to the Project Configuration tab.

  2. Next, expand the SOURCE CONTROL option from the left-hand pane.

  3. Select the appropriate Source Control Type

  4. Configure Source Repository. For detailed steps of Source Control configuration please refer to Configure Source Control in FlexDeploy

Project Properties

Lambda Function name: Name of the lambda function to deploy the code, if lambda function name is not given S3 key name will be use as function name.

Environment Variable File Path: Path of the file which contains list of the environment variables.

Please refer to the document for more details about Lambda function name and Environment Variable File path . AWS Lambda - Environment Variable File and zip File location options

KMS detail: Key Id or Key ARN details, both are accepted. Please refer to the document for more details. AWS Key Management Service - AWS Key Management Service

S3 Bucket Name: Name of the S3 Bucket where we have lambda function code.

S3 Key Name: Name of the S3 key.

To deploy the code using S3 bucket, both the name of the S3 bucket and S3 key are required.

S3 Object Version: Value of the object version, we can have multiple variants of an object. It’s optional property.

Target Properties

Select Topology from the menu and then select Targets. Select the target group and environment, provide the properties detail, according to the description.

Properties

Mandatory field

Description

Cloud Account

Optional

Select the Cloud Account to connect the Lambda Function.

CLI Path

Optional

Directory where Cloud CLI is installed.

AWS Region

Optional

Value of the AWS Region.

Below given are the environment-specific values which need to be updated.

Cloud Account

The AWS Cloud account needs to be set here from the drop-down. It will show all Cloud Accounts configured under Topology, which we have already mentioned earlier.

CLI Path

AWS CLI path can be set as environment property, if it’s not set then by default plugin will check for CLI in system classpath.

Override Properties at Project level

Let assume a scenario, where we want to change Cloud account for any specific project. Apart from setting at environment level, it can also be set at project properties by using Override Property. Please check below mentioned steps.

  1. Navigate to the Project Configuration tab as shown above.

  2. Next, select the PROPERTIES option from the left-hand pane.

  3. Click on the OVERRIDE option.

  4. Select the Cloud Account option from Property.

  5. Select the Environment from the drop down list.

  6. Select the Target Group from the drop down list.

Build and Deploy Execution

For detailed steps on how to perform build and deploy, please refer to document. Deploy through FlexDeploy for AWS plugin

After Deploy Execution

  • No labels