Search the Community
Showing results for tags 'aws cdk'.
-
Introduction APIs are the key to implementing microservices that are the building blocks of modern distributed applications. Launching a new API involves defining the behavior, implementing the business logic, and configuring the infrastructure to enforce the behavior and expose the business logic. Using OpenAPI, the AWS Cloud Development Kit (AWS CDK), and AWS Solutions Constructs to build your API lets you focus on each of these tasks in isolation, using a technology specific to each for efficiency and clarity. The OpenAPI specification is a declarative language that allows you to fully define a REST API in a document completely decoupled from the implementation. The specification defines all resources, methods, query strings, request and response bodies, authorization methods and any data structures passed in and out of the API. Since it is decoupled from the implementation and coded in an easy-to-understand format, this specification can be socialized with stakeholders and developers to generate buy-in before development has started. Even better, since this specification is in a machine-readable syntax (JSON or YAML), it can be used to generate documentation, client code libraries, or mock APIs that mimic an actual API implementation. An OpenAPI specification can be used to fully configure an Amazon API Gateway REST API with custom AWS Lambda integration. Defining the API in this way automates the complex task of configuring the API, and it offloads all enforcement of the API details to API Gateway and out of your business logic. The AWS CDK provides a programming model above the static AWS CloudFormation template, representing all AWS resources with instantiated objects in a high-level programming language. When you instantiate CDK objects in your Typescript (or other language) code, the CDK “compiles” those objects into a JSON template, then deploys that template with CloudFormation. I’m not going to spend a lot of time extolling the many virtues of the AWS CDK here, suffice it to say that the use of programming languages such as Typescript or Python rather than the declarative YAML or JSON allows much more flexibility in defining your infrastructure. AWS Solutions Constructs is a library of common architectural patterns built on top of the AWS CDK. These multi-service patterns allow you to deploy multiple resources with a single CDK Construct. Solutions Constructs follow best practices by default – both for the configuration of the individual resources as well as their interaction. While each Solutions Construct implements a very small architectural pattern, they are designed so that multiple constructs can be combined by sharing a common resource. For instance, a Solutions Construct that implements an Amazon Simple Storage Service (Amazon S3) bucket invoking a Lambda function can be deployed with a second Solutions Construct that deploys a Lambda function that writes to an Amazon Simple Queue Service (Amazon SQS) queue by sharing the same Lambda function between the two constructs. You can compose complex architectures by connecting multiple Solutions Constructs together, as you will see in this example. Infrastructure as Code Abstraction Layers In this article, you will build a robust, functional REST API based on an OpenAPI specification using the AWS CDK and AWS Solutions Constructs. How it Works This example is a microservice that saves and retrieves product orders. The behavior will be fully defined by an OpenAPI specification and will include the following methods: Method Functionality Authorization POST /order Accepts order attributes included in the request body. Returns the orderId assigned to the new order. AWS Identity and Access Management (IAM) GET /order/$(orderId} Accepts an orderId as a path parameter. Returns the fully populated order object. IAM The architecture implementing the service is shown in the diagram below. Each method will integrate with a Lambda function that implements the interactions with an Amazon DynamoDB table. The API will be protected by IAM authorization and all input and output data will be verified by API Gateway. All of this will be fully defined in an OpenAPI specification that is used to configure the REST API. The Two Solutions Constructs Making up the Service Architecture Infrastructure as code will be implemented with the AWS CDK and AWS Solutions Constructs. This example uses 2 Solutions Constructs: aws-lambda-dynamodb – This construct “connects” a Lambda function and a DynamoDB table. This entails giving the Lambda function the minimum IAM privileges to read and write from the table and providing the DynamoDB table name to the Lambda function code with an environment variable. A Solutions Constructs pattern will create its resources based on best practices by default, but a client can also provide construct properties to override the default behaviors. A client can also choose not to have the pattern create a new resource by providing a resource that already exists. aws-openapigateway-lambda – This construct deploys a REST API on API Gateway configured by the OpenAPI specification, integrating each method of the API with a Lambda function. The OpenAPI specification is stored as an asset in S3 and referenced by the CloudFormation template rather than embedded in the template. When the Lambda functions in the stack have been created, a custom resource processes the OpenAPI asset and updates all the method specifications with the arn of the associated Lambda function. An API can point to multiple Lambda functions, or a Lambda function can provide the implementation for multiple methods. In this example you will create the aws-lambda-dynamodb construct first. This construct will create your Lambda function, which you then supply as an existing resource to the aws-openapigateway-lambda constructor. Sharing this function between the constructs will unite the two small patterns into a complete architecture. Prerequisites To deploy this example, you will need the following in your development environment: Node.js 18.0.0 or later Typescript 3.8 or later (npm -g install typescript) AWS CDK 2.82.0 or later (npm install -g aws-cdk && cdk bootstrap) The cdk bootstrap command will launch an S3 bucket and other resources that the CDK requires into your default region. You will need to bootstrap your account using a role with sufficient privileges – you may require an account administrator to complete that command. Tip – While AWS CDK. 2.82.0 is the minimum required to make this example work, AWS recommends regularly updating your apps to use the latest CDK version. To deploy the example stack, you will need to be running under an IAM role with the following privileges: Create API Gateway APIs Create IAM roles/policies Create Lambda Functions Create DynamoDB tables GET/POST methods on API Gateway AWSCloudFormationFullAccess (managed policy) Build the App Somewhere on your workstation, create an empty folder named openapi-blog with these commands: mkdir openapi-blog && cd openapi-blog Now create an empty CDK application using this command: cdk init -l=typescript The application is going to be built using two Solutions Constructs, aws-openapigateway-lambda and aws-lambda-dynamodb. Install them in your application using these commands: npm install @aws-solutions-constructs/aws-openapigateway-lambda npm install @aws-solutions-constructs/aws-lambda-dynamodb Tip – if you get an error along the lines of npm ERR! Could not resolve dependency and npm ERR! peer aws-cdk-lib@"^2.130.0", then you’ve installed a version of Solutions Constructs that depends on a newer version of the CDK. In package.json, update the aws-cdk-lib and aws-cdk dependencies to be the version in the peer error and run npm install. Now try the above npm install commands again. The OpenAPI REST API specification will be in the api/openapi-blog.yml file. It defines the POST and GET methods, the format of incoming and outgoing data and the IAM Authorization for all HTTP calls. Create a folder named api under openapi-blog. Within the api folder, create a file called openapi-blog.yml with the following contents: --- openapi: 3.0.2 info: title: openapi-blog example version: '1.0' description: 'defines an API with POST and GET methods for an order resource' # x-amazon-* values are OpenAPI extensions to define API Gateway specific configurations # This section sets up 2 types of validation and defines params-only validation # as the default. x-amazon-apigateway-request-validators: all: validateRequestBody: true validateRequestParameters: true params-only: validateRequestBody: false validateRequestParameters: true x-amazon-apigateway-request-validator: params-only paths: "/order": post: x-amazon-apigateway-auth: type: AWS_IAM x-amazon-apigateway-request-validator: all summary: Create a new order description: Create a new order x-amazon-apigateway-integration: httpMethod: POST # "OrderHandler" is a placeholder that aws-openapigateway-lambda will # replace with the Lambda function when it is available uri: OrderHandler passthroughBehavior: when_no_match type: aws_proxy requestBody: description: Create a new order content: application/json: schema: "$ref": "#/components/schemas/OrderAttributes" required: true responses: '200': description: Successful operation content: application/json: schema: "$ref": "#/components/schemas/OrderObject" "/order/{orderId}": get: x-amazon-apigateway-auth: type: AWS_IAM summary: Get Order by ID description: Returns order data for the provided ID x-amazon-apigateway-integration: httpMethod: POST # "OrderHandler" is a placeholder that aws-openapigateway-lambda will # replace with the Lambda function when it is available uri: OrderHandler passthroughBehavior: when_no_match type: aws_proxy parameters: - name: orderId in: path required: true schema: type: integer format: int64 responses: '200': description: successful operation content: application/json: schema: "$ref": "#/components/schemas/OrderObject" '400': description: Bad order ID '404': description: Order ID not found components: schemas: OrderAttributes: type: object additionalProperties: false required: - productId - quantity - customerId properties: productId: type: string quantity: type: integer format: int32 example: 7 customerId: type: string OrderObject: allOf: - "$ref": "#/components/schemas/OrderAttributes" - type: object additionalProperties: false required: - id properties: id: type: string Most of the fields in this OpenAPI definition are explained in the OpenAPI specification, but the fields starting with x-amazon- are unique extensions for configuring API Gateway. In this case x-apigateway-auth values stipulate that the methods be protected with IAM authorization; the x-amazon-request-validator values tell the API to validate the request parameters by default and the parameters and request body when appropriate; and the x-amazon-apigateway-integration section defines the custom integration of the method with a Lambda function. When using the Solutions Construct, this field does not identify the specific Lambda function, but instead has a placeholder string (“OrderHandler”) that will be replaced with the correct function name during the launch. While the API will accept and validate requests, you’ll need some business logic to actually implement the functionality. Let’s create a Lambda function with some rudimentary business logic: Create a folder structure lambda/order under openapi-blog. Within the order folder, create a file called index.js . Paste the code from this file into your index.js file. Our Lambda function is very simple, consisting of some relatively generic SDK calls to Dynamodb. Depending upon the HTTP method passed in the event, it either creates a new order or retrieves (and returns) an existing order. Once the stack loads, you can check out the IAM role associated with the Lambda function and see that the construct also created a least privilege policy for accessing the table. When the code is written, the DynamoDB table name is not known, but the aws-lambda-dynamodb construct creates an environment variable with the table name that will do nicely: // Excerpt from index.js // Get the table name from the Environment Variable set by aws-lambda-dynamodb const orderTableName = process.env.DDB_TABLE_NAME; Now that the business logic and API definition are included in the project, it’s time to add the AWS CDK code that will launch the application resources. Since the API definition and your business logic are the differentiated aspects of your application, it would be ideal if the infrastructure to host your application could deployed with a minimal amount of code. This is where Solutions Constructs help – perform the following steps: Open the lib/openapi-blog-stack.ts file. Replace the contents with the following: import * as cdk from 'aws-cdk-lib'; import { Construct } from 'constructs'; import { OpenApiGatewayToLambda } from '@aws-solutions-constructs/aws-openapigateway-lambda'; import { LambdaToDynamoDB } from '@aws-solutions-constructs/aws-lambda-dynamodb'; import { Asset } from 'aws-cdk-lib/aws-s3-assets'; import * as path from 'path'; import * as lambda from 'aws-cdk-lib/aws-lambda'; import * as ddb from 'aws-cdk-lib/aws-dynamodb'; export class OpenapiBlogStack extends cdk.Stack { constructor(scope: Construct, id: string, props?: cdk.StackProps) { super(scope, id, props); // This application is going to use a very simple DynamoDB table const simpleTableProps = { partitionKey: { name: "Id", type: ddb.AttributeType.STRING, }, // Not appropriate for production, this setting is to ensure the demo can be easily removed removalPolicy: cdk.RemovalPolicy.DESTROY }; // This Solutions Construct creates the Orders Lambda function // and configures the IAM policy and environment variables "connecting" // it to a new Dynamodb table const orderApparatus = new LambdaToDynamoDB(this, 'Orders', { lambdaFunctionProps: { runtime: lambda.Runtime.NODEJS_18_X, handler: 'index.handler', code: lambda.Code.fromAsset(`lambda/order`), }, dynamoTableProps: simpleTableProps }); // This Solutions Construct creates and configures the REST API, // integrating it with the new order Lambda function created by the // LambdaToDynamomDB construct above const newApi = new OpenApiGatewayToLambda(this, 'OpenApiGatewayToLambda', { // The OpenAPI is stored as an S3 asset where it can be accessed during the // CloudFormation Create Stack command apiDefinitionAsset: new Asset(this, 'ApiDefinitionAsset', { path: path.join(`api`, 'openapi-blog.yml') }), // The construct uses these records to integrate the methods in the OpenAPI spec // to Lambda functions in the CDK stack apiIntegrations: [ { // These ids correspond to the placeholder values for uri in the OpenAPI spec id: 'OrderHandler', existingLambdaObj: orderApparatus.lambdaFunction } ] }); // We output the URL of the resource for convenience here new cdk.CfnOutput(this, 'OrderUrl', { value: newApi.apiGateway.url + 'order', }); } } Notice that the above code to create the infrastructure is only about two dozen lines. The constructs provide best practice defaults for all the resources they create, you just need to provide information unique to the use case (and any values that must override the defaults). For instance, while the LambdaToDynamoDB construct defines best practice default properties for the table, the client needs to provide at least the partition key. So that the demo cleans up completely when we’re done, there’s a removalPolicy property that instructs CloudFormation to delete the table when the stack is deleted. These minimal table properties and the location of the Lambda function code are all you need to provide to launch the LambdaToDynamoDB construct. The OpenApiGatewayToLambda construct must be told where to find the OpenAPI specification and how to integrate with the Lambda function(s). The apiIntegrations property is a mapping of the placeholder strings used in the OpenAPI spec to the Lambda functions in the CDK stack. This code maps OrderHandler to the Lambda function created by the LambdaToDynamoDB construct. APIs integrating with more than one function can easily do this by creating more placeholder strings. Ensure all files are saved and build the application: npm run build Launch the CDK stack: cdk deploy You may see some AWS_SOLUTIONS_CONSTRUCTS_WARNING:‘s here, you can safely ignore them in this case. The CDK will display any IAM changes before continuing – allowing you to review any IAM policies created in the stack before actually deploying. Enter ‘Y’ [Enter] to continue deploying the stack. When the deployment concludes successfully, you should see something similar to the following output: ... OpenapiBlogStack: deploying... [1/1] OpenapiBlogStack: creating CloudFormation changeset... OpenapiBlogStack Deployment time: 97.78s Outputs: OpenapiBlogStack.OpenApiGatewayToLambdaSpecRestApiEndpointD1FA5E3A = https://b73nx617gl.execute-api.us-east-1.amazonaws.com/prod/ OpenapiBlogStack.OrderUrl = https://b73nx617gl.execute-api.us-east-1.amazonaws.com/prod/order Stack ARN: arn:aws:cloudformation:us-east-1:123456789012:stack/OpenapiBlogStack/01df6970-dc05-11ee-a0eb-0a97cfc33817 Total time: 100.07s Test the App Let’s test the new REST API using the API Gateway management console to confirm it’s working as expected. We’ll create a new order, then retrieve it. Open the API Gateway management console and click on APIs in the left side menu Find the new REST API in the list of APIs, it will begin with OpenApiGatewayToLambda and have a Created date of today. Click on it to open it. On the Resources page that appears, click on POST under /order. In the lower, right-hand panel, select the Test tab (if the Test tab is not shown, click the arrow to shift the displayed tabs). The POST must include order data in the request body that matches the OrderAttributes schema defined by the OpenAPI spec. Enter the following data in the Request body field: { "productId": "prod234232", "customerId": "cust203439", "quantity": 5 } Click the orange Test button at the bottom of the page. The API Gateway console will display the results of the REST API call. Key things to look for are a Status of 200 and a Response Body resembling “{\"id\":\"ord1712062412777\"}" (this is the id of the new order created in the system, your value will differ). You could go to the DynamoDB console to confirm that the new order exists in the table, but it will be more fun to check by querying the API. Use the GET method to confirm the new order was persisted: Copy the id value from the Response body of the POST call – "{\"id\":\"ord1712062412777\"}" Tip – select just the text between the \” patterns (don’t select the backslash or quotation marks). Select the GET method under /{orderId} in the resource list. Paste the orderId you copied earlier into the orderId field under Path. Click Test – this will execute the GET method and return the order you just created. You should see a Status of 200 and a Response body with the full data from the Order you created in the previous step: "{\"id\":\"ord1712062412777\",\"productId\":\"prod234232\",\"quantity\":\"5\",\"customerId\":\"cust203439\"}" Let’s see how API Gateway is enforcing the inputs of the API. Let’s go back to the POST method and intentionally send an incorrect set of Order attributes. Click on POST under /order In the lower, right-hand panel, select the Test tab. Enter the following data in the Request body field: { "productId": "prod234232", "customerId": "cust203439", "quality": 5 } Click the orange Test button at the bottom of the page. Now you should see an HTTP error status of 400, and a Response body of {"message": "Invalid request body"}. Note that API Gateway caught the error, not any code in your Lambda function. In fact, the Lambda function was never invoked (you can take my word for it, or you can check for yourself on the Lambda management console). Because you’re invoking the methods directly from the console, you are circumventing the IAM authorization. If you would like to test the API with an IAM authorized call from a client, this video includes excellent instruction on how to accomplish this from Postman. Cleanup To clean up the resources in the stack, run this command: cdk destroy In response to Are you sure you want to delete: OpenApiBlogStack (y/n)? you can type y (once again you can safely ignore the warnings here). Conclusion Defining your API in a standalone definition file decouples it from your implementation, provides documentation and client benefits, and leads to more clarity for all stakeholders. Using that definition to configure your REST API in API Gateway creates a robust API that offloads enforcement of the API from your business logic to your tooling. Configuring a REST API that fully utilizes the functionality of API Gateway can be a daunting challenge. Defining the API behavior with an OpenAPI specification, then implementing that API using the AWS CDK and AWS Solutions Constructs, accelerates and simplifies that effort. The CloudFormation template that eventually launched this API is over 1200 lines long – yet with AWS CDK and AWS Solutions Constructs you were able generate this template with ~25 lines of Typescript. This is just one example of how Solutions Constructs enable developers to rapidly produce high quality architectures with the AWS CDK. At this writing there are 72 Solutions Constructs covering 29 AWS services – take a moment to browse through what’s available on the Solutions Constructs site. Introducing these in your CDK stacks accelerates your development, jump starts your journey towards being well-architected, and helps keep you well-architected as best practices and technologies evolve in the future. About the Author Biff Gaut has been shipping software since 1983, from small startups to large IT shops. Along the way he has contributed to 2 books, spoken at several conferences and written many blog posts. He’s been with AWS for 10+years and is currently a Principal Engineer working on the AWS Solutions Constructs team, helping customers deploy better architectures more quickly. View the full article
-
Customers often ask for help with implementing Blue/Green deployments to Amazon Elastic Container Service (Amazon ECS) using AWS CodeDeploy. Their use cases usually involve cross-Region and cross-account deployment scenarios. These requirements are challenging enough on their own, but in addition to those, there are specific design decisions that need to be considered when using CodeDeploy. These include how to configure CodeDeploy, when and how to create CodeDeploy resources (such as Application and Deployment Group), and how to write code that can be used to deploy to any combination of account and Region. Today, I will discuss those design decisions in detail and how to use CDK Pipelines to implement a self-mutating pipeline that deploys services to Amazon ECS in cross-account and cross-Region scenarios. At the end of this blog post, I also introduce a demo application, available in Java, that follows best practices for developing and deploying cloud infrastructure using AWS Cloud Development Kit (AWS CDK)... View the full article
-
- 1
-
- deployments
- aws cdk
-
(and 4 more)
Tagged with:
-
AWS unveils new capabilities for cdk8s, allowing seamless synthesis of applications into Helm charts on one hand, and native import of existing Helm charts into cdk8s applications on the other. In addition, cdk8s can now interpret deploy-time tokens of the AWS CDK and CDK For Terraform, all during the cdk8s synthesis phase. Helm stands out as a widely embraced solution for the deployment and management of Kubernetes applications. By converging cdk8s and Helm, users can enjoy a unified workflow for creating and deploying Kubernetes manifests. With the recent addition to the "cdk8s synth" command, you can transform a cdk8s app directly into a Helm Chart, ready to be integrated with Helm deployments. View the full article
-
- helm
- helm charts
-
(and 3 more)
Tagged with:
-
GitHub Actions is a feature on GitHub’s popular development platform that helps you automate your software development workflows in the same place you store code and collaborate on pull requests and issues. You can write individual tasks called actions, and combine them to create a custom workflow. Workflows are custom automated processes that you can set up in your repository to build, test, package, release, or deploy any code project on GitHub. A cross-account deployment strategy is a CI/CD pattern or model in AWS. In this pattern, you have a designated AWS account called tools, where all CI/CD pipelines reside. Deployment is carried out by these pipelines across other AWS accounts, which may correspond to dev, staging, or prod. For more information about a cross-account strategy in reference to CI/CD pipelines on AWS, see Building a Secure Cross-Account Continuous Delivery Pipeline. In this post, we show you how to use GitHub Actions to deploy an AWS Lambda-based API to an AWS account and Region using the cross-account deployment strategy. Using GitHub Actions may have associated costs in addition to the cost associated with the AWS resources you create. For more information, see About billing for GitHub Actions. Prerequisites Before proceeding any further, you need to identify and designate two AWS accounts required for the solution to work: Tools – Where you create an AWS Identity and Access Management (IAM) user for GitHub Actions to use to carry out deployment. Target – Where deployment occurs. You can call this as your dev/stage/prod environment. You also need to create two AWS account profiles in ~/.aws/credentials for the tools and target accounts, if you don’t already have them. These profiles need to have sufficient permissions to run an AWS Cloud Development Kit (AWS CDK) stack. They should be your private profiles and only be used during the course of this use case. So, it should be fine if you want to use admin privileges. Don’t share the profile details, especially if it has admin privileges. I recommend removing the profile when you’re finished with this walkthrough. For more information about creating an AWS account profile, see Configuring the AWS CLI. Solution overview You start by building the necessary resources in the tools account (an IAM user with permissions to assume a specific IAM role from the target account to carry out deployment). For simplicity, we refer to this IAM role as the cross-account role, as specified in the architecture diagram. You also create the cross-account role in the target account that trusts the IAM user in the tools account and provides the required permissions for AWS CDK to bootstrap and initiate creating an AWS CloudFormation deployment stack in the target account. GitHub Actions uses the tools account IAM user credentials to the assume the cross-account role to carry out deployment. In addition, you create an AWS CloudFormation execution role in the target account, which AWS CloudFormation service assumes in the target account. This role has permissions to create your API resources, such as a Lambda function and Amazon API Gateway, in the target account. This role is passed to AWS CloudFormation service via AWS CDK. You then configure your tools account IAM user credentials in your Git secrets and define the GitHub Actions workflow, which triggers upon pushing code to a specific branch of the repo. The workflow then assumes the cross-account role and initiates deployment. The following diagram illustrates the solution architecture and shows AWS resources across the tools and target accounts. Creating an IAM user You start by creating an IAM user called git-action-deployment-user in the tools account. The user needs to have only programmatic access. Clone the GitHub repo aws-cross-account-cicd-git-actions-prereq and navigate to folder tools-account. Here you find the JSON parameter file src/cdk-stack-param.json, which contains the parameter CROSS_ACCOUNT_ROLE_ARN, which represents the ARN for the cross-account role we create in the next step in the target account. In the ARN, replace <target-account-id> with the actual account ID for your designated AWS target account. Run deploy.sh by passing the name of the tools AWS account profile you created earlier. The script compiles the code, builds a package, and uses the AWS CDK CLI to bootstrap and deploy the stack. See the following code: cd aws-cross-account-cicd-git-actions-prereq/tools-account/ ./deploy.sh "<AWS-TOOLS-ACCOUNT-PROFILE-NAME>" You should now see two stacks in the tools account: CDKToolkit and cf-GitActionDeploymentUserStack. AWS CDK creates the CDKToolkit stack when we bootstrap the AWS CDK app. This creates an Amazon Simple Storage Service (Amazon S3) bucket needed to hold deployment assets such as a CloudFormation template and Lambda code package. cf-GitActionDeploymentUserStack creates the IAM user with permission to assume git-action-cross-account-role (which you create in the next step). On the Outputs tab of the stack, you can find the user access key and the AWS Secrets Manager ARN that holds the user secret. To retrieve the secret, you need to go to Secrets Manager. Record the secret to use later. Creating a cross-account IAM role In this step, you create two IAM roles in the target account: git-action-cross-account-role and git-action-cf-execution-role. git-action-cross-account-role provides required deployment-specific permissions to the IAM user you created in the last step. The IAM user in the tools account can assume this role and perform the following tasks: Upload deployment assets such as the CloudFormation template and Lambda code package to a designated S3 bucket via AWS CDK Create a CloudFormation stack that deploys API Gateway and Lambda using AWS CDK AWS CDK passes git-action-cf-execution-role to AWS CloudFormation to create, update, and delete the CloudFormation stack. It has permissions to create API Gateway and Lambda resources in the target account. To deploy these two roles using AWS CDK, complete the following steps: In the already cloned repo from the previous step, navigate to the folder target-account. This folder contains the JSON parameter file cdk-stack-param.json, which contains the parameter TOOLS_ACCOUNT_USER_ARN, which represents the ARN for the IAM user you previously created in the tools account. In the ARN, replace <tools-account-id> with the actual account ID for your designated AWS tools account. Run deploy.sh by passing the name of the target AWS account profile you created earlier. The script compiles the code, builds the package, and uses the AWS CDK CLI to bootstrap and deploy the stack. See the following code: cd ../target-account/ ./deploy.sh "<AWS-TARGET-ACCOUNT-PROFILE-NAME>" You should now see two stacks in your target account: CDKToolkit and cf-CrossAccountRolesStack. AWS CDK creates the CDKToolkit stack when we bootstrap the AWS CDK app. This creates an S3 bucket to hold deployment assets such as the CloudFormation template and Lambda code package. The cf-CrossAccountRolesStack creates the two IAM roles we discussed at the beginning of this step. The IAM role git-action-cross-account-role now has the IAM user added to its trust policy. On the Outputs tab of the stack, you can find these roles’ ARNs. Record these ARNs as you conclude this step. Configuring secrets One of the GitHub actions we use is aws-actions/configure-aws-credentials@v1. This action configures AWS credentials and Region environment variables for use in the GitHub Actions workflow. The AWS CDK CLI detects the environment variables to determine the credentials and Region to use for deployment. For our cross-account deployment use case, aws-actions/configure-aws-credentials@v1 takes three pieces of sensitive information besides the Region: AWS_ACCESS_KEY_ID, AWS_ACCESS_KEY_SECRET, and CROSS_ACCOUNT_ROLE_TO_ASSUME. Secrets are recommended for storing sensitive pieces of information in the GitHub repo. It keeps the information in an encrypted format. For more information about referencing secrets in the workflow, see Creating and storing encrypted secrets. Before we continue, you need your own empty GitHub repo to complete this step. Use an existing repo if you have one, or create a new repo. You configure secrets in this repo. In the next section, you check in the code provided by the post to deploy a Lambda-based API CDK stack into this repo. On the GitHub console, navigate to your repo settings and choose the Secrets tab. Add a new secret with name as TOOLS_ACCOUNT_ACCESS_KEY_ID. Copy the access key ID from the output OutGitActionDeploymentUserAccessKey of the stack GitActionDeploymentUserStack in tools account. Enter the ID in the Value field. Repeat this step to add two more secrets: TOOLS_ACCOUNT_SECRET_ACCESS_KEY (value retrieved from the AWS Secrets Manager in tools account) CROSS_ACCOUNT_ROLE (value copied from the output OutCrossAccountRoleArn of the stack cf-CrossAccountRolesStack in target account) You should now have three secrets as shown below. Deploying with GitHub Actions As the final step, first clone your empty repo where you set up your secrets. Download and copy the code from the GitHub repo into your empty repo. The folder structure of your repo should mimic the folder structure of source repo. See the following screenshot. We can take a detailed look at the code base. First and foremost, we use Typescript to deploy our Lambda API, so we need an AWS CDK app and AWS CDK stack. The app is defined in app.ts under the repo root folder location. The stack definition is located under the stack-specific folder src/git-action-demo-api-stack. The Lambda code is located under the Lambda-specific folder src/git-action-demo-api-stack/lambda/ git-action-demo-lambda. We also have a deployment script deploy.sh, which compiles the app and Lambda code, packages the Lambda code into a .zip file, bootstraps the app by copying the assets to an S3 bucket, and deploys the stack. To deploy the stack, AWS CDK has to pass CFN_EXECUTION_ROLE to AWS CloudFormation; this role is configured in src/params/cdk-stack-param.json. Replace <target-account-id> with your own designated AWS target account ID. Finally, we define the Git Actions workflow under the .github/workflows/ folder per the specifications defined by GitHub Actions. GitHub Actions automatically identifies the workflow in this location and triggers it if conditions match. Our workflow .yml file is named in the format cicd-workflow-<region>.yml, where <region> in the file name identifies the deployment Region in the target account. In our use case, we use us-east-1 and us-west-2, which is also defined as an environment variable in the workflow. The GitHub Actions workflow has a standard hierarchy. The workflow is a collection of jobs, which are collections of one or more steps. Each job runs on a virtual machine called a runner, which can either be GitHub-hosted or self-hosted. We use the GitHub-hosted runner ubuntu-latest because it works well for our use case. For more information about GitHub-hosted runners, see Virtual environments for GitHub-hosted runners. For more information about the software preinstalled on GitHub-hosted runners, see Software installed on GitHub-hosted runners. The workflow also has a trigger condition specified at the top. You can schedule the trigger based on the cron settings or trigger it upon code pushed to a specific branch in the repo. See the following code: name: Lambda API CICD Workflow # This workflow is triggered on pushes to the repository branch master. on: push: branches: - master # Initializes environment variables for the workflow env: REGION: us-east-1 # Deployment Region jobs: deploy: name: Build And Deploy # This job runs on Linux runs-on: ubuntu-latest steps: # Checkout code from git repo branch configured above, under folder $GITHUB_WORKSPACE. - name: Checkout uses: actions/checkout@v2 # Sets up AWS profile. - name: Configure AWS credentials uses: aws-actions/configure-aws-credentials@v1 with: aws-access-key-id: ${{ secrets.TOOLS_ACCOUNT_ACCESS_KEY_ID }} aws-secret-access-key: ${{ secrets.TOOLS_ACCOUNT_SECRET_ACCESS_KEY }} aws-region: ${{ env.REGION }} role-to-assume: ${{ secrets.CROSS_ACCOUNT_ROLE }} role-duration-seconds: 1200 role-session-name: GitActionDeploymentSession # Installs CDK and other prerequisites - name: Prerequisite Installation run: | sudo npm install -g aws-cdk@1.31.0 cdk --version aws s3 ls # Build and Deploy CDK application - name: Build & Deploy run: | cd $GITHUB_WORKSPACE ls -a chmod 700 deploy.sh ./deploy.sh For more information about triggering workflows, see Triggering a workflow with events. We have configured a single job workflow for our use case that runs on ubuntu-latest and is triggered upon a code push to the master branch. When you create an empty repo, master branch becomes the default branch. The workflow has four steps: Check out the code from the repo, for which we use a standard Git action actions/checkout@v2. The code is checked out into a folder defined by the variable $GITHUB_WORKSPACE, so it becomes the root location of our code. Configure AWS credentials using aws-actions/configure-aws-credentials@v1. This action is configured as explained in the previous section. Install your prerequisites. In our use case, the only prerequisite we need is AWS CDK. Upon installing AWS CDK, we can do a quick test using the AWS Command Line Interface (AWS CLI) command aws s3 ls. If cross-account access was successfully established in the previous step of the workflow, this command should return a list of buckets in the target account. Navigate to root location of the code $GITHUB_WORKSPACE and run the deploy.sh script. You can check in the code into the master branch of your repo. This should trigger the workflow, which you can monitor on the Actions tab of your repo. The commit message you provide is displayed for the respective run of the workflow. You can choose the workflow link and monitor the log for each individual step of the workflow. In the target account, you should now see the CloudFormation stack cf-GitActionDemoApiStack in us-east-1 and us-west-2. The API resource URL DocUploadRestApiResourceUrl is located on the Outputs tab of the stack. You can invoke your API by choosing this URL on the browser. Clean up To remove all the resources from the target and tools accounts, complete the following steps in their given order: Delete the CloudFormation stack cf-GitActionDemoApiStack from the target account. This step removes the Lambda and API Gateway resources and their associated IAM roles. Delete the CloudFormation stack cf-CrossAccountRolesStack from the target account. This removes the cross-account role and CloudFormation execution role you created. Go to the CDKToolkit stack in the target account and note the BucketName on the Output tab. Empty that bucket and then delete the stack. Delete the CloudFormation stack cf-GitActionDeploymentUserStack from tools account. This removes cross-account-deploy-user IAM user. Go to the CDKToolkit stack in the tools account and note the BucketName on the Output tab. Empty that bucket and then delete the stack. Security considerations Cross-account IAM roles are very powerful and need to be handled carefully. For this post, we strictly limited the cross-account IAM role to specific Amazon S3 and CloudFormation permissions. This makes sure that the cross-account role can only do those things. The actual creation of Lambda, API Gateway, and Amazon DynamoDB resources happens via the AWS CloudFormation IAM role, which AWS CloudFormation assumes in the target AWS account. Make sure that you use secrets to store your sensitive workflow configurations, as specified in the section Configuring secrets. Conclusion In this post we showed how you can leverage GitHub’s popular software development platform to securely deploy to AWS accounts and Regions using GitHub actions and AWS CDK. Build your own GitHub Actions CI/CD workflow as shown in this post. About the author Damodar Shenvi Wagle is a Cloud Application Architect at AWS Professional Services. His areas of expertise include architecting serverless solutions, ci/cd and automation. View the full article
-
Forum Statistics
67.4k
Total Topics65.3k
Total Posts