Search the Community
Showing results for tags 'aws codepipeline'.
-
AWS CodePipeline V2 type pipelines now support stage level rollback to help customers to confidently deploy changes to their production environment. When a pipeline execution fails in a stage due to any action(s) failing, customers can quickly get that stage to a known good state by rolling back to a previously successful pipeline execution in that stage. Customers can roll back changes in any stage, whether succeeded or failed, except the Source stage. View the full article
-
AWS CodePipeline is a managed continuous delivery service that automates your release pipelines for application and infrastructure updates. Today, CodePipeline adds triggers and new execution modes to support teams with various delivery strategies. These features give customers more choice in the pipelines they build. In this post, I am going to show you how to use triggers and pipeline execution modes together to create three pipeline designs. These examples are requested by customers that practice branch-based development or manage multiple projects within a monorepo. Pipeline #1: Create a GitFlow (multi-branch) release pipeline. Pipeline #2: Run a pipeline on all pull requests (PRs). Pipeline #3: Run a pipeline on a single folder within a monorepo. As I walkthrough each of the pipelines you will learn more about these features and how to use them. After completing the blog, you can use triggers and execution modes to adapt these examples to your pipeline needs. Pipeline #1 – Create a GitFlow (multi-branch) release pipeline GitFlow is a development model that manages large projects with parallel development and releases using long-running branches. GitFlow uses two permanent branches, main and develop, along with supporting feature, release, and hotfix branches. Since I will cover triggering a pipeline from multiple branches, these concepts can be applied to simplify other multi-branch pipeline strategies such as GitHub flow. I can create pipelines using the AWS Management Console, AWS CLI, AWS CloudFormation, or by writing code that calls the CodePipeline CreatePipeline API. In this blog, I will keep things simple by creating two pipelines, a release pipeline and a feature development pipeline. I start by navigating to the CodePipeline console and choosing Create pipeline. In the first step, Pipeline settings, as per Figure 1 below, you will now see options for the newly added Execution modes – Queued and Parallel. Figure 1. Example GitFlow release pipeline settings for Queued execution mode. The execution mode of the pipeline determines the handling of multiple executions: Superseded – an execution that started more recently can overtake one that began earlier. Before today, CodePipeline only supported Superseded execution mode. Queued – the executions wait and do not overtake executions that have already started. Executions are processed one by one in the order that they are queued. Parallel – the executions are independent of one another and do not wait for other executions to complete before starting. The first pipeline, release pipeline, will trigger for main, develop, hotfix, and release branches. I select Queued, since I want to run every push to these branches in the order triggered by the pipeline. I make sure the Pipeline type chosen is V2 and I click Next. Figure 2. Example source connection and repository. In step two, Add source stage, I select my Source provider, Connection, Repository name, and Default branch. I need to use a source provider that uses a connection to my external code repository, in this example I’m using GitHub so I select Connect to GitHub. Connections authorize and establish configurations that associate a third-party provider such as GitHub with CodePipeline. Now that I have my Source setup, I’m going to configure a Trigger. Triggers define the event type that starts the pipeline, such as a code push or pull request. I select the Specify filter from the Trigger types, since I want to add a filtered trigger. For this pipeline, I select Push for the Event type. A push trigger starts a pipeline when a change is pushed to the source repository. The execution uses the files in the branch that is pushed to, the destination branch. Next, I select the Filter type of Branch. The branch filter type specifies the branches in GitHub connected repository that the trigger monitors in order to know when to start an execution. There are two types of branch filters: Include – the trigger will start a pipeline if the branch name matches the pattern. Exclude – the trigger will NOT start a pipeline if the branch name matches the pattern. Note: If Include and Exclude both have the same pattern, then the default is to exclude the pattern. Branching patterns are entered in the glob format, detailed in Working with glob patterns in syntax, to specify the branch I want to trigger, I enter main,develop,hotfix/**,release/** in Include and I leave Exclude empty. Figure 3. Example GitFlow release pipeline for push event type and branch filters. I am done configuring the filters and I click Next. To keep the focus of the blog on the pipeline and not the application, I will skip ahead to Create pipeline. If you are curious about by my application and build step, I followed the example in AWS CodeBuild adds support for AWS Lambda compute mode. Next, I create the feature development pipeline. The feature development pipeline will trigger for feature branches. This time, I select the Parallel Execution mode, as developers should not be blocked by their peers working in other feature branches. I make sure the Pipeline type chosen is V2 and I click Next. Figure 4. Example GitFlow feature pipeline settings for Queued execution mode. In Step 2, the source provider and connection is setup the same as per the previous release pipeline, see Figure 2 above. Once the Source step is complete, I configure my Trigger with an Event type of Push, but this time I only enter feature/** for Include. Figure 5. Example GitFlow feature pipeline for push event type and branch filters. I am done configuring the filters and I skip forward to Create pipeline. After the pipeline is finished creating, I can now see both of the pipelines I created – the release pipeline and the feature development pipeline. Figure 6. Example GitFlow pipelines. To verify my pipeline setup, I create and merge multiple code changes to feature branches and to the release branches – develop, release, and main. The Pipeline view now displays the executions that have been triggered by the matching branches. Note how these executions have been successfully added to the queue by the pipeline. Figure 7. Example GitFlow release executions queued. I have now implemented a GitFlow release pipeline. By using Branch filter types and Push event triggers, you can now extend this example to meet your branch-based development needs. Pipeline #2 – Run a pipeline on all pull requests (PRs) Before proceeding, I recommend you review the concepts covered in Pipeline #1, as you will build on that knowledge. Triggering a pipeline on a pull request (PR) is a common continuous integration pattern to catch build and test failures before the PR is merged into the branch. A PR pipeline is often faster and lighter than the full release by limiting tests like security scans, validation tests, or performance tests to the changes in the PR rather than running them on every commit. Having a single pipeline triggered for all PRs allows reviewing and validating any proposed changes to the repository before merging. To start I create a new pipeline, by clicking Create Pipeline. I change the Execution mode to Parallel. I choose Parallel because the development team will be working on multiple features at the same time and it is wasteful to wait for other executions to finish. I make sure the Pipeline type chosen is V2 and I click Next. Figure 8. Example PR pipeline settings for Parallel execution mode. As per the previous pipeline, the Source provider and connection is setup as show in Figure 2 above. Once the Source step is setup, I configure my Pull Request Trigger. For this pipeline, I select Pull Request for the Event type. A pull request trigger starts a pipeline when a pull request is opened, updated, or closed in the source repository. The execution will use the files in the branch that the change is being pulled from, the source branch. Next, I select Pull request is created and New revision is made to pull request for Events for pull request. To match pull requests for all branches, I enter ** under Include for Branches and leave Exclude empty. Figure 9. Example PR pipeline for pull request event type. I will fast-forward to the Create pipeline, skipping the details of the build and deploy steps, similar to what I did in Pipeline #1. Once the pipeline has finished creating, I open a few PRs in my GitHub repository as a test. Back in CodePipeline when I click on my pipeline, I notice the pipeline takes me straight to the Execution history view. The reason I’m redirected to the execution history is the pipeline execution mode is Parallel and all executions are independent. From this view, I see the Trigger column displaying details about each pull request that has triggered the pipeline. Figure 10. Example PR pipeline with executions in parallel. Note: To view an individual execution Pipeline, click the Execution ID. I have now implemented a PR validation pipeline for all PRs across branches. By using Pull request event triggers and Branch filter types, you can now extend this example to meet your PR pipeline needs. Pipeline #3 – Run a pipeline on a single folder within a monorepo Before proceeding, I recommend you review the concepts covered in Pipeline #1, as you will build on that knowledge. A monorepo is a software-development strategy where a single repository is used to contain the code for multiple projects. Running pipelines for each project contained in the monorepo on every commit can be inefficient, especially when each project requires different pipelines. For this pipeline example, I want to limit pipeline executions to only changes inside the infrastructure folder in the main branch. This can reduce cost, speed up deployments, and optimize resource usage. To start, I create a new pipeline by clicking Create Pipeline. For this example, I keep the default Execution mode as Suspended, since I do not have any specific execution mode requirements. I make sure the Pipeline type chosen is V2 and I click Next. As per the previous pipeline, the Source provider and connection is setup as per Figure 2 above. Once the Source step is complete, I configure my Trigger to focus on the infrastructure folder in the main branch. For this pipeline, I select Push for the Event type. Next, I select the Filter type of Branch. To match pushes to only main, I enter main under Include for Branches and leave Exclude empty. Under File paths, for Include, I enter infrastructure/** and I leave Exclude empty. The file paths filter type specifies file path names in the source repository that the trigger monitors in order to know when to start an execution. Similar to branch filters, I can specify file path name patterns in glob format under Include and Exclude. Figure 11. Example monorepo pipeline for push event type and file path filters. I click Next, since I am done configuring the filters. I will jump ahead to the Create pipeline, omitting the details of the build and deploy steps, like I did in Pipeline #1. Once the pipeline has been created, I can test the pipeline Trigger in GitHub by making changes on the main branch inside and outside the infrastructure folder. To verify it is only invoking the pipelines inside the infrastructure folder, I open the History for the pipeline in CodePipeline. I confirm that only the changes I’m expecting are running. Figure 12. Example monorepo pipeline with only infrastructure executions. I have now selectively invoked a pipeline based on repository changes in a monorepo. By using File paths filter types, you can now extend this example to meet your monorepo release pipelines. Conclusion AWS CodePipeline’s new triggers and execution modes unlock new patterns for building pipelines on AWS. In this post, I discussed the new features and three popular pipeline patterns you can build. If you are creating GitFlow or your own multi-branch strategy, CodePipeline simplifies managing release pipelines for multi-branch models. Whether you are using File path filter types for monorepos or leveraging Parallel execution mode to unblock developers, CodePipeline accelerates the delivery of your software. Check out the AWS CodePipeline User Guide and hands-on tutorials to automate your delivery workflows today. Michael Ohde Michael Ohde is a Senior Solutions Architect from Long Beach, CA. As a Product Acceleration Solution Architect at AWS, he currently assists Independent Software Vendor (ISVs) in the GovTech and EdTech sectors, by building modern applications using practices like serverless, DevOps, and AI/ML. View the full article
-
AWS CodePipeline announces the ability to start a pipeline execution with source revision overrides. Until today, a manually started pipeline execution would automatically select the “latest revision” for each source in the pipeline. The “latest revision” depended on the source action type. For example, for a CodeCommit source, the HEAD commit reference in the configured repository and branch was used, and for Amazon Elastic Container Registry (ECR) source, the latest digest of the configured image repository and tag was used. Now, when you start a pipeline execution, you can override the source revisions for the source actions in your pipeline. View the full article
-
Tools and platforms form the backbone of seamless software delivery in the ever-evolving world of Continuous Integration and Continuous Deployment (CI/CD). For years, Jenkins has been the stalwart, powering countless deployment pipelines and standing as the go-to solution for many DevOps professionals. But as the tech landscape shifts towards cloud-native solutions, AWS CodePipeline emerges as a formidable contender. Offering deep integration with the expansive AWS ecosystem and the agility of a cloud-based platform, CodePipeline is redefining the standards of modern deployment processes. This article dives into the transformative power of AWS CodePipeline, exploring its advantages over Jenkins and showing why many are switching to this cloud-native tool. Brief Background About CodePipeline and Jenkins At its core, AWS CodePipeline is Amazon Web Services' cloud-native continuous integration and continuous delivery service, allowing users to automate the build, test, and deployment phases of their release process. Tailored to the vast AWS ecosystem, CodePipeline leverages other AWS services, making it a seamless choice for teams already integrated with AWS cloud infrastructure. It promises scalability, maintenance ease, and enhanced security, characteristics inherent to many managed AWS services. On the other side of the spectrum is Jenkins – an open-source automation server with a storied history. Known for its flexibility, Jenkins has garnered immense popularity thanks to its extensive plugin system. It's a tool that has grown with the CI/CD movement, evolving from a humble continuous integration tool to a comprehensive automation platform that can handle everything from build to deployment and more. Together, these two tools represent two distinct eras and philosophies in the CI/CD domain. View the full article
-
- jenkins
- cloud-native
-
(and 1 more)
Tagged with:
-
Today, AWS CodePipeline announces support for retrying a pipeline execution from the first action in a stage that failed. This launch provides another remediation option for a failed pipeline execution in addition to the existing option of retrying a failed pipeline execution from the failed action(s). View the full article
-
You can now use your GitLab.com source repository to build, test, and deploy code changes using AWS CodePipeline. Connect your GitLab.com account using AWS CodeStar Connections, and use the connection in your pipeline to automatically start a pipeline execution on changes in your repository. View the full article
-
Forum Statistics
63.7k
Total Topics61.7k
Total Posts