Jump to content

Search the Community

Showing results for tags 'bitbucket'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

There are no results to display.

There are no results to display.


Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Website URL


LinkedIn Profile URL


About Me


Cloud Platforms


Cloud Experience


Development Experience


Current Role


Skills


Certifications


Favourite Tools


Interests

Found 6 results

  1. Atlassian, a leading provider of collaboration and productivity software, has recently rolled out a series of patches aimed at fortifying the security of its popular products. These Atlassian flaws fixes address vulnerabilities across several platforms, including Bamboo, Bitbucket, Confluence, and Jira. Let’s delve into the details of these fixes and understand their significance in protecting […] The post Atlassian Flaws Fixes: Critical Bamboo Patch Mitigates Risk appeared first on TuxCare. The post Atlassian Flaws Fixes: Critical Bamboo Patch Mitigates Risk appeared first on Security Boulevard. View the full article
  2. AWS Glue now supports GitLab and BitBucket, alongside GitHub and AWS CodeCommit, broadening your toolset for managing data integration pipeline deployments. AWS Glue is a serverless data integration service that makes it simpler to discover, prepare, move, and integrate data from multiple sources for analytics, machine learning (ML), and application development. View the full article
  3. Atlassian today outlined its strategy to increase the appeal of the Bitbucket continuous integration/continuous delivery (CI/CD) cloud service among enterprise IT organizations. Atlassian said it plans to focus on four core pillars: Performance, management, extensibility and security and compliance, starting with an update scheduled for the third quarter. Dan Tao, head of engineering for Bitbucket […] The post Atlassian Bitbucket CI/CD Cloud Service Aims for the Enterprise appeared first on DevOps.com. View the full article
  4. This post was contributed by James Bland, Sr. Partner Solutions Architect, AWS, Jay Yeras, Head of Cloud and Cloud Native Solution Architecture, Snyk, and Venkat Subramanian, Group Product Manager, Bitbucket One of our goals at Atlassian is to make the software delivery and development process easier. This post explains how you can set up a software delivery pipeline using Bitbucket Pipelines and Snyk, a tool that finds and fixes vulnerabilities in open-source dependencies and container images, to deploy secured applications on Amazon Elastic Kubernetes Service (Amazon EKS). By presenting important development information directly on pull requests inside the product, you can proactively diagnose potential issues, shorten test cycles, and improve code quality. Atlassian Bitbucket Cloud is a Git-based code hosting and collaboration tool, built for professional teams. Bitbucket Pipelines is an integrated CI/CD service that allows you to automatically build, test, and deploy your code. With its best-in-class integrations with Jira, Bitbucket Pipelines allows different personas in an organization to collaborate and get visibility into the deployments. Bitbucket Pipes are small chunks of code that you can drop into your pipeline to make it easier to build powerful, automated CI/CD workflows. In this post, we go over the following topics: The importance of security as practices shift-left in DevOps How embedding security into pull requests helps developer workflows Deploying an application on Amazon EKS using Bitbucket Pipelines and Snyk Shift-left on security Security is usually an afterthought. Developers tend to focus on delivering software first and addressing security issues later when IT Security, Ops, or InfoSec teams discover them. However, research from the 2016 State of DevOps Report shows that you can achieve better outcomes by testing for security earlier in the process within a developer’s workflow. This concept is referred to as shift-left, where left indicates earlier in the process, as illustrated in the following diagram. There are two main challenges in shifting security left to developers: Developers aren’t security experts – They develop software in the most efficient way they know how, which can mean importing libraries to take care of lower-level details. And sometimes these libraries import other libraries within them, and so on. This makes it almost impossible for a developer, who is not a security expert, to keep track of security. It’s time-consuming – There is no automation. Developers have to run tests to understand what’s happening and then figure out how to fix it. This slows them down and takes them away from their core job: building software. Enabling security into a developer’s workflow Code Insights is a new feature in Bitbucket that provides contextual information as part of the pull request interface. It surfaces information relevant to a pull request so issues related to code quality or security vulnerabilities can be viewed and acted upon during the code review process. The following screenshot shows Code Insights on the pull request sidebar. In the security space, we’ve partnered with Snyk, McAfee, Synopsys, and Anchore. When you use any of these integrations in your Bitbucket Pipeline, security vulnerabilities are automatically surfaced within your pull request, prompting developers to address them. By bringing the vulnerability information into the pull request interface before the actual deployment, it’s much easier for code reviewers to assess the impact of the vulnerability and provide actionable feedback. When security issues are fixed as part of a developer’s workflow instead of post-deployment, it means fewer sev1 incidents, which saves developer time and IT resources down the line, and leads to a better user experience for your customers. Securing your Atlassian Workflow with Snyk To demonstrate how you can easily introduce a few steps to your workflow that improve your security posture, we take advantage of the new Snyk integration to Atlassian’s Code Insights and other Snyk integrations to Bitbucket Cloud, Amazon Elastic Container Registry (Amazon ECR, for more information see Container security with Amazon Elastic Container Registry (ECR): integrate and test), and Amazon EKS (for more information see Kubernetes workload and image scanning. We reference sample code in a publicly available Bitbucket repository. In this repository, you can find resources such as a multi-stage build Dockerfile for a sample Java web application, a sample bitbucket-pipelines.yml configured to perform Snyk scans and push container images to Amazon ECR, and a reference Kubernetes manifest to deploy your application. Prerequisites You first need to have a few resources provisioned, such as an Amazon ECR repository and an Amazon EKS cluster. You can quickly create these using the AWS Command Line Interface (AWS CLI) by invoking the create-repository command and following the Getting started with eksctl guide. Next, make sure that you have enabled the new code review experience in your Bitbucket account. To take a closer look at the bitbucket-pipelines.yml file, see the following code: script: - IMAGE_NAME="petstore" - docker build -t $IMAGE_NAME . - pipe: snyk/snyk-scan:0.4.3 variables: SNYK_TOKEN: $SNYK_TOKEN LANGUAGE: "docker" IMAGE_NAME: $IMAGE_NAME TARGET_FILE: "Dockerfile" CODE_INSIGHTS_RESULTS: "true" SEVERITY_THRESHOLD: "high" DONT_BREAK_BUILD: "true" - pipe: atlassian/aws-ecr-push-image:1.1.2 variables: AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY AWS_DEFAULT_REGION: "us-west-2" IMAGE_NAME: $IMAGE_NAME In the preceding code, we invoke two Bitbucket Pipes to easily configure our pipeline and complete two critical tasks in just a few lines: scan our container image and push to our private registry. This saves time and allows for reusability across repositories while discovering innovative ways to automate our pipelines thanks to an extensive catalog of integrations. Snyk pipe for Bitbucket Pipelines In the following use case, we build a container image from the Dockerfile included in the Bitbucket repository and scan the image using the Snyk pipe. We also invoke the aws-ecr-push-image pipe to securely store our image in a private registry on Amazon ECR. When the pipeline runs, we see results as shown in the following screenshot. If we choose the available report, we can view the detailed results of our Snyk scan. In the following screenshot, we see detailed insights into the content of that report: three high, one medium, and five low-severity vulnerabilities were found in our container image. Snyk scans of Bitbucket and Amazon ECR repositories Because we use Snyk’s integration to Amazon ECR and Snyk’s Bitbucket Cloud integration to scan and monitor repositories, we can dive deeper into these results by linking our Dockerfile stored in our Bitbucket repository to the results of our last container image scan. By doing so, we can view recommendations for upgrading our base image, as in the following screenshot. As a result, we can move past informational insights and onto actionable recommendations. In the preceding screenshot, our current image of jboss/wilfdly:11.0.0.Final contains 76 vulnerabilities. We also see two recommendations: a major upgrade to jboss/wildfly:18.0.1.FINAL, which brings our total vulnerabilities down to 65, and an alternative upgrade, which is less desirable. We can investigate further by drilling down into the report to view additional context on how a potential vulnerability was introduced, and also create a Jira issue to Atlassian Jira Software Cloud. The following screenshot shows a detailed report on the Issues tab. We can also explore the Dependencies tab for a list of all the direct dependencies, transitive dependencies, and the vulnerabilities those may contain. See the following screenshot. Snyk scan Amazon EKS configuration The final step in securing our workflow involves integrating Snyk with Kubernetes and deploying to Amazon EKS and Bitbucket Pipelines. Sample Kubernetes manifest files and a bitbucket-pipeline.yml are available for you to use in the accompanying Bitbucket repository for this post. Our bitbucket-pipeline.yml contains the following step: script: - pipe: atlassian/aws-eks-kubectl-run:1.2.3 variables: AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION CLUSTER_NAME: "my-kube-cluster" KUBECTL_COMMAND: "apply" RESOURCE_PATH: "java-app.yaml" In the preceding code, we call the aws-eks-kubectl-run pipe and pass in a few repository variables we previously defined (see the following screenshot). For more information about generating the necessary access keys in AWS Identity and Access Management (IAM) to make programmatic requests to the AWS API, see Creating an IAM User in Your AWS Account. Now that we have provisioned the supporting infrastructure and invoked kubectl apply -f java-app.yaml to deploy our pods using our container images in Amazon ECR, we can monitor our project details and view some initial results. The following screenshot shows that our initial configuration isn’t secure. The reason for this is that we didn’t explicitly define a few parameters in our Kubernetes manifest under securityContext. For example, parameters such as readOnlyRootFilesystem, runAsNonRoot, allowPrivilegeEscalation, and capabilities either aren’t defined or are set incorrectly in our template. As a result, we see this in our findings with the FAIL flag. Hovering over these on the Snyk console provides specific insights on how to fix these, for example: Run as non-root – Whether any containers in the workload have securityContext.runAsNonRoot set to false or unset Read-only root file system – Whether any containers in the workload have securityContext.readOnlyFilesystem set to false or unset Drop capabilities – Whether all capabilities are dropped and CAP_SYS_ADMIN isn’t added To save you the trouble of researching this, we provide another sample template, java-app-snyk.yaml, which you can apply against your running pods. The difference in this template is that we have included the following lines to the manifest, which address the three failed findings in our report: securityContext: allowPrivilegeEscalation: false readOnlyRootFilesystem: true runAsNonRoot: true capabilities: drop: - all After a subsequent scan, we can validate our changes propagated successfully and our Kubernetes configuration is secure (see the following screenshot). Conclusion This post demonstrated how to secure your entire flow proactively with Atlassian Bitbucket Cloud and Snyk. Seamless integrations to Bitbucket Cloud provide you with actionable insights at each step of your development process. Get started for free with Bitbucket and Snyk and learn more about the Bitbucket-Snyk integration. “The content and opinions in this post are those of the third-party author and AWS is not responsible for the content or accuracy of this post.” View the full article
  5. AWS CodeStar Connections is a new feature that allows services like AWS CodePipeline to access third-party code source provider. For example, you can now seamlessly connect your Atlassian Bitbucket Cloud source repository to AWS CodePipeline. This allows you to automate the build, test, and deploy phases of your release process each time a code change occurs. This new feature is available in the following Regions: US East (Ohio) US East (N. Virginia) US West (N. California) US West (Oregon) Asia Pacific (Mumbai) Asia Pacific (Seoul) Asia Pacific (Singapore) Asia Pacific (Sydney) Asia Pacific (Tokyo) Canada (Central) EU (Frankfurt) EU (Ireland) EU (London) EU (Paris) South America (São Paulo) The practice of tracking and managing changes to code, or source control, is a foundational element to the development process. Therefore, source control management systems are an essential tool for any developer. In this post, we focus on one specific Git code management product: Atlassian Bitbucket. You can get started for free with Bitbucket Cloud. Atlassian provides detailed documentation on getting started with Bitbucket Cloud, which includes topics such as setting up a team, creating a repository, working with branches, and more. For more information, see Get started with Bitbucket Cloud. Prerequisite For this use case, you use a Bitbucket account, repository, and Amazon Simple Storage Service (Amazon S3) bucket that we have already created. To follow along, you should have the following: A working knowledge of Git and how to fork or clone within your source provider Familiarity with hosting a static website on Amazon S3 To follow along you will need a sample page. Here is some simple html code that you can name index.html and add to your repo. <html> <head> Example Header </head> <body> Example Body Text </body> </html> Solution overview For this use case, you deploy a Hugo website from your Bitbucket Cloud repository to your S3 bucket using CodePipeline. You can then connect your Bitbucket Cloud account to your AWS account to deploy code natively. The walkthrough contains the following steps: Set up CodeStar connections. Add a deployment stage. Use CI/CD to update your website. Setting up CodeStar connections When connecting CodePipeline to Bitbucket Cloud, it helps if you already signed in to Bitbucket. After you sign in to Bitbucket Cloud, you perform the rest of the connection steps on the AWS Management Console. On the console, search for CodePipeline. Choose CodePipeline. Choose Pipelines. Choose Create pipeline. For Pipeline name, enter a name. For Service role, select New service role. For Role name, enter a name for the service role. Choose Next. For Source provider, choose Bitbucket Cloud. For Connection, choose Connect to Bitbucket Cloud. For Connection name, enter a name. For Bitbucket Cloud apps, choose Install a new app. If this isn’t your first time making a connection, you can choose an existing connection. Choose Connect. Confirm you’re logged in as the correct user and choose Grant access. Choose Connect. For Repository name, choose your repository. For Branch name, choose your branch. For Output artifact format, select CodePipeline default. Choose Next. Adding a deployment stage Now that you have created a source stage, you can add a deployment stage. On the Add build stage page, choose Skip build stage.For this use case, you skip the build stage, but if you need to build your own code, choose your build provider from the drop-down menu.You are prompted to confirm you want to skip the build stage. Choose Skip. For Deploy provider, choose Amazon S3. If you have a different destination type or are hosting on traditional compute, you can choose other providers. For Region, choose the Region your S3 bucket is in. For Bucket, choose the bucket you are deploying to. Optionally, you can also choose a deploy path if you need to deploy to a sub-folder. Select Extract file before deploy. Choose next. Review your configuration and choose Create pipeline. If the settings are correct, you see a green success banner and the initial deployment of your pipeline runs successfully. The following screenshot shows our first deployment. Now that the pipeline shows that the deployment was successful, you can check the S3 bucket to make sure the site is being hosted. You should see your static webpage, as in the following screenshot. Using CI/CD to update our website Now that you have created your pipeline, you can edit your website using your IDE, push the changes, and validate that those changes are automatically deployed to the website. For this step, I already cloned my repository and have it opened in my IDE. Open your code in your preferred IDE. Make the change to your code and push it to Bitbucket.The following screenshot shows that we updated the message that viewers see on our website and pushed our code. Look at the pipeline and make sure your code is being processed. The following screenshot shows that the stages were successful and the pipeline processed the correct commit. After your pipeline is successful, you can check the end result. The following screenshot shows our static webpage. Clean up If you created any resources during this that you do not plan on keeping, make sure you clean it up to keep from incurring cost associated with the services. Summary Being able to let your developers use their repository of choice can be important in your transition to the cloud. CodeStar connections makes it easy for you to set up Bitbucket Cloud as a source provider in the AWS Code Suite. Get started building your CI/CD pipeline using Bitbucket Cloud and the AWS Code Suite. View the full article
  • Forum Statistics

    67.4k
    Total Topics
    65.3k
    Total Posts
×
×
  • Create New...