Jump to content

Search the Community

Showing results for tags 'labs'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • General
    • General Discussion
    • Artificial Intelligence
    • DevOpsForum News
  • DevOps & SRE
    • DevOps & SRE General Discussion
    • Databases, Data Engineering & Data Science
    • Development & Programming
    • CI/CD, GitOps, Orchestration & Scheduling
    • Docker, Containers, Microservices, Serverless & Virtualization
    • Infrastructure-as-Code
    • Kubernetes & Container Orchestration
    • Linux
    • Logging, Monitoring & Observability
    • Security, Governance, Risk & Compliance
  • Cloud Providers
    • Amazon Web Services
    • Google Cloud Platform
    • Microsoft Azure

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Website URL


LinkedIn Profile URL


About Me


Cloud Platforms


Cloud Experience


Development Experience


Current Role


Skills


Certifications


Favourite Tools


Interests

Found 2 results

  1. Whether you are an experienced IT professional or new to the industry, cloud computing skills are no longer just good to have. It is a mandatory and highly sought-after credential that companies look for when hiring tech professionals. And the best way to kick off is to go for certification programs. Amazon Web Services (AWS) certifications are a go-to upskilling resource for cloud computing players. AWS courses keep you up-to-date in the cloud domain. However, passing these courses requires in-depth knowledge of the AWS cloud platform and getting hands-on with the nitty-gritty of platform solutions. AWS hands-on labs are the best way to get started. This blog will run you through top hands-on labs for AWS and their features. But before that, let’s glance at what AWS certifications are. A sneak peek into AWS certifications Amazon Web Services (AWS) certifications are a series of qualifications provided by Amazon Web Services to verify and showcase your expertise in various aspects of AWS cloud computing. These certifications hold high regard in the IT industry. They are valuable for individuals and organizations, as they help demonstrate the skills required to design, deploy, and manage applications and infrastructure on the AWS platform. Here are some of the key AWS certifications: AWS Certified Cloud Practitioner: This entry-level certification is designed for individuals seeking a foundational understanding of AWS and its cloud computing services. It’s suitable for both technical and non-technical professionals. AWS Certified Solutions Architect – Associate: This certification is for individuals who can design distributed systems on the AWS platform, focusing on best practices for building secure and scalable applications. AWS Certified Developer – Associate: Geared towards developers working with AWS services, this certification validates the ability to write, deploy, and debug code for serverless applications. AWS Certified SysOps Administrator – Associate: This certification concentrates on system administration tasks, highlighting the operational aspects of AWS, including managing and monitoring AWS resources. Popular Hands-on Labs for AWS To help you kickstart your AWS certification journey, here are a few popular hands-on labs to refer to. Creating NAT gateways to access the internet, for instance, in private subnets using Terraform In this lab, you will be guided through the process of setting up a NAT Gateway and enabling internet access for an instance in a private subnet using Terraform. Task Details Log in to the AWS Management Console. Generate a Key Pair. Configure Visual Studio Code. Establish a variable file. Construct a VPC in the main.tf file. Define Public and Private Subnets in the main.tf file. Create an Internet Gateway within the main.tf file. Formulate a Public route table and configure it in the main.tf file. Launch Public and Private Instances in the main.tf file. Create an output file. Verify the Terraform installation by checking the version. Apply Terraform configurations. Review the AWS Console for the deployed resources. SSH into Public and Private EC2 instances and test internet connectivity. Create a NAT Gateway in the main.tf file. Update the route table and configure the NAT Gateway in the main.tf file. Validate the internet connection from an instance inside the Private Subnet. Complete the lab validation. Remove AWS Resources when no longer needed. Building CRUD Todo apps with Flutter via Amplify datastore to save and restore data in cloud databases This project provides a comprehensive walkthrough for installing and configuring the Amplify Command Line Interface (CLI). After the initial setup, you will create a new Flutter project and define the data model your application will utilize. Subsequently, you will seamlessly integrate the Amplify DataStore with your application, gaining proficiency in using the generated data model to perform various operations, such as creating, updating, querying, and deleting Todo items. As you continue to develop the app, you will also create the backend using Amplify and ensure effective synchronization with the cloud, ensuring a robust and fully functional application. Task Details Initiating the Lab Environment. Installing and Setting up the Amplify CLI. Building a fresh Flutter application and incorporating Amplify. Defining the data model for your application. Generating local data models. Seamlessly integrating Amplify Datastore into your application. Configuring Amplify for your project. Crafting a new Todo item. Efficiently querying Todos and monitoring real-time updates. Managing updates and deletions of Todo items. Deploying the Amplify Sandbox to the backend. Implementing authentication in your application. Ensuring real-time cloud sync verification. Resource cleanup and management. Beginning with Docker – Installation and Setup This extensive lab will walk you through comprehending and setting up Docker on an EC2 instance. Within this lab, you will acquire hands-on experience using Amazon Machine Images (AMI) to initiate Amazon EC2 instances and configure the Docker containerization platform on top of them. Task Details Log in to the AWS Management Console. Create an EC2 instance with your preferred specifications. Securely access the EC2 instance via SSH using the provided key pair. Install Docker for the Linux platform. Initiate the Docker service and ensure it’s enabled. Deploy a sample container and confirm its successful launch. Deploying highly available feedback web apps combining server and serverless modalities This project is a step-by-step guide for setting up a resilient static web application that enables users to submit messages and attached images. These submissions are stored in a DynamoDB table, while the associated images are saved in an S3 Bucket. Within this project’s infrastructure, you’ll find an Application Load Balancer alongside an Auto Scaling Group of Elastic Compute Cloud (EC2) Instances, all operating within a custom VPC. The deployment of the website is facilitated through CodePipeline. Users can easily input data and upload images stored in the S3 Bucket, while the accompanying data is recorded in the DynamoDB table. Task Details Establishing a DynamoDB table. Crafting an S3 bucket for image storage and configuring the bucket policy. Creating a Lambda function. Developing a Rest API. Defining a Resource and Method for the API. Testing the API. Activating CORS support and deploying the API. Initiating a CodeCommit Repository. Configuring Git credentials for secure HTTPS access. Setting up IAM User permissions to access CodeCommit. Generating and downloading HTTPS Git credentials for AWS CodeCommit. Creating an Environment in CloudShell. Connecting to CodeCommit and cloning the repository. Adding the web application code to your CodeCommit Repository. Building a customized VPC. Crafting Public and Private Subnets across two Availability Zones. Creating an Internet Gateway and linking it to the VPC. Establishing and configuring Route tables. Creating a NAT Gateway. Setting up a Bastion host server. Formulating an Application Load Balancer. Establishing a Security group for EC2 Instances. Defining IAM Roles for Launch Templates and CodeDeploy. Constructing a Launch Template. Generating an Auto Scaling group. Confirming the SNS subscription. Creating an application in CodeDeploy. Creating a deployment group for the CodeDeploy application. Initiating a CodePipeline. Testing the web page. Automating EBS Snapshot with CloudWatch Event and obtaining SNS notifications using Terraform In this lab, you will be guided through the process of automating the generation of EBS snapshots using Terraform, CloudWatch, and SNS. Task Details Log in to the AWS Management Console. Configure Visual Studio Code. Establish a variable file. Initiate an EC2 instance within the main.tf file. Define an SNS Topic in the main.tf file. Craft a Lambda Function in the main.tf file. Set up a CloudWatch Event Rule within the main.tf file. Integrate an SNS Destination with Lambda in the main.tf file. Attach a CloudWatch Event Target to Lambda in the main.tf file. Generate an output file. Verify the Terraform installation by checking the version. Execute Terraform configurations. Examine the resources in the AWS Console. Validate the lab’s functionality. Remove AWS Resources when they are no longer needed. Also Read : Top Hands-On Labs To Prepare For AWS Certified Cloud Practitioner Certification Launching EC2 Instances using AWS Lambda Function using Terraform This lab guides you through the process of initiating an EC2 instance using AWS Lambda. Within this lab, we will craft a sample Lambda function through Terraform. This Lambda function, once triggered, will handle the provisioning of an EC2 Instance. Task details Log in to the AWS Management Console. Configure Visual Studio Code. Establish a variable file. Develop a Lambda Function within the main.tf file. Generate an output file. Verify the Terraform installation by inspecting the version. Execute Terraform configurations. Review the resources in the AWS Console. Perform testing on the Lambda Function. Verify the creation of the EC2 Instance. Remove AWS Resources when no longer needed. Creating Docker images with Dockerfile In this in-depth lab, you will be led through the journey of comprehending and setting up Docker on an EC2 instance. Within this lab, you will acquire hands-on experience using Amazon Machine Images (AMI) to initiate Amazon EC2 instances and configure the Docker containerization platform atop them. Task Details Log in to the AWS Management Console. Create an EC2 instance with your preferred specifications. Securely access the EC2 instance via SSH using the provided key pair. Install Docker for the Linux platform. Initiate the Docker service and ensure it’s enabled. Implementing static feedback webpages using a 100% Serverless Architecture This project provides a detailed guide for configuring a static web application that permits users to submit messages along with attached images. These submissions are collected in a DynamoDB table, while the corresponding images are stored in an S3 Bucket. Within this project’s infrastructure, you exclusively leverage serverless services to deploy the web page. Users have the convenience of inputting data and uploading images, knowing that the data is recorded in the DynamoDB table and images are securely stored in the S3 Bucket. Task Details Initiating the Lab Environment. Establishing a DynamoDB table. Crafting an image-storage S3 bucket and implementing a bucket policy. Creating an S3 bucket to house your web page content. Hosting the static webpage. Developing a Lambda function. Formulating a Rest API. Defining a Resource and Method for the API. Conducting API testing. Enabling CORS support and deploying the API. Adapting the index.html page to incorporate the API Gateway endpoint. Testing the Web Page functionality. Managing Resource Cleanup. Dive Deep Into Docker Compose This extensive lab will lead you through the journey of comprehending and configuring Docker on an EC2 instance. Within this lab, you will acquire hands-on expertise by utilizing Amazon Machine Images (AMI) to initiate Amazon EC2 instances and set up the Docker containerization platform atop them. Task Details Sign in to the AWS Management Console. Create an EC2 instance with your preferred specifications. Securely access the EC2 instance via SSH using the provided key pair. Install Docker for the Linux platform. Initiate the Docker service and ensure it’s enabled. Launch a sample container and confirm its functionality. Securing API Gateway using Amazon Cognito User Pools This lab walks you through the steps to secure your API Gateway with Amazon Cognito User Pools, ensuring that only authenticated and authorized users are able to access your APIs and providing a secure solution for user authentication and authorization for your applications. Task Details Log in to the AWS Management Console. Establish a Cognito user pool. Generate a User Account. Craft a Lambda Function. Create an API Gateway Endpoint. Formulate a Resource. Define a Method. Deploy the API. Set up a Cognito Authorizer. Verify the Authorization Token. Configure Authorization to Limit API Access. Test the API. Validate the Lab’s Completion. Remove AWS Resources when they are no longer needed. Summary Hope these hands-on labs for AWS blog help you understand the AWS certifications and discover some of the popular labs for AWS. However, these are just a few. You will find more than a hundred hands-on labs on the internet. These labs cater to the needs of innumerable AWS certifications at once, from beginners to advanced-level courses. However, just locating and operating with any and every hands-on labs for AWS is not the logical way. You must ensure that the lab you are about to access is updated, offers you the latest cloud features to work with, is easy to use, and guides you at the right time. But don’t worry since Whizlabs has got you all covered. Our minutely detailed AWS hands-on labs are curated by industrial experts with years of experience and combine instructive training and hands-on techniques to help you gain confidence and competence to begin your AWS journey. View the full article
  2. AWS Secrets Manager serves as a centralized and user-friendly solution for effectively handling access to all your secrets within the AWS cloud environment. It simplifies the process of rotating, maintaining, and recovering essential items such as database credentials and API keys throughout their lifecycle. A solid grasp of the AWS Secrets Manager concept is a valuable asset on the path to becoming an AWS Certified Developer. In this blog, you are going to see how to retrieve the secrets that exist in the AWS Service Manager with the help of AWS Lambda in virtual lab settings. Let’s dive in! What is a Secret Manager in AWS? AWS Secrets Manager is a tool that assists in safeguarding confidential information required to access your applications, services, and IT assets. This service makes it simple to regularly change, oversee, and access things like database credentials and API keys securely. Consider the AWS Secrets Manager example, users and applications can retrieve these secrets using specific APIs, eliminating the necessity of storing sensitive data in plain text within the code. This enhances security and simplifies the management of secret information. AWS Secrets Manager Pricing AWS Secrets Manager operates on a pay-as-you-go basis, where your costs are determined by the number of secrets you store and the API calls you make. The service is transparent, with no hidden fees or requirements for long-term commitments. Additionally, there is a 30-day AWS Secrets Manager free tier period, which begins when you store your initial secret, allowing you to explore AWS Secrets Manager without any charges. Once the free trial period ends, you will be billed at a rate of $0.40 per secret each month, and $0.05 for every 10,000 API calls. AWS Secrets Manager Vs Parameter Score What are AWS Lambda functions? AWS Lambda is a service for creating applications that eliminates the need to manually set up or oversee servers. AWS Lambda functions frequently require access to sensitive information like certificates, API keys, or database passwords. It’s crucial to keep these secrets separate from the function code to prevent exposing them in the source code of your application. By using an external secrets manager, you can enhance security and avoid unintentional exposure. Secrets managers offer benefits like access control, auditing, and the ability to manage secret rotation. It’s essential not to store secrets in Lambda configuration environment variables, as these can be seen by anyone with access to view the function’s configuration settings. Architecture Diagram for retrieving secretes in AWS Secrets Manager with AWS Lambda When Lambda invokes your function for the first time, it creates a runtime environment. First, it runs the function’s initialization code, which includes everything outside of the main handler. After that, Lambda executes the function’s handler code, which receives the event payload and processes your application’s logic. For subsequent invocations, Lambda can reuse the same runtime environment. To access secrets, you have a couple of options. One way is to retrieve the secret during each function invocation from within your handler code. This ensures you always have the most up-to-date secret, but it can lead to longer execution times and higher costs, as you’re making a call to the secret manager every time. There may also be additional costs associated with retrieving secrets from the Secret Manager. Another approach is to retrieve the secret during the function’s initialization process. This means you fetch the secret once when the runtime environment is set up, and then you can reuse that secret during subsequent invocations, improving cost efficiency and performance. The Serverless Land pattern example demonstrates how to retrieve a secret during the initialization phase using Node.js and top-level await. If the secret might change between invocations, make sure your handler can verify the secret’s validity and, if necessary, retrieve the updated secret. Another method to optimize this process is to use Lambda extensions. These extensions can fetch secrets from Secrets Manager, cache them, and automatically refresh the cache based on a specified time interval. The extension retrieves the secret from Secrets Manager before the initialization process and provides it via a local HTTP endpoint. Your function can then get the secret from this local endpoint, which is faster than direct retrieval from Secrets Manager. Moreover, you can share the extension among multiple functions, reducing code duplication. The extension takes care of refreshing the cache at the right intervention to ensure that your function always has access to the most recent secret, which enhances reliability. Guidelines to retrieve secrets stored in AWS Secrets Manager with AWS Lambda To retrieve the secrets retained in the AWS Secret Manager with the help of AWS Lambda, you can follow these guided instructions: First, you need to access the Whizlabs Labs library. Click on guided labs on the left side of the lab’s homepage and enter the lab name in the search lab tab. Now, you have found the guided lab for the topic you have entered in the search tab. By clicking on this lab, you can see the lab overview section. Upon reviewing the lab instructions, you may initiate the lab by selecting the “Start Lab” option located on the right side of the screen. Tasks involved in this guided lab are as follows: Task 1: Sign in to the AWS Management Console Start by accessing the AWS Management Console and set the region to N. Virginia a.You need to ensure that you do not edit or remove the 12-digit Account ID in the AWS Console. Copy your username and password from the Lab Console, then paste them into the IAM Username and Password fields in the AWS Console. Afterward, click the ‘Sign in’ button. Task 2: Create a Lambda Function Navigate to the Lambda service. Create a new Lambda function named “WhizFunction” with the runtime set to Python 3.8. Configure the function’s execution role and use the existing role named “Lambda_Secret_Access.” Adjust the function’s timeout to 2 minutes. Adjust the function’s timeout to 2 minutes. Task 3: Write a Lambda to Hard-code Access Keys Develop a Lambda function that creates a DynamoDB table and inserts items. This code will include hard-coded access keys. Download the code provided in the lab document. Replace the existing code in the Lambda function “WhizFunction” with the code from “Code1” in the downloaded zip file. Make sure to change the AWS Access Key and AWS Secret Access Key as instructed in the lab document. Deploy the code and configure a test event named “WhizEvent.” Run the test to create a DynamoDB table with i followed by configuration of the test event. Now click on the save button and click the test button to execute the code. The DynamoDB table was created successfully with some data fields. Task 4: View the DynamoDB Table in the Console Access the DynamoDB service by searching service in the top left corner. In the “Tables” section, you will find a table named “Whizlabs_stud_table1.” You can view the items within the table by selecting the table and clicking “Explore table items.” Task 5: Write a Lambda Code to Return Table Data Modify the Lambda function “WhizFunction” to write code that retrieves data from the DynamoDB table. Replace the existing code with the code from “Code2” in the lab document, making the necessary AWS Access Key and AWS Secret Access Key changes. Deploy the code and execute a test to enable the Lambda function to return data from the table. Task 6: Create a Secret Manager to Store Access Keys Access AWS Secrets Manager and make sure you are in the N. Virginia Region. Create a new secret by specifying it as “Other Type of Secret.” Enter the Access Key and Secret Access Key as key-value pairs. Choose the default encryption key. Name the secret “whizsecret” and proceed with the default settings. Review and store the secret and copy the Secret ARN for later use. Task 7: Write a Lambda to Create DynamoDB Items Using Secrets Manager Modify the Lambda function to create a new DynamoDB table and insert items by retrieving access keys from Secrets Manager. Replace the code with the code from “Code3” in the lab document, updating the Secret ARN. Deploy the code and run a test to create the DynamoDB table and items securely. Task 8: View the DynamoDB Table in the Console Access the DynamoDB service. In the “Tables” section, you will find a table named “Whizlabs_stud_table2.” To view the items, select the table and click “Explore table items.” Task 9: Write a Lambda Code to View Table Items Using Secrets Manager. Modify the Lambda function to write code that fetches table items securely using access and secret keys stored in Secrets Manager. Replace the code with the code from “Code4” in the lab document, updating the Secret ARN. Deploy the code and execute a test to securely access and view table items. Task 10: Cleanup AWS Resources Finally, you can delete the Lambda function “WhizFunction.” Delete both DynamoDB tables created. Delete the secret “whizsecret” from AWS Secrets Manager. Schedule its deletion with a waiting period of 7 days to ensure cleanup. Finally, end the lab by signing out from the AWS Management console. Also Read : Free AWS Developer Associate Exam Questions FAQs How much does the AWS Secret Manager parameter store cost? Parameter Store doesn’t incur any extra costs. However, there is a maximum limit of 10,000 parameters that you can store. What can be stored in AWS secrets manager? AWS Secrets Manager serves as a versatile solution for storing and managing a variety of sensitive information. This includes but is not limited to database credentials, application credentials, OAuth tokens, API keys, and various other secrets essential for different aspects of your operations. It’s important to note that several AWS services seamlessly integrate with Secrets Manager to securely handle and utilize these confidential data points throughout their entire lifecycle. What is the length limit for the AWS secrets manager? In the Secrets Manager console, data is stored in the form of a JSON structure, consisting of key/value pairs that can be easily parsed by a Lambda rotation function. AWS Secret manager limits range from 1 character to 65536 characters. Also, it’s important to note that the tag key names in Secrets Manager are case-sensitive. What are the benefits of AWS Secrets Manager? Secrets Manager provides a secure way to save and oversee your credentials. It makes the process of modifying or rotating your credentials easy, without requiring any complex code or configuration adjustments. Instead of embedding credentials directly in your code or configuration files, you can opt to store them safely using Secrets Manager. What is the best practice for an AWS secrets manager? You can adhere to the below listed AWS Secrets Manager best practices to carry out the secret storing in a better way: Make sure that the AWS Secrets Manager service applies encryption for data at rest by using Key Management Service (KMS) Customer Master Keys (CMKs). Ensure that automatic rotation is turned on for your Amazon Secrets Manager secrets. Also, confirm that the rotation schedule for Amazon Secrets Manager is set up correctly. Conclusion Hope this blog equips you with the knowledge and skills to effectively manage secrets within AWS, ensuring the protection of your critical data. Following the above AWS Secrets Manager tutorial steps can help to access the sensitive information stored in Secret Manager securely with the usage of AWS Lambda. You can also opt for AWS Sandbox to play around with the AWS platform. View the full article
  • Forum Statistics

    43.8k
    Total Topics
    43.3k
    Total Posts
×
×
  • Create New...