Jump to content

Search the Community

Showing results for tags 'amazon kinesis'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

There are no results to display.

There are no results to display.


Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Website URL


LinkedIn Profile URL


About Me


Cloud Platforms


Cloud Experience


Development Experience


Current Role


Skills


Certifications


Favourite Tools


Interests

Found 4 results

  1. Welcome to March’s post announcing new training and certification updates — helping equip you and your teams with the skills to work with AWS services and solutions. This month we launched eight new digital training products on AWS Skill Builder, including four new AWS Builder Labs and a free learning plan called, Generative AI Developer Kit. We also have three new, and one updated AWS Classroom Training courses—two of which have AWS Partner versions—including Developing Generative AI Applications on AWS. A reminder: registration is now open for the new AWS Certified Data Engineer – Associate exam. You can begin preparing with curated exam prep resources, created by the experts at AWS, on AWS Skill Builder. Missed our February course update? Check it out here. New AWS Skill Builder subscription features AWS Skill Builder subscriptions are available globally, including Mainland China as of this month, and unlock enhanced AWS Certification exam prep and hands-on AWS Cloud training including 1,000+ interactive learning and lab experiences like AWS Cloud Quest, AWS Industry Quest, AWS Builder Labs, and AWS Jam challenges. Select plans offer access to AWS Digital Classroom courses to dive deep with expert instruction. Try a 7-day free trail of Individual subscription. *terms and conditions apply AWS Builder Labs Migrate On-Premises Servers to AWS Using Application Migration Service (MGN) (60 min.) is an intermediate-level lab providing you an opportunity to learn how to use AWS Application Migration Service to migrate an existing workload to AWS. Migrate On-premises Databases to AWS Using AWS Database Migration Service (DMS) (75 min.) is an intermediate-level lab providing you an opportunity to learn how to use AWS Database Migration Service to migrate an existing database to Amazon Aurora. Data Modeling for Amazon Neptune (60 min.) is an intermediate-level lab providing you an opportunity to explore the process of modeling data with Amazon Neptune to meet prescribed use cases. Analyzing CloudWatch Logs with Kinesis Data Streams and Kinesis Data Analytics(4 hr.) is an advanced-level, challenge-based lab allowing you to learn how to use Amazon CloudWatch to collect Amazon Elastic Compute Cloud (EC2) system logs and use Amazon Kinesis to analyze the collected data. AWS Certification exam preparation and updates Now available: AWS Certified Data Engineer – Associate Registration is now open for the AWS Certified Data Engineer – Associate. Showcase your knowledge and skills in core data-related AWS services, implementing data pipelines, and providing high-quality data for business insights. Gain confidence going into exam day with trusted exam prep on AWS Skill Builder, including an Official Pretest, available now in all exam languages. Free digital courses on AWS Skill Builder The following digital courses on AWS Skill Builder are free to all learners, along with 600+ free digital courses and learning plans. Digital learning plan Generative AI Developer Kit (includes labs) (16h 30 min.) is a collection of curated courses, labs, and challenges to develop the skills needed to build generative AI applications. Software developers interested in leveraging large language models without fine-tuning will benefit from this collection. You’ll receive an overview of generative AI, learn to plan a generative AI project, get started with Amazon CodeWhisperer and Amazon Bedrock, learn the foundations of prompt engineering, and discover the architecture patterns to build generative AI applications using Amazon Bedrock and Langchain. Digital courses Decarbonization with AWS Introduction (15 min.) is a fundamental-level course that teaches you about AWS Customer Carbon Footprint Tool and other resources that can be used to advance your sustainability goals. You’ll learn how businesses use the AWS Customer Carbon Footprint Tool, how it helps you reduce your carbon footprint and achieve decarbonization goals with AWS, and considerations for using the tool for a variety of optimal usage and cost savings considerations. Amazon Redshift Introduction (15 min.) is a fundamental-level course that provides an introduction to Amazon Redshift, including its common uses and benefits. AWS Mainframe Modernization – Using Replatform Tools with Amazon AppStream (60 min.) is an intermediate-level course teaching the setup and usage of Micro Focus tools from OpenText, such as Enterprise Analyzer and Enterprise Developer, with Amazon AppStream 2.0. AWS Classroom Training Designing and Implementing Storage on AWS is a three-day, intermediate-level course teaching you to select, design, implement, and optimize secure storage solutions to save on time and cost, improve performance and scale, and accelerate innovation. You’ll explore AWS storage services and solutions for storing, accessing, and protecting your data. An expert AWS instructor will help you understand where, how, and when to take advantage of different storage services. Learn how to best evaluate the appropriate AWS storage service options to meet your use case and business requirements. Build Modern Applications with AWS NoSQL Databases is a one-day, intermediate-level course to help you understand how to build applications that involve complex data characteristics and millisecond performance requirements from your databases. You’ll learn to use purpose-built databases to build typical modern applications with diverse access patterns and real-time scaling needs. AnAWS Partner version is also available. Running Containers on Amazon Elastic Kubernetes Service (Amazon EKS) is an updated, three-day, intermediate-level course from an expert AWS instructor that teaches container management and orchestration for Kubernetes using Amazon EKS. You’ll build an Amazon EKS cluster, configure the environment, deploy the cluster, and add applications to your cluster. Learn how to also manage container images using Amazon Elastic Container Registry (ECR) and automate application deployment. Developing Generative AI Applications on AWS is a two-day, advanced-level course that teaches you the basics, benefits, and associated terminology of generative AI. An expert AWS instructor will guide you through planning a generative AI project and the foundations of prompt engineering to develop generative AI applications with AWS services. By the end of the course, you’ll have the skills needed to build applications that can generate and summarize text, answer questions, and interact with users using a chatbot interface. An AWS Partner version is also available. View the full article
  2. A multi-account architecture on AWS is essential for enhancing security, compliance, and resource management by isolating workloads, enabling granular cost allocation, and facilitating collaboration across distinct environments. It also mitigates risks, improves scalability, and allows for advanced networking configurations. In a streaming architecture, you may have event producers, stream storage, and event consumers in a single account or spread across different accounts depending on your business and IT requirements. For example, your company may want to centralize its clickstream data or log data from multiple different producers across different accounts. Data consumers from marketing, product engineering, or analytics require access to the same streaming data across accounts, which requires the ability to deliver a multi-account streaming architecture. To build a multi-account streaming architecture, you can use Amazon Kinesis Data Streams as the stream storage and AWS Lambda as the event consumer. Amazon Kinesis Data Streams enables real-time processing of streaming data at scale. When integrated with Lambda, it allows for serverless data processing, enabling you to analyze and react to data streams in real time without managing infrastructure. This integration supports various use cases, including real-time analytics, log processing, Internet of Things (IoT) data ingestion, and more, making it valuable for businesses requiring timely insights from their streaming data. In this post, we demonstrate how you can process data ingested into a stream in one account with a Lambda function in another account. The recent launch of Kinesis Data Streams support for resource-based policies enables invoking a Lambda from another account. With a resource-based policy, you can specify AWS accounts, AWS Identity and Access Management (IAM) users, or IAM roles and the exact Kinesis Data Streams actions for which you want to grant access. After access is granted, you can configure a Lambda function in another account to start processing the data stream belonging to your account. This reduces cost and simplifies the data processing pipeline, because you no longer have to copy streaming data using Lambda functions in both accounts. Sharing access to your data streams or registered consumers does not incur additional charges to your account. Cross-account usage of Kinesis Data Streams resources will continue to be billed to the resource owners. In this post, we use Kinesis Data Streams with enhanced fan-out feature, empowering consumers with dedicated read throughput tailored to their applications. By default, Kinesis Data Streams offers shared read throughput of 2 MB/sec per shard across consumers, but with enhanced fan-out, each consumer can enjoy dedicated throughput of 2 MB/sec per shard. This flexibility allows you to seamlessly adapt Kinesis Data Streams to your specific requirements, choosing between enhanced fan-out for dedicated throughput or shared throughput according to your needs. Solution overview For our solution, we deploy Kinesis Data Streams in Account 1 and Lambda as the consumer in Account 2 to receive data from the data stream. The following diagram illustrates the high-level architecture. The setup requires the following key elements: Kinesis data stream in Account 1 and Lambda function in Account 2 Kinesis Data Streams resource policies in Account 1, allowing a cross-account Lambda execution role to perform operations on the Kinesis data stream A Lambda execution role in Account 2 and an enhanced fan-out consumer resource policy in Account 1, allowing the cross-account Lambda execution role to perform operations on the Kinesis data stream For the setup, you use three AWS CloudFormation templates to create the key resources: CloudFormation template 1 creates the following key resources in Account 1: Kinesis data stream Kinesis data stream enhanced fan-out consumer CloudFormation template 2 creates the following key resources in Account 2: Consumer Lambda function Consumer Lambda function execution role CloudFormation template 3 creates the following resource in Account 2: Consumer Lambda function event source mapping The solution supports single-Region deployment, and the CloudFormation templates must be deployed in the same Region across different AWS accounts. In this solution, we use Kinesis Data Streams enhanced fan-out, which is a best practice for deploying architectures requiring large throughput across multiple consumers. Complete the steps in the following sections to deploy this solution. Prerequisites You should have two AWS accounts and the required permissions to run a CloudFormation template to create the services mentioned in the solution architecture. You also need the AWS Command Line Interface (AWS CLI) installed, version 2.15 and above. Launch CloudFormation template 1 Complete the following steps to launch the first CloudFormation template: Sign in to the AWS Management Console as Account 1 and select the appropriate AWS Region. Download and launch CloudFormation template 1 where you want to deploy your Kinesis data stream. For LambdaConsumerAccountId, enter your Lambda consumer account ID and click submit. The CloudFormation template deployment will take a few minutes to complete. When the stack is complete, on the AWS CloudFormation console, navigate to the stack Outputs tab and copy the values of following parameters: KinesisStreamArn KinesisStreamEFOConsumerArn KMSKeyArn You will need these values in later steps. Launch CloudFormation template 2 Complete the following steps to launch the second CloudFormation template: Sign in to the console as Account 2 and select the appropriate Region. Download and launch CloudFormation template 2 where you want to host the Lambda consumer. Provide the following input parameters captured from the previous step: KinesisStreamArn KinesisStreamEFOConsumerArn KMSKeyArn The CloudFormation template creates the following key resources: Lambda consumer Lambda execution role The Lambda function’s execution role is an IAM role that grants the function permission to access AWS services and resources. Here, you create a Lambda execution role that has the required Kinesis Data Streams and Lambda invocation permissions. The CloudFormation template deployment will take a few minutes to complete. When the stack is complete, on the AWS CloudFormation console, navigate to the stack Outputs tab and copy the values of following parameters: KinesisStreamCreateResourcePolicyCommand KinesisStreamEFOConsumerCreateResourcePolicyCommand Run the following AWS CLI commands in Account 1 using AWS CloudShell. We recommend using CloudShell because it will have the latest version of the AWS CLI and avoid any kind of failures. KinesisStreamCreateResourcePolicyCommand – This creates the resource policy in Account 1 for Kinesis Data Stream. The following is a sample resource policy: { "Version": "2012-10-17", "Statement": [ { "Sid": "StreamEFOReadStatementID", "Effect": "Allow", "Principal": { "AWS": [ "arn:aws:iam::<AWS Lambda - Consumer account id>:role/kds-cross-account-stream-consumer-lambda-execution-role" ] }, "Action": [ "kinesis:DescribeStreamSummary", "kinesis:ListShards", "kinesis:DescribeStream", "kinesis:GetRecords", "kinesis:GetShardIterator" ], "Resource": "arn:aws:kinesis:<region id>:<Account 1 - Amazon KDS account id>:stream/kds-cross-account-stream" } ] } KinesisStreamEFOConsumerCreateResourcePolicyCommand – This creates the resource policy for the enhanced fan-out consumer for the Kinesis data stream in Account 1. The following is a sample resource policy: { "Version": "2012-10-17", "Statement": [ { "Sid": "ConsumerEFOReadStatementID", "Effect": "Allow", "Principal": { "AWS": [ " arn:aws:iam::<AWS Lambda - Consumer account id>:role/kds-cross-account-stream-consumer-lambda-execution-role" ] }, "Action": [ "kinesis:DescribeStreamConsumer", "kinesis:SubscribeToShard" ], "Resource": "arn:aws:kinesis:<region id>:<Account 1 - Amazon KDS account id>:stream/kds-cross-account-stream/consumer/kds-cross-account-stream-efo-consumer:1706616477" } ] } You can also access this policy on the Kinesis Data Streams console, under Enhanced fan-out, Consumer name, and Consumer sharing resource-based policy. Launch CloudFormation template 3 Now that you have created resource policies in Account 1 for the Kinesis data stream and its enhanced fan-out consumer, you can create Lambda event source mapping for the consumer Lambda function in Account 2. Complete the following steps: Sign in to the console as Account 2 and select the appropriate Region. Download and launch CloudFormation template 3 to update the stack you created using CloudFormation template 2. The CloudFormation template creates the Lambda event source mapping. Validate the solution At this point, the deployment is complete. A Kinesis data stream is available to consume the messages and a Lambda function receives these messages in the destination account. To send sample messages to the data stream in Account 1, run the following AWS CLI command using CloudShell: aws kinesis put-record --stream-name kds-cross-account-stream --data sampledatarecord --partition-key samplepartitionkey3 --region <region id> The Lambda function in Account 2 is able to receive the messages, and you should be able to verify the same using Amazon CloudWatch logs: On the CloudWatch console, choose Log groups in the navigation pane. Locate the log group /aws/lambda/kds-cross-account-stream-efo-consumer. Choose Search log group to view the relevant log messages. The following is an example message: "Records": [ { "kinesis": { "kinesisSchemaVersion": "1.0", "partitionKey": "samplepartitionkey3", "sequenceNumber": "49648798411111169765201534322676841348246990356337393698", "data": "sampledatarecord", "approximateArrivalTimestamp": 1706623274.658 }, Clean up It’s always a good practice to clean up all the resources you created as part of this post to avoid any additional cost. To clean up your resources, delete the respective CloudFormation stacks from Accounts 1 and 2, and stop the producer from pushing events to the Kinesis data stream. This makes sure that you are not charged unnecessarily. Summary In this post, we demonstrated how to configure a cross-account Lambda integration with Kinesis Data Streams using AWS resource-based policies. This enables processing of data ingested into a stream within one AWS account through a Lambda function located in another account. To support customers who use a Kinesis data stream in their central account and have multiple consumers reading data from it, we have used the Kinesis Data Streams enhanced fan-out feature. To get started, open the Kinesis Data Streams console or use the new API PutResourcePolicy to attach a resource policy to your data stream or consumer. About the authors Pratik Patel is Sr. Technical Account Manager and streaming analytics specialist. He works with AWS customers and provides ongoing support and technical guidance to help plan and build solutions using best practices and proactively keep customers’ AWS environments operationally healthy. Amar is a Senior Solutions Architect at Amazon AWS in the UK. He works across power, utilities, manufacturing and automotive customers on strategic implementations, specializing in using AWS Streaming and advanced data analytics solutions, to drive optimal business outcomes. View the full article
  3. Today, Amazon Kinesis Data Streams adds the ability for you to run SQL queries with one click in the AWS Management Console using Amazon Managed Service for Apache Flink. With this new capability, you can easily analyze and visualize the data in your streams in real-time. View the full article
  4. You can now publish the Redis slow log from your Amazon ElastiCache for Redis clusters to Amazon CloudWatch Logs and Amazon Kinesis Data Firehose. The Redis slow log provides visibility into the execution time of commands in your Redis cluster, enabling you to continuously monitor the performance of these operations. You can choose to send these logs in either JSON or text format to Amazon CloudWatch Logs and Amazon Kinesis Data Firehose. View the full article
  • Forum Statistics

    67.7k
    Total Topics
    65.6k
    Total Posts
×
×
  • Create New...