Jump to content

Search the Community

Showing results for tags 'data streaming'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • General
    • General Discussion
    • Artificial Intelligence
    • DevOpsForum News
  • DevOps & SRE
    • DevOps & SRE General Discussion
    • Databases, Data Engineering & Data Science
    • Development & Programming
    • CI/CD, GitOps, Orchestration & Scheduling
    • Docker, Containers, Microservices, Serverless & Virtualization
    • Infrastructure-as-Code
    • Kubernetes & Container Orchestration
    • Linux
    • Logging, Monitoring & Observability
    • Security, Governance, Risk & Compliance
  • Cloud Providers
    • Amazon Web Services
    • Google Cloud Platform
    • Microsoft Azure

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Website URL


LinkedIn Profile URL


About Me


Cloud Platforms


Cloud Experience


Development Experience


Current Role


Skills


Certifications


Favourite Tools


Interests

Found 4 results

  1. Amazon Data Firehose (Firehose) now offers direct integration with Snowflake Snowpipe Streaming. Firehose enables customers to reliably capture, transform, and deliver data streams into Amazon S3, Amazon Redshift, Splunk, and other destinations for analytics. With this new feature, customers can stream clickstream, application, and AWS service logs from multiple sources, including Kinesis Data Streams, to Snowflake. With a few clicks, customers can setup a Firehose stream to deliver data to Snowflake. Firehose automatically scales to stream gigabytes of data, and records are available in Snowflake within seconds. View the full article
  2. Executives across various industries are under pressure to reach insights and make decisions quickly. This is driving the importance of streaming data and analytics, which play a crucial role in making better-informed decisions that likely lead to faster, better outcomes. While traditional systems store and process data in batches, streaming data refers to data that is continuously generated from a variety of sources. Data streaming — the process of continuously capturing and processing data at a certain speed — can encompass a broad range of latency, depending on business needs. Contrary to the traditional perception that “streaming must be in milliseconds,” streaming is a spectrum of latency that handles continuous streams of data in seconds, minutes or even hours. However, the process of data streaming presents challenges. For data to have business value, it needs to be ingested from diverse sources at low latency. But this is often associated with high complexity and cost, forcing organizations to strike a balance between latency and cost. Leaders across industries are addressing these challenges and preparing for future business requirements with Snowflake’s simplified solution, which combines streaming and batch pipelines in a single layer of architecture. By leveraging Snowflake and collaborating with partners such as AWS to centralize their data in a single platform, organizations are achieving more cost-effective streaming. An Industry View of Streaming Use Cases and Architectures In our new ebook, The Modern Data Streaming Pipeline, Snowflake engaged with dozens of customers across seven diverse sectors — financial services, manufacturing, healthcare, cybersecurity, retail, advertising and telecommunications — to explore their most common streaming use cases and examine their architecture choices for optimizing performance and efficiency. Let’s use manufacturing as an example. Data streaming helps manufacturing companies ingest critical data from across the value chain, such as sensor readings from production equipment, inventory levels, supplier performance metrics and customer demand patterns. This continuous flow of data opens opportunities for manufacturers, enabling them to generate new revenue streams with user behavior data, reduce operating costs by detecting potential issues and improve product quality through equipment performance data. Here’s a manufacturing reference architecture for IoT analytics to illustrate what this can look like in Snowflake: Smart devices, sensors and other IoT devices generate continuous and streaming data. Due to frequently unreliable internet connectivity, IoT devices communicate using IoT protocols such as MQTT and an IoT message broker. The message broker uses a publish and subscribe mechanism to interact with other services, which subscribe to specific topics within the broker to access device data. A streaming service can be used to ingest and buffer real-time device data together with Snowpipe Streaming for row-set data to ensure reliable ingestion and delivery to a staging table in Snowflake. For strong orchestration needs, Snowflake’s Streams and Tasks features automate the workflows required to aggregate incoming data. You can optionally use Dynamic Tables (currently in public preview) for aggregating and materializing the aggregated results. Use Dynamic Tables to automate incremental processing for continuous data transformation. Snowpark can be used to further enrich and validate data. Data then is used for transformation with business logic and/or machine learning training with Snowpark. Manufacturing partners such as DXC, Infosys, Kipi and LTI offer IoT solutions integrated with Snowflake’s streaming capabilities. And that’s just the beginning. Other customers across this and the other six industries shared how they’re using Snowflake for their top streaming analytics use cases — from advanced personalization in retail, to medical IoT device ingestion in healthcare, to regulatory reporting in financial services. Their recommended streaming reference architectures can help optimize performance and efficiency for these high-demand use cases. To discover how organizations in your industry are using Snowflake for streaming analytics — and realizing significant gains in the process — download our ebook: The Modern Data Streaming Pipeline: Top Streaming Architectures and Use Cases Across 7 Industries. The post The Modern Data Streaming Pipeline: Streaming Reference Architectures and Use Cases Across 7 Industries appeared first on Snowflake. View the full article
  3. Today, Amazon Kinesis Data Streams adds the ability for you to run SQL queries with one click in the AWS Management Console using Amazon Managed Service for Apache Flink. With this new capability, you can easily analyze and visualize the data in your streams in real-time. View the full article
  4. Streaming data pipelines have become an essential component in modern data-driven organizations. These pipelines enable real-time data ingestion, processing, transformation, and analysis. In this article, we will delve into the architecture and essential details of building a streaming data pipeline. Data Ingestion Data ingestion is the first stage of streaming a data pipeline. It involves capturing data from various sources such as Kafka, MQTT, log files, or APIs. Common techniques for data ingestion include: View the full article
  • Forum Statistics

    43.8k
    Total Topics
    43.3k
    Total Posts
×
×
  • Create New...