Jump to content

Search the Community

Showing results for tags 'workflows'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • General
    • General Discussion
    • Artificial Intelligence
    • DevOpsForum News
  • DevOps & SRE
    • DevOps & SRE General Discussion
    • Databases, Data Engineering & Data Science
    • Development & Programming
    • CI/CD, GitOps, Orchestration & Scheduling
    • Docker, Containers, Microservices, Serverless & Virtualization
    • Infrastructure-as-Code
    • Kubernetes & Container Orchestration
    • Linux
    • Logging, Monitoring & Observability
    • Security, Governance, Risk & Compliance
  • Cloud Providers
    • Amazon Web Services
    • Google Cloud Platform
    • Microsoft Azure

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Website URL


LinkedIn Profile URL


About Me


Cloud Platforms


Cloud Experience


Development Experience


Current Role


Skills


Certifications


Favourite Tools


Interests

Found 5 results

  1. ETL, or Extract, Transform, Load, serves as the backbone for data-driven decision-making in today's rapidly evolving business landscape. However, traditional ETL processes often suffer from challenges like high operational costs, error-prone execution, and difficulty scaling. Enter automation—a strategy not merely as a facilitator but a necessity to alleviate these burdens. So, let's dive into the transformative impact of automating ETL workflows, the tools that make it possible, and methodologies that ensure robustness. The Evolution of ETL Gone are the days when ETL processes were relegated to batch jobs that ran in isolation, churning through records in an overnight slog. The advent of big data and real-time analytics has fundamentally altered the expectations from ETL processes. As Doug Cutting, the co-creator of Hadoop, aptly said, "The world is one big data problem." This statement resonates more than ever as we are bombarded with diverse, voluminous, and fast-moving data from myriad sources. View the full article
  2. Git workflows are powerful tools that help streamline the software development process. Here are five of the most popular. View the full article
  3. This post demonstrates a proof-of-concept implementation that uses Kubernetes to execute code in response to an event. View the full article
  4. You can now run graph analytics and machine learning tasks on graph data stored in Amazon Neptune using an open-source Python integration that simplifies data science and ML workflows. With this integration, you can read and write graph data stored in Neptune using Pandas DataFrames in any Python environment, such as a local Jupyter notebook instance, Amazon SageMaker Studio, AWS Lambda, or other compute resources. From there, you can run graph algorithms, such as PageRank and Connected Components, using open-source libraries like iGraph, Network, and cuGraph. View the full article
  5. AWS Step Functions now supports the synchronous executions of Express Workflows, allowing you to easily build web-based applications and orchestrate high-volume, short-duration microservices. View the full article
  • Forum Statistics

    43.2k
    Total Topics
    42.5k
    Total Posts
×
×
  • Create New...