Databases
Database Design
SQL Optimization
Database Administration
NoSQL Databases
Data Warehousing
Performance Tuning
Cloud Databases
Query Troubleshooting
200 topics in this forum
-
-
At Snowflake, we are committed to providing our customers with industry-leading LLMs. We’re pleased to bring Meta’s latest Llama 4 models to Snowflake Cortex AI! Llama 4 models deliver performant inference so customers can build enterprise-grade generative AI applications and deliver personalized experiences. The Llama 4 Maverick and Llama 4 Scout models can be accessed within the secure Snowflake perimeter on Cortex AI. According to Meta, Llama 4 Scout is the best multimodal model in the world in its class and supports an industry-leading context window of up to 10M tokens. According to Meta, these models are trained with large amounts of unlabeled text, image and video…
-
- 0 replies
- 41 views
-
-
Amazon Relational Database Service (Amazon RDS) Custom for SQL Server now supports a new minor version for SQL Server 2019 (CU32 - 15.0.4430.1). This minor version includes performance improvements and bug fixes, and is available for SQL Server Developer, Web, Standard, and Enterprise editions. Review the Microsoft release notes for CU32 for details. We recommend that you upgrade to the latest minor version to benefit from the performance improvements and bug fixes. You can upgrade with just a few clicks in the Amazon RDS Management Console or by using the AWS SDK or CLI. Learn more about upgrading your database instances from the Amazon RDS Custom User Guide. This …
-
- 0 replies
- 20 views
-
-
Today, AWS announces an updated service level agreement (SLA) for Amazon Neptune, increasing the Monthly Uptime Percentage for Multi-AZ DB Instance, Multi-AZ DB Cluster, and Multi-AZ Graph from 99.90% to 99.99%. This enhancement reflects AWS’s continued commitment to providing a highly available and reliable graph database service for your mission-critical applications. With this new SLA, AWS will use commercially reasonable efforts to make each Amazon Neptune’s Multi-AZ DB Instance, Multi-AZ DB Cluster, and Multi-AZ Graph available with a Monthly Uptime Percentage, during any monthly billing cycle, of at least 99.99%. If Neptune does not meet this Service Commitment, …
-
- 0 replies
- 29 views
-
-
Amazon Relational Database Service (RDS) for PostgreSQL announces Amazon RDS Extended Support minor version 11.22-rds.20250220 and 12.22-rds.20250220. We recommend that you upgrade to this version to fix known security vulnerabilities and bugs in prior versions of PostgreSQL. Amazon RDS Extended Support provides you more time, up to three years, to upgrade to a new major version to help you meet your business requirements. During Extended Support, Amazon RDS will provide critical security and bug fixes for your RDS for PostgreSQL databases after the community ends support for a major version. You can run your PostgreSQL databases on Amazon RDS with Extended Support for…
-
- 0 replies
- 21 views
-
-
-
Data storage has been evolving, from databases to data warehouses and expansive data lakes, with each architecture responding to different business and data needs. Traditional databases excelled at structured data and transactional workloads but struggled with performance at scale as data volumes grew. The data warehouse solved for performance and scale but, much like the databases that preceded it, relied on proprietary formats to build vertically integrated systems. Data lake systems moved to more open formats but lacked the functional benefits that warehouses provide, such as ACID-compliant transactions, comprehensive governance and more. Ultimately, users found themse…
-
- 0 replies
- 32 views
-
-
Despite the best efforts of many ML teams, most models still never make it to production due to disparate tooling, which often leads to fragmented data and ML pipelines and complex infrastructure management. Snowflake has continuously focused on making it easier and faster for customers to bring advanced models into production. In 2024, we launched over 200 AI features, including a full suite of end-to-end ML features in Snowflake ML, our integrated set of capabilities for machine learning model development, inference and operationalization. We are thrilled to continue the momentum this year by announcing that the following capabilities for GPU-powered ML workflows are no…
-
- 0 replies
- 24 views
-
-
-
As privacy standards continue to evolve, businesses face a dual challenge: to uphold ethical standards for data use while seizing the opportunities offered by data collaboration. Enter data clean rooms: a privacy-enhancing solution that allows organizations to share valuable insights without compromising compliance.* If you're new to data clean rooms, our recent blog post “Data Clean Rooms Explained: What You Need to Know About Privacy-First Collaboration” breaks down the fundamentals. While the potential of data clean rooms is vast, implementing them successfully can be as complex as any enterprise technology project. Between navigating regulatory policies and addressing…
-
- 0 replies
- 22 views
-
-
-
Relational databases like Oracle have been the backbone of enterprise data management for years. However, as data volumes grow and the need for flexibility, scalability, and advanced analytics increases, modern solutions like Apache Iceberg are becoming essential. Iceberg’s open architecture and advanced features make it a compelling choice for organizations looking to optimize their data […]View the full article
-
- 0 replies
- 22 views
-
-
Does your organization rely on real-time analytics for decision-making, or is your product itself a real-time application? Either way, systems majorly fail when the database can’t keep up. That’s why Amazon introduced DynamoDB, a serverless, cloud database that tracks data modifications in real time through change data capture(CDC). In this article, we’ll discuss DynamoDB CDC, […]View the full article
-
- 0 replies
- 23 views
-
-
Relational databases like Postgres have been the backbone of enterprise data management for years. However, as data volumes grow and the need for flexibility, scalability, and advanced analytics increases, modern solutions like Apache Iceberg are becoming essential. Iceberg’s open architecture and advanced features make it a compelling choice for organizations looking to optimize their data […]View the full article
-
- 0 replies
- 18 views
-
-
-
It is the 21st century and you are leading a fast-growing fintech startup that is about to hit a breaking point. The data team has doubled in size over six months, but chaos is reigning. Analysts are wasting hours reconciling conflicting reports, engineers are scrambling to fix broken pipelines, and leaders can’t agree on priorities. […]View the full article
-
- 0 replies
- 17 views
-
-
You work with data to gain insights, improve decisions, and develop new ideas. With more and more data coming from all sorts of places, it’s super important to have a good data plan. That’s where big data integration comes in! It’s all about combining data from different sources to get a complete picture. For today’s […]View the full article
-
- 0 replies
- 17 views
-
-
Big data is now crucial for driving business decisions. Companies are tapping into it to gain valuable insights and make smarter moves. To unlock this power, they’re using tools like data warehouses, BI tools, and cloud storage. One key innovation? The semantic layer: Its role is simple—standardize data definitions, making them more accessible and easier […]View the full article
-
- 0 replies
- 17 views
-
-
Whether in healthcare or the retail industry, everyone needs data to succeed in their business. Data helps make clear decisions and helps businesses understand people and their needs. That is why data integration in business intelligence is very important. In this blog, you will explore data integration in business intelligence, its frameworks and components, its […]View the full article
-
- 0 replies
- 15 views
-
-
The current data-centric environment changes how organizations handle business information by implementing cloud data integration methods. By seamlessly connecting different data sources, companies gain real-time insights that drive smarter decisions and improve daily operations. Like for example Netflix, its advanced cloud strategies help process a staggering 550 billion events every day, generating 1.3 petabytes of […]View the full article
-
- 0 replies
- 17 views
-
-
Choosing the right data transformation tool can make all the difference for efficient data workflows. Coalesce and dbt are two of the most popular choices that bring unique features to the table for data teams. While dbt is known for its SQL-based, modular approach to transformations, Coalesce provides a low-code, column-aware interface with automation capabilities. […]View the full article
-
- 0 replies
- 17 views
-
-
Given the era of big data, organizations are producing and analyzing enormous amounts of data daily. They use tools that enable streamlining data ingestion, transformation, and analysis to try to understand it all. Two of the most popular tools on the modern data stack, dbt (Data Build Tool) and Hevo, occupy different but complementary spaces. […]View the full article
-
- 0 replies
- 18 views
-
-
With growing businesses, marketing teams are flooded with a wealth of data from various platforms such as social media, email campaigns, customer feedback, websites, and offline in-store. The real challenge lies in “how to integrate this data into a unified structure in a meaningful way ?”. This is where “Marketing Data Integration” comes into play. […]View the full article
-
- 0 replies
- 17 views
-
-
Is your business incapacitated due to slow and unreliable data pipelines in today’s hyper-competitive environment? Data pipelines are the backbone that guarantees real-time access to critical information for informed and quicker decisions. The data pipeline market is set to grow from USD 6.81 billion in 2022 to USD 33.87 billion by 2030 at a CAGR […]View the full article
-
- 0 replies
- 14 views
-
-
Digital tools and technologies help organizations generate large amounts of data daily, requiring efficient governance and management. This is where the AWS data lake comes in. With the AWS data lake, organizations and businesses can store, analyze, and process structured and unstructured data of any size. This article will focus on how businesses and organizations […]View the full article
-
- 0 replies
- 15 views
-
-
The right data integration platform is crucial for the effective management and analysis of data. Rivery offers robust capabilities in data integration and transformation, but it may not fit every business’s unique needs. Fortunately, there are several Rivery alternatives available, each with distinct features, pricing, and use cases. Explore these options and find the perfect […]View the full article
-
- 0 replies
- 13 views
-
-
Every year, Gartner rolls out its Magic Quadrant for Data Integration Tools, a trusted guide for data leaders on the hunt for the perfect integration tool. Think of it as a cheat sheet that cuts through the noise—evaluating tools based on how well they perform, how innovative they are, and how clear their vision is […]View the full article
-
- 0 replies
- 14 views
-
-
Let’s face it: Data engineering is like playing Tetris, always moving objects around to fit them into the right places. The data is never static; pipelines, schemas, transformations, workflows, and flow are always puzzles that must be solved. Yes, it is an unmistakable kind of job; however, let me assure you, it is not always […]View the full article
-
- 0 replies
- 15 views
-
-
One of the most common things in data analytics is running the same analytics queries over and over again by different end users over various times and snapshots of the data. This action, in particular, makes running warehousing solutions expensive and time-consuming at the same time. How about, we store the results of such expensive […]View the full article
-
- 0 replies
- 15 views
-
-
Have you ever felt like data engineering is evolving at the speed of light? With new tech emerging almost daily, it’s no surprise that staying ahead of the curve is harder than ever. As we step into the fantastic year 2025 ahead, the rate at which data engineering changes is at an all-time high. New […] View the full article
-
- 0 replies
- 19 views
-
-
In today’s data-driven world, data lakes have emerged as the data architecture of choice when storing and analyzing large volumes of data. However, implementing a successful data lake requires diligent planning and design, as it can quickly become a data swamp with no additional value. This blog post will delve into data lake best practices, […]View the full article
-
- 0 replies
- 14 views
-
-
The Oracle Database is used by many companies around the world as the basis for the storage and processing of information. It is well adopted across all markets, including the financial sector and healthcare, where data security and management is critical. A lot of companies rely on Oracle database to store and manage their critical […]View the full article
-
- 0 replies
- 14 views
-
-
Amazon Aurora PostgreSQL is now available as a quick create vector store in Amazon Bedrock Knowledge Bases. With the new Aurora quick create option, developers and data scientists building generative AI applications can select Aurora PostgreSQL as their vector store with one click to deploy an Aurora Serverless cluster preconfigured with pgvector in minutes. Aurora Serverless is an on-demand, autoscaling configuration where capacity is adjusted automatically based on application demand, making it ideal as a developer vector store. Knowledge Bases securely connects foundation models (FMs) running in Bedrock to your company data sources for Retrieval Augmented Generation…
-
- 0 replies
- 11 views
-
-
We know just how hard it is to run your marketing data. The variety of campaigns running through platforms, such as Google Ads, Facebook, and HubSpot, among others, gives the kind of information that would just flood you. This is why we talk about a marketing data warehouse today that is powerful enough to make […]View the full article
-
- 0 replies
- 12 views
-
-
Matillion is a cloud-based ETL tool known for its user-friendly, low-code interface. It’s great for teams that want to get pipelines up and running quickly without heavy coding. It also integrates seamlessly with cloud platforms like Snowflake, BigQuery, and Redshift, making it a solid choice for companies already working in the cloud.Airflow, on the other […]View the full article
-
- 0 replies
- 12 views
-
-
When it comes to data integration, Fivetran has established a solid reputation as one of the industry leaders. With its robust feature set, Fivetran has become a go-to option for many enterprises. But there’s a catch- Fivetran’s pricing model. It’s unique and depends on usage. However, Fivetran’s pricing model can be confusing and complex for […]View the full article
-
- 0 replies
- 11 views
-
-
With so many data integration tools available these days, it can become very overwhelming to choose one that best suits your needs. Here in this blog post, I have broken down an all-comprehensive comparison of two leading platforms: Airbyte VS Stitch. So, let’s get into the main discussion. Whether you seek flexibility, scalability, or ease […]View the full article
-
- 0 replies
- 11 views
-
-
When searching for a reliable data integration platform, many options might cross your mind. However, Hevo Data stands out as a no-code, fully managed solution. Recognized in G2’s Fall 2021 report, Hevo delivers unmatched ease of use, setup simplicity, and comprehensive support. Trusted by over 2000+ companies, including brands like Postman and Thoughtspot, Hevo enables […]View the full article
-
- 0 replies
- 12 views
-
-
At Snowflake BUILD, we are introducing powerful new features designed to accelerate building and deploying generative AI applications on enterprise data, while helping you ensure trust and safety. These new tools streamline workflows, deliver insights at scale, and get AI apps into production quickly. Customers such as Skai have used these capabilities to bring their generative AI solution into production in just two days instead of months. Here’s how Snowflake Cortex AI and Snowflake ML are accelerating the delivery of trusted AI solutions for the most critical generative AI applications... View the full article
-
- 0 replies
- 27 views
-
-
Amazon Redshift is an online, petabyte-scale Data Warehouse service. It is dedicated to enterprise use, collecting large amounts of data and extracting analysis and insights from it. Redshift helps organizations query large DBs in real-time. Nonetheless, Redshift provides flexibility in performance as long as the cost aspect is well-handled to minimize cloud expenses. In this […]View the full article
-
- 0 replies
- 13 views
-
-
Every business based on data-driven insights in the modern data ecosystem needs effective ETL tools. Your choice of ETL will go a long way in affecting the efficiency, speed, and cost of your data operations. Among well-recognized ETL tools, Hevo and Matillion come with different capabilities that make it very important to understand their features, […]View the full article
-
- 0 replies
- 12 views
-
-
As the dependency on high-quality, real-time data availability increases, the need for event/data streaming tools becomes increasingly crucial. Apache Kafka has become one of the most trending event streaming platforms, and its popularity has led to wide organizational acceptance in various functions related to large-scale real-time data streams. Today, we will explore the Top 5 […]View the full article
-
- 0 replies
- 12 views
-
-
If you have decided to start your journey with cloud databases, you probably have encountered AWS RDS – Amazon Web Services Relational Database Service, and CDC – Change Data Capture. In this blog, you will learn about AWS RDS, what CDC is, and how to integrate AWS RDS CDC into your data operations. If you […]View the full article
-
- 0 replies
- 12 views
-
-
In today’s fast-paced data environment, Change Data Capture (CDC) transforms how organizations handle and synchronize their expanding data volumes. According to the Market Analysis Report, the global data management market size was valued at USD 89.34 billion in 2022 and is expected to grow at a compound annual growth rate (CAGR) of 12.1% from 2023 […]View the full article
-
- 0 replies
- 12 views
-
-
In today’s world of big data, it’s important for companies to quickly and effectively change and analyze large data sets to get useful information. Businesses need tools that help them gather, change, and use data easily so they can make smart decisions based on that data. Among the many ETL/ELT tools available, Matillion and dbt […]View the full article
-
- 0 replies
- 12 views
-
-
Imagine putting hours into manually handling data tasks only to discover that one small mistake has caused the entire process to fail. Yes, it is frustrating. This is why automation is important. Automation is essential to ensure efficiency and data integrity in businesses. There is a McKinsey1 report that states: Task automation can spare up […]View the full article
-
- 0 replies
- 12 views
-
-
Optimization is crucial in data engineering, where high-volume and complex data demands increased data handling and querying efficiency. In platforms like Databricks, built around speed and performance, query optimization knowledge helps organizations leverage their data by accelerating processes. In this world of data engineering, optimization is not just a buzzword but a mandate. As the […]View the full article
-
- 0 replies
- 12 views
-
-
Today’s world is all about data hence, choosing the right Integration Platform as a Service-or iPaaS-enterprises will further seek streamlined operations, better quality of data, and ease in connecting diverse systems. Among the leading iPaaS vendors, Boomi and Informatica have unique features and capabilities that would suit different enterprise needs. A report by Gartner tells […]View the full article
-
- 0 replies
- 12 views
-
-
AWS Glue is a fully managed serverless ETL service that simplifies preparing and loading data for analytics. But how does it work? To answer that question, we need to understand its architecture. In this blog, we will discuss the AWS Glue architecture so you can fully understand how it works and optimize your data better. […]View the full article
-
- 0 replies
- 11 views
-
-
More than ever, organizations face increasing challenges in maintaining data quality as their data size and complexity grow exponentially. They must now rely on efficient tools and services to ensure data accuracy, integrity, and anomalies-free. Quality data is essential for deriving accurate insights and making informed decisions. Poor Data can lead to inaccurate insights, which, […]View the full article
-
- 0 replies
- 12 views
-