DOCS
Search in:
Go to Home
API Docs
Schedule a Demo
  • Introduction
    • About Hevo
    • Hevo Features
    • Hevo System Architecture
      • Hevo Pipeline Framework
    • Core Concepts
      • ETL and ELT
        • ETL Cost Monitor
      • Data Pipelines
      • Sources and Destinations
        • Types of Sources
        • Types of Destinations
      • Events
      • Objects and Event Types
      • Data Replication
      • Data Transformation
      • Data Types
    • Free Trials
      • Free Product Trial
      • Free Trial of Unused Sources
    • Security
      • Customer Data Retention and Encryption
      • Infrastructure Security
      • Company Security Standards and Practices
      • Password and Account Lockout Policy
    • Regulatory Compliance
    • Scheduling a Demo
    • Hevo Support
    • General FAQs
      • Where can I suggest a new feature in Hevo?
  • Getting Started
    • Creating an Account in Hevo
      • Selecting your Hevo Region
      • Creating your Account
      • User Roles and Workspaces
        • Viewing User Roles and Permissions
        • Changing the Role of the User
    • Subscribing to Hevo via AWS Marketplace
      • Subscribing to a Private Offer
      • Subscribing to a Public Offer
      • Modifying an AWS Contract
      • Switching to Stripe-based Invoicing
    • Connection Options
      • Connecting Through SSH
        • Configuring an SSH Tunnel
        • Troubleshooting SSH Connection Errors
          • Unable to Verify SSH Details
      • Connecting Through Reverse SSH Tunnel
      • Connecting Through VPN
      • Connecting Through Amazon Web Services
        • Connecting Through Mongo PrivateLink
        • Connecting Through AWS Transit Gateway
        • Connecting Through AWS VPC Endpoint
        • Connecting Through AWS VPC Peering
        • AWS Regions Supported by Hevo
      • Using Google Account Authentication
        • Authentication for Google Workspace Applications
        • Authentication for GCP-hosted Services
        • Migrating User Account-Based Pipelines to Service Account
      • How Hevo Authenticates Sources and Destinations using OAuth
      • Reauthorizing an OAuth Account
    • Familiarizing with the UI
      • Navigation Bar
      • Global Search
      • User Information Panel
      • Activity Graphs
      • Side-by-Side Setup Guide
      • Keyboard Shortcuts
      • Switching to Dark Mode
      • UI Elements and Terms Reference
    • Creating your First Pipeline
      • Creating a Database Pipeline
      • Creating a SaaS Pipeline
      • Creating a Webhook Pipeline
    • Data Loss Prevention and Recovery
  • Data Ingestion
    • Types of Data Synchronization
    • Ingestion Modes and Query Modes for Database Sources
      • Ingestion Modes
      • Query Modes
    • Ingestion and Loading Frequency
    • Data Ingestion Statuses
    • Deferred Data Ingestion
    • Handling of Primary Keys
    • Handling of Updates
    • Handling of Deletes
    • Hevo-generated Metadata
      • Metadata Column __hevo_database_name
      • Metadata Column __hevo_id
      • Metadata Column __hevo_source_modified_at
    • Best Practices to Avoid Reaching Source API Rate Limits
  • Edge
    • Getting Started
      • Familiarizing with the UI
      • Upgrading Pipeline from Standard to Edge
    • Data Ingestion
      • Types of Data Synchronization
      • Sync Frequency
        • Sync Frequency and Jobs
      • Rate Limit Exceptions
        • Handling Rate Limit Exceptions
        • Best Practices to Avoid Reaching Source API Rate Limits
      • Hevo-generated Metadata
    • Core Concepts
      • Events
      • Objects
      • Data Replication
    • Pipelines
      • Familiarizing with the Pipelines UI (Edge)
      • Creating an Edge Pipeline
      • Working with Edge Pipelines
        • Editing Edge Pipelines
        • On-Demand Syncing Data in Edge Pipelines
        • Activating Edge Pipelines
        • Modifying the Source and Destination Configuration
        • Deleting Edge Pipelines
      • Object and Schema Management
        • Familiarizing with the Object Configuration UI
        • Managing Objects in Pipelines
        • Pipeline Schema Management
      • Pipeline Job History
        • Job Types
        • Job and Object Statuses
        • Viewing Pipeline Job Details
        • Offset and Latency of Jobs and Objects
        • Canceling Jobs
        • Downloading Session Logs
    • Sources
      • PostgreSQL
        • Amazon Aurora PostgreSQL
        • Amazon RDS PostgreSQL
        • Azure PostgreSQL
        • Generic PostgreSQL
        • Google Cloud PostgreSQL
        • Troubleshooting PostgreSQL
          • Errors during Pipeline creation
            • Connection Attempt Failed
            • Invalid Publication Key
      • Oracle
        • Amazon RDS Oracle
        • Generic Oracle
      • MySQL
        • Amazon Aurora MySQL
        • Amazon RDS MySQL
        • Azure MySQL
        • Generic MySQL
        • Google Cloud MySQL
      • SQL Server
        • Amazon RDS SQL Server
        • Azure SQL Server
        • Generic SQL Server
        • Google Cloud SQL Server
        • Troubleshooting SQL Server
          • Errors during Pipeline creation
            • Connection Fails Through SSH Tunnel
            • Database User Does Not Have Required Permissions
      • Troubleshooting Database Sources
        • Errors during Pipeline creation
          • Authentication Failure
          • Connection Settings Errors
      • Salesforce Bulk API V2
        • Handling Formula Fields
    • Destinations
      • Familiarizing with the Destinations UI
      • Naming Conventions for Destination Data Entities
      • Amazon Redshift
        • Troubleshooting Amazon Redshift Destination
          • Database Not Found
          • Connection Attempt Failed
      • Google BigQuery
        • Authentication for GCP-hosted Services
        • Supporting Partitioned Tables in BigQuery
      • Snowflake
        • Troubleshooting Snowflake
          • Snowflake Account Locked
          • Snowflake Database Not Found
          • Unable to Read Private Key
      • Working with Destinations
        • Deleting Destinations
    • Alerts
      • Working with Alerts
        • Managing Alert Recipients
          • Integrating the Hevo App with Slack
        • Subscribing to Alerts
        • Editing Alert Preferences
    • Custom Connectors
      • Creating your First Connector
        • Creating the UI Components
        • Creating the Connection Logic
      • Testing your Custom Connector
    • Releases
      • Edge Release Notes - December 08, 2025
      • Edge Release Notes - December 01, 2025
      • Edge Release Notes - November 05, 2025
      • Edge Release Notes - October 30, 2025
      • Edge Release Notes - September 22, 2025
      • Edge Release Notes - August 11, 2025
      • Edge Release Notes - July 09, 2025
      • Edge Release Notes - November 21, 2024
  • Data Loading
    • Loading Data in a Database Destination
    • Loading Data to a Data Warehouse
    • Optimizing Data Loading for a Destination Warehouse
    • Deduplicating Data in a Data Warehouse Destination
    • Manually Triggering the Loading of Events
    • Scheduling Data Load for a Destination
    • Loading Events in Batches
    • Data Loading Statuses
    • Data Spike Alerts
    • Name Sanitization
    • Table and Column Name Compression
    • Parsing Nested JSON Fields in Events
  • Pipelines
    • Data Flow in a Pipeline
    • Familiarizing with the Pipelines UI
    • Working with Pipelines
      • Best Practices for Creating Database Pipelines
      • Creating a Pipeline
        • Draft Pipelines
      • Connectivity Check for RDBMS Sources
      • Scheduling a Pipeline
      • Modifying a Pipeline
      • Prioritizing a Pipeline
      • Viewing Pipeline Progress
      • Pausing and Deleting a Pipeline
      • Log-based Pipelines
        • Pausing a Log-based Pipeline
        • Handling Deletes in Log-based Pipelines
      • Troubleshooting Data Replication Errors
    • Managing Objects in Pipelines
      • Database Objects and Actions
      • Optimizing Query Modes for Objects
      • SaaS Objects and Actions
      • Bulk Actions in Pipeline Objects
    • Pipeline Jobs
    • Transformations
      • Python Code-Based Transformations
        • Supported Python Modules and Functions
        • Transformation Methods in the Event Class
          • Create an Event
          • Retrieve the Event Name
          • Rename an Event
          • Retrieve the Properties of an Event
          • Modify the Properties for an Event
          • Fetch the Primary Keys of an Event
          • Modify the Primary Keys of an Event
          • Fetch the Data Type of a Field
          • Check if the Field is a String
          • Check if the Field is a Number
          • Check if the Field is Boolean
          • Check if the Field is a Date
          • Check if the Field is a Time Value
          • Check if the Field is a Timestamp
        • TimeUtils
          • Convert Date String to Required Format
          • Convert Date to Required Format
          • Convert Datetime String to Required Format
          • Convert Epoch Time to a Date
          • Convert Epoch Time to a Datetime
          • Convert Epoch to Required Format
          • Convert Epoch to a Time
          • Get Time Difference
          • Parse Date String to Date
          • Parse Date String to Datetime Format
          • Parse Date String to Time
        • Utils
          • Check if an Object is a Dictionary Object
          • Retrieve the Type for a Value
          • Replace Unicode Values
          • Round off BigDecimal Values
          • Trim Unicode Values
          • Convert JSON String to Dictionary Object
          • Convert Python Dictionary Object to JSON String
        • Examples of Python Code-based Transformations
          • Adding Time to a Timestamp Value
          • Splitting an Event into Multiple Event Types
          • Splitting Multiple Values in a Key into Separate Events
          • Splitting Nested Events into Multiple Events
      • Drag and Drop Transformations
        • Special Keywords
        • Transformation Blocks and Properties
          • Add a Field
          • Change Datetime Field Values
          • Change Field Values
          • Drop Events
          • Drop Fields
          • Find & Replace
          • Flatten JSON
          • Format Date to String
          • Format Number to String
          • Hash Fields
          • If-Else
          • Mask Fields
          • Modify Text Casing
          • Parse Date from String
          • Parse JSON from String
          • Parse Number from String
          • Rename Events
          • Rename Fields
          • Round-off Decimal Fields
          • Split Fields
        • Examples of Drag and Drop Transformations
      • Effect of Transformations on the Destination Table Structure
      • Transformation Reference
        • Datetime Field Values
        • Operators
        • Rounding Modes
        • Text Casing
        • Values and Formulas
      • Transformation FAQs
        • Why can I not see my changed Source data?
    • Schema Mapper
      • Using Schema Mapper
      • Mapping Statuses
      • Auto Mapping Event Types
      • Manually Mapping Event Types
      • Modifying Schema Mapping for Event Types
      • Schema Mapper Actions
      • Fixing Unmapped Fields
      • Resolving Incompatible Schema Mappings
      • Resizing String Columns in the Destination
      • Changing the Data Type of a Destination Table Column
      • Schema Mapper Compatibility Table
      • Limits on the Number of Destination Columns
    • File Log
    • Troubleshooting Failed Events in a Pipeline
      • Reasons for Event Failures
      • Resolving Event Failures
      • Invalid Timestamp Value in Source Fields
    • Mismatch in Events Count in Source and Destination
    • Audit Tables
      • Configuring Audit Tables
      • Modifying Audit Table Settings
      • Viewing Audit History
        • Querying Audit Tables
    • Activity Log
      • Activity Logs - CloudWatch Sync
    • Pipeline FAQs
      • Can multiple Sources connect to one Destination?
      • What happens if I re-create a deleted Pipeline?
      • Why is there a delay in my Pipeline?
      • Can I change the Destination post-Pipeline creation?
      • Why is my billable Events high with Delta Timestamp mode?
      • Can I drop multiple Destination tables in a Pipeline at once?
      • How does Run Now affect scheduled ingestion frequency?
      • Will pausing some objects increase the ingestion speed?
      • Can I see the historical load progress?
      • Why is my Historical Load Progress still at 0%?
      • Why is historical data not getting ingested?
      • How do I set a field as a primary key?
      • How do I ensure that records are loaded only once?
  • Events Usage
    • Understanding Events Usage for Billing
    • Events Quota Reset Date
    • Viewing Events Usage
    • Pipeline Usage Summary
    • Factors Affecting Event Usage
      • Query Modes
      • Pipeline Frequency
      • Conversion Window and Pipeline Frequency in Ad-based Sources
  • Sources
    • Free Sources
    • Databases and File Systems
      • Data Warehouses
        • Amazon Redshift
        • Google BigQuery
      • Databases
        • Connecting to a Local Database
        • Amazon DocumentDB
        • Amazon DynamoDB
          • Troubleshooting Amazon DynamoDB
            • Errors Post-Pipeline Creation
              • Error 2008 - Streams disabled for table
              • Error 2011 - Invalid Stream View Type
        • Elasticsearch
          • Configuration Changes in Elasticsearch
        • MongoDB
          • Generic MongoDB
          • MongoDB Atlas
          • Support for Multiple Data Types for the _id Field
          • Example - Merge Collections Feature
          • Troubleshooting MongoDB
            • Errors During Pipeline Creation
              • Error 1001 - Incorrect credentials
              • Error 1005 - Connection timeout
              • Error 1006 - Invalid database hostname
              • Error 1007 - SSH connection failed
              • Error 1008 - Database unreachable
              • Error 1011 - Insufficient access
              • Error 1028 - Primary/Master host needed for OpLog
              • Error 1029 - Version not supported for Change Streams
              • SSL 1009 - SSL Connection Failure
            • Troubleshooting MongoDB Change Streams Connection
            • Troubleshooting MongoDB OpLog Connection
        • SQL Server
          • Amazon RDS SQL Server
          • Azure SQL Server
          • Generic SQL Server
          • Google Cloud SQL Server
          • Troubleshooting SQL Server
            • Errors During Pipeline Creation
              • Error 1003 - Authentication error
              • Error 1005 - Connection timeout
              • Error 1006 - Insufficient access
              • Error 1011 - Access denied
          • SQL Server FAQs
            • What if a deleted record is reinserted in the Source?
            • How can I change the Data Type of a Primary Key Column in SQL Server?
        • MySQL
          • Amazon Aurora MySQL
          • Amazon RDS MySQL
          • Azure MySQL
          • Generic MySQL
          • Google Cloud MySQL
          • MariaDB MySQL
          • Troubleshooting MySQL
            • Errors During Pipeline Creation
              • Error 1003 - Connection to host failed
              • Error 1006 - Connection to host failed
              • Error 1007 - SSH connection failed
              • Error 1011 - Access denied
              • Error 1012 - Replication access denied
              • Error 1017 - Connection to host failed
              • Error 1026 - Failed to connect to database
              • Error 1027 - Unsupported BinLog format
              • Failed to determine binlog filename/position
              • Schema 'xyz' is not tracked via bin logs
            • Errors Post-Pipeline Creation
              • Communications Link Failure from SELECT Queries
              • Insufficient max_allowed_packet Size
              • Out of Sort Memory
              • Pipeline failure due to BinLog expiry
          • MySQL FAQs
            • Why is there no activity in my BinLog Pipeline?
        • Oracle
          • Amazon RDS Oracle
          • Generic Oracle
          • Troubleshooting Oracle
            • Errors Post-Pipeline Creation
              • Pipeline failure due to Redo Log expiry
              • Pipeline failure due to deleted log files
        • PostgreSQL
          • Amazon Aurora PostgreSQL
          • Amazon RDS PostgreSQL
          • Azure PostgreSQL
          • Generic PostgreSQL
          • Google Cloud PostgreSQL
          • Heroku PostgreSQL
          • Troubleshooting PostgreSQL
            • Errors during Pipeline creation
              • Error 1003 - Authentication failure
              • Error 1006 - Connection settings errors
              • Error 1011 - Access role issue for logical replication
              • Error 1012 - Access role issue for logical replication
              • Error 1014 - Database does not exist
              • Error 1017 - Connection settings errors
              • Error 1023 - No pg_hba.conf entry
              • Error 1024 - Number of requested standby connections
            • Errors Post-Pipeline Creation
              • Pipeline failure due to replication slot errors
              • Values for columns in table "table_name" are incompatible
          • PostgreSQL FAQs
            • Can I track updates to existing records in PostgreSQL?
            • How can I migrate a Pipeline created with one PostgreSQL Source variant to another variant?
            • How can I prevent data loss when migrating or upgrading my PostgreSQL database?
            • Why do FLOAT4 and FLOAT8 values in PostgreSQL show additional decimal places when loaded to BigQuery?
            • Why is data not being ingested from PostgreSQL Source objects?
        • Troubleshooting Database Sources
          • Events not Found for future timestamp values in CDC mode
          • Historical data ingestion failing in Full Load mode
          • Query timed out for table/query
        • Database Source FAQs
          • How can I migrate a Pipeline created with one Database Source variant to another?
      • File Storage
        • Amazon S3
          • Amazon S3 FAQs
            • How do I load Amazon S3 folders as separate Event Types?