Skip to main content

Overview

Publishing is the process of deploying your data mappings to production. When you publish, Entegrata generates Delta Live Tables (DLT) scripts and deploys them to your Databricks environment, where they run as automated data pipelines. This guide covers the complete publishing process, from pre-publish validation to monitoring deployed pipelines.

Understanding Publishing

What Happens When You Publish?

When you click Publish, Entegrata:
  1. Validates your mapping configuration
  2. Generates Delta Live Tables (DLT) Python code
  3. Deploys the code to your Databricks workspace
  4. Creates or updates the DLT pipeline
  5. Schedules the pipeline to run automatically
  6. Starts initial data processing

Publishing Requirements

Before publishing, your mapping must meet these requirements:
Required for Publishing:
  • At least one data source configured
  • No validation errors
  • Valid join conditions (if using related sources)
  • Proper permissions to publish

Publishing Your Mapping

1

Save Your Work

Click Save to ensure all changes are persisted. Publishing will fail if there are unsaved changes.
Save button and saved indicator
2

Click Publish

Click the Publish button in the top toolbar.
3

Verify Success

When complete, you’ll see a success message.