Tag: #azuredatabricks

  • Triggering Azure Data Factory (ADF) Pipelines from Databricks Notebooks

    Triggering Azure Data Factory (ADF) Pipelines from Databricks Notebooks

    Overview  In modern data workflows, it’s common to combine the orchestration capabilities of Azure Data Factory (ADF) with the powerful data processing of Databricks. This blog demonstrates how to trigger an ADF pipeline directly from a Databricks notebook using REST API and Python.  We’ll cover:  Required configurations and widgets  Azure AD authentication  Pipeline trigger logic …

  • A Unifying Tool For Deployment Of Databricks

    A Unifying Tool For Deployment Of Databricks

    Overview Databricks Asset Bundles are a way to develop, package, version, and deploy Databricks workspace artifacts (like notebooks, workflows, libraries, etc.) using YAML-based configuration files. This allows for CI/CD integration and reproducible deployments across environments (dev/test/prod). What are Databricks Asset Bundles ​ Databricks Asset Bundles are an infrastructure-as-code (IaC) approach to managing your Databricks projects.​…