Databricks

Databricks is the lakehouse company, helping data teams solve the world’s toughest problems.

Integrate the Databricks API with the Schedule API

Setup the Databricks API trigger to run a workflow which integrates with the Schedule API. Pipedream's integration platform allows you to integrate Databricks and Schedule remarkably fast. Free for developers.

Get Run Output with Databricks API on Custom Interval from Schedule API
Schedule + Databricks
 
Try it
Get Run Output with Databricks API on Daily schedule from Schedule API
Schedule + Databricks
 
Try it
Get Run Output with Databricks API on Monthly Schedule from Schedule API
Schedule + Databricks
 
Try it
Get Run Output with Databricks API on Weekly schedule from Schedule API
Schedule + Databricks
 
Try it
List Runs with Databricks API on Custom Interval from Schedule API
Schedule + Databricks
 
Try it
Custom Interval from the Schedule API

Trigger your workflow every N hours, minutes or seconds.

 
Try it
Daily schedule from the Schedule API

Trigger your workflow every day.

 
Try it
Monthly Schedule from the Schedule API

Trigger your workflow on one or more days each month at a specific time (with timezone support).

 
Try it
Weekly schedule from the Schedule API

Trigger your workflow on one or more days each week at a specific time (with timezone support).

 
Try it
Get Run Output with the Databricks API

Retrieve the output and metadata of a single task run. See the documentation

 
Try it
List Runs with the Databricks API

Lists all runs available to the user. See the documentation

 
Try it
Run Job Now with the Databricks API

Run a job now and return the id of the triggered run. See the documentation

 
Try it

Overview of Databricks

The Databricks API allows you to interact programmatically with Databricks services, enabling you to manage clusters, jobs, notebooks, and other resources within Databricks environments. Through Pipedream, you can leverage these APIs to create powerful automations and integrate with other apps for enhanced data processing, transformation, and analytics workflows. This unlocks possibilities like automating cluster management, dynamically running jobs based on external triggers, and orchestrating complex data pipelines with ease.

Connect Databricks

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
import { axios } from "@pipedream/platform"
export default defineComponent({
  props: {
    databricks: {
      type: "app",
      app: "databricks",
    }
  },
  async run({steps, $}) {
    return await axios($, {
      url: `https://${this.databricks.$auth.domain}.cloud.databricks.com/api/2.0/clusters/list`,
      headers: {
        Authorization: `Bearer ${this.databricks.$auth.access_token}`,
      },
    })
  },
})

Overview of Schedule

The Schedule app in Pipedream is a powerful tool that allows you to trigger workflows at regular intervals, ranging from every minute to once a year. This enables the automation of repetitive tasks and the scheduling of actions to occur without manual intervention. By leveraging this API, you can execute code, run integrations, and process data on a reliable schedule, all within Pipedream's serverless environment.