Databricks

Databricks is the lakehouse company, helping data teams solve the world’s toughest problems.

Integrate the Databricks API with the MySQL API

Setup the Databricks API trigger to run a workflow which integrates with the MySQL API. Pipedream's integration platform allows you to integrate Databricks and MySQL remarkably fast. Free for developers.

Get Run Output with Databricks API on New Column from MySQL API
MySQL + Databricks
 
Try it
Get Run Output with Databricks API on New or Updated Row from MySQL API
MySQL + Databricks
 
Try it
Get Run Output with Databricks API on New Row (Custom Query) from MySQL API
MySQL + Databricks
 
Try it
Get Run Output with Databricks API on New Row from MySQL API
MySQL + Databricks
 
Try it
Get Run Output with Databricks API on New Table from MySQL API
MySQL + Databricks
 
Try it
New Column from the MySQL API

Emit new event when you add a new column to a table. See the docs here

 
Try it
New or Updated Row from the MySQL API

Emit new event when you add or modify a new row in a table. See the docs here

 
Try it
New Row from the MySQL API

Emit new event when you add a new row to a table. See the docs here

 
Try it
New Row (Custom Query) from the MySQL API

Emit new event when new rows are returned from a custom query. See the docs here

 
Try it
New Table from the MySQL API

Emit new event when a new table is added to a database. See the docs here

 
Try it
Get Run Output with the Databricks API

Retrieve the output and metadata of a single task run. See the documentation

 
Try it
Create Row with the MySQL API

Adds a new row. See the docs here

 
Try it
List Runs with the Databricks API

Lists all runs available to the user. See the documentation

 
Try it
Delete Row with the MySQL API

Delete an existing row. See the docs here

 
Try it
Run Job Now with the Databricks API

Run a job now and return the id of the triggered run. See the documentation

 
Try it

Overview of Databricks

The Databricks API allows you to interact programmatically with Databricks services, enabling you to manage clusters, jobs, notebooks, and other resources within Databricks environments. Through Pipedream, you can leverage these APIs to create powerful automations and integrate with other apps for enhanced data processing, transformation, and analytics workflows. This unlocks possibilities like automating cluster management, dynamically running jobs based on external triggers, and orchestrating complex data pipelines with ease.

Connect Databricks

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
import { axios } from "@pipedream/platform"
export default defineComponent({
  props: {
    databricks: {
      type: "app",
      app: "databricks",
    }
  },
  async run({steps, $}) {
    return await axios($, {
      url: `https://${this.databricks.$auth.domain}.cloud.databricks.com/api/2.0/clusters/list`,
      headers: {
        Authorization: `Bearer ${this.databricks.$auth.access_token}`,
      },
    })
  },
})

Overview of MySQL

The MySQL application on Pipedream enables direct interaction with your MySQL databases, allowing you to perform CRUD operations—create, read, update, delete—on your data with ease. You can leverage these capabilities to automate data synchronization, report generation, and event-based triggers that kick off workflows in other apps. With Pipedream's serverless platform, you can connect MySQL to hundreds of other services without managing infrastructure, crafting complex code, or handling authentication.

Connect MySQL

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
import mysql from '@pipedream/mysql';

export default defineComponent({
  props: {
    mysql,
  },
  async run({steps, $}) {
    // Component source code:
    // https://github.com/PipedreamHQ/pipedream/tree/master/components/mysql

    const queryObj = {
      sql: "SELECT NOW()",
      values: [], // Ignored since query does not contain placeholders
    };
    const { rows } = await this.mysql.executeQuery(queryObj);
    return rows;
  },
});