with Databricks and Airbyte?
Retrieve the output and metadata of a single task run. See the documentation
Run a job now and return the id of the triggered run. See the documentation
The Databricks API allows you to interact programmatically with Databricks services, enabling you to manage clusters, jobs, notebooks, and other resources within Databricks environments. Through Pipedream, you can leverage these APIs to create powerful automations and integrate with other apps for enhanced data processing, transformation, and analytics workflows. This unlocks possibilities like automating cluster management, dynamically running jobs based on external triggers, and orchestrating complex data pipelines with ease.
import { axios } from "@pipedream/platform"
export default defineComponent({
props: {
databricks: {
type: "app",
app: "databricks",
}
},
async run({steps, $}) {
return await axios($, {
url: `https://${this.databricks.$auth.domain}.cloud.databricks.com/api/2.0/clusters/list`,
headers: {
Authorization: `Bearer ${this.databricks.$auth.access_token}`,
},
})
},
})
The Airbyte API allows for creating and managing data integration pipelines between various sources and destinations, automating data synchronization tasks, and monitoring the status of those pipelines. On Pipedream, you can leverage the Airbyte API to build intricate workflows that react to data events, manipulate and store data, and connect to other services to create rich, automated data pipelines.
import { axios } from "@pipedream/platform"
export default defineComponent({
props: {
https_airbyte_com: {
type: "app",
app: "https_airbyte_com",
}
},
async run({steps, $}) {
return await axios($, {
url: `${this.https_airbyte_com.$auth.url}/v1/connections`,
headers: {
Authorization: `Bearer ${this.https_airbyte_com.$auth.api_key}`,
},
})
},
})