with Databricks and Celonis EMS?
Creates a new SQL Warehouse in Databricks. See the documentation
Edits the configuration of an existing SQL Warehouse. See the documentation
The Databricks API allows you to interact programmatically with Databricks services, enabling you to manage clusters, jobs, notebooks, and other resources within Databricks environments. Through Pipedream, you can leverage these APIs to create powerful automations and integrate with other apps for enhanced data processing, transformation, and analytics workflows. This unlocks possibilities like automating cluster management, dynamically running jobs based on external triggers, and orchestrating complex data pipelines with ease.
import { axios } from "@pipedream/platform"
export default defineComponent({
props: {
databricks: {
type: "app",
app: "databricks",
}
},
async run({steps, $}) {
return await axios($, {
url: `https://${this.databricks.$auth.domain}.cloud.databricks.com/api/2.0/preview/scim/v2/Me`,
headers: {
Authorization: `Bearer ${this.databricks.$auth.access_token}`,
},
})
},
})
The Celonis EMS API allows you to harness the power of process mining and execution management within your workflows. Integrated within Pipedream, this API enables you to automate actions based on process insights, such as identifying bottlenecks and initiating corrective measures. You can trigger workflows from Celonis data, send data back into Celonis for deeper analysis, or even mix data from different sources for rich, actionable insights.
import { axios } from "@pipedream/platform"
export default defineComponent({
props: {
celonis_ems: {
type: "app",
app: "celonis_ems",
}
},
async run({steps, $}) {
return await axios($, {
url: `https://${this.celonis_ems.$auth.team}.${this.celonis_ems.$auth.cluster}.celonis.cloud/intelligence/api/knowledge-models`,
headers: {
Authorization: `Bearer ${this.celonis_ems.$auth.api_key}`,
},
})
},
})