with Databricks and PagerDuty?
Creates a new SQL Warehouse in Databricks. See the documentation
Find the user on call for a specific schedule. See the docs here
The Databricks API allows you to interact programmatically with Databricks services, enabling you to manage clusters, jobs, notebooks, and other resources within Databricks environments. Through Pipedream, you can leverage these APIs to create powerful automations and integrate with other apps for enhanced data processing, transformation, and analytics workflows. This unlocks possibilities like automating cluster management, dynamically running jobs based on external triggers, and orchestrating complex data pipelines with ease.
import { axios } from "@pipedream/platform"
export default defineComponent({
props: {
databricks: {
type: "app",
app: "databricks",
}
},
async run({steps, $}) {
return await axios($, {
url: `https://${this.databricks.$auth.domain}.cloud.databricks.com/api/2.0/preview/scim/v2/Me`,
headers: {
Authorization: `Bearer ${this.databricks.$auth.access_token}`,
},
})
},
})
The PagerDuty API offers a powerful interface to automate your digital operations management. By leveraging its capabilities on Pipedream, you can create workflows that respond to incidents, automate alerts, and synchronize incident data across various platforms. PagerDuty's API enables you to manage services, teams, and incidents, ensuring that your systems remain operational and that the right people are notified at the right time.
import { axios } from "@pipedream/platform"
export default defineComponent({
props: {
pagerduty: {
type: "app",
app: "pagerduty",
}
},
async run({steps, $}) {
return await axios($, {
url: `https://api.pagerduty.com/users/me`,
headers: {
Authorization: `Bearer ${this.pagerduty.$auth.oauth_access_token}`,
"Accept": `application/vnd.pagerduty+json;version=2`,
},
})
},
})