with Google Cloud and Databricks?
Cancel all active runs for a job. The runs are canceled asynchronously, so it doesn't prevent new runs from being started. See the documentation
Cancel a job run. The run is canceled asynchronously, so it may still be running when this request completes. See the documentation
Inserts rows into a BigQuery table. See the docs and for an example here
The Google Cloud API opens a world of possibilities for enhancing cloud operations and automating tasks. It empowers you to manage, scale, and fine-tune various services within the Google Cloud Platform (GCP) programmatically. With Pipedream, you can harness this power to create intricate workflows, trigger cloud functions based on events from other apps, manage resources, and analyze data, all in a serverless environment. The ability to interconnect GCP services with numerous other apps enriches automation, making it easier to synchronize data, streamline development workflows, and deploy applications efficiently.
module.exports = defineComponent({
props: {
google_cloud: {
type: "app",
app: "google_cloud",
}
},
async run({steps, $}) {
// Required workaround to get the @google-cloud/storage package
// working correctly on Pipedream
require("@dylburger/umask")()
const { Storage } = require('@google-cloud/storage')
const key = JSON.parse(this.google_cloud.$auth.key_json)
const storage = new Storage({
projectId: key.project_id,
credentials: {
client_email: key.client_email,
private_key: key.private_key,
}
})
await storage.authClient.getCredentials()
return {
status: "success",
authenticated: true,
projectId: key.project_id,
serviceAccount: key.client_email
}
},
})The Databricks API allows you to interact programmatically with Databricks services, enabling you to manage clusters, jobs, notebooks, and other resources within Databricks environments. Through Pipedream, you can leverage these APIs to create powerful automations and integrate with other apps for enhanced data processing, transformation, and analytics workflows. This unlocks possibilities like automating cluster management, dynamically running jobs based on external triggers, and orchestrating complex data pipelines with ease.
import { axios } from "@pipedream/platform"
export default defineComponent({
props: {
databricks: {
type: "app",
app: "databricks",
}
},
async run({steps, $}) {
return await axios($, {
url: `https://${this.databricks.$auth.domain}.cloud.databricks.com/api/2.0/preview/scim/v2/Me`,
headers: {
Authorization: `Bearer ${this.databricks.$auth.access_token}`,
},
})
},
})