dbt is an intuitive, collaborative platform that lets you reliably transform data using SQL and Python code.
Retrieve information about an environment. See the documentation
Write custom Node.js code and use any of the 400k+ npm packages available. Refer to the Pipedream Node docs to learn more.
Retrieve information about a run artifact. See the documentation
Trigger a specified job to begin running. See the documentation
The dbt Cloud API allows users to initiate jobs, check on their status, and interact with dbt Cloud programmatically. On Pipedream, you can harness this functionality to automate workflows, such as triggering dbt runs, monitoring your data transformation jobs, and integrating dbt Cloud with other data services. By leveraging Pipedream's serverless platform, you can create custom workflows that act on dbt Cloud events or use the dbt Cloud API to manage your data transformation processes seamlessly.
import { axios } from "@pipedream/platform"
export default defineComponent({
props: {
dbt: {
type: "app",
app: "dbt",
}
},
async run({steps, $}) {
const baseUrl = this.dbt.$auth.access_url || `https://${this.dbt.$auth.region}.com/`
return await axios($, {
url: `${baseUrl}api/v3/accounts/`,
headers: {
"Authorization": `Token ${this.dbt.$auth.api_key}`,
"Accept": `application/json`,
},
})
},
})
Develop, run and deploy your Node.js code in Pipedream workflows, using it between no-code steps, with connected accounts, or integrate Data Stores and File Stores.
This includes installing NPM packages, within your code without having to manage a package.json
file or running npm install
.
Below is an example of installing the axios
package in a Pipedream Node.js code step. Pipedream imports the axios
package, performs the API request, and shares the response with subsequent workflow steps:
// To use previous step data, pass the `steps` object to the run() function
export default defineComponent({
async run({ steps, $ }) {
// Return data to use it in future steps
return steps.trigger.event
},
})