with dbt Cloud and WebScraper.IO?
Retrieve information about an environment. See the documentation
Creates a scraping job (scrapes a sitemap). See the docs here
Creates a sitemap for the selected website. See the docs here
Retrieve information about a run artifact. See the documentation
The dbt Cloud API allows users to initiate jobs, check on their status, and interact with dbt Cloud programmatically. On Pipedream, you can harness this functionality to automate workflows, such as triggering dbt runs, monitoring your data transformation jobs, and integrating dbt Cloud with other data services. By leveraging Pipedream's serverless platform, you can create custom workflows that act on dbt Cloud events or use the dbt Cloud API to manage your data transformation processes seamlessly.
import { axios } from "@pipedream/platform"
export default defineComponent({
props: {
dbt: {
type: "app",
app: "dbt",
}
},
async run({steps, $}) {
const baseUrl = this.dbt.$auth.access_url || `https://${this.dbt.$auth.region}.com/`
return await axios($, {
url: `${baseUrl}api/v3/accounts/`,
headers: {
"Authorization": `Token ${this.dbt.$auth.api_key}`,
"Accept": `application/json`,
},
})
},
})
The WebScraper.IO API allows you to programmatically perform web scraping tasks, extracting structured data from websites. With the API, you can automate the gathering of web content for analysis, monitoring, and integration with other data sources. In Pipedream, you can leverage this API to build workflows that process, analyze, and act on the data you scrape without writing code for backend infrastructure.
import { axios } from "@pipedream/platform"
export default defineComponent({
props: {
webscraper_io: {
type: "app",
app: "webscraper_io",
}
},
async run({steps, $}) {
return await axios($, {
url: `https://api.webscraper.io/api/v1/sitemaps`,
params: {
api_token: `${this.webscraper_io.$auth.api_key}`,
},
})
},
})