with FireCrawl and Database?
Crawls a given URL and returns the contents of sub-pages. See the documentation
Extract structured data from one or multiple URLs. See the documentation
Obtains the status and data from a previous crawl operation. See the documentation
Obtains the status and data from a previous extract operation. See the documentation
import { axios } from "@pipedream/platform"
export default defineComponent({
props: {
firecrawl: {
type: "app",
app: "firecrawl",
}
},
async run({steps, $}) {
const data = {
"url": "https://pipedream.com",
}
return await axios($, {
method: "post",
url: `https://api.firecrawl.dev/v0/crawl`,
headers: {
Authorization: `Bearer ${this.firecrawl.$auth.api_key}`,
},
data,
})
},
})
The Database API on Pipedream allows users to execute SQL commands directly within workflows, enabling rich and dynamic data manipulation and storage. It supports PostgreSQL, MySQL, and SQLite, making it a versatile option for managing data across various database systems. With this API, users can perform tasks such as data insertion, querying, updates, and deletions, directly within their automations, facilitating real-time data processing and integration across multiple platforms.
import postgresql from "@pipedream/postgresql";
export default {
props: {
postgresql,
sql: {
type: "sql",
auth: {
app: "postgresql",
},
label: "PostreSQL Query",
},
},
async run({ $ }) {
const args = this.postgresql.executeQueryAdapter(this.sql);
const data = await this.postgresql.executeQuery(args);
return data;
},
};