Web Scraping API & Rotating Proxy Servers. Turn Any Website Into Data with Undetected Scraping Technology 🔥.
Create a new document in a collection of your choice. See the docs here
Scrape HTML of the URL with CSS Selectors. See the documentation
ZenRows API specializes in web scraping and handles issues like CAPTCHAs, JavaScript rendering, and rotating proxies to ensure successful data extraction. In Pipedream, you can pair the ZenRows API with numerous other services to create automated workflows that respond to events, process and analyze scraped data, or even trigger actions based on the data collected. Whether you need to monitor changes on web pages, aggregate content for analysis, or feed scraped data into other applications, ZenRows' integration on Pipedream simplifies these tasks.
import { axios } from "@pipedream/platform"
export default defineComponent({
props: {
zenrows: {
type: "app",
app: "zenrows",
}
},
async run({steps, $}) {
return await axios($, {
url: `https://api.zenrows.com/v1/`,
params: {
apikey: `${this.zenrows.$auth.api_key}`,
url: `https://httpbin.io/anything`,
},
})
},
})
The MongoDB API provides powerful capabilities to interact with a MongoDB database, allowing you to perform CRUD (Create, Read, Update, Delete) operations, manage databases, and execute sophisticated queries. With Pipedream, you can harness these abilities to automate tasks, sync data across various apps, and react to events in real-time. It’s a combo that’s particularly potent for managing data workflows, syncing application states, or triggering actions based on changes to your data.
import mongodb from 'mongodb'
export default defineComponent({
props: {
mongodb: {
type: "app",
app: "mongodb",
},
collection: {
type: "string"
},
filter: {
type: "object"
}
},
async run({steps, $}) {
const MongoClient = mongodb.MongoClient
const {
database,
hostname,
username,
password,
} = this.mongodb.$auth
const url = `mongodb+srv://${username}:${password}@${hostname}/test?retryWrites=true&w=majority`
const client = await MongoClient.connect(url, {
useNewUrlParser: true,
useUnifiedTopology: true
})
const db = client.db(database)
const results = await db.collection(this.collection).find(this.filter).toArray();
$.export('results', results);
await client.close()
},
})