with WebScraper.IO and Inoreader?
Emit new event when a new article is added to a folder. See the Documentation
Emit new event when a new broadcasted article is added. See the Documentation
Emit new event when a new starred article is added. See the Documentation
Emit new event when a page scraping job has completed. See the docs here
Emit new event when a new subscription is added. See the Documentation
Creates a scraping job (scrapes a sitemap). See the docs here
Creates a sitemap for the selected website. See the docs here
Retrieves a list of scraping jobs for a sitemap. See the docs here
The WebScraper.IO API allows you to programmatically perform web scraping tasks, extracting structured data from websites. With the API, you can automate the gathering of web content for analysis, monitoring, and integration with other data sources. In Pipedream, you can leverage this API to build workflows that process, analyze, and act on the data you scrape without writing code for backend infrastructure.
import { axios } from "@pipedream/platform"
export default defineComponent({
props: {
webscraper_io: {
type: "app",
app: "webscraper_io",
}
},
async run({steps, $}) {
return await axios($, {
url: `https://api.webscraper.io/api/v1/sitemaps`,
params: {
api_token: `${this.webscraper_io.$auth.api_key}`,
},
})
},
})
The Inoreader API taps into the functionality of the Inoreader content reader, allowing the automation of tasks like subscribing to new feeds, listing articles, or marking items as read. In Pipedream, this can be leveraged to create custom workflows that integrate with other apps, trigger actions based on new content, or manage content consumption in a more efficient way.
import { axios } from "@pipedream/platform"
export default defineComponent({
props: {
inoreader: {
type: "app",
app: "inoreader",
}
},
async run({steps, $}) {
return await axios($, {
url: `https://www.inoreader.com/reader/api/0/user-info`,
headers: {
Authorization: `Bearer ${this.inoreader.$auth.oauth_access_token}`,
},
})
},
})