Notion is a new tool that blends your everyday work apps into one. It's the all-in-one workspace for you and your team.
Emit new event when a new comment is created in a page or block. See the documentation
Emit new event when a database is created. See the documentation
Emit new event when a page is created or updated in the selected database. See the documentation
Emit new event when a page is created in the selected database. See the documentation
Emit new event when the selected page or one of its sub-pages is updated. See the documentation
Append new and/or existing blocks to the specified parent. See the documentation
Retrieve data from a social media scraping job by responseId. See the documentation
Create a comment in a page or existing discussion thread. See the documentation
Use ScrapingBot API to initiate scraping data from a social media site. See the documentation
Notion's API allows for the creation, reading, updating, and deleting of pages, databases, and their contents within Notion. Using Pipedream's platform, you can build workflows that connect Notion with various other services to automate tasks such as content management, task tracking, and data synchronization. With Pipedream's serverless execution, you can trigger these workflows on a schedule, or by external events from other services, without managing any infrastructure.
import { axios } from "@pipedream/platform"
export default defineComponent({
props: {
notion: {
type: "app",
app: "notion",
}
},
async run({steps, $}) {
return await axios($, {
url: `https://api.notion.com/v1/users/me`,
headers: {
Authorization: `Bearer ${this.notion.$auth.oauth_access_token}`,
"Notion-Version": `2021-08-16`,
},
})
},
})
ScrapingBot API on Pipedream allows you to scrape websites without getting blocked, fetching crucial information while bypassing common defenses. Whether you're extracting product details, real estate listings, or automating competitor research, this API combined with Pipedream's serverless platform offers you the tools to automate these tasks efficiently. Pipedream's ability to trigger workflows via HTTP requests, schedule them, or react to events, means you can create robust scraping operations that integrate seamlessly with hundreds of other apps.
import { axios } from "@pipedream/platform"
export default defineComponent({
props: {
scrapingbot: {
type: "app",
app: "scrapingbot",
}
},
async run({steps, $}) {
const data = {
"url": ``,
}
return await axios($, {
method: "post",
url: `http://api.scraping-bot.io/scrape/raw-html`,
headers: {
"Content-Type": `application/json`,
},
auth: {
username: `${this.scrapingbot.$auth.username}`,
password: `${this.scrapingbot.$auth.api_key}`,
},
data,
})
},
})