Notion is a new tool that blends your everyday work apps into one. It's the all-in-one workspace for you and your team.
Emit new event when a database is created. Note: Databases must be shared with your Pipedream Integtration to trigger event.
Emit new event when a page or one of its sub-pages is updated.
Emit new event when a page in a database is updated. To select a specific page, use Updated Page ID
instead
Creates and appends blocks to the specified parent. See the documentation
Initiate the scraping process for a specific endpoint. See the documentation here.
Creates a page from a parent page. The only valid property is title. See the documentation
Creates a new page copied from an existing page block. See the docs
Notion's API allows for the creation, reading, updating, and deleting of pages, databases, and their contents within Notion. Using Pipedream's platform, you can build workflows that connect Notion with various other services to automate tasks such as content management, task tracking, and data synchronization. With Pipedream's serverless execution, you can trigger these workflows on a schedule, or by external events from other services, without managing any infrastructure.
import { axios } from "@pipedream/platform"
export default defineComponent({
props: {
notion: {
type: "app",
app: "notion",
}
},
async run({steps, $}) {
return await axios($, {
url: `https://api.notion.com/v1/users/me`,
headers: {
Authorization: `Bearer ${this.notion.$auth.oauth_access_token}`,
"Notion-Version": `2021-08-16`,
},
})
},
})
The Scrape-It.Cloud API allows you to automate the extraction of data from websites. It can parse, scrape, and retrieve content without the need for manual intervention. With this API on Pipedream, you can build workflows that trigger on various events and use the scraped data for numerous applications like data analysis, lead generation, and content aggregation.
import { axios } from "@pipedream/platform"
export default defineComponent({
props: {
scrape_it_cloud: {
type: "app",
app: "scrape_it_cloud",
}
},
async run({steps, $}) {
const data = {
"url": `https://pipedream.com`,
}
return await axios($, {
method: "post",
url: `https://api.scrape-it.cloud/scrape`,
headers: {
"Content-Type": `application/json`,
"x-api-key": `${this.scrape_it_cloud.$auth.api_key}`,
},
data,
})
},
})