A very simple Open Graph API. Don't waste time and resources scraping sites or trying to unfurl urls.
Scrape OpenGraph data from a list of URLs at once, to process multiple websites simultaneously. See the docs here
Write Python and use any of the 350k+ PyPi packages available. Refer to the Pipedream Python docs to learn more.
Extract specific OpenGraph properties from a specified URL, such as title, image, or description. See the docs here
Retrieve OpenGraph data from a specified URL using the OpenGraph.io API. See the docs here
The OpenGraph.io API enables you to fetch Open Graph data from websites, which is useful for previewing content like images, titles, and descriptions the way social platforms do. Integrating OpenGraph.io with Pipedream allows you to automate the retrieval of this data in response to various triggers and to use it in workflows that involve other apps and services.
import { axios } from "@pipedream/platform"
export default defineComponent({
props: {
opengraph_io: {
type: "app",
app: "opengraph_io",
}
},
async run({steps, $}) {
return await axios($, {
url: `https://opengraph.io/api/1.1/site/https%3A%2F%2Fpipedream.com`,
params: {
app_id: `${this.opengraph_io.$auth.api_key}`,
},
})
},
})
Develop, run and deploy your Python code in Pipedream workflows. Integrate seamlessly between no-code steps, with connected accounts, or integrate Data Stores and manipulate files within a workflow.
This includes installing PyPI packages, within your code without having to manage a requirements.txt
file or running pip
.
Below is an example of using Python to access data from the trigger of the workflow, and sharing it with subsequent workflow steps:
def handler(pd: "pipedream"):
# Reference data from previous steps
print(pd.steps["trigger"]["context"]["id"])
# Return data for use in future steps
return {"foo": {"test":True}}