Scrapfly Web Scraping API for developer
Automate content extraction from any text-based source using AI, LLM, and custom parsing. See the documentation
Write Python and use any of the 350k+ PyPi packages available. Refer to the Pipedream Python docs to learn more.
Retrieve current subscription and account usage details from Scrapfly. See the documentation
import { axios } from "@pipedream/platform"
export default defineComponent({
props: {
scrapfly: {
type: "app",
app: "scrapfly",
}
},
async run({steps, $}) {
return await axios($, {
url: `https://api.scrapfly.io/scrape`,
params: {
key: `${this.scrapfly.$auth.api_key}`,
url: `https://pipedream.com`,
render_js: `true`,
asp: `true`,
},
})
},
})
Develop, run and deploy your Python code in Pipedream workflows. Integrate seamlessly between no-code steps, with connected accounts, or integrate Data Stores and manipulate files within a workflow.
This includes installing PyPI packages, within your code without having to manage a requirements.txt
file or running pip
.
Below is an example of using Python to access data from the trigger of the workflow, and sharing it with subsequent workflow steps:
def handler(pd: "pipedream"):
# Reference data from previous steps
print(pd.steps["trigger"]["context"]["id"])
# Return data for use in future steps
return {"foo": {"test":True}}