← Scrapeless

Crawler with Scrapeless API

Pipedream makes it easy to connect APIs for Scrapeless and 2,700+ other apps remarkably fast.

Trigger workflow on
HTTP requests, schedules and app events
Next, do this
Crawler with the Scrapeless API
No credit card required
Intro to Pipedream
Watch us build a workflow
Watch us build a workflow
8 min
Watch now ➜

Trusted by 1,000,000+ developers from startups to Fortune 500 companies

Adyen logo
Appcues logo
Bandwidth logo
Checkr logo
ChartMogul logo
Dataminr logo
Gopuff logo
Gorgias logo
LinkedIn logo
Logitech logo
Replicated logo
Rudderstack logo
SAS logo
Scale AI logo
Webflow logo
Warner Bros. logo
Adyen logo
Appcues logo
Bandwidth logo
Checkr logo
ChartMogul logo
Dataminr logo
Gopuff logo
Gorgias logo
LinkedIn logo
Logitech logo
Replicated logo
Rudderstack logo
SAS logo
Scale AI logo
Webflow logo
Warner Bros. logo

Developers Pipedream

Getting Started

Create a workflow to Crawler with the Scrapeless API. When you configure and deploy the workflow, it will run on Pipedream's servers 24x7 for free.

  1. Configure the Crawler action
    1. Connect your Scrapeless account
    2. Select a Please select a API server
  2. Select a trigger to run your workflow on HTTP requests, schedules or app events
  3. Deploy the workflow
  4. Send a test event to validate your setup
  5. Turn on the trigger

Integrations

Crawler with Scrapeless API on New Requests (Payload Only) from HTTP / Webhook API
HTTP / Webhook + Scrapeless
 
Try it
Crawler with Scrapeless API on New Submission from Typeform API
Typeform + Scrapeless
 
Try it
Crawler with Scrapeless API on New Submission (Instant) from Jotform API
Jotform + Scrapeless
 
Try it
Crawler with Scrapeless API on New Scheduled Tasks from Pipedream API
Pipedream + Scrapeless
 
Try it
Crawler with Scrapeless API on New Download Counts from npm API
npm + Scrapeless
 
Try it

Details

This is a pre-built, source-available component from Pipedream's GitHub repo. The component is developed by Pipedream and the community, and verified and maintained by Pipedream.

To contribute an update to an existing component or create a new component, create a PR on GitHub. If you're new to Pipedream component development, you can start with quickstarts for trigger span and action development, and then review the component API reference.

Crawler on Scrapeless
Description:Crawl any website at scale and say goodbye to blocks. [See the documentation](https://apidocs.scrapeless.com/api-17509010).
Version:0.0.2
Key:scrapeless-crawler

Code

import scrapeless from "../../scrapeless.app.mjs";

export default {
  key: "scrapeless-crawler",
  name: "Crawler",
  description: "Crawl any website at scale and say goodbye to blocks. [See the documentation](https://apidocs.scrapeless.com/api-17509010).",
  version: "0.0.2",
  type: "action",
  props: {
    scrapeless,
    apiServer: {
      type: "string",
      label: "Please select a API server",
      description: "Please select a API server to use",
      default: "crawl",
      options: [
        {
          label: "Crawl",
          value: "crawl",
        },
        {
          label: "Scrape",
          value: "scrape",
        },
      ],
      reloadProps: true,
    },
  },
  async run({ $ }) {
    const {
      scrapeless, apiServer, ...inputProps
    } = this;

    const browserOptions = {
      "proxy_country": "ANY",
      "session_name": "Crawl",
      "session_recording": true,
      "session_ttl": 900,
    };

    let response;

    if (apiServer === "crawl") {
      response =
        await scrapeless._scrapelessClient().scrapingCrawl.crawl.crawlUrl(inputProps.url, {
          limit: inputProps.limitCrawlPages,
          browserOptions,
        });
    }

    if (apiServer === "scrape") {
      response =
        await scrapeless._scrapelessClient().scrapingCrawl.scrape.scrapeUrl(inputProps.url, {
          browserOptions,
        });
    }

    if (response?.status === "completed" && response?.data) {
      $.export("$summary", `Successfully retrieved crawling results for ${inputProps.url}`);
      return response.data;
    } else {
      throw new Error(response?.error || "Failed to retrieve crawling results");
    }
  },
  additionalProps() {
    const { apiServer } = this;

    const props = {};

    if (apiServer === "crawl" || apiServer === "scrape") {
      props.url = {
        type: "string",
        label: "URL to Crawl",
        description: "If you want to crawl in batches, please refer to the SDK of the document",
      };
    }

    if (apiServer === "crawl") {
      props.limitCrawlPages = {
        type: "integer",
        label: "Number Of Subpages",
        default: 5,
        description: "Max number of results to return",
      };
    }

    return props;
  },
};

Configuration

This component may be configured based on the props defined in the component code. Pipedream automatically prompts for input values in the UI and CLI.
LabelPropTypeDescription
ScrapelessscrapelessappThis component uses the Scrapeless app.
Please select a API serverapiServerstringSelect a value from the drop down menu:{ "label": "Crawl", "value": "crawl" }{ "label": "Scrape", "value": "scrape" }

Authentication

Scrapeless uses API keys for authentication. When you connect your Scrapeless account, Pipedream securely stores the keys so you can easily authenticate to Scrapeless APIs in both code and no-code steps.

About Scrapeless

Effortless Web Scraping Toolkit

More Ways to Use Scrapeless

Actions

Get Scrape Result with the Scrapeless API

Retrieve the result of a completed scraping job. See the documentation

 
Try it
Scraping API with the Scrapeless API

Endpoints for fresh, structured data from 100+ popular sites. See the documentation

 
Try it
Submit Scrape Job with the Scrapeless API

Submit a new web scraping job with specified target URL and extraction rules. See the documentation

 
Try it
Universal Scraping API with the Scrapeless API

Access any website at scale and say goodbye to blocks. See the documentation

 
Try it

Explore Other Apps

1
-
24
of
2,700+
apps by most popular

HTTP / Webhook
HTTP / Webhook
Get a unique URL where you can send HTTP or webhook requests
Node
Node
Anything you can do with Node.js, you can do in a Pipedream workflow. This includes using most of npm's 400,000+ packages.
Python
Python
Anything you can do in Python can be done in a Pipedream Workflow. This includes using any of the 350,000+ PyPi packages available in your Python powered workflows.
Pipedream Utils
Pipedream Utils
Utility functions to use within your Pipedream workflows
Notion
Notion
Notion is a new tool that blends your everyday work apps into one. It's the all-in-one workspace for you and your team.
OpenAI (ChatGPT)
OpenAI (ChatGPT)
OpenAI is an AI research and deployment company with the mission to ensure that artificial general intelligence benefits all of humanity. They are the makers of popular models like ChatGPT, DALL-E, and Whisper.
Anthropic (Claude)
Anthropic (Claude)
AI research and products that put safety at the frontier. Introducing Claude, a next-generation AI assistant for your tasks, no matter the scale.
Google Sheets
Google Sheets
Use Google Sheets to create and edit online spreadsheets. Get insights together with secure sharing in real-time and from any device.
Telegram
Telegram
Telegram, is a cloud-based, cross-platform, encrypted instant messaging (IM) service.
Google Drive
Google Drive
Google Drive is a file storage and synchronization service which allows you to create and share your work online, and access your documents from anywhere.
Pinterest
Pinterest
Pinterest is a visual discovery engine for finding ideas like recipes, home and style inspiration, and more.
Google Calendar
Google Calendar
With Google Calendar, you can quickly schedule meetings and events and get reminders about upcoming activities, so you always know what’s next.
Shopify
Shopify
Shopify is a complete commerce platform that lets anyone start, manage, and grow a business. You can use Shopify to build an online store, manage sales, market to customers, and accept payments in digital and physical locations.
Supabase
Supabase
Supabase is an open source Firebase alternative.
MySQL
MySQL
MySQL is an open-source relational database management system.
PostgreSQL
PostgreSQL
PostgreSQL is a free and open-source relational database management system emphasizing extensibility and SQL compliance.
Premium
AWS
AWS
Amazon Web Services (AWS) offers reliable, scalable, and inexpensive cloud computing services.
Premium
Twilio SendGrid
Twilio SendGrid
Send marketing and transactional email through the Twilio SendGrid platform with the Email API, proprietary mail transfer agent, and infrastructure for scalable delivery.
Amazon SES
Amazon SES
Amazon SES is a cloud-based email service provider that can integrate into any application for high volume email automation
Premium
Klaviyo
Klaviyo
Email Marketing and SMS Marketing Platform
Premium
Zendesk
Zendesk
Zendesk is award-winning customer service software trusted by 200K+ customers. Make customers happy via text, mobile, phone, email, live chat, social media.
Premium
ServiceNow
ServiceNow
The smarter way to workflow
Slack
Slack
Slack is a channel-based messaging platform. With Slack, people can work together more effectively, connect all their software tools and services, and find the information they need to do their best work — all within a secure, enterprise-grade environment.
Microsoft Teams
Microsoft Teams
Microsoft Teams has communities, events, chats, channels, meetings, storage, tasks, and calendars in one place.