← RSS + Apify integrations

Scrape Single URL with Apify API on New Item From Multiple RSS Feeds from RSS API

Pipedream makes it easy to connect APIs for Apify, RSS and 2,400+ other apps remarkably fast.

Trigger workflow on
New Item From Multiple RSS Feeds from the RSS API
Next, do this
Scrape Single URL with the Apify API
No credit card required
Intro to Pipedream
Watch us build a workflow
Watch us build a workflow
8 min
Watch now ➜

Trusted by 1,000,000+ developers from startups to Fortune 500 companies

Adyen logo
Appcues logo
Bandwidth logo
Checkr logo
ChartMogul logo
Dataminr logo
Gopuff logo
Gorgias logo
LinkedIn logo
Logitech logo
Replicated logo
Rudderstack logo
SAS logo
Scale AI logo
Webflow logo
Warner Bros. logo
Adyen logo
Appcues logo
Bandwidth logo
Checkr logo
ChartMogul logo
Dataminr logo
Gopuff logo
Gorgias logo
LinkedIn logo
Logitech logo
Replicated logo
Rudderstack logo
SAS logo
Scale AI logo
Webflow logo
Warner Bros. logo

Developers Pipedream

Getting Started

This integration creates a workflow with a RSS trigger and Apify action. When you configure and deploy the workflow, it will run on Pipedream's servers 24x7 for free.

  1. Select this integration
  2. Configure the New Item From Multiple RSS Feeds trigger
    1. Connect your RSS account
    2. Configure timer
    3. Configure Feed URLs
    4. Optional- Configure Max per Feed
  3. Configure the Scrape Single URL action
    1. Connect your Apify account
    2. Configure URL
    3. Select a Crawler Type
  4. Deploy the workflow
  5. Send a test event to validate your setup
  6. Turn on the trigger

Details

This integration uses pre-built, source-available components from Pipedream's GitHub repo. These components are developed by Pipedream and the community, and verified and maintained by Pipedream.

To contribute an update to an existing component or create a new component, create a PR on GitHub. If you're new to Pipedream component development, you can start with quickstarts for trigger span and action development, and then review the component API reference.

Trigger

Description:Emit new items from multiple RSS feeds
Version:1.2.7
Key:rss-new-item-from-multiple-feeds

RSS Overview

The RSS app allows users to automatically fetch and parse updates from web feeds. This functionality is pivotal for staying abreast of content changes or updates from websites, blogs, and news outlets that offer RSS feeds. With Pipedream, you can harness the RSS API to trigger workflows that enable a broad range of automations, like content aggregation, monitoring for specific keywords, notifications, and data synchronization across platforms.

Trigger Code

import rss from "../../app/rss.app.mjs";
import { defineSource } from "@pipedream/types";
import rssCommon from "../common/common.mjs";
export default defineSource({
    ...rssCommon,
    key: "rss-new-item-from-multiple-feeds",
    name: "New Item From Multiple RSS Feeds",
    type: "source",
    description: "Emit new items from multiple RSS feeds",
    version: "1.2.7",
    props: {
        ...rssCommon.props,
        urls: {
            propDefinition: [
                rss,
                "urls",
            ],
            description: "Enter one or multiple URLs from any public RSS feed. To avoid timeouts, 5 or less URLs is recommended.",
        },
        max: {
            type: "integer",
            label: "Max per Feed",
            description: "Maximum number of posts per feed to retrieve at one time. Defaults to 20.",
            optional: true,
            default: 20,
        },
    },
    dedupe: "unique",
    hooks: {
        async activate() {
            // Try to parse the feed one time to confirm we can fetch and parse.
            // The code will throw any errors to the user.
            for (const url of this.urls) {
                await this.rss.fetchAndParseFeed(url);
            }
        },
    },
    async run() {
        const items = [];
        for (const url of this.urls) {
            const feedItems = (await this.rss.fetchAndParseFeed(url))?.slice(0, this.max);
            console.log(`Retrieved items from ${url}`);
            items.push(...feedItems);
        }
        this.rss.sortItems(items).forEach((item) => {
            const meta = this.generateMeta(item);
            this.$emit(item, meta);
        });
    },
});

Trigger Configuration

This component may be configured based on the props defined in the component code. Pipedream automatically prompts for input values in the UI and CLI.
LabelPropTypeDescription
RSSrssappThis component uses the RSS app.
timer$.interface.timer

How often you want to poll the feed for new items

Feed URLsurlsstring[]

Enter one or multiple URLs from any public RSS feed. To avoid timeouts, 5 or less URLs is recommended.

Max per Feedmaxinteger

Maximum number of posts per feed to retrieve at one time. Defaults to 20.

Trigger Authentication

The RSS API does not require authentication.

About RSS

Real Simple Syndication

Action

Description:Executes a scraper on a specific website and returns its content as text. This action is perfect for extracting content from a single page.
Version:0.0.1
Key:apify-scrape-single-url

Apify Overview

The Apify API unleashes the power to automate web scraping, process data, and orchestrate web automation workflows. By utilizing Apify on Pipedream, you can create dynamic serverless workflows to manage tasks like extracting data from websites, running browser automation, and scheduling these jobs to run autonomously. It integrates smoothly with Pipedream's capabilities to trigger actions on various other apps, store the results, and manage complex data flow with minimal setup.

Action Code

import apify from "../../apify.app.mjs";
import { ACTOR_ID } from "../../common/constants.mjs";

export default {
  key: "apify-scrape-single-url",
  name: "Scrape Single URL",
  description: "Executes a scraper on a specific website and returns its content as text. This action is perfect for extracting content from a single page.",
  version: "0.0.1",
  type: "action",
  props: {
    apify,
    url: {
      type: "string",
      label: "URL",
      description: "The URL of the web page to scrape.",
      optional: false,
    },
    crawlerType: {
      type: "string",
      label: "Crawler Type",
      description: "Select the crawling engine:\n- **Headless web browser** - Useful for modern websites with anti-scraping protections and JavaScript rendering. It recognizes common blocking patterns like CAPTCHAs and automatically retries blocked requests through new sessions. However, running web browsers is more expensive as it requires more computing resources and is slower. It is recommended to use at least 8 GB of RAM.\n- **Stealthy web browser** (default) - Another headless web browser with anti-blocking measures enabled. Try this if you encounter bot protection while scraping. For best performance, use with Apify Proxy residential IPs. \n- **Raw HTTP client** - High-performance crawling mode that uses raw HTTP requests to fetch the pages. It is faster and cheaper, but it might not work on all websites.",
      options: [
        {
          label: "Headless browser (stealthy Firefox+Playwright) - Very reliable, best in avoiding blocking, but might be slow",
          value: "playwright:firefox",
        },
        {
          label: "Headless browser (Chrome+Playwright) - Reliable, but might be slow",
          value: "playwright:chrome",
        },
        {
          label: "Raw HTTP client (Cheerio) - Extremely fast, but cannot handle dynamic content",
          value: "cheerio",
        },
      ],
    },
  },
  async run({ $ }) {
    const response = await this.apify.runActor({
      $,
      actorId: ACTOR_ID,
      data: {
        crawlerType: this.crawlerType,
        maxCrawlDepth: 0,
        maxCrawlPages: 1,
        maxResults: 1,
        startUrls: [
          {
            url: this.url,
          },
        ],
      },
    });
    $.export("$summary", `Successfully scraped content from ${this.url}`);
    return response;
  },
};

Action Configuration

This component may be configured based on the props defined in the component code. Pipedream automatically prompts for input values in the UI.

LabelPropTypeDescription
ApifyapifyappThis component uses the Apify app.
URLurlstring

The URL of the web page to scrape.

Crawler TypecrawlerTypestringSelect a value from the drop down menu:{ "label": "Headless browser (stealthy Firefox+Playwright) - Very reliable, best in avoiding blocking, but might be slow", "value": "playwright:firefox" }{ "label": "Headless browser (Chrome+Playwright) - Reliable, but might be slow", "value": "playwright:chrome" }{ "label": "Raw HTTP client (Cheerio) - Extremely fast, but cannot handle dynamic content", "value": "cheerio" }

Action Authentication

Apify uses API keys for authentication. When you connect your Apify account, Pipedream securely stores the keys so you can easily authenticate to Apify APIs in both code and no-code steps.

You can find your API token on the Integrations page in the Apify console.

About Apify

Apify is the place to find, develop, buy and run cloud programs called actors. Use actors to turn any website into an API.

More Ways to Connect Apify + RSS

Scrape Single URL with Apify API on New Item in Feed from RSS API
RSS + Apify
 
Try it
Scrape Single URL with Apify API on Random item from multiple RSS feeds from RSS API
RSS + Apify
 
Try it
Run Actor with Apify API on New Item From Multiple RSS Feeds from RSS API
RSS + Apify
 
Try it
Run Actor with Apify API on New Item in Feed from RSS API
RSS + Apify
 
Try it
Run Actor with Apify API on Random item from multiple RSS feeds from RSS API
RSS + Apify
 
Try it
New Item in Feed from the RSS API

Emit new items from an RSS feed

 
Try it
New Item From Multiple RSS Feeds from the RSS API

Emit new items from multiple RSS feeds

 
Try it
Random item from multiple RSS feeds from the RSS API

Emit a random item from multiple RSS feeds

 
Try it
New Finished Actor Run (Instant) from the Apify API

Emit new event when a selected actor is run and finishes.

 
Try it
New Finished Task Run (Instant) from the Apify API

Emit new event when a selected task is run and finishes.

 
Try it
Merge RSS Feeds with the RSS API

Retrieve multiple RSS feeds and return a merged array of items sorted by date See documentation

 
Try it
Run Actor with the Apify API

Performs an execution of a selected actor in Apify. See the documentation

 
Try it
Scrape Single URL with the Apify API

Executes a scraper on a specific website and returns its content as text. This action is perfect for extracting content from a single page.

 
Try it
Set Key-Value Store Record with the Apify API

Create or update a record in the key-value store of Apify. See the documentation

 
Try it

Explore Other Apps

1
-
24
of
2,400+
apps by most popular

HTTP / Webhook
HTTP / Webhook
Get a unique URL where you can send HTTP or webhook requests
Node
Node
Anything you can do with Node.js, you can do in a Pipedream workflow. This includes using most of npm's 400,000+ packages.
Python
Python
Anything you can do in Python can be done in a Pipedream Workflow. This includes using any of the 350,000+ PyPi packages available in your Python powered workflows.
OpenAI (ChatGPT)
OpenAI (ChatGPT)
OpenAI is an AI research and deployment company with the mission to ensure that artificial general intelligence benefits all of humanity. They are the makers of popular models like ChatGPT, DALL-E, and Whisper.
Premium
Salesforce
Salesforce
Web services API for interacting with Salesforce
Premium
HubSpot
HubSpot
HubSpot's CRM platform contains the marketing, sales, service, operations, and website-building software you need to grow your business.
Premium
Zoho CRM
Zoho CRM
Zoho CRM is an online Sales CRM software that manages your sales, marketing, and support in one CRM platform.
Premium
Stripe
Stripe
Stripe powers online and in-person payment processing and financial solutions for businesses of all sizes.
Shopify
Shopify
Shopify is a complete commerce platform that lets anyone start, manage, and grow a business. You can use Shopify to build an online store, manage sales, market to customers, and accept payments in digital and physical locations.
Premium
WooCommerce
WooCommerce
WooCommerce is the open-source ecommerce platform for WordPress.
Premium
Snowflake
Snowflake
A data warehouse built for the cloud
Premium
MongoDB
MongoDB
MongoDB is an open source NoSQL database management program.
Supabase
Supabase
Supabase is an open source Firebase alternative.
MySQL
MySQL
MySQL is an open-source relational database management system.
PostgreSQL
PostgreSQL
PostgreSQL is a free and open-source relational database management system emphasizing extensibility and SQL compliance.
Premium
AWS
AWS
Amazon Web Services (AWS) offers reliable, scalable, and inexpensive cloud computing services.
Premium
Twilio SendGrid
Twilio SendGrid
Send marketing and transactional email through the Twilio SendGrid platform with the Email API, proprietary mail transfer agent, and infrastructure for scalable delivery.
Amazon SES
Amazon SES
Amazon SES is a cloud-based email service provider that can integrate into any application for high volume email automation
Premium
Klaviyo
Klaviyo
Email Marketing and SMS Marketing Platform
Premium
Zendesk
Zendesk
Zendesk is award-winning customer service software trusted by 200K+ customers. Make customers happy via text, mobile, phone, email, live chat, social media.
Notion
Notion
Notion is a new tool that blends your everyday work apps into one. It's the all-in-one workspace for you and your team.
Slack
Slack
Slack is a channel-based messaging platform. With Slack, people can work together more effectively, connect all their software tools and services, and find the information they need to do their best work — all within a secure, enterprise-grade environment.
Microsoft Teams
Microsoft Teams
Microsoft Teams has communities, events, chats, channels, meetings, storage, tasks, and calendars in one place.
Schedule
Schedule
Trigger workflows on an interval or cron schedule.