← MySQL + Scrapfly integrations

AI Data Extraction with Scrapfly API on New Row (Custom Query) from MySQL API

Pipedream makes it easy to connect APIs for Scrapfly, MySQL and 2,400+ other apps remarkably fast.

Trigger workflow on
New Row (Custom Query) from the MySQL API
Next, do this
AI Data Extraction with the Scrapfly API
No credit card required
Intro to Pipedream
Watch us build a workflow
Watch us build a workflow
8 min
Watch now ➜

Trusted by 1,000,000+ developers from startups to Fortune 500 companies

Adyen logo
Appcues logo
Bandwidth logo
Checkr logo
ChartMogul logo
Dataminr logo
Gopuff logo
Gorgias logo
LinkedIn logo
Logitech logo
Replicated logo
Rudderstack logo
SAS logo
Scale AI logo
Webflow logo
Warner Bros. logo
Adyen logo
Appcues logo
Bandwidth logo
Checkr logo
ChartMogul logo
Dataminr logo
Gopuff logo
Gorgias logo
LinkedIn logo
Logitech logo
Replicated logo
Rudderstack logo
SAS logo
Scale AI logo
Webflow logo
Warner Bros. logo

Developers Pipedream

Getting Started

This integration creates a workflow with a MySQL trigger and Scrapfly action. When you configure and deploy the workflow, it will run on Pipedream's servers 24x7 for free.

  1. Select this integration
  2. Configure the New Row (Custom Query) trigger
    1. Connect your MySQL account
    2. Configure timer
    3. Select a Table
    4. Optional- Select a De-duplication Key
    5. Configure Where condition
    6. Configure Values
  3. Configure the AI Data Extraction action
    1. Connect your Scrapfly account
    2. Configure Body
    3. Select a Content Type
    4. Configure URL
    5. Optional- Configure Charset
    6. Optional- Configure Extraction Template
    7. Optional- Configure Extraction Prompt
    8. Optional- Configure Extraction Model
    9. Optional- Configure Webhook Name
  4. Deploy the workflow
  5. Send a test event to validate your setup
  6. Turn on the trigger

Details

This integration uses pre-built, source-available components from Pipedream's GitHub repo. These components are developed by Pipedream and the community, and verified and maintained by Pipedream.

To contribute an update to an existing component or create a new component, create a PR on GitHub. If you're new to Pipedream component development, you can start with quickstarts for trigger span and action development, and then review the component API reference.

Trigger

Description:Emit new event when new rows are returned from a custom query. [See the docs here](https://dev.mysql.com/doc/refman/8.0/en/select.html)
Version:2.0.5
Key:mysql-new-row-custom-query

MySQL Overview

The MySQL application on Pipedream enables direct interaction with your MySQL databases, allowing you to perform CRUD operations—create, read, update, delete—on your data with ease. You can leverage these capabilities to automate data synchronization, report generation, and event-based triggers that kick off workflows in other apps. With Pipedream's serverless platform, you can connect MySQL to hundreds of other services without managing infrastructure, crafting complex code, or handling authentication.

Trigger Code

import common from "../common/table.mjs";
import { v4 as uuidv4 } from "uuid";

const { mysql } = common.props;

export default {
  ...common,
  key: "mysql-new-row-custom-query",
  name: "New Row (Custom Query)",
  description: "Emit new event when new rows are returned from a custom query. [See the docs here](https://dev.mysql.com/doc/refman/8.0/en/select.html)",
  version: "2.0.5",
  type: "source",
  dedupe: "unique",
  props: {
    ...common.props,
    column: {
      propDefinition: [
        mysql,
        "column",
        (c) => ({
          table: c.table,
        }),
      ],
      label: "De-duplication Key",
      description:
        "The name of a column in the table to use for de-duplication",
      optional: true,
    },
    condition: {
      propDefinition: [
        mysql,
        "whereCondition",
      ],
    },
    values: {
      propDefinition: [
        mysql,
        "whereValues",
      ],
    },
  },
  methods: {
    ...common.methods,
    async listResults() {
      const {
        table,
        condition,
        values,
      } = this;

      const numberOfQuestionMarks = condition?.match(/\?/g)?.length;

      if (!numberOfQuestionMarks) {
        throw new Error("No valid condition provided. At least one question mark character ? must be provided.");
      }

      if (!Array.isArray(values)) {
        throw new Error("No valid values provided. The values property must be an array.");
      }

      if (values.length !== numberOfQuestionMarks) {
        throw new Error("The number of values provided does not match the number of question marks ? in the condition.");
      }

      const rows = await this.mysql.findRows({
        table,
        condition,
        values,
      });
      this.iterateAndEmitEvents(rows);
    },
    generateMeta(row) {
      const id = this.column
        ? row[this.column]
        : uuidv4();
      return {
        id,
        summary: `New Row ${id}`,
        ts: Date.now(),
      };
    },
  },
};

Trigger Configuration

This component may be configured based on the props defined in the component code. Pipedream automatically prompts for input values in the UI and CLI.
LabelPropTypeDescription
MySQLmysqlappThis component uses the MySQL app.
timer$.interface.timer
TabletablestringSelect a value from the drop down menu.
De-duplication KeycolumnstringSelect a value from the drop down menu.
Where conditionconditionstring

In this expression you can write your own conditions (eg. column1 = ? or column2 = ?). Depending on the number of ? symbols likewise you need to add the same number of values.

Valuesvaluesstring[]

This is the list of values that will match every ? symbol in the where expression. If you want to build yourself the values (eg. {{["string1", "string2"]}})

Trigger Authentication

MySQL uses API keys for authentication. When you connect your MySQL account, Pipedream securely stores the keys so you can easily authenticate to MySQL APIs in both code and no-code steps.

Connecting to Restricted Databases

Either enable the shared static IP for this account below, or configure a VPC to deploy any workflow in your workspace to a private network with a dedicated static IP. Learn more in our docs.

SSL Setup

Configure SSL on your MySQL database by providing the CA (Certificate Authority), and choosing between Full Verification, Verify Certificate Authority (CA), or Skip Verification. Skipping verification is not recommended as this has serious security implications.

About MySQL

MySQL is an open-source relational database management system.

Action

Description:Automate content extraction from any text-based source using AI, LLM, and custom parsing. [See the documentation](https://scrapfly.io/docs/extraction-api/getting-started)
Version:0.0.1
Key:scrapfly-ai-data-extraction

Action Code

import { ConfigurationError } from "@pipedream/platform";
import fs from "fs";
import { checkTmp } from "../../common/utils.mjs";
import scrapfly from "../../scrapfly.app.mjs";

export default {
  key: "scrapfly-ai-data-extraction",
  name: "AI Data Extraction",
  description: "Automate content extraction from any text-based source using AI, LLM, and custom parsing. [See the documentation](https://scrapfly.io/docs/extraction-api/getting-started)",
  version: "0.0.1",
  type: "action",
  props: {
    scrapfly,
    body: {
      propDefinition: [
        scrapfly,
        "body",
      ],
    },
    contentType: {
      propDefinition: [
        scrapfly,
        "contentType",
      ],
    },
    url: {
      propDefinition: [
        scrapfly,
        "url",
      ],
    },
    charset: {
      type: "string",
      label: "Charset",
      description: "Charset of the document pass in the body. If you are not sure, you can use the `auto` value and we will try to detect it. Bad charset can lead to bad extraction, so it's important to set it correctly. **The most common charset is `utf-8` for text document and `ascii` for binary**. The symptom of a bad charset is that the text is not correctly displayed (accent, special characters, etc).",
      default: "auto",
      optional: true,
    },
    extractionTemplate: {
      type: "string",
      label: "Extraction Template",
      description: "Define an extraction template to get structured data. Use an ephemeral template (declared on the fly on the API call) or a stored template (declared in the dashboard) by using the template name.",
      optional: true,
    },
    extractionPrompt: {
      type: "string",
      label: "Extraction Prompt",
      description: "Instruction to extract data or ask a question on the scraped content with an LLM (Large Language Model). [Must be url encoded](https://scrapfly.io/web-scraping-tools/urlencode).",
      optional: true,
    },
    extractionModel: {
      type: "string",
      label: "Extraction Model",
      description: "AI Extraction to auto parse document to get structured data. E.g., `product`, `review`, `real-estate`, `article`.",
      optional: true,
    },
    webhookName: {
      type: "string",
      label: "Webhook Name",
      description: "Queue you scrape request and redirect API response to a provided webhook endpoint. You can create a webhook endpoint from your `dashboard`, it takes the name of the webhook. Webhooks are scoped to the given project/env.",
      optional: true,
    },
  },
  async run({ $ }) {
    if (!this.extractionTemplate && !this.extractionPrompt && !this.extractionModel) {
      throw new ConfigurationError("You must provide at least **Extraction Template**, **Extraction Prompt** or **Extraction Model**");
    }
    const response = await this.scrapfly.automateContentExtraction({
      $,
      headers: {
        "content-type": this.contentType,
      },
      maxBodyLength: Infinity,
      params: {
        url: this.url,
        charset: this.charset,
        extraction_template: this.extractionTemplate,
        extraction_prompt: this.extractionPrompt,
        extraction_model: this.extractionModel,
        webhook_name: this.webhookName,
      },
      data: fs.readFileSync(checkTmp(this.body)).toString(),
    });

    $.export("$summary", "Successfully extracted content");
    return response;
  },
};

Action Configuration

This component may be configured based on the props defined in the component code. Pipedream automatically prompts for input values in the UI.

LabelPropTypeDescription
ScrapflyscrapflyappThis component uses the Scrapfly app.
Bodybodystring

The request body must contain the content of the page you want to extract data from. The content must be in the format specified by the content-type header or via the content_type HTTP parameter. Provide a file from /tmp. To upload a file to /tmp folder, please follow the doc here

Content TypecontentTypestringSelect a value from the drop down menu:application/jsonapplication/jsonldapplication/xmltext/plaintext/htmltext/markdowntext/csvapplication/xhtml+xml
URLurlstring

This URL is used to transform any relative URLs in the document into absolute URLs automatically. It can be either the base URL or the exact URL of the document. Must be url encoded.

Charsetcharsetstring

Charset of the document pass in the body. If you are not sure, you can use the auto value and we will try to detect it. Bad charset can lead to bad extraction, so it's important to set it correctly. The most common charset is utf-8 for text document and ascii for binary. The symptom of a bad charset is that the text is not correctly displayed (accent, special characters, etc).

Extraction TemplateextractionTemplatestring

Define an extraction template to get structured data. Use an ephemeral template (declared on the fly on the API call) or a stored template (declared in the dashboard) by using the template name.

Extraction PromptextractionPromptstring

Instruction to extract data or ask a question on the scraped content with an LLM (Large Language Model). Must be url encoded.

Extraction ModelextractionModelstring

AI Extraction to auto parse document to get structured data. E.g., product, review, real-estate, article.

Webhook NamewebhookNamestring

Queue you scrape request and redirect API response to a provided webhook endpoint. You can create a webhook endpoint from your dashboard, it takes the name of the webhook. Webhooks are scoped to the given project/env.

Action Authentication

Scrapfly uses API keys for authentication. When you connect your Scrapfly account, Pipedream securely stores the keys so you can easily authenticate to Scrapfly APIs in both code and no-code steps.

About Scrapfly

Scrapfly Web Scraping API for developer

More Ways to Connect Scrapfly + MySQL

AI Data Extraction with Scrapfly API on New Column from MySQL API
MySQL + Scrapfly
 
Try it
AI Data Extraction with Scrapfly API on New or Updated Row from MySQL API
MySQL + Scrapfly
 
Try it
AI Data Extraction with Scrapfly API on New Row from MySQL API
MySQL + Scrapfly
 
Try it
AI Data Extraction with Scrapfly API on New Table from MySQL API
MySQL + Scrapfly
 
Try it
Retrieve Scrapfly Account Info with Scrapfly API on New Column from MySQL API
MySQL + Scrapfly
 
Try it
New Column from the MySQL API

Emit new event when you add a new column to a table. See the docs here

 
Try it
New or Updated Row from the MySQL API

Emit new event when you add or modify a new row in a table. See the docs here

 
Try it
New Row from the MySQL API

Emit new event when you add a new row to a table. See the docs here

 
Try it
New Row (Custom Query) from the MySQL API

Emit new event when new rows are returned from a custom query. See the docs here

 
Try it
New Table from the MySQL API

Emit new event when a new table is added to a database. See the docs here

 
Try it
Execute SQL Query with the MySQL API

Execute a custom MySQL query. See our docs to learn more about working with SQL in Pipedream.

 
Try it
Create Row with the MySQL API

Adds a new row. See the docs here

 
Try it
Delete Row with the MySQL API

Delete an existing row. See the docs here

 
Try it
Execute Query with the MySQL API

Find row(s) via a custom query. See the docs here

 
Try it
Execute Stored Procedure with the MySQL API

Execute Stored Procedure. See the docs here

 
Try it

Explore Other Apps

1
-
24
of
2,400+
apps by most popular

HTTP / Webhook
HTTP / Webhook
Get a unique URL where you can send HTTP or webhook requests
Node
Node
Anything you can do with Node.js, you can do in a Pipedream workflow. This includes using most of npm's 400,000+ packages.
Python
Python
Anything you can do in Python can be done in a Pipedream Workflow. This includes using any of the 350,000+ PyPi packages available in your Python powered workflows.
OpenAI (ChatGPT)
OpenAI (ChatGPT)
OpenAI is an AI research and deployment company with the mission to ensure that artificial general intelligence benefits all of humanity. They are the makers of popular models like ChatGPT, DALL-E, and Whisper.
Premium
Salesforce
Salesforce
Web services API for interacting with Salesforce
Premium
HubSpot
HubSpot
HubSpot's CRM platform contains the marketing, sales, service, operations, and website-building software you need to grow your business.
Premium
Zoho CRM
Zoho CRM
Zoho CRM is an online Sales CRM software that manages your sales, marketing, and support in one CRM platform.
Premium
Stripe
Stripe
Stripe powers online and in-person payment processing and financial solutions for businesses of all sizes.
Shopify
Shopify
Shopify is a complete commerce platform that lets anyone start, manage, and grow a business. You can use Shopify to build an online store, manage sales, market to customers, and accept payments in digital and physical locations.
Premium
WooCommerce
WooCommerce
WooCommerce is the open-source ecommerce platform for WordPress.
Premium
Snowflake
Snowflake
A data warehouse built for the cloud
Premium
MongoDB
MongoDB
MongoDB is an open source NoSQL database management program.
Supabase
Supabase
Supabase is an open source Firebase alternative.
MySQL
MySQL
MySQL is an open-source relational database management system.
PostgreSQL
PostgreSQL
PostgreSQL is a free and open-source relational database management system emphasizing extensibility and SQL compliance.
Premium
AWS
AWS
Amazon Web Services (AWS) offers reliable, scalable, and inexpensive cloud computing services.
Premium
Twilio SendGrid
Twilio SendGrid
Send marketing and transactional email through the Twilio SendGrid platform with the Email API, proprietary mail transfer agent, and infrastructure for scalable delivery.
Amazon SES
Amazon SES
Amazon SES is a cloud-based email service provider that can integrate into any application for high volume email automation
Premium
Klaviyo
Klaviyo
Email Marketing and SMS Marketing Platform
Premium
Zendesk
Zendesk
Zendesk is award-winning customer service software trusted by 200K+ customers. Make customers happy via text, mobile, phone, email, live chat, social media.
Notion
Notion
Notion is a new tool that blends your everyday work apps into one. It's the all-in-one workspace for you and your team.
Slack
Slack
Slack is a channel-based messaging platform. With Slack, people can work together more effectively, connect all their software tools and services, and find the information they need to do their best work — all within a secure, enterprise-grade environment.
Microsoft Teams
Microsoft Teams
Microsoft Teams has communities, events, chats, channels, meetings, storage, tasks, and calendars in one place.
Schedule
Schedule
Trigger workflows on an interval or cron schedule.