MySQL is an open-source relational database management system.
Emit new event when you add or modify a new row in a table. See the docs here
Emit new event when new rows are returned from a custom query. See the docs here
Emit new event when a new table is added to a database. See the docs here
Retrieve the output and metadata of a single task run. See the documentation
The MySQL application on Pipedream enables direct interaction with your MySQL databases, allowing you to perform CRUD operations—create, read, update, delete—on your data with ease. You can leverage these capabilities to automate data synchronization, report generation, and event-based triggers that kick off workflows in other apps. With Pipedream's serverless platform, you can connect MySQL to hundreds of other services without managing infrastructure, crafting complex code, or handling authentication.
import mysql from '@pipedream/mysql';
export default defineComponent({
props: {
mysql,
},
async run({steps, $}) {
// Component source code:
// https://github.com/PipedreamHQ/pipedream/tree/master/components/mysql
const queryObj = {
sql: "SELECT NOW()",
values: [], // Ignored since query does not contain placeholders
};
return await this.mysql.executeQuery(queryObj);
},
});
The Databricks API allows you to interact programmatically with Databricks services, enabling you to manage clusters, jobs, notebooks, and other resources within Databricks environments. Through Pipedream, you can leverage these APIs to create powerful automations and integrate with other apps for enhanced data processing, transformation, and analytics workflows. This unlocks possibilities like automating cluster management, dynamically running jobs based on external triggers, and orchestrating complex data pipelines with ease.
import { axios } from "@pipedream/platform"
export default defineComponent({
props: {
databricks: {
type: "app",
app: "databricks",
}
},
async run({steps, $}) {
return await axios($, {
url: `https://${this.databricks.$auth.domain}.cloud.databricks.com/api/2.0/clusters/list`,
headers: {
Authorization: `Bearer ${this.databricks.$auth.access_token}`,
},
})
},
})