DigitalOcean Spaces

Highly scalable and affordable object storage.

Integrate the DigitalOcean Spaces API with the Snowflake API

Setup the DigitalOcean Spaces API trigger to run a workflow which integrates with the Snowflake API. Pipedream's integration platform allows you to integrate DigitalOcean Spaces and Snowflake remarkably fast. Free for developers.

Delete Files with DigitalOcean Spaces API on New Row from Snowflake API
Snowflake + DigitalOcean Spaces
 
Try it
List Files with DigitalOcean Spaces API on New Row from Snowflake API
Snowflake + DigitalOcean Spaces
 
Try it
Upload File /tmp with DigitalOcean Spaces API on New Row from Snowflake API
Snowflake + DigitalOcean Spaces
 
Try it
Upload File Base64 with DigitalOcean Spaces API on New Row from Snowflake API
Snowflake + DigitalOcean Spaces
 
Try it
Upload File URL with DigitalOcean Spaces API on New Row from Snowflake API
Snowflake + DigitalOcean Spaces
 
Try it
New Row from the Snowflake API

Emit new event when a row is added to a table

 
Try it
File Deleted from the DigitalOcean Spaces API

Emit new event when a file is deleted from a DigitalOcean Spaces bucket

 
Try it
New Query Results from the Snowflake API

Run a SQL query on a schedule, triggering a workflow for each row of results

 
Try it
Failed Task in Schema from the Snowflake API

Emit new events when a task fails in a database schema

 
Try it
New File Uploaded from the DigitalOcean Spaces API

Emit new event when a file is uploaded to a DigitalOcean Spaces bucket

 
Try it
Delete Files with the DigitalOcean Spaces API

Delete files in a bucket. See the docs.

 
Try it
List Files with the DigitalOcean Spaces API

List files in a bucket. See the docs.

 
Try it
Insert Multiple Rows with the Snowflake API

Insert multiple rows into a table

 
Try it
Upload File /tmp with the DigitalOcean Spaces API

Accepts a file path starting from /tmp, then uploads as a file to DigitalOcean Spaces. See the docs.

 
Try it
Insert Single Row with the Snowflake API

Insert a row into a table

 
Try it

Overview of DigitalOcean Spaces

DigitalOcean Spaces API permits you to manage object storage, allowing for the storage and serving of massive amounts of data. This API is great for backing up, archiving, and providing public access to data or assets. On Pipedream, you can use this API to automate file operations like uploads, downloads, and deletions, as well as manage permissions and metadata. You can integrate it with other services for end-to-end workflow automation.

Connect DigitalOcean Spaces

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
import { S3 } from "@aws-sdk/client-s3";
import { ListBucketsCommand  } from "@aws-sdk/client-s3";

export default defineComponent({
  props: {
    digitalocean_spaces: {
      type: "app",
      app: "digitalocean_spaces"
    }
  },
  async run({ steps, $ }) {
    console.log(this.digitalocean_spaces.$auth)
    const s3Client = new S3({
        forcePathStyle: false, // Configures to use subdomain/virtual calling format.
        endpoint: `https://${this.digitalocean_spaces.$auth.region}.digitaloceanspaces.com`,
        region: "us-east-1",
        credentials: {
          accessKeyId: this.digitalocean_spaces.$auth.key,
          secretAccessKey: this.digitalocean_spaces.$auth.secret
        }
    });

    const data = await s3Client.send(new ListBucketsCommand({}));
    return data.Buckets;
  },
})

Overview of Snowflake

Snowflake offers a cloud database and related tools to help developers create robust, secure, and scalable data warehouses. See Snowflake's Key Concepts & Architecture.

Getting Started

1. Create a user, role and warehouse in Snowflake

Snowflake recommends you create a new user, role, and warehouse when you integrate a third-party tool like Pipedream. This way, you can control permissions via the user / role, and separate Pipedream compute and costs with the warehouse. You can do this directly in the Snowflake UI.

We recommend you create a read-only account if you only need to query Snowflake. If you need to insert data into Snowflake, add permissions on the appropriate objects after you create your user.

2. Enter those details in Pipedream

Visit https://pipedream.com/accounts. Click the button to Connect an App. Enter the required Snowflake account data.

You'll only need to connect your account once in Pipedream. You can connect this account to multiple workflows to run queries against Snowflake, insert data, and more.

3. Build your first workflow

Visit https://pipedream.com/new to build your first workflow. Pipedream workflows let you connect Snowflake with 1,000+ other apps. You can trigger workflows on Snowflake queries, sending results to Slack, Google Sheets, or any app that exposes an API. Or you can accept data from another app, transform it with Python, Node.js, Go or Bash code, and insert it into Snowflake.

Learn more at Pipedream University.

Connect Snowflake

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
import { promisify } from 'util'
import snowflake from 'snowflake-sdk'

export default defineComponent({
  props: {
    snowflake: {
      type: "app",
      app: "snowflake",
    }
  },
  async run({steps, $}) {
    const connection = snowflake.createConnection({
      ...this.snowflake.$auth,
      application: "PIPEDREAM_PIPEDREAM",
    })
    const connectAsync = promisify(connection.connect)
    await connectAsync()
    
    async function connExecuteAsync(options) {
      return new Promise((resolve, reject) => {
        connection.execute({
          ...options,
          complete: function(err, stmt, rows) {
            if (err) {
              reject(err)
            } else {
              resolve({stmt, rows})
            }
          }
        })
      })
    }
    
    // See https://docs.snowflake.com/en/user-guide/nodejs-driver-use.html#executing-statements
    const { rows } = await connExecuteAsync({
      sqlText: `SELECT CURRENT_TIMESTAMP()`,
    })
    return rows
  },
})