Is it Possible to Set a Workflow Trigger for New Files in a Watch Folder on a Local Server?

This topic was automatically generated from Slack. You can find the original thread here.

Hi do you know if it is possible to set a workflow trigger when new files are added to a watch folder on my local server?

Yes, it is possible to set a workflow trigger when new files are added to a watch folder on your local server. However, Pipedream does not natively support local file system triggers directly. You can achieve this by combining a local script with an HTTP trigger in Pipedream. Here’s a step-by-step guide on how to set this up: 1. Create an HTTP trigger in Pipedream: - Go to the Pipedream UI and create a new workflow. - Add an HTTP trigger to your workflow. This will give you a unique URL where you can send HTTP requests. 2. Set up a local script to watch your folder: You can use a Node.js script with the chokidar library to watch for new files in a specific folder. When a new file is detected, the script will send an HTTP request to the Pipedream trigger URL. Here’s an example Node.js script: const chokidar = require('chokidar'); const axios = require('axios'); // Replace with your Pipedream HTTP trigger URL const PIPEDEAM_TRIGGER_URL = 'https://your-pipedream-trigger-url'; // Watch the folder for new files const watcher = chokidar.watch('/path/to/your/folder', { ignored: /(^|[\/\\])\../, // ignore dotfiles persistent: true }); watcher.on('add', async (path) => { console.log(`File ${path} has been added`); try { await [axios.post](http://axios.post)(PIPEDEAM_TRIGGER_URL, { filePath: path }); console.log('Notification sent to Pipedream'); } catch (error) { console.error('Error sending notification to Pipedream:', error); } }); console.log('Watching for new files...'); 3. Run the script on your local server: - Make sure you have Node.js installed on your server. - Install the chokidar and axios libraries using npm: bash npm install chokidar axios - Save the script to a file (e.g., watch-folder.js) and run it: ```
bash node watch-folder.js

would I also need to define a service to start this node script on boots? Also support service status file-watcher for example?

To help you better, I need a bit more information about your setup and requirements. Specifically: 1. Are you looking to run a Node.js script on a server or a local machine? 2. Which operating system are you using (e.g., Linux, Windows, macOS)? 3. Are you asking about integrating this with Pipedream, or is this a general question about running a Node.js script as a service? Once I have more details, I can provide a more accurate and helpful response.

Hi thanks (and Pi :wink:) → @U05A7KLNALC in my case the files being added to the watch folder are invoices and these files (pdfs) need to be processed.
→ Right now I run a manual cli commando to copy these pdf’s over to a google drive folder which then triggers my pipedream workflow, each file is downloaded from the gdrive folder, parsed using python code in pipedream and then uploaded to Airtable through it’s API, then I receive a notification in slack when the job’s done.
→ I would like to skip the manual commando and trigger automatically… but keep the copying over from local server to google folder because I found that if I copy over existing files (duplicates) they won’t trigger the rest of the workflow… so thinking about it I just need to adapt the local script to copy the files with scp rather than send a http request right?

To automate your workflow and skip the manual command, you can indeed adapt your local script to copy the files with scp rather than sending an HTTP request. Here’s a step-by-step approach to achieve this: 1. Automate the File Copy with SCP: - Modify your local script to use scp to copy the files from your local server to the Google Drive folder. - Ensure that the files are copied with unique names to avoid duplicates, which will help in triggering the rest of the workflow. 2. Trigger Pipedream Workflow: - Your Pipedream workflow will be triggered when new files are added to the Google Drive folder. - The workflow will download the files, parse them using Python code, upload the data to Airtable, and send a notification to Slack. ### Example SCP Command in Local Script Here’s an example of how you can use scp in your local script to copy files to the Google Drive folder: bash #!/bin/bash # Define variables LOCAL_FOLDER="/path/to/local/folder" REMOTE_FOLDER="user@remote:/path/to/google/drive/folder" TIMESTAMP=$(date +%s) # Copy files with unique names to avoid duplicates for file in "$LOCAL_FOLDER"/**.pdf; do scp "$file" "$REMOTE_FOLDER/$(basename "$file" .pdf)_$TIMESTAMP.pdf" done ### Pipedream Workflow 1. Google Drive Trigger: - Use the Google Drive trigger to watch for new files in the specified folder. 2. Python Code Step: - Parse the downloaded PDF files using Python code. 3. Airtable Action: - Upload the parsed data to Airtable using its API. 4. *Slack Notification: - Send a notification to Slack when the job is done. ### Example Pipedream Workflow Steps #### Google Drive Trigger 1. Search for the Google Drive app from the trigger menu. 2. Select the “New File in Folder” trigger. 3. Connect your Google Drive account. 4. Follow the instructions on the trigger configuration screen to specify the folder to watch. #### Python Code Step ``` def handler(pd: “pipedream”): import PyPDF2 # Reference the file from the trigger step file_path = pd.steps[“trigger”][“event”][“path”] # Parse the PDF file with open