← AWS

New Deleted S3 File from AWS API

Pipedream makes it easy to connect APIs for AWS and 1000+ other apps remarkably fast.

Trigger workflow on
New Deleted S3 File from the AWS API
Next, do this
Connect to 1000+ APIs using code and no-code building blocks
No credit card required
Into to Pipedream
Watch us build a workflow
Watch us build a workflow
7 min
Watch now ➜
Trusted by 250,000+ developers from startups to Fortune 500 companies:
Trusted by 250,000+ developers from startups to Fortune 500 companies

Developers Pipedream

Getting Started

Trigger a workflow on New Deleted S3 File with AWS API. When you configure and deploy the workflow, it will run on Pipedream's servers 24x7 for free.

  1. Configure the New Deleted S3 File trigger
    1. Connect your AWS account
    2. Select a AWS Region
    3. Select a S3 Bucket Name
    4. Configure Ignore Delete Markers
  2. Add steps to connect to 1000+ APIs using code and no-code building blocks
  3. Deploy the workflow
  4. Send a test event to validate your setup
  5. Turn on the trigger

Integrations

Send Message with Discord Webhook API on New Deleted S3 File from AWS API
AWS + Discord Webhook
 
Try it
Add Multiple Rows with Google Sheets API on New Deleted S3 File from AWS API
AWS + Google Sheets
 
Try it
Get Film with SWAPI - Star Wars API on New Deleted S3 File from AWS API
AWS + SWAPI - Star Wars
 
Try it
Create Multiple Records with Airtable API on New Deleted S3 File from AWS API
AWS + Airtable
 
Try it
Send any HTTP Request with HTTP / Webhook API on New Deleted S3 File from AWS API
AWS + HTTP / Webhook
 
Try it

Details

This is a pre-built, open source component from Pipedream's GitHub repo. The component is developed by Pipedream and the community, and verified and maintained by Pipedream.

To contribute an update to an existing component or create a new component, create a PR on GitHub. If you're new to Pipedream component development, you can start with quickstarts for trigger span and action development, and then review the component API reference.

New Deleted S3 File on AWS
Description:Emit new event when a file is deleted from a S3 bucket
Version:0.0.1
Key:aws-s3-deleted-file

Code

import base from "../common/s3.mjs";

export default {
  ...base,
  type: "source",
  key: "aws-s3-deleted-file",
  name: "New Deleted S3 File",
  description: "Emit new event when a file is deleted from a S3 bucket",
  version: "0.0.1",
  dedupe: "unique",
  props: {
    ...base.props,
    ignoreDeleteMarkers: {
      type: "boolean",
      label: "Ignore Delete Markers",
      description: "When ignoring delete markers this will only emit events for permanently deleted files",
      default: false,
    },
  },
  methods: {
    ...base.methods,
    getEvents() {
      return [
        this.ignoreDeleteMarkers
          ? "s3:ObjectRemoved:Delete"
          : "s3:ObjectRemoved:*",
      ];
    },
    generateMeta(data) {
      const { "x-amz-request-id": id } = data.responseElements;
      const { key } = data.s3.object;
      const { eventTime: isoTimestamp } = data;
      return {
        id,
        summary: `Deleted file: '${key}'`,
        ts: Date.parse(isoTimestamp),
      };
    },
  },
};

Configuration

This component may be configured based on the props defined in the component code. Pipedream automatically prompts for input values in the UI and CLI.
LabelPropTypeDescription
AWSawsappThis component uses the AWS app.
N/Adb$.service.dbThis component uses $.service.db to maintain state between component invocations.
N/Ahttp$.interface.httpThis component uses $.interface.http to generate a unique URL when the component is first instantiated. Each request to the URL will trigger the run() method of the component.
AWS RegionregionstringSelect a value from the drop down menu.
S3 Bucket NamebucketstringSelect a value from the drop down menu.
Ignore Delete MarkersignoreDeleteMarkersboolean

When ignoring delete markers this will only emit events for permanently deleted files

Authentication

AWS uses API keys for authentication. When you connect your AWS account, Pipedream securely stores the keys so you can easily authenticate to AWS APIs in both code and no-code steps.

Follow the AWS Instructions for creating an IAM user with an associated access and secret key.

As a best practice, attach the minimum set of IAM permissions necessary to perform the specific task in Pipedream. If your workflow only needs to perform a single API call, you should create a user and associate an IAM group / policy with permission to do only that task. You can create as many linked AWS accounts in Pipedream as you'd like.

Enter your access and secret key below.

About AWS

On-demand cloud computing platforms and APIs

More Ways to Use AWS

Triggers

New Scheduled Tasks from the AWS API

Creates a Step Function State Machine to publish a message to an SNS topic at a specific timestamp. The SNS topic delivers the message to this Pipedream source, and the source emits it as a new event.

 
Try it
New SNS Messages from the AWS API

Creates an SNS topic in your AWS account. Messages published to this topic are emitted from the Pipedream source.

 
Try it
New Inbound SES Emails from the AWS API

The source subscribes to all emails delivered to a specific domain configured in AWS SES. When an email is sent to any address at the domain, this event source emits that email as a formatted event. These events can trigger a Pipedream workflow and can be consumed via SSE or REST API.

 
Try it
New Records Returned by CloudWatch Logs Insights Query from the AWS API

Executes a CloudWatch Logs Insights query on a schedule, and emits the records as invidual events (default) or in batch

 
Try it
New Restored S3 File from the AWS API

Emit new event when an file is restored into a S3 bucket

 
Try it

Actions

CloudWatch Logs - Put Log Event with the AWS API

Uploads a log event to the specified log stream. See docs

 
Try it
DynamoDB - Create Table with the AWS API

Creates a new table to your account. See docs

 
Try it
DynamoDB - Execute Statement with the AWS API

This operation allows you to perform transactional reads or writes on data stored in DynamoDB, using PartiQL. See docs

 
Try it
DynamoDB - Get Item with the AWS API

The Get Item operation returns a set of attributes for the item with the given primary key. If there is no matching item, Get Item does not return any data and there will be no Item element in the response. See docs

 
Try it
DynamoDB - Put Item with the AWS API

Creates a new item, or replaces an old item with a new item. If an item that has the same primary key as the new item already exists in the specified table, the new item completely replaces the existing item. See docs

 
Try it