Using Pipedream to go from Postmark to MongoDB

Hello!
Currently using Pipedream to move data from a webhook post from Postmark to an S3 bucket. This works well but parsing .gz data from an S3 is becoming more of a pain than expected. Is it possible to take the data from the webhooks of Postmark and put it directly into a MongoDB collection? Or would this data still end up as .gz data in Mongo.
Would appreciate any advice here, thanks!

Hi @kunalrye

First off, welcome to the Pipedream community. Happy to have you!

What a coincidence, I too use Postmark and MongoDB, big fans of both services.

I’m not sure which Postmark webhook you’re subscribing to, but I assume it’s providing JSON data to your workflow to start, then you’re creating a file on S3.

As long as the initial data from Postmark is the JSON format, then you can use our pre-built MongoDB action to insert that data into a collection.

Here’s an example on how to use this action: https://pipedream.com/apps/mongodb

Thanks!

I am subscribing to the delivered, opened, and clicked webhooks. Currently these are being sent to an S3 but in the S3 they end up as JSON data inside .gz files since they are batched. Would I end up with the same .gz file posting them to Mongo? Or since I insert in to a collection they would go from JSON to document through like the insertMany functionality?

Nope, MongoDB is a database whereas S3 is for file storage.

You’ll be able to run MongoDB queries against your stored webhook data instead of trying to open them in memory and run JS to filter them.

1 Like

Great, this makes more sense than posting to an S3 for me. Appreciate the insight!