Data stores are Pipedream’s built-in key-value store.Data stores are useful for:
Storing and retrieving data at a specific key
Setting automatic expiration times for temporary data (TTL)
Counting or summing values over time
Retrieving JSON-serializable data across workflow executions
Caching and rate limiting
And any other case where you’d use a key-value store
You can connect to the same data store across workflows, so they’re also great for sharing state across different services.You can use pre-built, no-code actions to store, update, and clear data, or interact with data stores programmatically in Node.js or Python.
Select the Add or update a single record pre-built action.
Configure the action:
Select or create a Data Store — create a new data store or choose an existing data store.
Key - the unique ID for this data that you’ll use for lookup later
Value - The data to store at the specified key
Time to Live (TTL) - (Optional) The number of seconds until this record expires and is automatically deleted. Leave blank for records that should not expire.
For example, to store the timestamp when the workflow was initially triggered, set the Key to Triggered At and the Value to {{steps.trigger.context.ts}}.The Key must evaluate to a string. You can pass a static string, reference exports from a previous step, or use any valid expression.
Need to store multiple records in one action? Use the Add or update multiple records action instead.
You can view the contents of your data stores at any time in the Pipedream Data Stores dashboard. You can also add, edit, or delete data store records manually from this view.
Click the pencil icon on the far right of the record you want to edit. This will open a text box that will allow you to edit the contents of the value. When you’re finished with your edits, save by clicking the checkmark icon.
You can delete a data store from this dashboard as well. On the far right in the data store row, click the trash can icon.
Deleting a data store is irreversible.
If the Delete option is greyed out and unclickable, you have workflows using the data store in a step. Click the > to the left of the data store’s name to expand the linked workflows.
Refer to the Node.js and Python data store docs to learn how to use data stores in code steps. You can get, set, delete and perform any other data store operations in code. You cannot use data stores in Bash or Go code steps.
Data saved in data stores is Brotli-compressed, minimizing storage. The total compression ratio depends on the data being compressed. To test this on your own data, run it through a package that supports Brotli compression and measure the size of the data before and after.
Data store operations are not atomic or transactional, which can lead to race conditions. To ensure atomic operations, be sure to limit access to a data store key to a single workflow with a single worker or use a service that supports atomic operations from among our integrated apps.
In order to stay within the data store limits, you may need to export the data in your data store to an external service.The following Node.js example action will export the data in chunks via an HTTP POST request. You may need to adapt the code to your needs. Click on this link to create a copy of the workflow in your workspace.
If the data contained in each key is large, consider lowering the number of chunkSize.
Adjust your workflow memory and timeout settings according to the size of the data in your data store. Set the memory at 512 MB and timeout to 60 seconds and adjust higher if needed.
Monitor the exports of this step after each execution for any potential errors preventing a full export. Run the step as many times as needed until all your data is exported.
This action deletes the keys that were successfully exported. It is advisable to first run a test without deleting the keys. In case of any unforeseen errors, your data will still be safe.
Copy
Ask AI
import { axios } from "@pipedream/platform";export default defineComponent({ props: { dataStore: { type: "data_store", }, chunkSize: { type: "integer", label: "Chunk Size", description: "The number of items to export in one request", default: 100, }, shouldDeleteKeys: { type: "boolean", label: "Delete keys after export", description: "Whether the data store keys will be deleted after export", default: true, }, }, methods: { async *chunkAsyncIterator(asyncIterator, chunkSize) { let chunk = []; for await (const item of asyncIterator) { chunk.push(item); if (chunk.length === chunkSize) { yield chunk; chunk = []; } } if (chunk.length > 0) { yield chunk; } }, }, async run({ steps, $ }) { const iterator = this.chunkAsyncIterator(this.dataStore, this.chunkSize); for await (const chunk of iterator) { try { // export data to external service await axios($, { url: "https://external_service.com", method: "POST", data: chunk, // may need to add authentication }); // delete exported keys and values if (this.shouldDeleteKeys) { await Promise.all(chunk.map(([key]) => this.dataStore.delete(key))); } console.log( `number of remaining keys: ${(await this.dataStore.keys()).length}` ); } catch (e) { // an error occurred, don't delete keys console.log(`error exporting data: ${e}`); } } },});