The Data Store was working fine while I was coding my workflow.
As soon as I went “production” mode, meaning, 500 rows fetched and then updated per workflow execution, I started seeing errors.
Error
Error communicating with data store
DETAILS
at null.__node_internal_captureLargerStackTrace (internal/errors.js:412:5)
at null.__node_internal_exceptionWithHostPort (internal/errors.js:590:12)
at null.internalConnect (net.js:934:16)
at null.defaultTriggerAsyncIdScope (internal/async_hooks.js:452:18)
at GetAddrInfoReqWrap.emitLookup (net.js:1077:9)
at GetAddrInfoReqWrap.onlookup (dns.js:73:8)
Well… I fixed the initial problem, but now I have another use case that brought me back to this issue.
I would like to reset a field inside all the objects I store (e.g. setting outdated: true on all of them) but I could not find an efficient way to do so.
Data Stores are still a Preview feature, and we’re working to improve the guard rails so you t won’t hit these limitations.
It’s intended to be a convenient key => value data store, not a full fledged database. If you find yourself needing to iterate over a large amount of records to perform updates, you may be exceeding the use case for Data Stores.
We have pre-built actions for popular databases such as Postgres, MySQL, MongoDb, and DynamoDB. Any of these can be connected to your Pipedream workflows and have conditional updating actions that might better fit your purpose.