Is 1.7 Megs of Data Too Large for a Data Store if It Takes 20 Seconds to Execute?

This topic was automatically generated from Slack. You can find the original thread here.

would 1.7 megs of data be a bit much for a data store? to be clear, it’s working, i can save it and get it. but im noting that when i execute my workflow, which is only returning the value from the cache, it takes 20 seconds, which seems like a long time to execute

For one key or across the entire data store?

It is a Redis key value store, so it is oriented around fast in-memory data retrieval.

it was one key. i ended up translating the data to something a LOT smaller and the issue went away. but it was definitely slowing things down for me

Makes sense. Not the intended use case for data stores.

File storage would allow persisting larger files across runs.

/tmp would also satisfy within a run.

using an actual database would be the approach if you need to store larger amounts of data per entry.

Self service users are capped at 1mb of data store capacity on the Advanced plan. It is intended for small key value pairs.

i didnt even think about file storage - yeah, ill consider that next time.