Hi, I have a scenario of updating records and deleting records in airtable. that works on a daily basis, for a chatbot I built on airtable. (so as not to exceed the limit of 50,000 records in the table)
The scenario works, but I get the error “The default is out of memory for this workflow”.
I understand that I need to increase the default memory,
I can limit the number of records to update each day, but I don’t know how much memory each operation requires.
In testing the scenario I ran I got an error about updating 144 records, and the memory setting was 256MB.
Is there a way to track how much memory the scenario requires?
Is there a way to reduce the amount of memory the scenario requires?
I think this is more like the engineering problem than the Pipedream problem. Your memory usage increasing per your records means at some point you loads all the records in your workflow (which stores those records in memory).
Is there a way to track how much memory the scenario requires?
This is entirely depends on your workflow logic, so you can try to run with 1000, 10000, or 50000 records to measure how your workflow memory increased
Is there a way to reduce the amount of memory the scenario requires?
If Airtable API supports Batch update operation, you can change your code to use that, or you might need to migrate your Airtable to a platform that support Batch update operation (a SQL database on Supabase for example). This requires more development effort and time, but it will scale.
I think your code is not wrong and it will works well when you have a small amount of records. Because it loads all the records into the memory (and also pass then between steps).
The problem your code will face is it will use memory increasingly as your records grow. So I would suggest you not to loading all of the records into your workflow, but use the batch update operation instead.
An example for batch update operation in SQL is something like this:
UPDATE Messages SET BackupMessageGroup = 'Some value' WHERE createdAt > '2023-02-08'
Using Batch update operation, instead of loading all records into your workflow, you send an update command to the platform for it to update the records for you
Ok thank you.
I will try to change to “batch update operation”.
But I would still like to know if there is a way to track the amount of memory my scenario requires?
I saw the documentation you sent me, but it doesn’t give an answer.
But I would still like to know if there is a way to track the amount of memory my scenario requires?
This is entirely depends on your workflow logic, so you can try to run with 1000, 10000, or 50000 records with increasing memory limits. In all case, your workflow memory usage will be converted to invocations, so you can access Billing & Usage to track how much invocations your workflow cost
Unfortunately this is not possible for now due to the nature of Pipedream running on AWS Lambda. I think what we can do for now is to manually increase it to the number that enough to handle all the records in your Airtable
And really the problem was not in the existing code, but in the built-in record search, which brought me all the information I have in the table, which is very burdensome,
That’s why the programmer wrote me a code that searches for records and does not upload irrelevant data.