What is the Most Effective Way to Upload 100k Records into a Datastore?

this is our code
**def** handler(pd: "pipedream"):
# Access the data store under the pd.inputs
data_store = pd.inputs["data_store"]

`**for** item **in** pd.steps["trigger"]["event"]["body"]:`
    _`# Store a value under a key`_
    `data_store[item["key"]] = item["value"]`

    _`# Retrieve the value and print it to the step's Logs`_
    `**print**(data_store[item["key"]])`

This is the error:
KeyError

Traceback (most recent call last):

  File "/nano-py/pipedream/worker.py", line 118, in execute
    user_retval = handler(pd)

  File "/pipedream/dist/code/58ecef3d76943ec5c6bca4ce78a6d08903b900488911d6945eebe52b7d03784b/code.py", line 10, in handler
    print(data_store[item["key"]])

  File "/nano-py/pipedream/pipedream.py", line 237, in __getitem__
    raise KeyError

KeyError

(python)

We of course could throw in try/except, but I think the question is more about why we need that for something that seems so simple?