This topic was automatically generated from Slack. You can find the original thread here.
tried and tested Python packages no longer work on new workflows
hi, i have existing workflows that are using **from** google.oauth2 **import** service_account in a Python step without any issues. however, when creating a new workflow, this package is no longer working. ive made various attempts pinning to an older package and that didn’t work either. is there an easy way to fix this?
on a related note, i also tried using this example in the documentation, that didn’t work either …
# pipedream add-package google-cloud-bigquery
from google.cloud import bigquery
# pipedream add-package google-cloud-bigquery
from google.cloud import bigquery
def handler(pd: "pipedream"):
# Reference data from previous steps
print(pd.steps["trigger"]["context"]["id"])
# Return data for use in future steps
return {"foo": {"test": True}}
i can probably work with just google-cloud-bigquery package to get the result. its just strange that older workflows allow me to use from google.oauth2 import service_account
I think this issue might be due to how Pipedream maps import names to PyPi package names. I think the latest version of the protobuf package is being added as a dependency because of the google.oauth2 import. And the latest version of protobuf conflicts with the version required by pandas_gbq.
As a workaround, I think you could pinprotobuf to a version compatible with pandas_gbq. For example:
# pipedream add-package protobuf==4.25.3
import pandas as pd
import pandas_gbq
from google.oauth2 import service_account