Why are previously working Python packages not functional in new workflows?

This topic was automatically generated from Slack. You can find the original thread here.

tried and tested Python packages no longer work on new workflows

hi, i have existing workflows that are using **from** google.oauth2 **import** service_account in a Python step without any issues. however, when creating a new workflow, this package is no longer working. ive made various attempts pinning to an older package and that didn’t work either. is there an easy way to fix this?

on a related note, i also tried using this example in the documentation, that didn’t work either …

# pipedream add-package google-cloud-bigquery
from google.cloud import bigquery

Hi Tom - I’m not seeing that same issue with importing the bigquery package with the magic comment system

But I did notice you’re using our older pipedream.script_helpers syntax. I wonder if that’s affecting the execution environment somehow.

Have you tried the new handler format instead?

# pipedream add-package google-cloud-bigquery
from google.cloud import bigquery

def handler(pd: "pipedream"):
    # Reference data from previous steps
    print(pd.steps["trigger"]["context"]["id"])
    # Return data for use in future steps
    return {"foo": {"test": True}}

that’s odd - the exact same code doesnt work for me

Could you try a new workflow? I wonder if the current environment is just in a bad state

the above was a new workflow - should i try a new project?

Hmm that’s strange, that shouldn’t matter. Are you using a Github Sync’d project by chance?

yeh - new project didnt work either

no, not using Github sync

Could you share a link to the project? The p_********* ID within the URL will help us find your project and see what might be the cause

here’s the test example

this is the workflow ive been struggling with:

Thanks, I’ve notified our team. Not quite sure of the cause yet but I’ll be in touch when it’s been tracked down

great, thanks

Hi Tom, super nit picky ask but could you try moving up your import and magic comment statement up one line?

The problem might be that the magic comment system only scans for imports on line 1.

yeh that works for google-cloud-bigquery

i still can’t get this to work

import pandas as pd
import pandas_gbq
from google.oauth2 import service_account

i can probably work with just google-cloud-bigquery package to get the result. its just strange that older workflows allow me to use from google.oauth2 import service_account

I think this issue might be due to how Pipedream maps import names to PyPi package names. I think the latest version of the protobuf package is being added as a dependency because of the google.oauth2 import. And the latest version of protobuf conflicts with the version required by pandas_gbq.

As a workaround, I think you could pin protobuf to a version compatible with pandas_gbq. For example:

# pipedream add-package protobuf==4.25.3
import pandas as pd
import pandas_gbq
from google.oauth2 import service_account