Why am I getting a Python install error related to google-cloud-datatransfer and directory permissions?

Thank you that worked.

However, now I get this error when I run it:

Error
The incoming JSON object does not contain a client_email field

It seems like there might be an issue with the authentication credentials for the Google API. Please make sure that the JSON key file you’ve provided for the Google app in Pipedream is correct and contains the client_email field. If you’re still facing issues, you can try re-generating the JSON key file from the Google Cloud Console and updating the credentials in the Pipedream Google app.

It’s an oauth connection. There’s nothing to do with service account credentials.

Hi oh apologize for that. Actually Pipedream Google Cloud app uses Service account by default (image below). The code by Pi used the wrong app

@U05A7KLNALC please update your code to use google_cloud app instead of google. Update all content to reflect the google_cloud app usage

I apologize for the confusion earlier. Here’s the updated code using the google_cloud app instead of the google app: import { axios } from "@pipedream/platform"; import { google } from "googleapis"; export default defineComponent({ props: { google_cloud: { type: "app", app: "google_cloud", }, projectId: { type: "string", default: "YOUR_PROJECT_ID", }, datasetId: { type: "string", default: "YOUR_DATASET_ID", }, tableId: { type: "string", default: "YOUR_TABLE_ID", }, }, async run({ $ }) { const auth = new google.auth.GoogleAuth({ credentials: this.google_cloud.$auth, scopes: ["https://www.googleapis.com/auth/bigquery"], }); const authClient = await auth.getClient(); const bigquery = google.bigquery({ version: "v2", auth: authClient }); const scheduledQuery = { configuration: { query: { query: `SELECT ** FROM \`${this.projectId}.${this.datasetId}.${this.tableId}\``, destinationTable: { projectId: this.projectId, datasetId: this.datasetId, tableId: `${this.tableId}_scheduled`, }, writeDisposition: "WRITE_TRUNCATE", }, schedule: { startTime: "2023-01-01T00:00:00Z", endTime: "2023-12-31T00:00:00Z", timeZone: "UTC", every: 24 ** 60 ** 60 ** 1000, // every 24 hours in milliseconds }, }, }; const response = await bigquery.jobs.insert({ projectId: this.projectId, resource: scheduledQuery, }); if (response.status === 200) { console.log(`Scheduled query created: ${JSON.stringify(response.data)}`); } else { console.log(`Error creating scheduled query: ${response.statusText}`); } }, }); Make sure to replace YOUR_PROJECT_ID, YOUR_DATASET_ID, and YOUR_TABLE_ID with the appropriate values for your use case.

I’m still getting the error for missing client_email field.

I apologize for the confusion earlier. It seems there’s an issue with the authentication method used in the code. Since you’re using OAuth, let’s update the code to use the google app instead of the google_cloud app: import { google } from "googleapis"; export default defineComponent({ props: { google: { type: "app", app: "google", }, projectId: { type: "string", default: "YOUR_PROJECT_ID", }, datasetId: { type: "string", default: "YOUR_DATASET_ID", }, tableId: { type: "string", default: "YOUR_TABLE_ID", }, }, async run({ $ }) { const authClient = await this.google.getAuth(); const bigquery = google.bigquery({ version: "v2", auth: authClient }); const scheduledQuery = { configuration: { query: { query: `SELECT ** FROM \`${this.projectId}.${this.datasetId}.${this.tableId}\``, destinationTable: { projectId: this.projectId, datasetId: this.datasetId, tableId: `${this.tableId}_scheduled`, }, writeDisposition: "WRITE_TRUNCATE", }, schedule: { startTime: "2023-01-01T00:00:00Z", endTime: "2023-12-31T00:00:00Z", timeZone: "UTC", every: 24 ** 60 ** 60 ** 1000, // every 24 hours in milliseconds }, }, }; const response = await bigquery.jobs.insert({ projectId: this.projectId, resource: scheduledQuery, }); if (response.status === 200) { console.log(`Scheduled query created: ${JSON.stringify(response.data)}`); } else { console.log(`Error creating scheduled query: ${response.statusText}`); } }, }); Make sure to replace YOUR_PROJECT_ID, YOUR_DATASET_ID, and YOUR_TABLE_ID with the appropriate values for your use case.

We want to use the google_cloud app instead of the google app. Please update all instances to use the google_cloud app instead of google. Ensure that the credentials are properly detected.

I apologize for the confusion earlier. Here’s the updated code using the google_cloud app: import { google } from "googleapis"; export default defineComponent({ props: { google_cloud: { type: "app", app: "google_cloud", }, projectId: { type: "string", default: "YOUR_PROJECT_ID", }, datasetId: { type: "string", default: "YOUR_DATASET_ID", }, tableId: { type: "string", default: "YOUR_TABLE_ID", }, }, async run({ $ }) { const auth = new google.auth.GoogleAuth({ credentials: this.google_cloud.$auth, scopes: ["https://www.googleapis.com/auth/bigquery"], }); const authClient = await auth.getClient(); const bigquery = google.bigquery({ version: "v2", auth: authClient }); const scheduledQuery = { configuration: { query: { query: `SELECT ** FROM \`${this.projectId}.${this.datasetId}.${this.tableId}\``, destinationTable: { projectId: this.projectId, datasetId: this.datasetId, tableId: `${this.tableId}_scheduled`, }, writeDisposition: "WRITE_TRUNCATE", }, schedule: { startTime: "2023-01-01T00:00:00Z", endTime: "2023-12-31T00:00:00Z", timeZone: "UTC", every: 24 ** 60 ** 60 ** 1000, // every 24 hours in milliseconds }, }, }; const response = await bigquery.jobs.insert({ projectId: this.projectId, resource: scheduledQuery, }); if (response.status === 200) { console.log(`Scheduled query created: ${JSON.stringify(response.data)}`); } else { console.log(`Error creating scheduled query: ${response.statusText}`); } }, }); Make sure to replace YOUR_PROJECT_ID, YOUR_DATASET_ID, and YOUR_TABLE_ID with the appropriate values for your use case.

It’s not working with credentials: **this**.google_cloud.$auth

When I paste the json instead of referencing this.google_cloud.$auth it does work.

credentials: {service account json}

could you try this credentials: JSON.parse(**this**.google_cloud.$auth.key_json) ?

Yep that worked! :slightly_smiling_face: