Repeatable dependent process

So- I have several processes dependent on pulling a bearer token — We recently had to change the PW and then I realized I had several processes where I had to get this token. How can I make a single task callable without having to go into several tasks and change the PW? Do I create a process that every 15 minutes creates a new entry in a data store that I can call? OR is there a more straight fw way?

Thanks!

Yeah, just create a workflow that refreshes your token as often as necessary, and updates it in the data store (no need to create a new entry). Then all the other workflows can read a fresh token from that data store entry.

1 Like

What I did was create a single function to fetch the bearer token and put the password there. Then, other processes just call that function when they need the token. No need to mess with multiple tasks. Just update the password in one place.

1 Like

I’d like to be able to store this as an action- but I’m not quite there yet! :slight_smile:

@mike.melanson are you doing this for an app that has a public API? If so, we can look into getting them integrated on Pipedream so we can abstract that logic for you entirely.

It’s Power BI’s REST app I’m accessing. Have several integrations to update tables inside the API, cancel refreshes, check on statuses etc.

Thanks – I’ll check in with our integrations team to see if we can get Power BI integrated! Here’s a ticket you can subscribe to if you’d like to track the progress: Power BI · Issue #9247 · PipedreamHQ/pipedream · GitHub

1 Like

Hi @mike.melanson, I set up the Microsoft Power BI application (https://pipedream.com/apps/microsoft-power-bi) and it should be ready for use.

Are there any prebuilt triggers or actions that you’d like to see built? I’m in the process of scoping nocode actions for Power BI, and was going to start with:

  • Create Dataset
  • Add/Delete Rows on Table

Happy to add support for other components that you would find useful.

that would be great- but most of our PBI data is pre-modelled in snowflake and then just transferred or direct queried. The main functions I have built out with pipe dream are :

  1. Trigger full refresh of a dataset (currently linked to via a DBT webhook)
  2. Monitor refreshes to alert when they complete\fail (currently have a trigger ever 5 mins to check and see if a refresh has happened (i push a value into a data store when we trigger a refresh) and it scans until it’s complete then deletes the data store key.
  3. Trigger partial refreshes (singular\multiple tables in a data set — currently have a slack monitor set up for this to yell at pipedream from slack! Lol )
  4. Cancel refreshes – accidental refresh kick offs, ability to kill it via slack

Most of these processes are triggered from a DBT web hook or I have a slack monitor trigger set up- but I have to have a process scan every 15 minutes updating the bearer token, then get that- pass it through to the processes etc. There seems to be a lot of steps I need to take for fairly simple calls- Happy to walk through my use cases- but I can see use in having insert into table functions for some of our more manual “google sheet” processes I call them!

Mike

1 Like

I’ve added the actions that you described into the component development ticket here: [Components] microsoft-power-bi · Issue #9267 · PipedreamHQ/pipedream · GitHub

Let me know if you are able to connect using the new Microsoft Power BI application that we’ve configured - that should handle the process of authentication and keeping your token refreshed!

Hi @mike.melanson ,
I’ve tested the end to end setup with authentication and the team is working on the components that fulfill the functionality of the functions that you mentioned.

Whenever you get a chance, could you try connecting the Microsoft Power BI application? This should abstract away the authentication / token refresh process that you described.

What I’d suggest is creating a centralized function to handle token retrieval. Maybe set up a service that updates the token in a shared storage every 15 mins. Then, all your processes just call that function.

That’s what I ended up doing! Every 15 mins it just updates the token in a data store- so I can just retrieve the value from any process! Although- trying to streamline that to be dependent on need as we just moved to the business plan and went from 50k credits to 5k so need to start being mindful of my credit consumption!

1 Like

You could create a component which takes care of “lazy loading/refreshing” the token from the data store. Just store it as: { token: "...", timestamp: 1234567890 }, and then if the timestamp is older than 15 minutes, refresh the token and store the new value (and the new timestamp).

Even with a high level of concurrency, you may have multiple workflows refreshing the token at the same time… but that should be okay as long as the service supports more than 1 active token at a time (should be the case).