Flow Studio features in April that will help us mitigate a disaster

eric-muhr-1146201-unsplash.jpg

In the best scenario, nothing ever fails. Your code and logic doesn’t fail, flows don’t fail, SharePoint doesn’t fail, Azure doesn’t fail.

Some days like yesterday aren’t as wonderful. I’m sure everyone “went home” on Thursday, but there’s a daunting task on Friday - how do we know business continuity has been maintained? Where do we even start?

I hope you are reading this while coming into your work on a Friday morning. Because I’m offering a great solution with Flow Studio, and I think you will have a wonderful Friday.

We can’t perform miracles every day, but today, the Flow Studio team thinks we have a good shot at give you back a “great Friday”

Flow Studio free feature update for March ~ April

This blog post is about several new features that were added to Flow Studio during March to April. Flow Studio is a power tool we’ve built for Power Users to work with their Microsoft Flows. We believe in Flow automations, and we want your Flows to be successful. It is an coincidence that many of the features we paid attention to during March and April forms a complete set of tools when it comes to disaster mitigation. Let us explain how these work together.

Sparklines

We introduced Flow Run Sparklines in March, and improved the performance over April. In Flow Studio - your runs are not second tier citizens - they are top tier. We want you to see immediately on Friday morning, which ones of your Flows have failed.

This is important. Because this was my task this morning. I reviewed all the Flows in my environment, and at a glance - I could identify which ones have failed and needs immediate migitation.

Flow Runs, with Context

Clicking on Sparklines takes me right into the latest set of Flow runs - many of them have failed. Flow Studio understands the trigger data (SharePoint, Dynamics or CDS) and brings the context of each trigger to the front. We don’t see just random Run IDs - instead we see the trigger item id, title or filename. These to help makers select the correct runs to re-submit.


Bulk re-submit

When we are sure the Flow is ready to be re-submitted, Flow Studio gives us the ability to bulk select our failed runs and re-submit them.

Flow Run deletion (future)

Deleting Runs is a feature we are expecting to appear (since it has been announced for Logic Apps) - so we have already added support for this feature. As far as we are aware - this API has not been available on any environment.

In the future, when Flows have been successfully re-submitted, we expect the Maker may choose to delete the old failed run and keep the successful, resubmitted run.

Deleting failed runs so we only have successful runs is sort of cheating, but we LOVE SUCCESS! 100% allowed!

Flow Studio subscription feature update

Sparklines for admins

Flow Subscription allows a maker (with Flow P2 license) to see all the Flows made within that environment.

The new Sparklines is also available here - this allows us to observe all the Flows are successful in the entire environment.

There are Flows that may belong to a solution, belong to a team (that excludes you), or even belong to a Resource owner like SharePoint (so they don’t have an actual Flow user owner). To see those Flows - you’ll need to get Flow Studio subscription, because we can only see them via the P2 admin API.

Approvals (v2) cancellation

In the latest Approvals V2 update - Approvals can now be cancelled. Flow Studio provides bulk Approval cancellation.

We feel this is a situational feature - it may be useful if a lot of Flows have been accidentally ran and created a lot of duplicate Approvals that we might want to cancel in bulk.

But it is very possible that during the disaster of May 2019 - we have a lot of approvals created and then the Flow failed. So in re-submitting these Flows again, we wouldn’t want to have a lot of duplicated approvals. Bulk approval cancellation would be very useful in this scenario.


Thank You

Thank you for your support of Flow Studio App - we hope your business processes have not been impacted, and we hope you were able to use Flow Studio to restart your Flows.

We think the combination of Sparklines, Runs with Context and Bulk Resubmit is a timely reminder of the Flow Studio mission.

Flow to Success!

https://flowstudio.app

A Power User friendly method of connecting hundreds of sites, list and libraries to a single Flow

Photo by Sebastian Boring on Unsplash

Photo by Sebastian Boring on Unsplash

We need to get this disclaimer out of the way first - this approach uses SharePoint Designer workflow as a glue - the main reason for this is because SharePoint Designer workflows (being old generation technology) can be deployed across site collections via PowerShell scripting without too much trouble, as long as they remain small and don’t need regular maintenance.

This approach isn’t necessary for deploying a Flow to a dozen sites. But if we are talking potentially hundreds of project sites. Those become scenarios that aren’t currently covered well, and would need workarounds such as this one.

Let me know what you think in the comments.

The Problem

There are many ways we can view this problem of “deploying Flows across many lists, libraries, sites and site collections”.

This problem has several issues we may not need to consider:

  • First, we have Export and Import, as well as Send Flow as a Copy. These are simple, manual steps. That can’t be automated.

  • But at some point - we will hit the 600 Flow limit (this isn’t per environment, this is per account)

  • We will need to reconfigure URLs specific to each Flow.

  • As these copied Flows are customized from the master version which makes updating and maintaining them very tricky.

  • When redeploy a Flow - existing connections must be maintained.

  • When deploying Flow to a new site - new connections must be configured (or existing connections re-used).

  • If we are ever in a situation where we need to deploy a new update an existing Flow but now with additional new Connections. God have mercy.

  • If we think perhaps just delete existing Flows and re-deploy as new, we’d kill any existing runs.

So there are alternate approaches as well… what if we just don’t deploy hundreds of copies of the same Flow?


A new solution?

The method in this blog post is a different idea - we will borrow SharePoint Designer workflow’s various tools for deploying cross site collections and have it invoke the same Flow to perform the main task.

This was posted as #FlowNinja hack 91

https://twitter.com/johnnliu/status/1121055884749053953


Let’s begin

Warning: this hack involves the use of SharePoint Designer. Now before you all run off screaming, there's a good reason why we want to try this method: mass deployments. So here we go

1. Make a Request trigger Flow. With a simple compose.
2. make SPd 2013 reusable workflow

3. first action - add SPd HTTP web request to the Flow request trigger
4. need to configure both request body and request header

5. If we don't blank out authorization this happens DirectApiAuthorizationRequired: The request must be authenticated only by Shared Access scheme

Add SPd Workflow to a Library and a List

6. now publish and add this to a List and a Document library
7. upload a file see SPd workflow trigger
8. see HTTP Request flow trigger

The Test is successful. Next copy the JSON we received in Flow

9. add Request schema from the previous successful run
10. change Compose to match request body

11. add a new item to a sharepoint list (that has the same SPd workflow attached)
12. the same Flow re-triggers.

Take a breather, have some tea

13. at this point, there's several more things to do - consider whether we want excessive logging. SPd workflow is simple - once it is working there's not a lot more to tweak, perhaps we want more context values. While still within this site collection, republish will update it.

Package the SPd Workflow for cross site collection

Let's package this workflow for cross site collection warp jump.

14 remove workflow association to list Achievements - because we don't have that list everywhere, only Documents. 15 export WSP to desktop

16 go to new site collection - this can be SharePoint or group sites or project sites.
17 site settings - upload solution to gallery - activate solution
18 site settings - site features -activate feature (this associates with Documents)

19 test with upload new file in new site collection
20 flow triggers, context of Flow is new site url.

21. note there was an error with the SPd workflow after it called HTTP - because I didn’t activate the SharePoint Workflow feature, Workflow History list isn’t available in the new site collection.

Consider Step #13 and whether you’d want to delete all the logging steps before exporting Flow


21. now we are in Flow. Re-fetch the item and go crazy.

Yes, in Step #4 we deliberately chose the context values that will allow us to re-configure Flow to pick up the item

Conclusion

Con:

  • The pain of writing a SharePoint Designer workflow again

  • Difficult to update deployed SPd workflows - but the one we have here is very simple (two build dictionary and a HTTP request)

Pros:

  • A simple lightweight SharePoint Designer workflow can be deployed across many site collections via PnP-PowerShell, or as part of PnP-Provisioning.

    Activate the solution, and then activate the feature (this will create the workflow associations)

  • All the events from associated SharePoint libraries will call a single Microsoft Flow - this is where we can customize to our heart’s content.

I don’t think this approach should be written off - it should be evaluated and may suit your situation well. I also think in a possible future when cross-site Flow deployments can be done easier - we may have different Flow triggers invoking our one Flow. So there exists an upgrade path forward.

Azure Global Bootcamp Sydney - this Saturday!

Photo by James Ree on Unsplash

Photo by James Ree on Unsplash

I’m presenting a talk on Azure Logic Apps (and Microsoft Flow) this Saturday at the Azure Global Bootcamp Sydney.

https://www.meetup.com/en-AU/Azure-Sydney-User-Group/events/256253065/



[Update 2019-04-28 Video livestream]

I start at 1:15
https://www.youtube.com/watch?v=_kZZbKsHrGI&feature=youtu.be&t=4543

Our day of developer talks streamed live on Saturday 27 April 2019.


LogicApps & Flow for Developers - insane low-Code Serverless Automation

We make the case that every developer must understand what LogicApps and Microsoft Flow are, because they will make us rethink how we really write code.

There is always more code to write, but what if we can

  • Use out of the box actions when we don't need to write code

  • Connect to new systems painlessly

  • Connect to APIs that we have never used before

  • Not worry about how much it all costs

  • Build microservice architecture solutions

  • Fall into pit of success


Join this session to learn about how to write code, fast, without writing code.


This is a similar session (with more focus on Azure Logic Apps - there’s always room for tweaking) from a talk I’ve done previously, most notably during the MS Ignite Roadshow.

The talk is aimed at developers. There’s no assumption that you would know much about Logic Apps or Microsoft Flow, but think of this as a baptism by fire - an mid-level introduction that goes right into parallelism and HTTP requests.

These are what one would often consider to be difficult developer concepts - and as we will all see, these are extremely easy to achieve in Logic Apps / Microsoft Flow.

Join us this Saturday - how to write code fast without writing code.



Upload Image from PowerApps to Flow to SharePoint via an Unused Outlook connector

This is the simplest no code approach to the PowerApps image upload problem so far. Far simpler than with Azure Function, with custom connector, with hacked Flow button via Flow Studio, even simpler than Azure Blob Storage. All standard connectors so no premium required, and no risk of PowerApps trigger resetting and breaking the connection.

This is my simplest method to upload any image from PowerApps to SharePoint

  • No Swagger

  • No Edit JSON

  • No Azure Blob Storage

  • All Standard Connectors

  • No HTTP

  • Can easily add more arguments


Original

This blog post is a cleaned up version of the #Flow Ninja hack 87 thread which happened on Sunday night. https://twitter.com/johnnliu/status/1114863521525669888

Follow me on Twitter and catch the next live hack.


[Updated: 2019-04-27] Video version, PnP SharePoint Community Call from Chaks

From @chakkradeep

This community call demo is taken from the SharePoint General Development Special Interest Group recording on 18th of April 2019. In this video Chaks (Microsoft) shows how you can upload files to SharePoint from PowerApps using Microsoft Flow Presenter - Chakkaradeep (Chaks) Chinnakonda Chandran (Microsoft) - @chakkaradeep Full details on the community call from https://developer.microsoft.com/en-us/sharepoint/blogs/sharepoint-dev-community-pnp-general-sp-dev-sig-recording-18th-of-april-2019/ More details on the SharePoint dev community calls from http://aka.ms/sppnp.


Steps - first a bit of study and exploration

I have a @MicrosoftFlow hack this evening to send files from @PowerApps to @SharePoint I have been thinking about this one for a while. So if you are still awake, follow along.

First - I check the Flow button trigger.
Then create a PowerApps trigger, use peek code to study

Double check SharePoint connector - I read this with FlowStudioApp - there's no method that takes format: byte. Everything wants format: binary.

I spent a while looking through various standard connectors looking for something that does format: byte - I found one. In the Outlook connector.
In send email with attachment. **cackle** **evil grin**

Evil twinkle in the eye acquired - we now execute the plan

So we hack the PowerApps trigger. by using a totally unrelated connector.
I can't hold back my dislike of the PowerApps trigger. Why can't it behave more like the Flow button trigger...

The argument sendanemail_attachmentscontent is ugly. Try using Flow Studio to rename them first before you go too far. This will also make the connection tidier when you take it over to PowerApps.

Finally

PowerApps time - this is probably my simplest method.
Don't need Azure blob storage
Don't need edit json
Don't need swagger
Can have multiple arguments

5-2.png

Just need to conditionally build a strange Flow that doesn't use the outlook connector but use it to lock the PowerApps trigger

  • See the condition is always false - it doesn’t run

  • See also the Size of the create file is much larger than a broken blob string

  • We need to keep the unused Send an email action even if we don’t use it - because it locks the PowerApps trigger in place so the trigger doesn’t reset.


And there we have it - the absolutely simplest no-code solution to send a File from PowerApps to SharePoint with ease.

We lock the PowerApps trigger to format: byte by using an otherwise unused Outlook send mail connector.

Future

There are a few things Microsoft could do that will make this even easier. If they ever get around to it:

  • Allow us to define PowerApps trigger directly either by using Flow Button UI or Request schema

  • Allow SharePoint connector to accept format: byte

  • Allow PowerApps to send format: binary, right now PowerApps converts that to string, dropping the non-character bytes from the data it sends to Flow






One Flow to handle them all - how to subscribe to multiple SharePoint lists with one Flow

One Flow to handle them all - how to subscribe to multiple SharePoint lists with one Flow

At some point - we think hmm how do we save a Flow as a template and deploy it with all our SharePoint site collections…

and did we just create a massive problem down the road where we have all these duplicate Flows…

and how do we manage versioning and changes to them hmm lets have a think and talk about this.

So instead of all that copying. Let’s try a different idea - what if we have One Flow that Rules them all. It handles events from All the Sites and Libraries, all within one Flow.

Depending on what our Flow actually do - this might be a better approach.


Live Tweet

This is a long overdue write up of an idea of two Flow done as a hack over twitter.  https://twitter.com/johnnliu/status/1002747531506245632

Multi-Part Series: Deploy My Flow

This is a multi-part series called Deploy My Flow. I want to publish discussing the various ways to deploy Flows across SharePoint and Office 365, between dev, test and production environments. Across lists, sites, and tenants.

  • Export and Import

  • Automate - PowerShell, Flow Management and Flow Studio

  • Power Platform Solutions

  • Subscribe to multiple SharePoint Lists with One Flow

Part I - SharePoint Webhooks

A Flow can only listen/trigger from one list, so while in SharePoint as we provision sites - we consider having a way to provision templated Flows to handle each list. 

But let's turn the question around.  Can we have just one Flow to handle multiple lists?
Is it possible to have Flow connect directly to SharePoint's Webhooks and manage events directly with Flow?

 

Why this approach?

  • The Flow triggers listen to and raises events for each individual list or document library. So, out of the box, we can't create one Flow that listens to every list or library.

  • Cloning Flows multiple times has its own challenges:

    • 250 Flow limit (This limit is now 500, you can request a service ticket to raise this number) per account.

    • How do we manage multiple Flows, do we have one master? Does it get exported and imported over existing Flows when we upgrade? What happens with the connections?

    • Can we do these automatically?

  • There’s a lot to discuss, but for this blog post we will focus on a simple pattern

  1. Create a Handler Flow that handles list change events.

  2. Create a second Scheduler Flow that subscribes webhooks from multiple SharePoint lists onto the first Flow.

Reference

We don't start from scratch, we have the excellent materials on docs.microsoft.com from the SharePoint PnP team describing how to attach to SharePoint webhooks via code.

https://docs.microsoft.com/en-us/sharepoint/dev/apis/webhooks/overview-sharepoint-webhooks

We only need to translate this to Flow.

First Create the Scheduler Flow

First - here I’m reading all the lists from SharePoint that’s a document library. Then read the subscriptions on each document library. This end point shows me SharePoint’s webhooks.

TIP: Because I use odata=nometadata a LOT - I end up putting that header into a header_nometa JSON variable and insert it in every Send HTTP Request action.


Create Subscription and handle Resubscription

Next, we check if the webhooks contains has a clientState with the current Flow’s name (Flow names are unique guids).

If not, we create one. The check and filter for ClientState makes sure we don’t subscribe the same event handler to the same document library multiple times. It also makes this “scheduler” flow re-run safe. We can pretty much change the trigger to a recurrence trigger that runs as often as you’d like - say daily.

workflow()?['name']

The Handler

We need a new Flow to handle the document updating. This is the One Flow that does all the work.

Save the Handler Flow and take the HTTP Reques URL back to our Subscriber Flow

And this is the final result - when we run the Subscriber Flow - it iterates through all my document libraries and connects them all to the handler Flow.

Now when documents are updated in these 12 document libraries, my “Handler Flow” will be triggered by SharePoint.

SharePoint webhook subscriptions are valid for 180 days. So at some point our Scheduler Flow will re-run on recurrence and re-subscribe the document libraries. (We can also handle the True condition in the Scheduler Flow to renew webhooks).

Part 2 - Get Changes

We need to go deeper into Get Changes. Your webhook fired, now you need to figure out what changed.

Follow Part 2 here

http://johnliu.net/blog/2019/5/one-flow-to-handle-them-all-part-2-figuring-out-the-changes

@ISSPDEV couldn’t wait for the next part he went ahead with this. And because he wrote in so much detail - I will also link to his work for Part 2.

Apologies

This was a live hack in 06/2018 - I actually have three draft versions of this blog post never finished so never previously published. So moving that forward with this first post. 9 months late is still better than never.