Design a "Delay until SharePoint File Changed" HTTP+Webhook for MicrosoftFlow

I love challenges - I love hearing people say "We need a Delay until File Changed" action in Microsoft Flow.  I agree.  But those are the same questions that sets one inquisitive mind to wander.

Plan

  • The Special Puzzle Piece
  • The generic nested Flow
  • The parent Flow
  • Results
  • Extend the technique

The Special Puzzle Piece "HTTP + Webhook"

Flows are triggered based on events - we can't easily have "a second trigger" in the middle of a Flow.  But there's a magical action in Flow that does do a 'wait' - and that's the HTTP + Webhook action.

"HTTP + Webhook" sounds like we plan to call someone else' webhook.  But we can use that to call our own webmethods.  So the idea then is that in the parent Flow, where we are building an approval, we'd delegate a 'wait' to a child nested flow and then have a generic Nested Flow that would perform any sort of 'wait'.  When the wait is over, the child Flow calls the callback URL and returns data back to the parent.

I first read about HTTP Webhook can be used this way reading @sp_muser blog post on Actionable Messages with Azure Functions. 
https://spwondering.wordpress.com/2018/01/17/actionable-messages-part-1-add-the-card-to-microsoft-flow/

HTTP Webhook left a very deep impression with me after reading that blog post.  Like a new lightbulb that won't go away.
Sometimes, we adapt new techniques.  This opens doors to new designs.  I hope this will help more people to be creative.

The nested child Flow

This is the child Flow - it uses a HTTP Request trigger and we need two arguments - a callback url, and a filepath to watch for in SharePoint.

We immediately check the file in SharePoint and remember the modified datetime (store this in a variable- modified).

Next, enter a do-until loop, and delay every 5 minutes and then grab the file's metadata again and check if the modified time has changed.  When it has changed, we exit the do-util loop, and call the original callback url with a plain HTTP action.

In your real scenario, think about whether 5 minute wait is too unnecessary, it may be better to check once an hour, or only a few times a day, if you don't want to loop too much.

The Parent Flow

In the parent workflow, where we need to delay until a File has changed - use the HTTP Webhook action, call the URL of the child Nested Flow, pass in the URL of the SharePoint file, as well as the listcallback() from the HTTP Webhook action.

Notice when the parent workflow is running and it enters the HTTP Webhook, the parent Flow enters a long-waiting asynchronous wait, magically.

Result

I go and modify the file that we are watching.

Notice the child Nested Flow wakes up after the next delay and detects the file's modified time has changed.

This exits the do-until loop, and call the callback URL.

Calling the callback URL wakes up and collapses the parent Flow

See the parent "waiting" action completes in some time in the future.  In my example, it was 10minutes.  But in real world this can be days or weeks.  Flow's timeout is 30 days.  So we can build quite complex, long-waiting, asynchronous processes.

Extend this technique

We can use multiple parallel HTTP Webhooks if we are waiting for multiple files or process updates to finish.

The nested Flow may not just check a file - it can be any long running process.

There's also a consideration that a long running HTTP Webhook works very similar to an Approval process and the Flow will happily wait for all these steps to return before collapsing and fan-in.

 

How to automatically enter MVP timesheets with Microsoft Flow

The Microsoft MVP Summit is next week.  I'm here at 6AM slaving away cracking on this Swagger API file so that we can all have the awesomeness of automatically submitting MVP timesheets with Microsoft Flow.

But really, who doesn't want automatic?! 

Flow, Make It So

Flow, Make It So

Plan

  • The MVP Contributions "timesheet"
  • MVP Production API
  • Custom Connection via Swagger
  • Set up the Flow
  • Future ideas

The MVP Contributions "timesheet"

The Microsoft MVP award is a recognition of our various activities throughout the previous year, and it is measured with both "reach" and "impact".  So, at a minimum, we have to do timesheets.  I really really don't like doing timesheets.

We really should be able to do this automatically.

Write a blog?  Made a podcast?  RSS -> Automatic.
Wrote a tweet?  Automatic.  (your MVP lead will probably have a chat with you about this)
IoT senses temperature change?  What better time to log an entry!
 

MVP Production API and the MVP PowerShell module

The MVP program with contribution from several MVPs created a set of APIs for querying and posting our profiles, contributions and details.  The starting point is here:

https://mvp.microsoft.com/en-us/Opportunities/my-opportunities-api-getting-started

To call this API, we need two pieces of authentication - we need OAuth to Windows Live Account, and we need an API key from the MVP API.  Follow the steps in the post and you'll end up with:

  • an MVP Production API subscription, which gives you a primary and secondary Api-Key
  • a MSA application with ClientID and ClientSecret

I want to note that there are existing work in a MVP PowerShell module

https://github.com/lazywinadmin/MVP

It works the same way, but because it doesn't remember your tokens, every time you run the PowerShell you need to login via a Live, get a token, then make the submissions and then when the PowerShell session finishes you lose everything.  So while this helps with entering the details, but doesn't help you manage your OAuth token, it certainly isn't "hands free / automatic".

So we will do this, automatically, with Flow

Custom Connection via Swagger

I frequently sing praises for Jan Vidar Elven's blog post on custom connector.

https://gotoguy.blog/2017/12/17/access-microsoft-graph-api-using-custom-connector-in-powerapps-and-flows/

I'm connecting a Custom Connection to Flow to help me manage the MSA account.  This is an extension of his detailed blog post.

To be able to call the MVP API we will need a Swagger (OpenAPI file) to create a custom connection.  After some struggling - I've got a working version of the swagger file here:

https://github.com/johnnliu/flow/blob/master/MVP%20Production.swagger.json

You can read this, but to use it - you need to replace line 35: 

"default": "ae2edf7-YOURKEYHERE",

With your real subscription key from the API.  You can use either the primary or the secondary.
Save the swagger file.  We go into Flow.

Set up the Flow

Start in Flow - create a custom connector by Importing an OpenAPI file

flow-security-1.png

For OAuth to work - the redirect URL from Flow must be allowed by this App

Return back to Flow Custom Connection

Create a connection

Setup the Flow to make your MVP Lead happy because now all your contributions are going to be automatically entered.

Results

I'm triggering this but clicking a button - you can hook this up to HTTP Request, Schedule Timer, RSS Feed... etc etc

The entry in the MVP tool.

Disclaimer

Running this Flow does not guarantee an MVP award.  But it will keep your lead happy.

 

Future Ideas

1. The Swagger File is generated from the MVP API tool, but underwent heavy modification.  For the curious you can compare the original vs my modified version.

2. As far as I can tell, the Swagger file defines two security definitions (for MSA and ApiKey), but Flow's Custom Connection UI can only handle 1 security setup.

Which is why I moved the ApiKey into an internal parameter within the Swagger File.

If Flow Custom Connection can handle multiple Authentication settings, then we can improve this part of the Swagger.

3. In Posting new Contributions - there are several settings are are ref objects.  ContributionType, ContributionTechnology (ContributionArea), and Visibility.  These should be connected to a dynamic lookup value, so within Flow UI, we will see a friendly dropdown menu that allows us to select one of the friendly names.

There's always more to do, but there's also a time to stop, and publish this blog post.

 

Bulk-copy files across site collection in MicrosoftGraph with MicrosoftFlow, in parallel and in batch

MicrosoftGraph represents SharePoint Document Libraries as "Drives" and Folders and Files as "DriveItems".  It has ability to copy these objects.

I did several tests with MicrosoftGraph and MicrosoftFlow, here are the notes.

  • Copying Files with MicrosoftFlow parallel for-each
  • Copying Files with MicrosoftGraph's $batch
  • Copying Folders when copying with MicrosoftGraph

 

Copying Files with Flow parallel for-each

  • ForEach is running in parallel mode 
  • There are 23 items in the library
  • Because Flow's HTTP Action automatically follows 202 redirect location header - it will check when the MSGraph copy is completed (this is out of box default behaviour).
  • 23 files are copied in 12 seconds
  • No bytes were downloaded to Flow during the copy
  • HTTP Action's retry mechanism would deal with errors or request throttling

 

Copying Files with MicrosoftGraph's $batch

  • MicrosoftGraph's $batch handles 20 request so to handle 23 items, we still have to use foreach and execute two batch requests.
  • Building $batch request needs a Flow for-each loop with incremental index (alas, in Flow expression we have item() for the current item but no index() for the current index).
  • $batch executes quickly, and returns individual responses (including any success, redirect or errors).  BUT it doesn't follow 202 location header so it isn't guaranteed if the copy has finished.  If we want proper fan-in, we will need to track it manually with a second for-each loop querying the response headers.
  • The files are still copied successfully.  So if you don't need a fan-in scenario this may not be a problem.

 

Copying Folders via MicrosoftGraph

  • In MicrosoftGraph, a folder is a DriveItem
  • So copying folder structures is native, and nested
  • We NEVER had this in SharePoint
  • The 202 redirect took longer as it has to copy child items

 

 

References

 

 

Send mail as anyone - #MicrosoftGraph and #MicrosoftFlow (bonus: inline image attachments)

Sometimes while browsing MS Graph permissions, you come across something like this:

This is an Application Permission, it says "Send mail as any user"
The correct next step is of course we drop whatever we've been working on and immediately play with this.

Plan

  • Explore what is Send mail as any user
  • Do attachments
  • Do inline attachments

First, while still in the Azure Portal - grant this permission to the current service app.

Next, we head over to Microsoft Graph Explorer and grab a sample of the Exchange SendMail 

sendMail is usually on: /v1.0/me/sendMail for delegate permissions, but since we are testing sending email as another user, I replaced the URL to:

/v1.0/users/{user-id-guid}/sendMail

This is really as simple as it looks.

Gandalf is now asking me to try the new cafeteria.

What can you do with this?  Lots of people ask the Flow team how they can send email on behalf of the current user onto their managers.  This will do it easily, without having to worry about delegate permissions.

Extra notes: the service app does not have permission to read the user's inbox or login as that user.

And if it's so easy to send emails via MS Graph, let's try attachments.

Do Attachment

We add the JSON for attachments.  For this we need the $content-type and $content

flow-sendmail-json-attachment.jpg

Result:

 

Now, do inline attachments

Why do we do inline attachments?  There are few options to embed images as part of your rich HTML email.  You can use background image via CSS - this is mostly ignored by everybody.  You can use inline img with dataUri - this pretty much only works in iOS.

The oldest way and still the most supported way is to do inline attachment.  The trick is that your attachment must have an additional content-id (cid:) header with a unique name.  Then the HTML mail body can refer to that image wtih <img src="cid:xxxyyy" />

I left the original attachment as a comparison.  See the second attachment is marked inline and has a contentId - boromir12345.  The contentId is used in the HTML content as a embed reference: cid:boromir12345

Result:

Summary

MSGraph lets you send email as anyone.  
It also gives you much better controls over attachments and inline images.  

Please watch the recent MS-Flow Webinar I did with MS-Flow team focused on working with Binary Data.  That will explain how I take binary values and split them into $content and $content-type.

 

Serverless Parallelism in Microsoft Flow and SharePoint

This is a short post about running things in parallel.  There are two angles to this:

"Sometimes you fan-out to 50 parallels and want to run them as quickly as possible, sometimes, you want them in a single file and no one skips a queue".
 

Plan

  • Parallelism Settings in For Each (AKA: when to fan-out to 50)
  • Parallelism Settings in SharePoint Trigger (AKA: and when to queue up in single file)

For Each 

In 2017, we saw Microsoft demo'ed parallel settings in Logic Apps - the support for this dropped into Flow's UI recently.

What this allows is everything in the loop will happen at the same time, in batches of up to 50.  A great example is copying a lot of files from a SharePoint library to another.

This flow, by default, runs one element at a time.  Takes 45 seconds for 23 files.

Running For Each in parallel, the copying takes 6 seconds.

Advance use cases of Parallel For Each:

In SharePoint Site Provisioning - split large PnP Template into many small ones, and run them in parallel.  Since PnP Provisoining is additive - most of the actions can finish on their own.

HTTP action to AzureFunction has automatic retry policy by default.  So if an AzureFunction fails it will retry (default is 4 times with 20 second delay)

See also:

http://johnliu.net/blog/2016/11/build-your-pnp-site-provisioning-with-powershell-in-azure-functions-and-run-it-from-flow

http://www.vrdmn.com/2018/01/site-designs-flow-azure-functions-and.html

 

Sometimes, instead of running so many items at once in parallel (fan-out), we want to make sure only one item run at a time.  This brings us to the second part of this post.

Parallelism Setting in SharePoint Trigger

This one is trickier, and it's not always clear _why_ you need to do this.  But sometimes, you need to stop parallelism, and handle things one at a time.

Setting parallelism to 1

There's a problem Split On and Concurrency Control are exclusive.

So we need to turn off Split On.  This means the trigger will now return an array of SPLists (because the trigger works like a delta query)

I quickly enter about 10 list items in a SharePoint list - triggering off Flow runs.

The result of concurrency/parallelism controls here is that only one Flow run at one time.  It runs with a batch of items that we'll need to handle individually, but they do not overlap.

Advance use case of parallel setting on SharePoint Trigger:

If you are generating sequential incremental numbers on your SharePoint list - this is very useful to prevent two Flows run at the same time.

Summary

  • Parallel Settings on For Each 
  • Parallel Settings on SharePoint Trigger
  • And a small note about Split On and handling array of items on an Created/Updated event