Office 365 Saturday Canberra 2018 #O365CBR #SPSCBR

This Saturday - 14 July, we'll be converging on SharePoint and Office 365 Saturday Canberra.  I'm planning to take my time and drive down for the weekend.

028Canberra0021.jpg

https://www.meetup.com/en-AU/O365-Saturday/events/252110689/ 

The event is held at Microsoft office this time (so not at the previous Clifton's venue).  We will be covering a mix of Office 365, modern SharePoint, and security topics.

I'll be around all day as well - ask me anything about SharePoint, Flow, PowerApps, Office Development, Todo, Xbox and everything!

I believe we have an Xbox One and a bunch of swag to give away too!  But you are definitely coming for the wide range of content and topics, and definitely not for the swag.

We hope to see you there!

 

10 Things in Microsoft Flow that isn't in Azure Logic Apps

Sorry about the catchy headline.  I will start by saying I am perfectly ready to see a response post containing 20 things in Azure Logic Apps that we wish are in Microsoft Flow.  The point of this post isn't about whether one product is better than another product, it is simply to highlight the very-intentional design differences and how as users we have access to both and should make our choice accordingly.

If a comparison must be made, then I think in reality, they are better seen as two siblings - LogicApps is the big sibling with more features, Flow piggybacks LogicApps, but itself has several unique tricks and sometimes, features do move between them.  I love both teams & products.

To best describe Flow to an Azure / Logic Apps person, Flow is Logic Apps + power-user / human workflow-focused workloads, combined with a mobile experience and better in-product integration.  As a result, it caters to a whole different set of scenarios that Logic Apps isn't focused on.

 

1. Resource-Owned Flows

Flows can have multiple owners, but it can also have a Resource as a owner.  The best example of this is in SharePoint connector.  A Flow can be 'owned' by the List or Library resource - so if we grant a user library owner permissions - that user automatically can see and modify the Flows owned by the resource.  This is awesome because we don't need to manage two sets of user ownership sets.

 

https://flow.microsoft.com/en-us/blog/share-with-sharepoint-office-365/


2. Run-as user

Several Flow connectors has a concept of the Run-As user, that is, the user can select a resource like a document or a library, and run the Flow as the current user.

LogicApps connectors can only run as the maker.

3. Approvals

Flow implements a simple set of approvals API with both one-must-approve as well as everyone-must-approve, this is setup with Office 365's Actionable Messages, so tasks can be completed directly from email.  These are also available within the Flow Mobile app.  These approval tasks can be reassigned, and there is a history trail of them in Flow.

In LogicApps - human approvals can be built Outlook's Send Approval Email.

https://flow.microsoft.com/en-us/blog/introducing-modern-approvals/ 
 

 

 

 


4. Notifications

Flow can send Mobile Notifications to the accompanying Flow mobile app.  

https://docs.microsoft.com/en-us/flow/guided-learning/build-flows?tutorial-step=2#step-2

5. Flow Buttons

Flow has digital buttons called "Flow Buttons" these appear as quick triggers on the Flow Mobile App, but is also a really easy way to set up a run-now test trigger.

https://flow.microsoft.com/en-us/blog/button-file-inputs/

6. PowerApps Trigger

Flow has a PowerApps trigger (and response) that can send structured JSON data to and from PowerApps.  This makes it much easier to use Flow as server-side middleware to extend PowerApps (which is client-side).

LogicApps has to publish custom connector (which can then be used in Flow and PowerApps).

 


7. Environments

Flows are created and grouped within Environments - an environment can have unique assets grouped, as well as shared custom connectors, and data leak prevention policies.

Logic Apps are grouped within subscriptions - you either have access to the subscription or you don't.

8. Selected-Row Trigger

Selected Row is part of the product integration feature of Flow - in which, Flow can be created and ran from within other products as part of an integrated experience.

Two examples we have right now are in SharePoint and (soon) in Excel.
On the roadmap there is also Outlook integration.

This trigger is specific in that in each of these integrated experiences, you can select an existing item (in SharePoint list, in Excel row, or in an email) and start a Flow with that item as the source trigger.  Additionally, Flow can run as the current user (see #2) as part of this integration.

https://flow.microsoft.com/en-us/blog/spring-2018-update/


9. Analytics

Flow has several builtin analytics charts out of box with PowerBI.

Logic Apps has Log Analytics integration and users can build their analytics via Insights.

 

 

 

 


10. Flow Management Connector

In Flow, the Flow Management Connector is a meta-level connector that lets you perform reflection-like actions on the Flows within your current environment.  You can even use Flow to make other Flows.

Somewhere, there is an insane Flow engineer that says wouldn't it be ultra-meta to deploy Flows with Flow.

Logic App's Logic Apps connector only lets you list other Logic Apps in the current subscription and run them.  To deploy LogicApps - talk to Azure Management API to deploy ARM templates.

I use this connector quite often to move Flows around:

http://johnliu.net/blog/2017/12/you-must-copy-all-your-flows-to-sharepoint-simple-ideas-are-the-most-brilliant

http://johnliu.net/blog/2018/5/save-all-your-flows-to-vsts-via-http-rest-in-8-actions

Think about this.  Flow can read itself.  Flow can call Azure Machine Learning.  Flow can update itself.

Insane.


11. Business Process Flows

Business Process Flows replace the old Business Process workflow in Dynamic 365,  and is a way to build a "state machine" that triggers and transitions between different business process stages in Dynamics.

https://flow.microsoft.com/en-us/blog/spring-2018-update/

This integration with Dynamics platform isn't available in LogicApps.

(I included an 11th because some may be picky and say well #9 analytics isn't a special power...)


Example of a powerful Flow feature that made it's way back to LogicApps

12. Data Gateway

Flow as part of the Business Application Platform had on-premises integration through the Data Gateway functionality first - including calling on-premises SharePoint, SQL, File System and REST endpoints.

This feature was integrated back to LogicApps later as well.

https://flow.microsoft.com/en-us/blog/on-premise-apis/

I connect SharePoint to my Minecraft via the Data Gateway with a custom REST API.

http://johnliu.net/blog/2017/10/from-office-365-to-minecraft-connected-with-flow


Summary

Flow, Logic Apps, Azure Integration - these are a multi-headed effort to move expand in multiple directions, each under a specific product offering.

Specifically, Flow's special powers fall under these categories, some are easy - others not so easy to replicate back in Logic Apps.

  • Flow Mobile app
  • Approvals
  • App-Integrated Flow (SharePoint, Dynamics, Teams, Excel, Outlook)
  • BAP-Integration  (Environments, Data Gateway, PowerApps, PowerBI) 

In Dynamics and Office 365, because we get a generous pool of free Flow runs as part of the license, Microsoft Flow can be cheaper.  Flows cost per run.  So it encourages building long-running, multi-step Flows, suitable for human workflows.

But in my own experience - some of the Flows that I call a lot (but doesn't have too many actions), it's actually cheaper to switch those to Logic Apps.  I consider these Flows more like Middleware calls - HTTP Request, do a few actions, finish.

Also, in building Flow Studio that works across tenants - I've opted to use Logic Apps rather than use the Flow runs of a single Office 365 Tenant.

Different scenarios, for different needs.

How to get live FIFA Worldcup results via Microsoft Flow into your SharePoint Intranet WebPart

Github tweeted a link to a NPM NodeJS CLI project that uses data from http://football-data.org.  Seeing that, I decided we need to build a SharePoint Modern SPFx webpart so we can load it into all our Intranets.

Plan

  • Use Microsoft Flow to call the football-data API
  • A Thesis in Microsoft Flow's Parse JSON action
  • Format the data and write it into a HTML segment in SharePoint
  • Build a simple SPFx webpart that will load the HTML segment into the DOM
  • Future: Using Events API
  • Future: Using API Key
  • Downloads

Use Microsoft Flow to call the football-data API

We want to build this:

First let's look at the data source API.

That's nice - we don't need to figure out which competition is the 2018 World Cup.  It's 467.

http://api.football-data.org/v1/competitions/467

We want the game fixtures though, so we will call

http://api.football-data.org/v1/competitions/467/fixtures

(from http://api.football-data.org/documentation )

 

A Thesis in Microsoft Flow's Parse JSON action

To make the Select Operation Easier, we have two choices:

  • "type the expressions manually" or
  • "configure Parse JSON to make our lives easier"

Parse JSON is a detailed exercise that requires a full post by itself, since it requires description of various problems that we'll see, and how to get around them.  The Parse JSON step is the next blog post.

http://johnliu.net/blog/2018/6/a-thesis-on-the-parse-json-action-in-microsoft-flow

 

Format the data and write it into a HTML segment in SharePoint

If we skipped Parse JSON - then we'll need to manually type out the expressions.

We are selecting From:

body('HTTP'_world-cup-fixtures')?['fixtures']

And we need the properties:

date: convertFromUtc(item()?['date'], 'AUS Eastern Standard Time')
Day: item()['matchday']
Team 1: item()['homeTeamName']
Team 2: item()['awayTeamName']
Team 1 Goals: item()['result']['goalsHomeTeam']
Team 2 Goals: item()['result']['goalsAwayTeam']
Status: item()['status']
 

If we used Parse JSON - then the step to select the properties are easier.

 

We do the work in Parse JSON to describe what the types of the properties are - this allows the dynamic content pane to be more intelligent in showing us the appropriate choices.

Still have to type out the date conversion to local-time, unless you want UTC time.

We have our nice HTML table now.

flow-fifa-request-call.jpg

 

Build a simple SPFx webpart that will load the HTML segment into the DOM

For this step, I cheated and just used Mikael Svenson's wonderful Modern Script Editor webpart

Then I pasted in this code: 

<div id="fifa-games"></div>
<script>
window.fetch('/Shared%20Documents/fifa-2018-fixtures.html', {
  credentials: "same-origin"
}).then(function(response){
  return response.text();
}).then(function(text) {
  document.getElementById('fifa-games').innerHTML = text;
});
</script>

Basically, fetch the contents of the HTML from fifa-2018-fixtures.html and write that into DIV#fifa-games element.

You can also just use the old Content Editor webpart and use Content Link property to the file, but that doesn't work on the new modern pages.

Results

Works really well in Microsoft Teams too.

 

Future: Using Events API

football-data.org has an Events API where they will call our callback URL when a team scores in a game fixture.

http://api.football-data.org/event_api

This requires a first Flow to loop through all the Fixtures in the FIFA 2018 competition and set up an event call for each fixture.  Then when those games are played - if any goals are scored - we can call the same parent Flow to immediately update the result HTML table.

This implementation won't require a scheduled recurrence to refresh the data daily (or hourly).  It can be refreshed on-demand!

Future: Using API Key

When calling the football-data.org API without an API-Key, there is a max rate limit of 50 calls per day.  We can sort of see this in the response header (it goes down by 2 per execution).

Now because this is running from a Microsoft Flow server, I have no idea if your calls are rate-limited by the same IP address as someone else running this in your Azure Data Centre region.  So if you do see rate limit, it is better to register an API Key with an email address, and then add that to the HTTP Request Header.

Do Group By in Microsoft Flow with two simple elegant loops

This is a problem I worked on with Fausto and is a common problem I see when working with rows of data in Microsoft Flow.

Plan

  • Scenario: Group By
  • Doing Group By Elegantly
  • Build it in two parts
  • Run it quickly

Scenario: Group By

Group-by is a common problem - but usually we see this a lot in the form of batching rows of data to some sort of output.  One frequent example is a schedule that looks up a bunch of tasks or messages for the day, and sends only ONE email with a group of tasks to that person.

Doing Group By Elegantly

How do we do Group-By elegantly, and quickly.  Yes, group-by can be done with for-each and append-to-array, but that approach leads to fairly complex looking inner loops and wide-layout if-conditions.

The solution that we got at the end is an elegant one with two for-each loops.  It is easy to follow and easy to test - which means you can easily reconfigure it to your own scenario.

Let's take an example

pivot-this-list.jpg

I have a list here with 6 rows from SharePoint and 3 names.  In the first loop, we want to reduce the array of 6 rows into unique names:

[ "Gandalf", "Boromir", "John Liu" ]

Then in the second loop, we can loop over each of these unique names and built the corresponding list of rows for each name:

[{
  "name": "Gandalf",
  "titles": [ "Wizard", "Remember Password"]
},
{
  "name": "Boromir",
  "items": [ "One Does not Simply", "**SPOILER**"]
},
{
  "name": "John Liu",
  "items": [ "Making up lists", "Ninja"]
}]

First Loop

In each loop, we use the union() expression to combine two arrays.  Union has the special ability that if an element already exists in the array it is omitted.

union( ["Gandalf", "Boromir"], ["Gandalf"] ) => ["Gandalf", "Boromir"]

We do the union in a compose action, and then put that result back into the array variable.

In my example, I'm using SharePoint's people fields - so I'm stepping into item()?['Person']?['Email']

That's first loop.  Easy to test - one expression.

Second Loop

Second loop involves looping through the first array of unique_names, and the first action stores that value into a compose step.

Then use Filter array to select only the rows with that unique name from the original table.
Then use Select to pick only the columns from the original table that we want to see.  (The majority of the second loop looks more busy, because I'm using techniques from my other blog post "using select to simplify your Create HTML Table")

I append the results as an object to a running array of results.  But it is also very easy to just send email at this point.

For each unique name (or email), this sends one email, the email body is a summary HTML table of the items in the list for this person.

Run it quickly

Expression operations like Compose, Filter, Select or Union are fast - most of the time running 0-second.

For Each steps and set variables are slower - because there are global locks being applied.  By reducing use of variables, we can make the loop go much faster.

Elegant and fast.

 

Summary

Two elegant loops.  No crazy if-conditions and checking if a value already exists in an array during append.

This is a very useful pattern if you are sending daily emails or summary notifications to a user and you want to batch the results.

I wanted to end here on more loops.  Loopity loop.

Microsoft Flow: SharePoint Trigger on specific fields changed via SP HTTP Request

result-nochanged.png

A very common request in Microsoft Flow for SharePoint is a trigger that only runs when a certain field has changed, or when it has changed to a certain value.

With the new SharePoint "Send an HTTP request to SharePoint" action, we can now do this in relatively few steps with the help of List Versioning.

 

 

Plan

  • Enable List Versioning

  • Obtain item versions results on item change

  • A conditional fast exit

  • Understand the data formats and how to debug

  • Summary

 

Enable List Versioning

List-Versioning.png

Obtain item versions results on item change 

// SPREST-Versions
"method": "post",
"body": {
  "method": "GET",
  "uri": "_api/web/lists/getbytitle('ListWithHistory')/items(triggerBody()?['ID'])/versions?$top=2",
  "headers": {
    "accept": "application/json; odata=nometadata"
  }
}

// Select
{
  "from": "body('SPREST-Versions')?['value']",
  "select": {
    "ID": "item()?['ID']",
    "VersionLabel": "item()?['VersionLabel']",
    "Title": "item()?['Title']",
    "Demo": "item()?['Demo']",
    "Completed": "item()?['Completed']"
  }
}

Versions returns the latest version first.
But there are a few tweaks we can do here:

  1. If we ONLY ever care about the latest two versions - we use $top=2
    this covers the new version and the previous version.

  2. If we care about the latest versions within the last few minutes - we can filter with $filter=Modified gt 'last-5-minutes-date-expression'

    The expression would be addMinutes(utcNow(), -5) but it should be formed in a separate action above.

  3. I have seen really complex mixed up Flow triggers with a lot of overlapping list item updates - I highly advise designing the workflow to not do that. This pattern here can help decouple those situations.

There are many fields on version endpoint - Select will clean it up to only the specific fields that we care about:

A conditional fast exit

// Condition-SameTitle
equals(first(body('SelectVersions'))?['Title'], last(body('SelectVersions'))?['Title'])

Results

Results for success and failure. 
This check can be performed with ~4 actions, as part of the pre-condition check for whether to continue the Flow.

 

Understand the data formats and how to debug

Understanding this step is interesting for debugging purposes.  Since we are calling REST and getting back JSON, we need to know how to read it and 

Run this, and we'll see the default output which is

But for our needs - it can be a lot easier if we apply additional header.

See also: https://www.microsoft.com/en-us/microsoft-365/blog/2014/08/13/json-light-support-rest-sharepoint-api-released/

With either method, we end up with one of these expressions to get to the result array of versions:

body('SPREST-Versions')?['d']?['results']
body('SPREST-Versions')?['value']

Summary

  • Tip 1 - use accept header to get simpler result

  • Read the output to understand the return JSON format

  • Tip 2 - using $top=2 is simpler than Tip 3.

  • Tip 3 (advanced) - using $filter to compare whether just the last two versions or changes as a batch

  • Tip 4 (advanced) - on the created event - there will be only one version. So
    first(body('SelectVersions'))?['Title']
    last(body('SelectVersions'))?['Title']
    refers to the same row.

    We can force comparison by:
    body('SelectVersions')?[0]?['Title']
    body('SelectVersions')?[1]?['Title']

    The second row [1] doesn't exist, but ? will replace the query with null.

  • Use expressions to navigate the JSON structure

  • Use Select to simplify the result array

  • Remove complexity in overlapping Flows hitting update on the same list item by making sure Flows exit early when not required - this will simplify a lot of our Flow designs

  • Tip 5 (advanced) - in the SelectVersion step - do not include the VersionLabel then in the comparison step - compare the two JSON objects string(first(body('SelectVersions'))) and string(last(body('SelectVersions'))) directly. This allows us to check all the fields together.