MS Ignite the Tour 2019 Sydney - MS Flow x2

I’ll be presenting at Microsoft Ignite the Tour 2019, Sydney on two Microsoft Flow presentations.

Microsoft Ignite The Tour - Community Breakout Social Image Template.png

Advancing the Flow - understand expressions in Microsoft Flow

Is a short 15 minute theatre session on expressions. In a short 15 minutes I’d like to cover why you might want to use expressions, and how it opens the entire Flow engine to your command.

Flow for Developers - insane low-code Serverless automation

This is a full hour breakout session where I wanted to talk about what we can do with Flow, at the intermediate to advanced end and the power it opens up to every platform it touches. Whether it is SharePoint Online, Dynamics CRM, PowerApps or Power BI, from Microsoft 365 to Dynamics 365 to Azure. We have 250+ connectors and it is completely bananas for developer productivity.

Chat!

Those that have met me will know I’m a very chatty person!

I’ll be hanging out at the community booths or at the SharePoint Gurus + Valo booth, so come find me, or tweet me and ask your Flow (or SharePoint) questions!

The simplest No-Code Solution to Save Picture Files from PowerApps to Flow

pexels-photo-1028624.jpg

Today, we are going to talk about a new technique (hack) to send any pictures from PowerApps to Flow.

We will do this with a modified Flow Button trigger.


Update 2019-04 Save Picture Files from PowerApps to Flow via unused Outlook Connector

I’ve came up with an EVEN EASIER hack that you should check out. The 2019 April method is the best one so far.

http://johnliu.net/blog/2019/4/flowninja-hack-87-lock-microsoft-flow-powerapps-trigger-to-upload-images-to-sharepoint-with-ease

Update 2019-03 there’s some JSON definition changes to the PowerApps trigger. See note below.

Update 2019-01 - I did a YouTube recording of this blog post

Quite possibly the easiest way to let us upload an image from PowerApps into SharePoint - from the Add Pictures control (not the low res camera control). Without having to write a custom connection. By hacking a Flow Button into a PowerApps trigger.

Why John, what is the problem? 

Why do you repeatedly write the same post?

The problem is that we want to use PowerApps to collect binary files (mainly images, but also documents, video, anything) and send them to Flow - Flow can then decide where to send the pictures.  To SharePoint, to Email, to Cognitive Services etc.

This is a problem that isn't out of box, and has several good workarounds - some choices more difficult (or limited) than others.

 

What have we got so far?

This is the first method - we can send PowerApps' Camera to Flow via a built in dataUriToBinary() expression.  But it doesn't work for Add Picture from device gallery.  We also can't use this method for sending files - documents, zip, sound, movie clips.

This method lets us use Add Picture from gallery and we can send any binary file formats.  But the input parameter is specialized multipart formdata bytes, so this method needs a Custom Connection (need a Swagger definition file) to work with PowerApps.

 

What is this new method?

We are going to take a Flow Button and hack it into a PowerApps trigger to send Files from PowerApps to Flow.

It will be magical.  Because there will be no Swagger.

 

1. Build a Button-triggered Flow

We start with a Flow Button trigger - which lets us specify which parameters we want as part of the button click.  Flow Button let us select File.

Create FileName and FileContent compose actions to extract the value from the File trigger parameter.  Then use SharePoint's create file to create this file.

Test this Flow - upload a picture of LEGO 10234 - success.

 

2. Examine this Flow Button trigger and switch it to a PowerApps trigger.

Next, we need to check the Flow button trigger and convert it to a PowerApps trigger.

This can be done via Exporting the Flow, open up the export in ZIP file, and change the JSON definition of the trigger.manual.kind from "Button" to "PowerApp".  But here I'm using a simpler way by opening the Flow using Flow Studio's Edit Flow JSON capability.

Try it: https://flow-studio.azurewebsites.net/welcome

Save the changed definition and this is now a PowerApp trigger.

Note: Have a look at the JSON schema generated for the Flow Button's file trigger parameter.  We'll need to know this later.

 

3. Build our PowerApp

Note - this is the Add Picture control.  Not the camera control.

Because our Flow is now a PowerApps trigger - it appears in the available list without us having to build a Swagger file.

The parameters needed is { file: { name: "x", contentBytes: ... }} 
This matches what Flow Button wants for a file upload.

 

4. That's it.  Done.

Run the PowerApps to test.

And the result - LEGO 851400 cup is uploaded to SharePoint via Binary.  No Swagger.

 

Future thoughts

Eventually, I hope we have a way to officially define the input parameters we want to use in a PowerApps trigger.  The way it currently works is awkward.  It should just use the same way that the Flow Button trigger works.

Until that day - this method is possibly the simplest way to send any binary files from PowerApps to Flow.  Because this method doesn't need custom connections - you might find this useful in not hitting your Flow plan limit of 1 custom connection.

Underneath the hood though, a Flow Button trigger is the same as a PowerApps trigger (they are both HTTP Request Triggers).

 

Download File:

https://github.com/johnnliu/flow/blob/master/FlowNinja-SendFileToFlow_20180728000358.zip

 

Bonus Exercises:

  • Create multiple files first in Flow Button before converting to PowerApps trigger.

  • In the definition, made some (but not all) of the file parameters required

 

Update: adding additional arguments

The converted PowerApps trigger works, but you may find it troublesome to edit the Flow after it was changed.  And attempts to use Ask Parameter in PowerApp Trigger within the Flow Designer will cause the PowerApp trigger to be regenerated, potentially breaking the file argument.

Instead, do this.  First make sure it is a Button trigger again.  

Then, create multiple arguments with the Flow Button

Note the two new arguments have very boring names text and text_1

Note the two new string arguments in addition to the file JSON record object.
Result:

 

2019-03 update

  1. When editing the Button trigger to Powerapp do:

    Change Button to Powerapp
    Rename variables to nicer ones file1 and file2 - rather than file and file_1
    fix up references within the Flow to point to triggerBody()?[‘file1’]?[‘name’] and triggerBody()?[‘file1’]?]’contentBytes’]

  2. In PowerApps - call Flow method with two files

  3. See result

Speaking and Hackathon at Digital Workplace Conference Australia - Melbourne

DWCAU-Landscape-Temp.png

In a little less than a month on August 15-16, I'll be presenting in Melbourne at the Digital Workplace Conference Australia 2018.  

www.dwcau.com.au

(I still fondly remember this conference as the previously annual Australian SharePoint Conference - but all things evolve.  SharePoint to SharePoint Online to Office 365, and the workplace evolved from portals to intranet to social platforms to conversational platforms).

Flow and Functions: level up our Serverless Toolkit in Office 365

I will be presenting an amped up talk covering the implementation and automation with Microsoft Flow and Azure Functions.  This is one of several sessions on Microsoft Flow at this conference so I will be covering implementation and mastery at level 200+  (But because it's Flow, it'll still look deceivingly simple!)


John join forces with Paul Culmsee and Ashlee at #DWCAU pre-conf hackathon

I want to also mention that I'm helping out with Paul Culmsee and Ashlee's PowerApps and Flow Workshop/Hackathon on 14 August 2018

http://www.dwcau.com.au/workshops/powerapps-hackathon/

I personally don't know how much PowerApps and Flow raw potential would be in that room in the hackathon.  I really want to find out and I hope you would too.  Space for this is limited - please consider this one day pre-conference workshop.

"Learn PowerApps and Flow from experts and build an app that you need." says John.

"Two MVPs for the price of one" says Paul.  I'll just leave this here.

Whether at the conference, and/or the hackathon, I hope to see you there in Melbourne.

Betting on 2018 - level up our Serverless in Azure

A recent conversation got me thinking about making some predictions for 2018.  This isn't so much a "ha look I'm right in 2019" post.  This is more about internalizing and verbalize my choices and I think there's value is sharing all this thinking.

So here it is, all of it: notes, wishlist, observations, what other people are doing, what we should be doing.  All in one overview blog post.  Happy 2018.

serverless-levelup.png

 

 

Bet on Serverless

You can't look sideways without seeing "Serverless" it's a silly term, but I need to start with a definition by Serverless experts on "Serverless"

  • Use a compute service to execute code on demand
  • Write single-purpose stateless functions
  • Design push-based event-driven pipelines
  • Create thicker front-ends
  • Embrace third-party services

I started on this path in 2016 and I can't look back.  Being able to run your code, anytime in the cloud is a life changing experience for many of us - it abstracts the operations part of hosting code in the cloud, and lets us get back quickly into code.

Applications for this technique are far and wide.  From simple services to augment the endless front-end applications we were building in 2016, to finally having a great way to handle remote events or permission escalation.  And look beyond to the bot-framework.  A little blog post I wrote in 2016 about Serverless site provisioning is now officially best practice in SharePoint's Site Design - I'm a little glad it was useful :-)  At times it feels like I just hack and cobble things together and behold, wow people do like this.

https://docs.microsoft.com/en-us/sharepoint/dev/declarative-customization/site-design-overview#pnp-provisioning-and-customization-using-microsoft-flow

So, what's next?

 

Serverless Orchestration

Invest into Serverless orchestration.  Azure Functions are not the right place to do our orchestration.  Yes, Durable Functions will help this a lot.  But the product we should be looking at is Azure Logic Apps / Microsoft Flow.

As far as I'm concerned - these are the same products, the differences boils down to:

Logic Apps

  • UX, with JSON editor is targetted for developers
  • Consumption based pricing - per actions used, perfect for multiple small requests 
    • So we end up compressing multiple actions into unreadable mess to save costs
  • Integration Services (biztalk scenarios)
  • Better for multi-tenant solutions.

Microsoft Flow

  • UX tries really hard to remain Power User friendly and hide JSON complexity
  • Per Flow execution pricing, with free buckets per tier
    • So we end up putting way too many steps inside a single Flow to save costs
  • Premium connectors as part of higher tier plans
  • Free licenses as part of Office 365 / Dynamic 365 plans making this cheaper for single-tenant solutions.

What can you do with Logic Apps/Flow?

  • Leverage connectors - (remember Embrace third-party services is a Serverless principle), these are hundreds of connectors implemented by the various product teams themselves directly.  So they know what they are doing* (most of the time)
  • You can do delay and wait easily in Flow
  • You can do loops easily in Flow (in Functions it's tricky without potentially hitting timeout).
  • You can do for-each loops in Flow and easily turn it into parallel execution (fan-out) with fan-in just part of the package
  • You can define repeat/retry policies with gradual fall back in Flow
  • You can define follow next token in Flow HTTP Request for REST paging
  • You can handle fallback behaviour as a scoped set, so if any actions fail you can orchestrate that
  • You can include human workflows with human approvals and send nice templated emails with attachments from Flow
  • Function shouldn't do more than one thing.  Use Flow to chain them.

 

Serverless API end points

As we build out a constellation (I stole this word from https://www.slideshare.net/HeitorLessa1/serverless-best-practices-plus-design-principles-20m-version) of functions.  We need to clean up all the microservices APIs with a unified API front.  There are two products for this:

Azure Functions Proxy

  • Simpler - can transform query/post messages

Azure API Management Service

  • More extensive - can transform REST to XML
  • Better Open API definitions

 

Serverless Websites / thicker Front-Ends

A serverless website is basically a CDN plus FaaS.  You don't scale Azure VM or even Azure WebJobs.  Build your entire website with your favourite JavaScript library (I like and recommend Angular - but you should use what your team uses), then bundle with Webpack into a couple of minified JS file for CDN.

Do your compute in the client.  And do your server compute with Azure Functions.

I'll even add here that a low-code solution such as PowerApps is extremely good at getting a proof of concept up and running quickly.  PowerApps supports offline capabilities and will happily call your Serverless APIs via a Swagger/OpenAPI file and treats them all as first class functions.

Wish

As part of the Azure services upgrade email (what a peculiar way to announce new features), the upgrade to the latest Windows Server means that Azure Functions, as part of Azure App Services, will gain ability to work with HTTP/2.

It means - we can get our entire HTTP website in one HTTP Get request, with our Function (or possibly our LogicApp/Flow) sending multiple resources in one response.

 

Serverless Database

Let me first define what is a Serverless Database.  Essentially, you have a database in the cloud. 

  • You want to pay for storage.
  • You want to pay for compute.  On consumption based plans
  • Pay nothing if it's not doing anything, automatically scale as necessary
  • The problem we are trying to fix is simple.  We want to start an application, pick a database, and have it scale with us.  We don't want to put SQL Azure on the cheapest free VM and have it run like crap.

This is an area where Azure is somewhat lacking.  My choices are:

  • Azure Storage Table
  • SharePoint Lists (only because I'm a Office 365 person and I've got office 365 tenants everywhere I look)

Wish

I predict boldly that Azure will bring out a Serverless CosmosDB solution in 2018 and it will be what everyone in the Microsoft ecosystem uses from there onwards.

Otherwise, look towards the competition:

  • Google Cloud Platform has Firebase - event driven, consumption based database, linked to Google Cloud Functions
  • Amazone Web Services has Aurora Serverless - in late 2017, AWS announced they've separated Aurora's cost model down to Compute and Storage.

 

Serverless Event Aggregator

The Azure Event Grid is a very interesting service.  I see the possibility that we'll see a unified way to manage all events in a system.

This is best explained with a parallel analogy.  In browser applications, we catch and handle events in the DOM all the time.

In the beginning, we do:

$(element).click(func)

This has all sorts of problems - how do you route.  How do you de-allocate.  How do you attach new events as new resources come online.  A few years later, we end up with this:

$(global).on("click", ".filter", func)

We attach events via one top level resource, ALL our event handlers are attached there.  And then we let events bubble to the root, apply the filter, then call the handler.

The Azure Event Grid has the potential to be this solution.  In 2019, if we are attaching event handling directly to a resource or a container, then we have stuffed up.  We should attach all our events to Event Grid, then filter within the event grid, and only then invoke the functions that fits the filter.

Wish

I'm hopeful if we can map Microsoft Graph events into the Azure Event Grid - then we'd have something super magical.

 

Serverless Visualization

I want to end on this one because I don't have a great solution, but I think we need a great solution.

Wish

As we built out our constellation of functions and orchestration, there's a need to visualize that design so we can both review the designs, and specifically see where the bottle necks are.

If a set of microservices are buggy, this would be a place to pintpoint this and switch the Functions back to the previous deployment slots.

With Azure Insights - we can get detailed logging for Functions and Flow/LogicApps, so perhaps this is something that needs to be layered on top of the logging.

 

References

 

Serverless connect-the-dots: MP3 to WAV via ffmpeg.exe in AzureFunctions, for PowerApps and Flow

There's no good title to this post, there's just too many pieces we are connecting.  

So, a problem was in my todo list for a while - I'll try to describe the problem quickly, and get into the solution.

  • PowerApps Microphone control records MP3 files
  • Cognitive Speech to Text wants to turn WAV files into JSON
  • Even with Flow, we can't convert the audio file formats. 
  • We need an Azure Function to gluethis one step
  • After a bit of research, it looks like FFMPEG, a popular free utility can be used to do the conversion

Azure Functions and FFMPEG

So my initial thought is that well, I'll just run this utility exe file through PowerShell.  But then I remembered that PowerShell don't handle binary input and output that well.  A quick search nets me several implementations in C# one of them catches my eye: 

Jordan Knight is one of our Australian DX Microsoftie - so of course I start with his code

It actually was really quick to get going, but because Jordan's code is triggered from blob storage - the Azure Functions binding for blob storage has waiting time that I want to shrink, so I rewrite the input and output bindings to turn the whole conversion function into an input/output HTTP request.

https://github.com/johnnliu/function-ffmpeg-mp3-to-wav/blob/master/run.csx

#r "Microsoft.WindowsAzure.Storage"

using Microsoft.WindowsAzure.Storage.Blob;
using System.Diagnostics;
using System.IO;
using System.Net;
using System.Net.Http.Headers;

public static HttpResponseMessage Run(Stream req, TraceWriter log)
{

    var temp = Path.GetTempFileName() + ".mp3";
    var tempOut = Path.GetTempFileName() + ".wav";
    var tempPath = Path.Combine(Path.GetTempPath(), Guid.NewGuid().ToString());

    Directory.CreateDirectory(tempPath);

    using (var ms = new MemoryStream())
    {
        req.CopyTo(ms);
        File.WriteAllBytes(temp, ms.ToArray());
    }

    var bs = File.ReadAllBytes(temp);
    log.Info($"Renc Length: {bs.Length}");


    var psi = new ProcessStartInfo();
    psi.FileName = @"D:\home\site\wwwroot\mp3-to-wave\ffmpeg.exe";
    psi.Arguments = $"-i \"{temp}\" \"{tempOut}\"";
    psi.RedirectStandardOutput = true;
    psi.RedirectStandardError = true;
    psi.UseShellExecute = false;
    
    log.Info($"Args: {psi.Arguments}");
    var process = Process.Start(psi);
    process.WaitForExit((int)TimeSpan.FromSeconds(60).TotalMilliseconds);


    var bytes = File.ReadAllBytes(tempOut);
    log.Info($"Renc Length: {bytes.Length}");


    var response = new HttpResponseMessage(HttpStatusCode.OK);
    response.Content = new StreamContent(new MemoryStream(bytes));
    response.Content.Headers.ContentType = new MediaTypeHeaderValue("audio/wav");

    File.Delete(tempOut);
    File.Delete(temp);
    Directory.Delete(tempPath, true);    

    return response;
}
Trick: You can upload ffmpeg.exe and run them inside an Azure Function

https://github.com/johnnliu/function-ffmpeg-mp3-to-wav/blob/master/function.json

{
  "bindings": [
    {
      "type": "httpTrigger",
      "name": "req",
      "authLevel": "function",
      "methods": [
        "post"
      ],
      "direction": "in"
    },
    {
      "type": "http",
      "name": "$return",
      "direction": "out"
    }
  ],
  "disabled": false
}

Azure Functions Custom Binding

Ling Toh (of Azure Functions) reached out and tells me I can try the new Azure Functions custom bindings for Cognitive Services directly.  But I wanted to try this with Flow.  I need to come back to custom bindings in the future.

https://twitter.com/ling_toh/status/919891283400724482

Set up Cognitive Services - Speech

In Azure Portal, create Cognitive Services for Speech

Need to copy one of the Keys for later

Flow

Take the binary Multipart Body send to this Flow and send that to the Azure Function

base64ToBinary(triggerMultipartBody(0)?['$content'])

Take the binary returned from the Function and send that to Bing Speech API

Flow returns the result from Speech to text which I force into a JSON

json(body('HTTP_Cognitive_Speech'))

Try it:

Swagger

Need this for PowerApps to call Flow
I despise Swagger so much I don't even want to talk about it (the Swagger file takes 4 hours the most problematic part of the whole exercise)

{
  "swagger": "2.0",
  "info": {
    "description": "speech to text",
    "version": "1.0.0",
    "title": "speech-api"
  },
  "host": "prod-03.australiasoutheast.logic.azure.com",
  "basePath": "/workflows",
  "schemes": [
    "https"
  ],
  "paths": {
    "/xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx/triggers/manual/paths/invoke": {
      "post": {
        "summary": "speech to text",
        "description": "speech to text",
        "operationId": "Speech-to-text",
        "consumes": [
          "multipart/form-data"
        ],
        "parameters": [
          {
            "name": "api-version",
            "in": "query",
            "default": "2016-06-01",
            "required": true,
            "type": "string",
            "x-ms-visibility": "internal"
          },
          {
            "name": "sp",
            "in": "query",
            "default": "/triggers/manual/run",
            "required": true,
            "type": "string",
            "x-ms-visibility": "internal"
          },
          {
            "name": "sv",
            "in": "query",
            "default": "1.0",
            "required": true,
            "type": "string",
            "x-ms-visibility": "internal"
          },
          {
            "name": "sig",
            "in": "query",
            "default": "4h5rHrIm1VyQhwFYtbTDSM_EtcHLyWC2OMLqPkZ31tc",
            "required": true,
            "type": "string",
            "x-ms-visibility": "internal"
          },
          {
            "name": "file",
            "in": "formData",
            "description": "file to upload",
            "required": true,
            "type": "file"
          }
        ],
        "produces": [
          "application/json; charset=utf-8"
        ],
        "responses": {
          "200": {
            "description": "OK",
            "schema": {
              "description": "",
              "type": "object",
              "properties": {
                "RecognitionStatus": {
                  "type": "string"
                },
                "DisplayText": {
                  "type": "string"
                },
                "Offset": {
                  "type": "number"
                },
                "Duration": {
                  "type": "number"
                }
              },
              "required": [
                "RecognitionStatus",
                "DisplayText",
                "Offset",
                "Duration"
              ]            
            }
          }
        }
      }
    }
  }
}

Power Apps

Result

Summary

I expect a few outcomes from this blog post.

  1. ffmpeg.exe is very powerful and can convert multiple audio and video datatypes.  I'm pretty certain we will be using it a lot more for many purposes.
  2. Cognitive Speech API doesn't have a Flow action yet.  I'm sure we will see it soon.
  3. PowerApps or Flow may need a native way of converting audio file formats.  Until such an action is available, we will need to rely on ffmpeg within an Azure Function
  4. The problem of converting MP3 to WAV was raised by Paul Culmsee - the rest of the blog post is just to connect the dots and make sure it works.  I was also blocked on an error on the output of my original swagger file, which I fixed only after Paul sent me a working Swagger file he used for another service - thank you!