Building Binary output service with Cognitive Services and Microsoft Flow

We covered how to do binary webservices with Microsoft Flow.  A question then lingers in my mind.  If you can push binary data into a Flow, and within the Flow you can pass it around...  Can you output a binary file from Flow?

This question bothered me so much in my sleep, I decided to test it and write this blog.  And thus, we have probably the simplest example of the series.

  1. So we will first build a service endpoint that can return binary data.
  2. Then we will send it through cognitive services and tag some data as we go.

This is a post in a series on Microsoft Flow.

  1. JSON cheatsheet for Microsoft Flow
  2. Nested-Flow / Reusable-Function cheatsheet for Microsoft Flow
  3. Building non-JSON webservices with Flow 
  4. One Connection to Proxy Them All - Microsoft Flow with Azure Functions Proxies
  5. Building Binary output service with Cognitive Services and Microsoft Flow

Build a Flow to output non-text Output

The method needs to be set to GET.  Take a image that's authenticated in SharePoint, and set that to be the response output.

Test this with Postman

A few things to note:

  1. The request is a GET request.
  2. It replies with image/png (content type was automatically worked out)
  3. ... and that's it, there's not a lot to say

Add Cognitive Services - Computer Vision

You'll need to create a Cognitive Services in your Azure Subscription.  The free tier offers 5000 images per month, at 20/minute.

We are taking the output of the tag action and adding that to the tags header in the service response.

And here we have the same image, but now with tags in the output.

Smart dogs.

 

Why do we need this?

  1. This means - we can post image in, and we can get image out
  2. May be you need to proxy a resource within SharePoint that is authenticated - but you want to use it directly as a file.  If you use a SharePoint Sharing link it'd take you to a page.
  3. With this direct link to the file, you can use this as an anchor within HTML, or use this to upload a file to an external system (via URL).
  4. May be this isn't a file, but a generated ZIP file that you want to copy somewhere else.  Or it is a docx file.
  5. Or perhaps you want to send a picture to a Flow, then resize it or run it through cognitive services before getting back the result.
  6. May be you are just mad and want to auto-tag every image in your SharePoint?
    That actually sounds amazing.

Because Microsoft Flow lets us push binary through actions, I think there's a bunch of interesting scenarios for this.

Also, I think assistant branch manager and branch manager are awesome.

 

 

Speaking at Digital Workplace Conference Australia 2017

I'll be speaking at the Digital Workplace Conference Australia!  23-24 August in Sydney.

 

This is a conference that's near and dear to me - and I've had several opportunities in the past to present at this conference, where I covered Silverlight, JavaScript, TypeScript, Modern Office App-ins and now this year - I plan to present a supercharged talk on running Serverless with Office 365.

Parts of the talk - especially how to get started - may seem familiar to many of you that has started down this journey. 

I wanted to focus a bit less on the technical, and more about how this has changed people. 

Azure Functions democratized 'I need to run a bit of code' to everyone.  Suddenly, the cloud is not this scary place where there are a hundred things we don't know, and don't know where to start.  Suddenly, the toys that seems far out of reach are ours.  Suddenly, a cloud subscription that costs less than a coffee per month is something I don't even think about.

To me, that is the power of AzureFunctions and why Serverless is a game changer. 

Do you know there are now brand new categories of design patterns specifically rewritten for the Serverless world.

I will of course still cover the technical bits - but to see all 20+ demos I have with me, you'll have to come find me in the speaker area for a personal demo :-)

In Digital Workplace Conference 2017, I want to talk about Serverless.

And I want to talk about humans.  Us.

I think the future will be amazing.  I hope to see you at the DWC Australia.  Come and grab me and say hello!

Reusing functions in PowerShell AzureFunctions

This is a pretty simple blog.  Take one of the examples I've been using often: 

https://github.com/johnnliu/azure-functions-o365/blob/master/sharepoint-list-email.ps1

This PowerShell uses PnP-PowerShell to:

  • Connect to SharePoint Online
  • Pulls a list from Document Library
  • Format as HTML Table
  • Send to an email with SPO SendMail utility endpoint

One of the most common, repeated step I do in almost every Function is to get credential and authenticate.  So I decided to put it into a shared function.

Refactor this into a separate function

Use the file navigator on the right hand pane - add a new file, call it "shared.psm1" this is a PowerShell Module file.

Create a function get-cred() and put the 4 lines of getting $username and encode PSCredential into this function, then return $creds

Finally, reference the module via:

Import-Module "D:\home\site\wwwroot\get-list-and-email\shared.psm1"

# call get-cred inline here
Connect-PnPOnline -url $siteUrl -Credentials (get-cred)

The path is the name of the function, that points to the current directory.
From now on, in every other function, you can just import that module to share the function.

If you use a separate shared directory under the wwwroot\shared level - that's a good place to put these shared modules too.  But note that you can't access that area via the Files right-hand pane.  You'll need to go there via the Kudu interface.

 

I consider putting get-cred away as a preparation step for one day in the future where I would replace that function with a call to Azure KeyVault to obtain the PSCredential object. When that refactoring happens, I will only need to update one place.

 

AzureFunctions Work Fan-out with Azure Queue in PowerShell

So I really like using PnP-PowerShell to chain up and perform complex operations in Office 365, and linking them up with AzureFunctions and Flow.

Scenario - scan all tenant's site collections

I bump into another problem today - I needed to scan all my site collections within the tenancy and start a Flow that will notify and apply site closure policy and lock the site.

We can scan all the site collections in a tenant in one request Get-PnPTenantSites - but we want to make sure the job doesn't time out.

So we need to fan-out the workload to a queue, and trigger multiple AzureFunctions to scan each site collection in parallel.

Problem - PoSH Queue binding only 1 output

As soon as I started writing the PoSH - I remembered, with the default PoSH Queue binding - you can only write 1 message to the Queue.

Unlike C# where you could do multiple:

foreach(var message in messages) {
    await outQueue.AddAsync(message);
}

In PoSH - if you are using the default integration tab to set up an Output Binding to AzureQueue.  Then you can only write one message to the Queue.

 

How to write multiple messages to Queue in PoSH?

It turns out I've already solved this once before in April, but I had completely forgotten this, because I DIDN'T BLOG IT.
Let that be a lesson to all developers - Always blog something cool that you did.  Because you will need it in two months when your memory failed you.

# if you have been using the storage in other functions 
# you will already have the connection string in your 
# function's app settings - reuse it

$storeAuthContext = New-AzureStorageContext -ConnectionString $env:azurefunctions3a585851_STORAGE 

$outQueue = Get-AzureStorageQueue –Name 'my-queue-name' -Context $storeAuthContext
if ($outQueue -eq $null) {
    $outQueue = New-AzureStorageQueue –Name 'my-queue-name' -Context $storeAuthContext
}

# this example isn't scanning sites - just going through files in a library
$items | % {
    
    $item = @{
        source = $_.FieldValues.FileRef;
        target = ($destination + "/" + $_.FieldValues.FileLeafRef)
    }

    # Create a new message using a constructor of the CloudQueueMessage class.
    $queueMessage = New-Object `
        -TypeName Microsoft.WindowsAzure.Storage.Queue.CloudQueueMessage `
        -ArgumentList (ConvertTo-Json $item)

    # Add a new message to the queue.
    $outQueue.CloudQueue.AddMessage($queueMessage)
}

 

 

 

Taking a picture with PowerApps and sending to SharePoint with just Flow

Less than one day after I wrote about Taking a picture with PowerApps and sending to SharePoint with help of Azure Functions - I was looking at Flow to do another thing with recurring calendar events, and reading about how Logic App's Workflow Definition Language can be used in Flow.  Then as I scrolled down - I saw this: dataUriToBinary

This was the heart of the problem in converting PowerApp's camera image (Data URI) for SharePoint File upload (Binary).  That I solved with an Azure Function.

And here it is, again, staring at me: dataUriToBinary()
And I know I'd have to write this new post.  

Create the Flow from Template

Using Advanced Formula from Logic Apps Functions in Flow

https://docs.microsoft.com/en-au/azure/logic-apps/logic-apps-workflow-definition-language#functions lists the Logic Apps functions available to Flow.  There are some tricks to make the syntax work - but they are all the same, so practice makes perfect.  Also, there is a LOT of functions.  So it should be fun.

 

Add Compose Action

Add "@dataUriToBinary(  ...  )" drag in Createfile_FileContent.  It'll look OK at first, but if you try to Update flow, you'll get an error.

The template validation failed: 'The template action 'Compose' at line '1' and column '1947' is not valid: "The template language expression 'dataUriToBinary(@{triggerBody()['Createfile_FileContent']})' is not valid: the string character '@' at position '16' is not expected.".'.

Note 2018: the Flow designer has been changed since 2017, and the way to write this expression has changed.

  • Create a Compose action

  • In the dynamic content panel that pop up on the right, select expression editor

  • Type in dataUriToBinary(triggerBody()['Createfile_FileContent'])

  • Note, without the prefix @

  • Hit OK to write the expression into the Compose

Note: Once you save and come back, it won't show the " quotes anymore, and it isn't updateable.

flow2.png

 

Result

So that's all - DataURI to Binary conversion for PowerApps camera to go to SharePoint file.

 

In a way, I'm glad - even in my previous post I argued that data conversion should be native, and shouldn't require a developer.  So this is kind of my wish come true.