AzureFunctions Work Fan-out with Azure Queue in PowerShell

So I really like using PnP-PowerShell to chain up and perform complex operations in Office 365, and linking them up with AzureFunctions and Flow.

Scenario - scan all tenant's site collections

I bump into another problem today - I needed to scan all my site collections within the tenancy and start a Flow that will notify and apply site closure policy and lock the site.

We can scan all the site collections in a tenant in one request Get-PnPTenantSites - but we want to make sure the job doesn't time out.

So we need to fan-out the workload to a queue, and trigger multiple AzureFunctions to scan each site collection in parallel.

Problem - PoSH Queue binding only 1 output

As soon as I started writing the PoSH - I remembered, with the default PoSH Queue binding - you can only write 1 message to the Queue.

Unlike C# where you could do multiple:

foreach(var message in messages) {
    await outQueue.AddAsync(message);
}

In PoSH - if you are using the default integration tab to set up an Output Binding to AzureQueue.  Then you can only write one message to the Queue.

 

How to write multiple messages to Queue in PoSH?

It turns out I've already solved this once before in April, but I had completely forgotten this, because I DIDN'T BLOG IT.
Let that be a lesson to all developers - Always blog something cool that you did.  Because you will need it in two months when your memory failed you.

# if you have been using the storage in other functions 
# you will already have the connection string in your 
# function's app settings - reuse it

$storeAuthContext = New-AzureStorageContext -ConnectionString $env:azurefunctions3a585851_STORAGE 

$outQueue = Get-AzureStorageQueue –Name 'my-queue-name' -Context $storeAuthContext
if ($outQueue -eq $null) {
    $outQueue = New-AzureStorageQueue –Name 'my-queue-name' -Context $storeAuthContext
}

# this example isn't scanning sites - just going through files in a library
$items | % {
    
    $item = @{
        source = $_.FieldValues.FileRef;
        target = ($destination + "/" + $_.FieldValues.FileLeafRef)
    }

    # Create a new message using a constructor of the CloudQueueMessage class.
    $queueMessage = New-Object `
        -TypeName Microsoft.WindowsAzure.Storage.Queue.CloudQueueMessage `
        -ArgumentList (ConvertTo-Json $item)

    # Add a new message to the queue.
    $outQueue.CloudQueue.AddMessage($queueMessage)
}

 

 

 

Are you Cloud-Curious or Cloud-Serious? Azure Functions in DWCNZ 2017

I had a fantastic time at Digital Workplace Conference in NZ.

Highlight Sessions

There are many other great sessions, I wasn't able to be in multiple places at once!

My Own Session

I presented Azure Functions in Office 365 - Building Serverless Solutions

There were a few things that I didn't managed to get through.  I wanted to list them here, and hope you will accept my apologies.  I've had several conversations with you all over the two days of the conference, many wanted deeper details into certain aspects of using Azure Functions.

 

Demo: Timer Based Alert with Email

https://github.com/johnnliu/azure-functions-o365/blob/master/sharepoint-list-email.ps1

This demo outlines a very simple script that will connect to a SharePoint list (or document library), query and fetch list items, format them into HTML and email to user from the System Account.

Combined with a schedule, this is an extremely common scenario in SharePoint Online: you want to schedule a smart alert email once a week, based on a filter to a list.

 

Using Recurring event in Flow instead of Azure Functions Timer-Trigger

While you can schedule tasks in Azure Function via a Timer Trigger, Microsoft Flow's recurrent trigger has several benefits:

  • You can create a Team Flow - so multiple users can be owners and configure the recurrence trigger.
  • The UI for setting up a time for the trigger is more obvious for power users.
  • You can easily see past runs from within Flow
  • You can easily re-run a Flow

 

The Severless "Specturm"

From my own experiences and from reading and understanding the greater scope of Serverless solutions that are being designed in the world, I wanted to present the spectrum of Serverless solutions.  We start on one side - from the Cloud-Curious, to the experts - the Cloud-Serious.

 

Cloud Curious

The majority of the presentation is pitched for the cloud-curious.  You have heard of Azure Functions and Serverless.  The demos presented how to get going really quickly.

Functions are thus:

  1. Micro (web) services for everyone.  So many people I talked to has given up on programming, thinking writing microservices or complex architecture isn't for them.  It's for the young'in dev teams now.
    AzureFunctions, especially with PowerShell - flipped the whole thing upside down.  Now, many 'ex-developers' suddenly find themselves build amazing service end points, connecting them to webhooks and Azure Blob Queues.  It is an amazing resurgence and move to microservices.  And everyone's having fun playing with really cool new toys.
     
  2. Use your favourite language!
    C#?  JS?  PoSH?  F#?  You can even use TS or VB.NET compiled.  Nobody can tell you what language you can and can't use.
     
  3. Perfect solution for many problems in SharePoint customizations
    Elevate permissions
    Webhook and event receivers
    Timer Jobs
    Extending Flow (as custom workflow action)

    If you are bringing customizations in SharePoint On-Premises to SharePoint Online - Azure Functions is a solution that must be evaluated.  It fits so many scenarios that you need to bring your On-Premises customizations forward, without breaking the bank, or needing complex re-development.

 

Cloud Serious

For the cloud serious - you are already using simple functions.  You want to know what's next.

  1. "Idempotent" - this is the keyword that will define the entire Serverless Framework.  You want to design functions that has no side-effects if you rerun.  A function can fail, it will automatically retry until success.  Your function must be built to be retry-safe.
     
  2. Use message queues and service bus to scale your Function.
    In Serverless, you are bound by duration.  You are not bound by parallel compute.
    To scale your long running process, split into a Queue and spawn infinite parallel compute.
     
  3. In a serial code, we wanted to catch all our exceptions to speed up long running tasks.  When we convert to parallel compute - we no longer really care about exceptions.  If you fail, you want to fail fast.  Throw exceptions freely and as fast as possible.  Terminate the function.  Let Queue retry automatically.
     
  4. With the new Azure Functions Proxies, we can create Serverless Web Applications - which is essentially combining a CDN to host static resources, and Functions to run server side code.

    Future of Serverless web apps is basically: CDN + Functions
    Both scale in parallel infinitely, by default, by design.  But is easy to understand and accept in concept.

    You do no worry about scaling VMs, AppPools, IIS, WebJobs, WebSites... 

Your solutions sits on top of all of those things - but there is no fear.  A fast messaging queue with built-in retries and a thousand atomic hammers will carry your workload from now to infinity.  And it'll cost less than your coffee.

 

Slide downloads

https://github.com/johnnliu/pptx

 

Taking a picture with PowerApps and sending to SharePoint with just Flow

Less than one day after I wrote about Taking a picture with PowerApps and sending to SharePoint with help of Azure Functions - I was looking at Flow to do another thing with recurring calendar events, and reading about how Logic App's Workflow Definition Language can be used in Flow.  Then as I scrolled down - I saw this: dataUriToBinary

This was the heart of the problem in converting PowerApp's camera image (Data URI) for SharePoint File upload (Binary).  That I solved with an Azure Function.

And here it is, again, staring at me: dataUriToBinary()
And I know I'd have to write this new post.  

Create the Flow from Template

Using Advanced Formula from Logic Apps Functions in Flow

https://docs.microsoft.com/en-au/azure/logic-apps/logic-apps-workflow-definition-language#functions lists the Logic Apps functions available to Flow.  There are some tricks to make the syntax work - but they are all the same, so practice makes perfect.  Also, there is a LOT of functions.  So it should be fun.

 

Add Compose Action

Add "@dataUriToBinary(  ...  )" drag in Createfile_FileContent.  It'll look OK at first, but if you try to Update flow, you'll get an error.

The template validation failed: 'The template action 'Compose' at line '1' and column '1947' is not valid: "The template language expression 'dataUriToBinary(@{triggerBody()['Createfile_FileContent']})' is not valid: the string character '@' at position '16' is not expected.".'.

Note 2018: the Flow designer has been changed since 2017, and the way to write this expression has changed.

  • Create a Compose action

  • In the dynamic content panel that pop up on the right, select expression editor

  • Type in dataUriToBinary(triggerBody()['Createfile_FileContent'])

  • Note, without the prefix @

  • Hit OK to write the expression into the Compose

Note: Once you save and come back, it won't show the " quotes anymore, and it isn't updateable.

flow2.png

 

Result

So that's all - DataURI to Binary conversion for PowerApps camera to go to SharePoint file.

 

In a way, I'm glad - even in my previous post I argued that data conversion should be native, and shouldn't require a developer.  So this is kind of my wish come true.

 

Taking a picture with PowerApps and sending to SharePoint with help of Azure Functions

Taking a picture with PowerApps and sending to SharePoint with help of Azure Functions

Sometimes, after having written a selfie app in Silverlight, JavaScript, even an Add-in (SharePoint Online), you want to do it again with PowerApps.  This is that article.  I think it's really fun.  And I think it's funny I'm solving the world's problems one AzureFunction at a time.  And I think I need help.

Read More

April PnP JavaScript special interest group call and Azure Functions demos

Shortly after the March Azure Functions demo, I reached out and asked Patrick about coming back to do a follow up focused on JavaScript - specifically PnP-JS-Core.  As I've completely skipped it in the March call/demo that was focused on PnP PowerShell (and C#).  When I first started playing with Azure Functions I was doing everything in JavaScript - so it was nice to return to be able to do this demo. 

Uploaded by SharePoint / Office 365 Dev Patterns & Practices on 2017-04-13.

I'm a bit more mindful of the time, but this whole demo is on PnP-JS-Core.

We focused on a few things that people asked in the PnP-PowerShell call in March:

  • What about JavaScript - can you show JavaScript in Azure Functions
  • Isomorphic PnP-JS-Core - running on NodeJS - if you are going to use JavaScript on the client, might as well use the same code on the server.
  • Authentication using Sergei's node-sp-auth (congrats on MVP award!)
  • How to test your Azure Functions locally via azure-functions-cli
  • Live debugging with VSCode (locally)
  • How to pack your JavasScript AzureFunctions so that you don't need to deploy the massive node_modules (which is both costly for storage, and has a higher startup time).  We use azure-functions-pack

SharePoint's Future is full of JavaScript

Lots of quick little demos that makes a nice introduction scenario - but if you have not seen Azure Functions before, this is best viewed as a supplementary follow up to the first PnP Call in March.

Related Links

http://johnliu.net/blog/2017/4/march-pnp-special-interest-group-call-and-azure-functions-demos