Two ways to convert SharePoint files to PDF via Flow

This blog post is divided into three sections: The easy, The Auth and The Complete parts.

Microsoft Flow released a new power to Convert Files to PDF.  This made my October.  So of course we have to play with this.

Part 1. The Easy

Now this work well, but raises a few questions: 

  1. Why do I have to copy to OneDrive for Business?
    Because the Convert File action is also available for OneDrive for consumer, but not SharePoint
     
  2. Can I do this without copying to OneDrive for Business
    Not with the default Actions for now.  There's no Convert File for SharePoint Connector.  And SharePoint Connector's Get File Content action doesn't allow a format parameter.
convert-file-actions.png

And this is the simplest solution.

Warning: Next be dragons (Auth and API)

We are going to dive in to see what API this uses.  And whether we can call the same API on SharePoint library document directly without copying the file to OneDrive first.

This next part is good for you.  But it is heavy and will look complicated.  Brace yourselves.

...So what API does this use?

https://docs.microsoft.com/en-us/onedrive/developer/rest-api/api/driveitem_get_content_format

GET /drive/items/{item-id}/content?format={format}
GET /drive/root:/{path and filename}:/content?format={format}

Specifically, this uses the Microsoft Graph

Part 2. The Auth

Disclaimer - OAuth looks familiar, but steps are always tricky.  Easy to mess up.  So if you are following this through, walk carefully.

For the next part, we need to connect to MS Graph with AppOnly permissions

In Azure Portal - under Azure AD - create an App Registeration (I'm reusing a powershell-group-app one I had previously baked)

client-id.png

We will be accessing files - so make sure Application Permissions for read files is granted.  This requires admin consent.

client-perms.png

Via the Azure AD portal - hit Grant Permissions to perform admin consent directly.

client-grant.png

Now we are going to write the Flow with HTTP requests

hit the token endpoint for our tenant with a POST message.  The Body must be grant_type=client_credential with client_id, client_secret and the resource is https://graph.microsoft.com

this request if successful will give us back a JSON.  Parse JSON with this schema:

{
    "type": "object",
    "properties": {
        "token_type": {
            "type": "string"
        },
        "expires_in": {
            "type": "string"
        },
        "ext_expires_in": {
            "type": "string"
        },
        "expires_on": {
            "type": "string"
        },
        "not_before": {
            "type": "string"
        },
        "resource": {
            "type": "string"
        },
        "access_token": {
            "type": "string"
        }
    }
}

This gives Flow a variable for access_token for the remainder of the steps to use to call Microsoft Graph

Test this by calling the MS Graph endpoint for SharePoint site

token-test.png

This HTTP request with the Bearer access_token successfully returns SharePoint site data from Microsoft Graph.

 

Part 3.  The Complete Solution to fetch SharePoint document as PDF

Call /content?format=PDF

get-content-format-redirect.png

A few things going on in this result.  

  1. Flow thinks this request has failed - because it doesn't return a 2xx status.  It returns a 302 redirect.
  2. The Response header contains the Location of the redirect, which is where the PDF file is

Parse JSON again on the Response header.  

{
    "type": "object",
    "properties": {
        "Transfer-Encoding": {
            "type": "string"
        },
        "request-id": {
            "type": "string"
        },
        "client-request-id": {
            "type": "string"
        },
        "x-ms-ags-diagnostic": {
            "type": "string"
        },
        "Duration": {
            "type": "string"
        },
        "Cache-Control": {
            "type": "string"
        },
        "Date": {
            "type": "string"
        },
        "Location": {
            "type": "string"
        },
        "Content-Type": {
            "type": "string"
        },
        "Content-Length": {
            "type": "string"
        }
    }
}

We just want Location.  We also need to configure Continue on previous HTTP error.

redirect-continue.png

And finally, retrieve the file via GET again

fetch-return.png

 

When ran, the flow looks like this:

run.png

 

Summary

The complete solution uses HTTP to call MS Graph directly and pulls back the PDF file after a 302 Response.  This is a fairly complex example so please evaluate whether you want the Correct Way or the Easy Way.

Note also that Microsoft Flow has a Premium connector for Azure AD Requests - which will negate the middle part of this blog post re: Auth and let you dive right into MS Graph REST endpoints without worrying about access_tokens.  

Call this Flow request and it downloads the PDF file, converted from a DOCX document in SharePoint team site.

 

Review Special Techniques Invoked:

  • MS Graph Auth
  • The Continue on Error configuration
  • Parse JSON on Response Header

 

Where is SharePoint Customization going in 2017

I'm actually pretty terrible at gambling the future of Technology.  But I like writing these posts because it makes me sit down and think.

User Experience: SPFx is king

 

The user experience is dominated in 2016 with the news of SharePoint Framework (SPFx).  Which shifts development and toolchain to a more modern JavaScript Stack:  NodeJS, Yeoman.  WebPack.  There are component libraries like Office UI Fabric, but we explored options such as Kendo UI and UI Bootstrap and there are many benefits too.  In 2017, SPFx should come down to On-Premises too, we'll have to wait and see what that looks like.

Sure - there are still plenty of gaps where SPFx can't cover - many of us are looking to a solution that allows us to do a site-level script injection or something that will let us do a Single Page Application override.  But I'm very bullish on the SPFx.  I think 2017 will rock.

https://github.com/SharePoint/sp-dev-docs

 

Frameworks: Angular or React

React continues to better suit component development, and Angular might catch up, but we'll see.  In SPG, we are divided into both camps, and for people that aren't familiar with Angular, I am not opposed to them learning React.

I am however, dead serious that no one should try to learn both Angular and React at the same time.  One need to pick one, master it, then the concepts of the other framework will map in the mind and come naturally.  Learning both at the same time without mastering either of them will screw one's learning path.  Don't risk this.

Have an ASPNET MVC background?  Angular will make more sense.  Want a more code / component based approach?  Then React will make more sense.  Pick one and run with it.

I have picked up Angular now and am quite happy with it.  Feel free to reach out with Angular+SharePoint questions.  I can't help with React questions.

 

SP Helper: PnP-JS-Core

I have high hopes that PnP-JS-Core will continue to gain popularity and wrap around the SharePoint REST Services well.  I can totally see a few blog posts in the future using NodeJS, PnP-JS-Core inside an Azure Function to manage SharePoint Online.

As a bonus, it works really well with On-Premises too.  Congrats on hitting 2.0.0 Patrick!

 

Build tool: Webpack

Pick up webpack - do it now.  gulp and grunt are no longer the right tools that manage the build process.  Use them to automate the tasks around the build.

The rise of command-line-interface CLI tools will be the theme of 2017.  We have angular-cli, create-react-app, SharePoint Framework has its own yeoman template.

CLI is just a command line's way of "new file wizard"

 

Dashboards: Power BI

We did a bit of work with embedding Power BI dashboards into SharePoint, and with the rapid pace of releases of upcoming Power BI SPFx, Push data with Flow to PowerBI and PowerBI Streaming Datasets - it will become increasing no brainer to use Power BI to build your reporting dashboard and embed them into your sites.  With a local gateway, Power BI works for on-premises data too.

 

Forms: ???

I would like to say Power Apps is the thing.  But it's not, not yet.  The reason I say this is that business wants Forms.  They want forms to replace paper processes.  They want forms to replace older, aging digital forms (InfoPath).

Power Apps isn't trying to be a Form.  It is trying to be a Mobile App-Builder.  It might get there one day.  But I'm not sure if that's the course they have set.

I was thinking Angular-Forms for heavy customizations and Power Apps for simple ones.

I'm open to suggestions. 

 

Automation: Flow / Logic Apps + Azure Functions

Several products hit the scene - Flow, Logic Apps are fairly basic.  But once you pair them up with an Atomic Hammer (Azure Functions), then everything will look like a nail and everything's easy.

The way I see it, Flow is better for single events and Logic Apps is better for sets of data.

And don't get me started on the dirt cheap price point too. 

 

Server Code as a Service: Azure Functions

You've probably seen from my recent posts where I rant on and on about Azure Functions.  I truly see it as the ultimate power to do anything that we need a server to do.

  • Any Elevate Permissions task - in Workflow, Flow, Logic Apps or JavaScript Front-End Application
  • Any long running task
  • Any scheduled task
  • Any event response task - especially combined with the newly released SharePoint Webhooks
  • Any task that requires you to build a set of microservices for your power user to build complex Flows or Power Apps
  • Choose any language you like: C#, NodeJS or PowerShell, in fact, mix them in your microservices - nobody cares how it runs, and it's great.

 

Auth Helper: ADAL, MSAL

We can't escape this.  As soon as we begin to connect to various APIs we'll need authentication bearer tokens.

We seem to have better and better helper libraries as time goes on.  But regardless, in 2017 - we will need to be able to auth to the Microsoft Graph in any language that we use.

Julie Turner has an excellent series:

 

Summary

And these are my picks for 2017.  Do let me know what you think - I'm really interested in points where you disagree with my choices, because I want to learn.

 

 

 

 

Build your PnP Site Provisioning with PowerShell in Azure Functions and run it from Flow

What if I tell you - you can build your own Azure Function and use PnP to provision SharePoint Online sites without firing up Visual Studio?

What if I tell you - we will follow the best practices from PnP.  And You can do this all within 10 minutes (OK I timed it - if you don't take any breaks it'll be 10 minutes), and we'll do it with PnP PowerShell.

You will think - this is magic.  You will probably say, this is all crazy.

What if I tell you - you can connect this to the newly released Microsoft Flow.  So you don't even need to learn WebHooks (which is awesome and you should learn it).

What if I tell you - the whole thing will cost you a little bit.  It's not totally free.  It will cost you may be two cents.  ($0.02)

You will say, get out.  I need to lie down.

Learning Software pieces like LEGO blocks

I see software pieces as I see LEGO blocks.  PnP Provisioning is a block (a .NET library written in C#).  PowerShell cmdlets are a block.  And Azure Functions is a block.  Flow is a block.

What would happen if you connect them together?

 

Setting up PowerShell from PnP

First, this is how we set up in our local machine:

Install-Module SharePointPnPPowerShellOnline

Why PowerShell? 

Why PowerShell, you ask - John you are a developer.  You are a .NET developer before you became a JavaScript developer.  Why PowerShell suddenly?

The answer is simple.  It is because everybody understand this:

# connect to the SPWeb context
Connect-SPOnline -url "https://sharepointgurus365.sharepoint.com/sites/projects/template"
# export (copy) a template
Get-SPOProvisioningTemplate -force -out project-template.xml

# connect to new blank SPWeb
Connect-SPOnline -url "https://sharepointgurus365.sharepoint.com/sites/projects/tea"
# paste the template
Apply-SPOProvisioningTemplate -path project-template.xml

Four lines - you have just copied a template from /projects/template and stamp it over to /projects/tea - this is the power and the hardwork of the PnP provisioning team.  They make this magic look simple.

PnP was updated about a week after the blogpost and several PnP commands have been renamed to avoid confusion with SharePoint admin commands. See PnP-PowerShell documentations.

Connect-SPOnline is now Connect-PnPOnline
Get-SPOProvisioningTemplate is Get-PnPProvisioningTemplate 
etc

A way to verify the powershell is to run it first on your local machine and make sure it works first, before uploading the same DLL modules and script to Azure Function.

Special thanks to @pskelly

 

But you can do this in C#

Of course, you can also do this in C#, it looks like this:

This shouldn't be too scary to a developer.  But everyone sees the bubbles and the flow chart and just stop trying.  It "looks" hard.

Additionally, in CSOM - everything is asynchronous.  C# makes this simpler with the await syntax, but in the PowerShell cmdlets, every request is essentially serialized - so they are easy to understand, even if they aren't the fastest way. 

If you want to see what this looks like in NodeJS - I have several blogs.  It's great to learn.  If you don't like JavaScript to begin with - then you'll probably run to PowerShell with open arms.

OK enough chatter - let's go back to the demo.

 

Setting up Azure Functions

Create a new HTTP Request function - select PowerShell as the language.

It says Experimental - what they really should say is "The World's Not Ready For This".

PowerShell Azure Functions looks like this.  The request is passed in as raw JSON content - which is parsed into an object.

Output from the function by writing out to the $res file.

 

Install-Module

install-module doesn't work.  But we still can have modules.

http://stackoverflow.com/questions/37724769/how-to-install-a-powershell-module-in-an-azure-function/39985646#39985646  - The steps are explained by Azure Function team's @tohling

Essentially, we can take a PowerShell Module, find all its component script and DLL, and copy them to a modules/ subfolder within the Azure Function.  It will be loaded when the function needs to run.

In Functions - go to Kudu

Navigate to your new function's directory, and add a modules subfolder

In your PC:

Navigate to

C:\Program Files\WindowsPowerShell\Modules\SharePointPnPPowerShellOnline\2.8.1610.1\

(if you don't have this folder - you missed Install-Module SharePointPnPPowerShellOnline from earlier)

You will also need Microsoft.IdentityModel

C:\Windows\assembly\GAC_MSIL\Microsoft.IdentityModel
C:\Windows\Microsoft.NET\assembly\GAC_MSIL\Microsoft.IdentityModel.Extensions

Grab all these DLLs and PowerShell script and throw them all inside modules.

 

Change Azure Function to x64

One more thing.  The PowerShell cmdlets prefer x64 architecture.  Azure Functions provide 32 bit by default.  Switch this in function settings > application settings > Platform

 

Setup Done!

Now we can write fun functions!  Go back to our site provisioning function.

$requestBody = Get-Content $req -Raw | ConvertFrom-Json

# {
#     "source": "https://sharepointgurus365.sharepoint.com/sites/demo02/projects/pst",
#     "destination": "https://sharepointgurus365.sharepoint.com/sites/demo02/projects/tea"
# }

$source = $requestBody.source
$destination = $requestBody.destination

$secpasswd = ConvertTo-SecureString $env:SPO_P -AsPlainText -Force
$mycreds = New-Object System.Management.Automation.PSCredential ($env:SPO_U, $secpasswd)

Connect-SPOnline -url $source -Credentials $mycreds 
Get-SPOProvisioningTemplate -force -out "D:\home\site\wwwroot\project-template.xml"

Connect-SPOnline -url $destination -Credentials $mycreds 
Apply-SPOProvisioningTemplate -path "D:\home\site\wwwroot\project-template.xml"

Out-File -Encoding Ascii -FilePath $res -inputObject "Done $destination"

Note: PnP was updated about a week after the blogpost and several PnP commands have been renamed to avoid confusion with SharePoint admin commands. See PnP-PowerShell documentations.

Connect-SPOnline is now Connect-PnPOnline
Get-SPOProvisioningTemplate is Get-PnPProvisioningTemplate 
etc

A way to verify the powershell is to run it first on your local machine and make sure it works first, before uploading the same DLL modules and script to Azure Function.

Special thanks to @pskelly

The script will run unattended.  So you will need to provide your username/password stored in Function Settings and read from $env variables.

Run success!

What do we have here?!  PowerShell cmdlets that wrapped the PnP C# library and talks to SharePoint Online and everything running inside an Azure Function.

You have a service.

If you are counting lines - there is 10 lines of PowerShell - including the function wrapper.

 

Flow

Let's top off this demo with a Microsoft Flow - this was GA'ed just this week.

Create a list "Copy-Site" - with two fields "ToSite" and "FromSite"

Have the flow trigger off ItemAdded, and call our Azure Function via the Http Request.

Pass in source and destination.

 

Test Flow... workflow

Flow tells you what was sent to the HTTP Request, and what Response body it got back.  Use this for debugging your functions.

 

We always talk Costs because Azure Function is cheap

The 11cents total is over several days, and the breakdown shows it's almost entirely bandwidth costs - which doesn't have a free bucket.  My AzureFunctions don't exceed the allowed free tier, so I'm not paying for any compute time.

 

Where do we go from here?

What can you do with PowerShell?  Desired State Configuration?  Download lists to CSV?  Synchronize and fix metadata?  Locate orphaned or misplaced documents and fix that?  Improve and merge user profile properties?  Provision users and sites and what else do you fancy?

Does this give you wild ideas?

 

Azure Functions is Serverless

When we say Azure Functions is Serverless - it always seems we are saying we don't need servers (and thus don't need IT Pros).  This isn't strictly the truth - which is that the server is maintained by professionals, and developers could focus on writing code.

But after working on these PowerShell for the last two weeks - actually, I'm beginning to wonder whether there will be a day we don't need Developers.

An analyst says to a Bot - I want this System A to connect to this System B and put result in this System C.  The bot arranges three connections and suggest how the parameters will be connected.  Offers to perform a trial run.  Analyst checks it and approves it.  The code goes into production.

Think about this.  Everyone needs to evolve.  IT Pros - you have a huge opportunity here to use your PowerShell skills to provide huge business value.  Developers - you should use the best tools to enable your users and do more, this is really easy to get into.

 

Tell me what you think

I really want to know what you think - if you can spend a bit of time and leave a comment below.  Where are we going as developers, where are we going as IT Pros?  What kind of PowerShell-AzureFunction are you going to build?  Does PnP need a repository for PowerShell recipes?

It's so nice to see the pieces fit together well.

Presenting "PhantomJS" at SPBizConf free online conference on June 18.

I will be presenting "Introducing PhantomJS: Headless Browser for your SharePoint/Online" in the SPBizConf on 18 June.

http://www.spbizconf.com/events/headless-in-sharepoint-phantomjs-will-blow-your-mind/

Despite the -JS in the name of the title, PhantomJS is not another Javascript library.  Instead, it is a headless browser that lets you automate many tasks involving a browser, and there are some very nifty tricks you can use this to build site directories or create PDF snapshot of documents and forms.

Timing

Technically, because of Timezone Differences, it will actually be the 19th of June at 7AM in the morning.  The session is pre-recorded and I will basically be watching it along with the attendees and replying to questions in the chat room.

More Online.  More content.

Online Conferences such as SPBizConf and Unity Online are a new thing for me, and has prompted me to start looking into setting up recording room at home for more, higher quality, recorded content to be made available from my local presentations.  I'm very excited to be able to participate in global conferences like this without the expense of having to do crazy travel - travelling from Australia to "anywhere" tend to get expensive very quickly.

Will it blow my mind?

The curious will see the old name of the session.  I've since calm down and renamed the title, but the old URL has to stick for link reasons.  You need to let me know if this session gave you great ideas about what sort of shenanigans you get up to afterwards.

Nintex Work Inspired Breakfast Seminar - Sydney

I'm now a Nintex vTE (Virtual Technical Evangelist).  Continuing my company SharePoint Gurus' strong partnership with Nintex.

I had the pleasure of presenting Nintex Tips at the Work Inspired Breakfast Seminars - Sydney event on May 15.

http://www.nintex.com/company/events-webinars/2015/5-15-2015-work-inspired-breakfast-seminars-sydney

As I wasn't sure of the audience, I decided to cover the simple Nintex workflow scenarios and diving deeper into the complex - developer minded (heavy API stuff) at the end.

  • Site Workflows
  • Scheduled Workflow
  • Iterating through Query SQL
  • Recursive CAML Query
  • Regex Options in the Regular Expression Action
  • XSLT to clean up complex XML namespaces
  • And finally, pre-caching json queries in Javascript based applications (great for javascript charts that needs lots of data)

It was quite interesting that the Nintex Breakfast is targeted at existing Nintex customers and really just show casing what they can do with tools they have already got.  So there was no hard sell - just "you can already do this".  There was a bit of discussion from the tables regarding various techniques and tricks.

Timing wise, some were able to stay to end (developers) for the last few demos while many had to go after 9:30am.  Still a great experience and will look forward to more of these events in the future.

I've previously worked with Dan Stoll and he's always a great show.