Flow Studio Price Update 2025 & Introducing Flow Studio for Teams

Since July 2019, Flow Studio has remained at its previous pricing of $20 per month per person and $200 per year. Over the years, we’ve introduced numerous enhancements, including:

  • Flow Snapshots – Easily track and restore previous versions of your flows.

  • Export to Zip – Package and share flows efficiently.

  • Export runs to Excel

  • API Enhancements – Faster, more reliable flow management.

At the same time, Azure costs have steadily increased, impacting our infrastructure expenses. To continue delivering a high-quality experience, we are updating our pricing.

Flow Studio Pro

Plan Current Price New Price Effective Date
Monthly $20 per month per person $25 per month per person End of March 2025
Yearly $200 per year $250 per year End of April 2025

🔹 Existing customers will continue at their current pricing—this update applies only to new sign-ups.

Introducing Flow Studio for Teams

We’re excited to introduce Flow Studio for Teams, designed for business, projects and teams that need better visibility and management of their flows. With this new offering, teams can:

Set up automatic monitoring of critical flows
Receive alerts when flows fail or behave unexpectedly
Store flow run details beyond 30 days
✅ We host, or BYO Storage

Flow Studio for Teams Pricing

Plan Price Effective Date
Monthly $100 per month inc 3 seats Now
Yearly $1000 per year inc 3 seats Now

We appreciate your support and look forward to continuing to improve Flow Studio. If you have any questions or feedback, feel free to reach out!

Mathematically Elegant way to Flatten an Array of Arrays in Power Automate

When working with data in Power Automate, you may encounter nested arrays (arrays within arrays)—especially when dealing with JSON responses, SharePoint lists, or API results. However, many Power Automate actions require a flat array to work properly.

In this post, I'll show you a mathematically elegant way to flatten an array of arrays into a single-level array using Power Automate expressions.

Understanding the Problem

Let's say you have the following array:

[
    ["Ford", "Toyota"],
    ["BMW", "Audi"],
    ["Honda", "Nissan"]
]

Instead of dealing with nested arrays, you want to flatten it into a single array:

["Ford", "Toyota", "BMW", "Audi", "Honda", "Nissan"]

The slow way - Array variable

The slow way is to use an array variable, then while looping through the top level array, append or union the results into the array variable.

Variables are so slow I don’t even want to make them and add a picture.

The faster way - String manipulation

Convert the array into JSON string, remove the extra array [ and ] characters, re-construct the array in string, and convert back to JSON.

This is a method I was using, it’s much quicker, but has a risk of needing to be careful when removing bracket characters. If you have nested JSON objects this needs more care.

The new fastest way - div and mod flattening

To understand this - you need two numbers: m and n
m = number of elements in the top array
n = number of elements in the child array

Create a range of the total result size of the array (m) * (n)

Use select - loop through this range of numbers, then for each, select the nested result using:
outputs(‘nested-items’)?[ div( item(), outputs(‘n’)) ]?[mod( item(), outputs(‘n’)) ]

do you see how elegant this is 🤔

what’s going on here?
let’s picture for each of the 6 elements above.

0 = array[0][0] div(0, 2) = 0, mod(0,2) = 0
1 = array[0][1] div(1, 2) = 0, mod(1,2) = 1
2 = array[1][0]
3 = array[1][1]
4 = array[2][0] div(4, 2) = 2, mod(4,2) = 0
5 = array[2][1]

so in one select, we can flatten an (m * n) -sized array.

What if my child array is irregularly sized?
That’s OK. By using ?[n] Select will return null for that element, so we can follow the select with a filter-array to remove the nulls.

Bonus

This works wonderfully with Pieter’s Method, which returns array of Body jsons.

Bonus

This works well for cross-join of two arrays, and then flattening them into one.
(These bonus ideas are massive additional blog posts let me know if want to read them…)

Upgrading SharePointSSO Copilot SPFx to Botframework-WebChat 4.18

This is a quick post about how to make a few packages work together.

Firstly, we have this SharePoint SSO Copilot Studio Sample.
https://github.com/microsoft/CopilotStudioSamples/tree/master/SharePointSSOComponent

It uses SPFx 1.18, which uses typescript 4.7
It also uses botframework-webchat 4.15.9

There are a few really nice upgrades in Botframework-webchat since 4.15. But they also picked up a dependency on typescript 5.0.

Lots of people have reported issues with this…
https://github.com/microsoft/BotFramework-WebChat/issues/5345
https://github.com/microsoft/CopilotStudioSamples/issues/260

Now if only we can easily upgrade SPFx to typescript 5.0…
Which is exactly what MVP Andrew Connell has written here:
https://www.voitanos.io/blog/sharepoint-framework-typescript-v5/

So the final steps are:

  1. Commit everything first

  2. npm uninstall @microsoft/rush-stack-compiler-4.7 -DE

  3. npm install [email protected] -DE
    npm install [email protected] -DE

  4. npm install @microsoft/[email protected] -DE

  5. Update ./tsconfig.json (see AC’s blog!)

  6. npm install [email protected]

  7. I had the build failed because a task (lint) wrote output to stderr.
    So I had to switch off a few rules and suppress the error warning.

Surprisingly, everything works and deploys fine in SharePoint online.


A debug tip for complex conditions in Power Automate #FlowNinjaHack 126

This is #FlowNinjaHack 126

Sometimes, we have complex Condition blocks in Power Automate.

And when we run the flow, it just shows “false”, which is a bit hard to debug.

One way I’ve started doing, is to write a Compose debug statement that shows the output of each component of my condition.
I show the Code View here as well so you get the idea.

To convert from the Condition block to these expressions can be a bit tricky, since you can’t use Code View easily for Condition. So here’s a second hack.

Paste this to a text editor, something that understands JSON. You’ll get this part.

        "type": "If",
        "expression": {
            "and": [
                {
                    "not": {
                        "equals": [
                            "@outputs('Get_item_-_Bulletins_list')?['body/Status/Value']",
                            "Draft"
                        ]
                    }
                },
                {
                    "not": {
                        "contains": [
                            "@body('Select_page_filename')",
                            "@variables('varTitle')"
                        ]
                    }
                },
                {
                    "contains": [
                        "@body('Select_page_filename')",
                        "@outputs('Compose_-_Old_Title')"
                    ]
                }
            ]
        }

Now this is still not the right format, but you can then go from this to:

not(contains(body('Select_page_filename'), outputs('Compose_-_Old_Title'))

With a lot less pain.
This is the debug message you’ll see when you run now.

Updating AzCopy in Azure Pipeline

You know how the saying goes - if it ain’t broke don’t fix it. Well, something broke in my Azure Pipeline for Flow Studio App a few days ago, and it took a bit of time to figure it out, so it makes sense to write it down. I’m pretty sure I’ll forget again.

The error is related to AzFileCopy

  • AADSTS7000222: The provided client secret keys for app '***' are expired. Visit the Azure portal to create new keys for your app: https://aka.ms/NewClientSecret, or consider using certificate credentials for added security: https://aka.ms/certCreds.

  • There was an error with the service principal used for the deployment.

I set up my pipelines years ago and don’t remember what was in them. But there were a few issues:

  • I want to switch to Workload Identity Federation thing in Azure Pipelines, it looks like that means I won’t have to keep remembering my keys

  • I was using AzureFileCopy@3 which is not the latest version, latest version is v6. It also looks like v3 didn’t support the new credentials.

Steps

  • Click the convert button in Azure Pipelines

  • Fix AzCopy arguments

  • Fix a permission issue

Click the convert button

It created this identity. Hmm no secrets.

Fixing AzCopy arguments

I was using these AzCopy arguments: /S /Y /SetContentType

/S is --recursive=true 
/Y is --overwrite=true 
/SetContentType is apparently a default behaviour now so I didn’t have to set that
--as-subdir=false

This is a new one I needed, because otherwise it was creating the “Drop” folder in Azure Pipelines as the rootfolder in Azure Blob.

Fixing Permissions

For a few hours I was struggling with AzCopy not working with the new credentials, and I don’t understand. I think because previously the credentials impersonated a person. Whereas now I need to grant a certain role to this new identity.

Go to Subscription (or Resource Group, or Storage)’s IAM settings.
Add Role Assignment
Find Storage Blob Data Contributor
Add the service accounts created by Azure Pipelines.

It should look something like this at the end.

and success