Tips on fixing Power BI dynamic value

In Power BI we have the ability to use dynamic value within a text block, this allows us to create dynamic paragraphs with data that automatically update.


The problem is, the way we specify the dynamic value is via Power BI Q&A - this can be a giant hit or miss, when it miss - it’s a frustrating experience.

Tip 1 - just use a measure (with a unique name)

The first tip then is to create our own measure with a unique name so that it can’t be confused with anything else. Within our measure, we can use DAX and write specifically what we want the dynamic value to be.

We can add a Card visual to inspect what the value of our measure is. This makes debugging the measure quite simple.

Tip 2 - use the Q&A setup, particularly, to set up the synonyms

I encountered a new problem today, no matter what I tried - Power BI did not recognize the measure (or even the new table that I was looking for). I went into Q&A setup and added simpler, unique names to the measure as well as the table via the Synonyms dialog within Q&A setup, but Power BI was just unable to recognize it.

Tip 3 - too many entities

I tried to export the Linguistic schema (beware this is a YAML file) to see what’s wrong with the model, and that’s when I got this new error message.

So I went back to the Synonyms dialog and unchecked “Include in Q&A” on several entity tables that did not need to be included, and as soon as I did this, dynamic value suddenly can find my measure by name, and link up what I want to render correctly.

I wanted to write this down as it may help others. This ate 2 hours of time for two people, and we created a bunch of rubbish name entities and measure names that I now need to go clean up…



Power Automate API changes - v2 Admin scope now needs user_impersonation

This blog post is about two major updates to Flow Studio App and Flow Studio for Enterprise.

MSAL v2 Update

First one, we finally updated Flow Studio to MSAL v2 there’s a few reasons for this, but primarily, this is because we want to support modern browsers that are now by default disabling 3rd party cookies, which prevented previous authentication via hidden iframe method in earlier versions of MSAL and ADALjs.

Incidentally, this also means Flow Studio App now works on iPad and Safari. And should work better for many customers within enterprise that has 3rd party cookie disabled.

MSAL also supports multiple accounts so that’s an interesting scenario in the future to support multi-user or multi-tenancy? We’ll see.


Power Automate Admin API Scope

Secondly, we have a note on Power Automate API changes and how it affects us.

Power Automate /scopes/admin/v2/ supports fetching up to 250 flows per request prior to paging, by comparison, v1 only supports 50 flows. This means reading flows as admin is once again much quicker.


But we’ve also noticed that admin flow requests now need an additional user_impersonation scope.

Access Microsoft Flow as signed in user” (nice name!)




When customers login to Flow Studio App v1.1.45 or later, you will be asked to re-consent due to this additional scope.


field notes using Power Automate with Power BI

I have been working at a client learning, testing and building Power BI reports, datasets, dataflows, datamart and dashboards for the last 8 months. With a fairly healthy dose of Power Automate thrown in. I wanted to take a short breather and write down some of the learnings so far. I also would really like to hear your feedback regarding what your Power BI best practices are.

I will be presenting my experiences and examples in the Australian Digital Workplace Conference next week in Melbourne. I hope I’ll be seeing you there.

The system components

  • Call the API of a custom business application to retrieve project & product data, the system uses a NoSQL database, so we are pulling out pages of JSON via Power Automate.

  • At the moment, we store these JSON files into a document library in SharePoint.

  • Use Power BI dataflow to process and merge these JSON data into a staging table.

  • Use Power BI dataflow to transform data from the staging dataflow to actually do work.

  • Reports use the staging dataflow, and reference it if additional local transformations are needed.

  • Fancy visuals with Deneb and HTML content visuals.

Calling APIs - we are using HTTP request and calculating nonce and api-key within Power Automate - this wasn’t something Power BI can call directly without some middleware. In particular, we are interested in a subset of projects from our dataset - so every evening, Power Automate calls Power BI Dataset Query, fetch a list of project codes and make API calls. It also checks if there’s been a change from the version stored in SharePoint and skips writing if the JSON has not been modified. (Unfortunately, we don’t have a last modified metadata from source).

We put JSON data into a SharePoint library and store them by months - this is because the business has a monthly reporting cycle, and we wanted data captured against last month vs this month. An alternative would be to use Azure Blob Storage for this staging area, because once we reached several hundred JSON files, SharePoint often throws too-busy error to Power BI during data refresh.

On re-use, we tried a myriad of methods, and we found using two dataflows to work the best. The first dataflow provides raw json content per file from SharePoint. This dataflow is configured for incremental refresh, so if the JSON isn’t updated, this doesn’t need to refresh in the dataflow (this solves our file too busy problem). Datamart can’t easily be used within the ETL of another dataflow (it’s more suitable for direct query).

Our second dataflow is where we do the transformation of the json data navigate the JSON structure and pull out records, lists accordingly.

We try to have very light local transformation or modification within the Power BI report. Instead, as much as possible, we want to run the transform within the dataflow.

This is a high-level overview blog post and there’s quite a bit more (smaller, bite sized) notes I want to write down in time, but this will do for now, and we’ll leave the rest for a future discussion.

Other design decisions, notes and thoughts

  • What Power Automate brings to the table.

  • What Power BI Dataflows brings to the table.

  • Notes on using DAX and Power Query M

  • Using Dataverse

  • Considerations with using SharePoint or Azure Blob Storage

  • Licensing costs

“Soon” Azure Data Factory and Microsoft Fabric offerings

Turning a new chapter - the MVP Alumni post

A few months ago I wrote that I’m turning a page, I’ve found motivation to move forward again, and have begin to actively change my destiny (instead of being passive and waiting for it to happen to me). I wanted to give an update and write as I feel I’m moving onwards onto a new chapter.

  • Some things changed a lot

  • Some things didn’t change much

  • Some things I must give my full attention to

  • Some things I must let go

Some things changed a lot

At the end of last year, I met someone through church meetings that was a friend, who became a close friend, then a fiancé and now my wife. Sometimes, the right person appears suddenly in your life when you are at your lowest, and she didn’t ask questions and just picks you up. I’m grateful to recognize that God is giving me a second chance and I wanted to take that path to move forward with purpose.

  • Moved out from previous family home with a big renovation project completed.

  • Resumed part time work that quickly became full time.

    • Exercising a lot of my Power Automate and Power BI muscles

  • Lost weight, became more healthy

    • and working on gaining more energy via fixing my sleep

  • Married, with two more children, so now a total of four.

    • also add two cats.

  • Moved into a bigger house and the various organization and fixing challenges

  • I have not been very active in the communities (covid, and later as I got swamped with full time work), so alas, I can’t maintain my Microsoft MVP status. After 7 years from 2017 to 2023, I must pass on the spot to someone else.

  • I realized I can do lots of DIY projects and I love doing them. It’s refreshing to realize that I can build physical things and not just digital things.

  • I also realized I really quite like the tranquility of going camping.

Some things didn’t change much

  • I still love to chat, tell people how to build things

  • I am speaking at the Digital Workplace Conference in August 2-3. Presenting topics related to Power Automate and Power BI dataflow.

  • I will still find time to blog, make YouTube content

  • I still love working on Flow Studio and helping customers.

Some things I must give my full attention to

  • Flow Studio, my startup, must continue to grow and I’m working on multiple aspects of our offerings relating to Power Platform monitoring and governance.

  • Grow my YouTube channel, I think between my YouTube and my blog is where I’ll continue to reach more of you.

    • Twitter where we met before is becoming such a strange place

    • Reddit is weird too

    • Some of the community events are no longer around after Covid

  • My health, which is improving, so I’ll still be here for a long time to writing things.

  • My family, which requires me to be around for a long time

Some things I must let go

A younger me thought that hey I’ll build cool things and tell people how to build them. You’ll find this person on this very blog, over the span of twenty years.
Over two decades, I had fun with you all. I built a career, which let me build what I wanted and be on the cutting (yes I bleed a lot so you don’t have to bleed as much) edge.
I wrote, spoke, presented and taught on these topics. I received and maintained the MVP award status.

But I must tell you, when everything comes crashing down, none of this mattered. Career, wealth, fame, travels, future.

Consider what is closest and important to you. Do more of that. Spend time with people that you should have - don’t regret it afterwards when it is too late to do anything different.

I am thankful for the MVP program for recognizing my contributions to the community, and for the many beautiful friendships I’ve made along the way. I’m grateful that I have a clear path to move on, and I hope you all are still here to cheer me on.

So it feels so good to put these things to words. I’m fourty-four. I feel like I’ve climbed out of a valley. There’s a new horizon in sight, I have a second chance.

I’m always here, online and physically in person around Australia.
I’m sure we’ll meet again soon.

Using Power Automate to provide Timezone and Local Time to Power BI Dataset and Data Flow

Problem:

In Power BI data transformation, we have useful methods such as DateTime.UtsNow() and DateTime.LocalNow() they work great when running from a desktop. But when running as a dataset or dataflow within the Power BI service, these fall apart because The Power BI service does not have the concept of a local timezone. Everything is UTC. The 'regional locale settings' is only useful in controlling the formatting of dates and currency, but not in setting timezone.

Some ideas on how to fix this:

  • There are useful methods such as: DateTimeZone.RemoveZone DateTimeZone.ZoneHours but again they don't work within the Power BI service.

  • Manually add a time duration offset everywhere - but you can't handle day light savings

  • Call an external website Display Last Refreshed Date in Power BI - Excelguru to calculate and get correct local date time. Except the suggested website timeapi has gone offline.

  • This Blog: write a Power Automate flow to help us. Because Power Automate has a fairly robust set of Date Time connectors that handles timezone based on the .NET timezone library.

Make our flow

  1. Create a HTTP Request trigger - change the method to GET

  2. Add Current Time

  3. Save the flow once - this will create the GET URL in the HTTP Request

  4. Add two Convert time zone actions - take the current time (in UTC) and convert to the timezone you want.

5. Add a Compose action and create a JSON with three attributes: UTC, Local, SeattleLocal. Since I'm based in Sydney, my "Local" is Australian East timezone.

6. Add a HTTP Response action and put the Compose output in the Body


If you copy and paste the URL and put that into a browser window, you'll see this:

Wonderful the flow is done. Now we go to Data Flow or Data Transformation in Dataset.

Data Set / Data Flow

Conclusion

Lovely, pretty, local time - even when refreshing dataset or data flow from Power BI