top of page
Writer's pictureJon Russell

Azure Blob storage and SAS tokens


JonDoesFlow introducing the blog topic


Introduction


Hey, how are you doing?


I hope you had a good Christmas and happy new year for those of you who celebrate it.


I wanted to get back into blog post writing more this year, after being inspired by Lewis Baybutt's 365 Post Challenge.


I looked back at some of the more successful posts that I have written over the years I have been doing this and found a trend, and it's not one that I really saw before.


Last year I got diagnosed with ADHD, and it really is my superpower when it comes to Power Platform consultancy. My psychiatrist believed I have had it since I was young, and I have employed lots of coping mechanisms that have become the norm.


For example:

  1. I have a diary for everything

  2. I am surrounded by notebooks

  3. I am very forgetful, which is why I have a diary and notebooks.


This made me think, and I looked back at the blog posts and realized that the writing of these blog posts is a coping mechanism too. Because I find it hard to retain information without writing it down, I used the blog to write my thoughts and steps to achieve a goal as a way of remembering and being able to refer back to a particular problem I faced and how I solved it.


So without further ado, lets get squirrelling on the next blog post:


Azure Blob storage and SAS tokens


I use Azure Blob storage in a lot of the solutions I am building, and it is a great and cheap way to store documents. One thing that we need to pay special attention to though is access. We don't really want a person to be able to get a link to a file and be able to access it, without us securing and governing the access point to that file, this is where we use Azure Blob storage and SAS tokens to help secure it.


For this blog post, I have created a solution in https://make.powerapps.com - Blob Storage.

I have also created a table called File Storage, and it has a primary name column of File Storage ref which is an auto number, and a column called File upload - which is a file type data type:


File Storage columns

I am also going to create a SAS Blob URL column in the FileStorage table, we will come to that later. Set the Maximum character count to 500 for this.


I am then going to create a Model-Driven app, and edit the form of the File Storage table. I will add the File Upload column and leave the primary name field as is.


If we save the record, the Choose File button becomes active. The record needs to be saved first as the relationship needs to be made with the attachments table for us to be able to store a file.


We can now upload a file:


Let's head over to https://portal.azure.com and create a storage account and container:


Azure blob storage container in Azure Portal

We can now create a Power Automate cloud flow which will do the following:


  • When a row is modified on the File Storage table

  • Download the File upload

  • Parse JSON to find out the file name

  • Create a file in a SharePoint folder

  • Get the File Content

  • Create the file as a blob in the container

  • Create a SAS token for the blob storage

  • Update the File Storage record with he SAS blob storage URL



Head back to your Blob Storage solution, and add a new automated Power Automate cloud flow, let's call it - When a file is uploaded, and choose the When a row is added, modified or deleted from the Microsoft Dataverse triggers:


Creating the cloud flow


The first step, we need to configure the trigger, like below:



Step 1 of the cloud flow


We need to do it on Modified, as the record has to be created first before we can upload a file to it as stated above.


Next, we are going to add the Download a file or an image action and set it up as below:


This is setup as below:



Step 2 of the cloud flow

Next add a Get a row by ID Dataverse step, as we need to get the details of the file name of the file that has been uploaded:


Step 3 of the cloud flow

As we need to get the response body from the Get a row by ID steps, let's save the flow and test it. We can then grab the body from the trigger step.


Click Save, then Test and choose the Manual option, then head back to the Model Driven app, create a New fil storage record, and add upload a file, head back to your flow and the flow will have run. Expand the Get a row by ID action and copy all of the contents from the body output and put it in a notepad file:


Step 4 of the cloud flow

Click Edit, to edit the flow, and add a new step. Search for Parse JSON. Choose the body from the Get a row by ID step as the Content and then click Generate from Sample, and paste in the body you copied from the previous step.


The Parse JSON Step will look like this:



Step 5 of the cloud flow

Next add a Compose step., what we are going to do now is use the split expression to get the file name suffix.


We are going to split the file name, by the period, and then choose the second item in the object:


split(body('Parse_JSON')?['jr_fileupload_name'],'.')[1]


You can find the column name of the file uploaded by having a look at the body content you passed into Notepad, and searching for the file name of the file you uploaded. The column name is in the speech marks to the left of the file name, in our case that is jr_fileupload_name.

For more on the use of split, check this blog post here.


Your Compose step should look like this:


Step 6 of the cloud flow

Let's save the flow and test it with a previous run and see what happens.


We can see from the run that we have successfully captured the file extension:



Step 7 of the cloud flow


Edit the flow again and let's continue. Next, we need to create a document folder on SharePoint, as the file needs to be created there, so that we can then upload it to Blob storage.


Add a Create File SharePoint action, point it to the folder you have created and then for the file name add the File Storage ref followed by a hypen, the formatdatetime expression to ensure the file name is unique, followed by the File Upload Name.


The file content will come from the Body of Download a file or an image step. This step should look like this:


Step 8 of the cloud flow

The expressions for the format date time and the body expression for file content are below:


formatdatetime(utcnow(),'yyyyMMddhhmmss')


body('Download_a_file_or_an_image')


If we test and run the flow now from a previous successful run, we can see that the file does get created in the SharePoint folder we specified:


Document creation in SharePoint

Next we add a Get File content SharePoint step:



Step 9 of the cloud flow


This allows us to be able to send the content to Blob Storage. So create another new step this time - Choosing the Create Blob (V2) step. If this is the first time you have set this up, then you will need to change the Authentication type to Access Key, and pop in the Azure storage account name and the Access Key which you can get from the Storage Account page in the Azure Portal.


Click Create to create the connection reference:


Creation of the Blob Storage connection reference

You'll then be able to add the elements to the Create blob (V2) step. We can use the connection settings we have created for the Storage Account Name, this will allow us to choose the folder path. The blob name we can copy that over from the Create file step above and then get the body from the Get File content step, that is:>


body('Get_file_content')


Step 10 of the cloud flow

Let's save and test, and see what happens.


Wahoooo, it worked, and there we can see the file inside the Blob Storage container:


Creation of file in blob storage container

SAS Token creation


Next we need to add a Create SAS URI by path (V2) step. We can set it up as below. This step encrypts the content via a SAS tokenised URL, and we have set the start time as now, and the expiry time for 10 minutes later as per the expression:


Step 11 of the cloud flow

Now that we have the URL, we can add this to the File Storage table into the SAS URL column:


Step 12 of the cloud flow

When we save the flow, we can see this in the Flow checker:


Flow checker warning

We need to fix this. What is happening is that the trigger is a modification of a File Storage row. To sort this, let's go back to the Solution and into the File Storage table, and add a Yes/No field called Blob Created. Now go back to our flow, and update the bottom update a row step, and change the Blob created value to Yes.


We also need to create a trigger condition so that the flow only triggers when the Blob created value is no.


We can use the filter array to create the trigger condition, you can find out how to do this on my blog post


Add the following to the trigger, replacing your column name with the logical name in your solution:


@equals(triggerOutputs()?['body/jr_blobcreated'], false)


We also need to add jr_blobcreated to the Select Columns part of the trigger step.


The flow checker will still complain with the warning, but it's ok, we have sorted this now.


Finally, we need to delete the file that we created in SharePoint:



Test


Let's test it again, let's run the flow using a previous successful step.


When we run the test, we can now see that the SAS URL has been generated:


Step 13 of the cloud flow

And that is a wrap. Wahoo. Thanks for staying with me !



1,550 views0 comments

Recent Posts

See All

Subscribe Form

©2019 by Jon Does Flow. Proudly created with Wix.com

bottom of page