Showing posts with label Excel. Show all posts
Showing posts with label Excel. Show all posts

Friday, September 23, 2022

Power Automate: Export To Excel With Dynamic Table & Columns Creation

Hello Friends,
Welcome back with another post on Power Automate. Today, we will learn how create a table in excel dynamically along with the columns where number of columns are not predefined. The export data to excel. The outcome of this post will be-
Let's start-
  1. First of all, we need a SharePoint list. Below is the list, I had created. It has 8 columns.
  2. Now, the objective is that I wish to export any combination of columns-
    1. ID, Title, First Name, Last Name 
    2. ID, First Name, Date Of Joining, Salary, Designation
    3. ID, Full Name (First Name + Last Name), Designation
  3. As we can see in above examples, the number of column as well as the columns itself are not fixed. Till now, we were creating a defined template which is having a table with columns and using it to export the data. This approach will not work here.
  4. Microsoft had given solution for this problem as well. Let's begin-
  5. Create a document library in Site Assets or the Documents library. Let say "ExcelFiles".
  6. Now create a blank excel file and save it with name "EmployeesData.xlsx".
  7. Upload this file in "ExcelFiles" folder.
  8. Now, open the Power Automate maker portal and create a new Instant cloud flow with flow name as "POC-ExportExcelDynamic" and trigger condition "PowerApps".
  9. Add a variable that will interact with PowerApps to get the column names.
  10. Save the flow and keep saving the flow at regular interval during the flow writing.
  11.  Now, to create the table in excel dynamically, we should know the number of columns that are going to be created and based upon the count, the Excel cell address. As we know, that excel works upon cell address. Columns are defined as A, B, C, .... while rows are defined as 1, 2, 3, ...
  12. So, for the first row the first cell address is A1, second cell address is B1 and so on.
  13. Similarly, for the second row the first cell address is A2, second cell address is B2 and so on.
  14. Similarly, table is also dependent upon cells. For a 3 column table, the header address will be A1 -> C1. For a 4 column table, it will be A1 -> D1.
  15. However, there is no direct method to get the respective address of cell using the count. Here, we have to apply some trick. We will use an excel file to get the series of the address based upon count. Excel has provided "ADDRESS" function. Here is the excel.
  16. We have to create a JSON and will use the same. Here is the JSON.
    1. [
        {
          "ColumnNo": 1,
          "ColumnAddress": "A1"
        },
        {
          "ColumnNo": 2,
          "ColumnAddress": "B1"
        },
        {
          "ColumnNo": 3,
          "ColumnAddress": "C1"
        },
        {
          "ColumnNo": 4,
          "ColumnAddress": "D1"
        },
        {
          "ColumnNo": 5,
          "ColumnAddress": "E1"
        },
        {
          "ColumnNo": 6,
          "ColumnAddress": "F1"
        },
        {
          "ColumnNo": 7,
          "ColumnAddress": "G1"
        },
        {
          "ColumnNo": 8,
          "ColumnAddress": "H1"
        },
        {
          "ColumnNo": 9,
          "ColumnAddress": "I1"
        },
        {
          "ColumnNo": 10,
          "ColumnAddress": "J1"
        },
        {
          "ColumnNo": 11,
          "ColumnAddress": "K1"
        },
        {
          "ColumnNo": 12,
          "ColumnAddress": "L1"
        },
        {
          "ColumnNo": 13,
          "ColumnAddress": "M1"
        },
        {
          "ColumnNo": 14,
          "ColumnAddress": "N1"
        },
        {
          "ColumnNo": 15,
          "ColumnAddress": "O1"
        },
        {
          "ColumnNo": 16,
          "ColumnAddress": "P1"
        },
        {
          "ColumnNo": 17,
          "ColumnAddress": "Q1"
        },
        {
          "ColumnNo": 18,
          "ColumnAddress": "R1"
        },
        {
          "ColumnNo": 19,
          "ColumnAddress": "S1"
        },
        {
          "ColumnNo": 20,
          "ColumnAddress": "T1"
        },
        {
          "ColumnNo": 21,
          "ColumnAddress": "U1"
        },
        {
          "ColumnNo": 22,
          "ColumnAddress": "V1"
        },
        {
          "ColumnNo": 23,
          "ColumnAddress": "W1"
        },
        {
          "ColumnNo": 24,
          "ColumnAddress": "X1"
        },
        {
          "ColumnNo": 25,
          "ColumnAddress": "Y1"
        },
        {
          "ColumnNo": 26,
          "ColumnAddress": "Z1"
        },
        {
          "ColumnNo": 27,
          "ColumnAddress": "AA1"
        },
        {
          "ColumnNo": 28,
          "ColumnAddress": "AB1"
        }
      ]
      
  17. You may create as many as column numbers you want. Ideally, keep it "Total Number of Columns in list + 10" so that in case, if one or 2 columns got added later on in the list, it will not impact your flow.
  18. Now, add an action called Parse JSON in flow and paste this JSON in Content as well as use this JSN to create schema "Generate from sample".
  19. Now, we have the columns list that were passed through PowerApps and we have the mapping structure of Column Number vs Cell Address.
  20. Add another action called "Filter array".
  21. Here we will filter the JSON to find the corresponding cell address row (Column Address property in JSON) using the count of columns passed through PowerApps.
  22. It will give us the item which is having ColumnNo property equals to the total number of columns received from PowerApps.
  23. Now, we will fetch the ColumnAddress property from this output. So, add another called "Compose".
  24. The first part is completed. Now, add an action called "Copy file". We will create a copy of the excel template.
  25. Remember to select "Copy with a new name" option in dropdown which is asking the further course of action "If another file is already there".
  26. Now, here the climax part of the flow. Microsoft have given a unique feature to create table on the fly. Add an action "Create table".
  27. Provide the input as shown below. The starting range is fixed to A1, followed by a colon ":". The table range can be "Absolute", "Partial Relative" or "Relative". Make sure, whatever you are choosing, both cell address must be in same format. Here, I had chosen, "Relative"
  28. You may give the table name as per your wish. I had given "SearchResults". You may define a variable for the Table Name as well in the beginning because, this table name is going to be used at two places, therefore, to avoid any misspell, you may use variable.
  29. This was the second part.
  30. The last part is adding items to table.
  31. Add an action called "Get Items".
  32. Next, we will use the "Select" to transform the data as per our requirement. For an example, if we wish to add another column "Full Name", we can do here by using "concat" function. Remember, the name of column at left side in mapping must be exactly the same as we are expecting to get receive from PowerApps.
  33. So, add "Select" action.
  34. I had used "concat" function to join First Name & Last Name. Also, I used "formatDateTime" function to apply formating upon Date Of Joining. 
  35. Now, use "Apply to each" action to add this data to excel table.
  36. Add an action "Apply to each".
  37. Now, add "Add row into a table" action inside this "Apply to each" action.
  38. It's all done.
  39. Now, If you wish to send this excel as attachment or send the link of the file, you can add further actions that will fulfil the requirements. These all, we have already discussed in our previous posts.
  40. Save the flow and test it.
  41. Let's check for our first set of columns mentioned in the beginning of the post. Remember, column names should be separated either by comma or semicolon and should not have leading or trailing spaces.
    1. ID,Title,First Name,Last Name [Correct Format]
    2. ID;Title;First Name;Last Name [Correct Format]
    3. ID, Title,First Name , Last Name [In-Correct Format]
  42. Click on "Test" >> "Manually" >> "Test". It will show the Run window and asking for Column names.
  43. Give the input as "ID,Title,First Name,Last Name" and click on Run.
  44. Wait for the flow to complete it's execution. It got succeeded.
  45. Check the SharePoint library. The file got created with name as "EmployeesData1.xlsx".
  46. Here we go. The is there.
  47. Now let's try with custom column 
    1. ID,Full Name,Designation,Date Of Joining,Salary,Experience
  48. Here we go. This also tested successfully.
  49. This is how, you can create Excel Tables dynamically with any number of columns. However, you can create the file on the runtime but for that, you need to include the OneDrive connection. Ultimately, you are creating a file then copying it. Means one more step. Therefore, we avoided that. Keeping a blank file is far better than to include one more connection and adding one more step.
With this, I am concluding this post.
Happy Coding !!!
Will see you again with some new topics.

Stay Safe !
Stay Healthy !

Wednesday, September 21, 2022

Power Automate: Export To Excel For Dynamic View/Filter Selection In PowerApps

Hello Friends,

Welcome back with another post on Power Automate. We had already discussed about "Export To Excel" in our earlier posts. However, we are again going to discuss the same. This time, we are getting the data based upon the selection of View as well as filters applied in PowerApps. This post is also related to my previous two posts-
Let's start-
  1. Open the Power Automate maker portal and create a Instant cloud flow with trigger type as "PowerApps" and flow name as- "POC-DynamicExportToExcel".
  2. Now, we have 3 parameters to apply filter upon data-
    1. View Name
    2. Experience
    3. Free Text Search
  3. We will initialize 3 variables (using Initialize variable and value as Ask in PowerApps) that will communicate with PowerApps to provide these values to the flow-
  4. Variable1: ViewName
  5. Variable2: Experience
  6. Variable3: FreeTextSearch
  7. Now, we will define another variable named "strFilterCriteriaByViewName". This will be the first filter criteria.
  8. Keep saving your flow frequently.
  9. Here, we will use the same filter conditions as we applied in PowerApps. For that, we will apply a Switch-Case.
  10. Now, refer back to the previous post, where, we have defined the Views. Check the "Items" property of DataTable / Gallery.
  11. Please find below the function from PowerApps for ready reference.
    1. Switch(
          Dropdown_SelectView.SelectedText.ViewName,
          "All Employees",
          Filter(
              EmployeeInfo,
              Experience.Value in If(
                  Radio_SelectExperience.SelectedText.Experience = "All",
                  Experience.Value,
                  Radio_SelectExperience.SelectedText.Experience
              ) && (Trim(TextInput_Search.Text) in 'First Name' || Trim(TextInput_Search.Text) in 'Last Name' || 
      Trim(TextInput_Search.Text) in Designation.Value || Trim(TextInput_Search.Text) in Experience.Value || 
      Upper(Title) = Upper(Trim(TextInput_Search.Text))) || Text(
                  'Date Of Joining',
                  "m/d/yyyy",
                  "en-US"
              ) = Text(
                  Trim(TextInput_Search.Text),
                  "m/d/yyyy",
                  "en-US"
              ) || Text(Trim(Salary)) = Trim(TextInput_Search.Text)
          ),
          "Software Engineer",
          Filter(
              EmployeeInfo,
              "Software Engineer" in Designation.Value,
              Experience.Value in If(
                  Radio_SelectExperience.SelectedText.Experience = "All",
                  Experience.Value,
                  Radio_SelectExperience.SelectedText.Experience
              ) && (Trim(TextInput_Search.Text) in 'First Name' || Trim(TextInput_Search.Text) in 'Last Name' || 
      Trim(TextInput_Search.Text) in Designation.Value || Trim(TextInput_Search.Text) in Experience.Value || 
      Upper(Title) = Upper(Trim(TextInput_Search.Text))) || Text(
                  'Date Of Joining',
                  "m/d/yyyy",
                  "en-US"
              ) = Text(
                  Trim(TextInput_Search.Text),
                  "m/d/yyyy",
                  "en-US"
              ) || Text(Trim(Salary)) = Trim(TextInput_Search.Text)
          ),
          "Module Lead",
          Filter(
              EmployeeInfo,
              "Module Lead" in Designation.Value,
              Experience.Value in If(
                  Radio_SelectExperience.SelectedText.Experience = "All",
                  Experience.Value,
                  Radio_SelectExperience.SelectedText.Experience
              ) && (Trim(TextInput_Search.Text) in 'First Name' || Trim(TextInput_Search.Text) in 'Last Name' || 
      Trim(TextInput_Search.Text) in Designation.Value || Trim(TextInput_Search.Text) in Experience.Value || 
      Upper(Title) = Upper(Trim(TextInput_Search.Text))) || Text(
                  'Date Of Joining',
                  "m/d/yyyy",
                  "en-US"
              ) = Text(
                  Trim(TextInput_Search.Text),
                  "m/d/yyyy",
                  "en-US"
              ) || Text(Trim(Salary)) = Trim(TextInput_Search.Text)
          ),
          "Project Manager",
          Filter(
              EmployeeInfo,
              "Project Manager" in Designation.Value,
              Experience.Value in If(
                  Radio_SelectExperience.SelectedText.Experience = "All",
                  Experience.Value,
                  Radio_SelectExperience.SelectedText.Experience
              ) && (Trim(TextInput_Search.Text) in 'First Name' || Trim(TextInput_Search.Text) in 'Last Name' || 
      Trim(TextInput_Search.Text) in Designation.Value || Trim(TextInput_Search.Text) in Experience.Value || 
      Upper(Title) = Upper(Trim(TextInput_Search.Text))) || Text(
                  'Date Of Joining',
                  "m/d/yyyy",
                  "en-US"
              ) = Text(
                  Trim(TextInput_Search.Text),
                  "m/d/yyyy",
                  "en-US"
              ) || Text(Trim(Salary)) = Trim(TextInput_Search.Text)
          ),
          "Program Manager",
          Filter(
              EmployeeInfo,
              "Program Manager" in Designation.Value,
              Experience.Value in If(
                  Radio_SelectExperience.SelectedText.Experience = "All",
                  Experience.Value,
                  Radio_SelectExperience.SelectedText.Experience
              ) && (Trim(TextInput_Search.Text) in 'First Name' || Trim(TextInput_Search.Text) in 'Last Name' || 
      Trim(TextInput_Search.Text) in Designation.Value || Trim(TextInput_Search.Text) in Experience.Value || 
      Upper(Title) = Upper(Trim(TextInput_Search.Text))) || Text(
                  'Date Of Joining',
                  "m/d/yyyy",
                  "en-US"
              ) = Text(
                  Trim(TextInput_Search.Text),
                  "m/d/yyyy",
                  "en-US"
              ) || Text(Trim(Salary)) = Trim(TextInput_Search.Text)
          )
      )
  12. The same we are going to implement in flow. The primary filter is upon Designation-
    1. All Employees - No Filter Criteria
    2. Software Engineer - Filter Designation by "Software Engineer"
    3. Module Lead - Filter Designation by "Module Lead"
    4. Project Manager - Filter Designation by "Project Manager"
    5. Program Manager - Filter Designation by "Program Manager"
  13. We had completed first part.
  14. Now, we will fetch all such items from SharePoint list that matches this filter criteria. For that, we have to define three variables then use "Do Until" action to perform the activity.
  15. Now, we will use Do-Until action to fetch the items.
  16. Get Items Using ItemID variable & FilterCriteria
  17. Select the required fields
  18. Join the output with arrAllItems array
  19. Update the variable arrAllItems with the previous action result
  20. Update the intItemID with the last ItemID from the output of GetItems action. If GetItems action output is NULL, then assign it with 0.
  21. Now, check if the output of GetItems is empty or not. If empty, set the blnIsEmpty to true otherwise false.
  22. Now, we will set the "Configure Run After" condition for the action. For this, click on ellipses (3 dots) ahead to the Set blnIsEmpty action and choose "Configure run after".
  23. You will find that "is successful" is by default selected. Select "is failed" as well and click on Done.
  24.  
  25. This completes the second part.
  26. Now, the third part is to filter data by Experience choice. If Experience selected by user is not "All", then we have to apply additional filter of Experience upon the outcome of Do-Until action.
  27. Before that, we will execute this flow and get the schema so that we can parse the output to JSON as we cannot apply filter upon array.
  28. Then use Parse JSON action.
  29. Now add a "Condition" action and check if Experience is not selected as "All". If true, apply "Filter" action on output of ParseJSON action and set the output again to arrAllItems variable.
  30. This completes third part.
  31. Now, the fourth part is to apply filter using free text search.
  32. This is the main section. Here we will apply filter on each field separately and capture the outcome in an individual array.
  33. After then we will combine all these outcomes in single array. Thus, unique results will be obtained.
  34. As we can see that we had applied free text search upon all 7 fields. Thus, we have to create 7 array variables and then apply filter one by one.
  35. First define the arrays.
  36. Now, check if the FreeTextSearch variable is having any value or not. If Yes, then start filtering one by one.
  37. Apply Parse JSON upon arrAllItems so that filter can be applied.
  38. We will check if the FreeTextSearch value is contained (unless specified) in the item respective field.
  39. Filter for Title field.
  40. Remember, when you are applying a filter, you have to use the output (body) of Parse JSON action. When you are updating the respective array with the output of respective filter, then you have to use the output (body) of Filter action.
  41. Filter for First Name field.
  42. Remember, when you are applying a filter, you have to use the output (body) of Parse JSON action. When you are updating the respective array with the output of respective filter, then you have to use the output (body) of Filter action.
  43. Filter for Last Name field.
  44. Remember, when you are applying a filter, you have to use the output (body) of Parse JSON action. When you are updating the respective array with the output of respective filter, then you have to use the output (body) of Filter action.
  45. Filter for Date Of Joining field. Here, we will first format the date to "M/d/yyyy" format and then check if both are equal. Additionally, here we will use the "Configure run after" as well because in case the FreeTextSearch is not a date then the formatDateTime will throw an error and our further processing will get failed. So, we have to handle that case.
  46. Remember, when you are applying a filter, you have to use the output (body) of Parse JSON action. When you are updating the respective array with the output of respective filter, then you have to use the output (body) of Filter action.
  47. Filter for Salary field.
  48. Remember, when you are applying a filter, you have to use the output (body) of Parse JSON action. When you are updating the respective array with the output of respective filter, then you have to use the output (body) of Filter action.
  49. Filter for Designation field.
  50. Remember, when you are applying a filter, you have to use the output (body) of Parse JSON action. When you are updating the respective array with the output of respective filter, then you have to use the output (body) of Filter action.
  51. Filter for Experience field. However, we can avoid filter condition upon Experience as this is already present upon screen in the form of Radio Choice.
  52. Remember, when you are applying a filter, you have to use the output (body) of Parse JSON action. When you are updating the respective array with the output of respective filter, then you have to use the output (body) of Filter action.
  53. Now, we have individual filter results for each FreeText filter criteria. We have to combine these outcomes. Thus, we will now use Compose action to apply union function to combine these outcomes. The benefit of union is that it takes unique items. If any item is found duplicate, it keeps only one of those item instances. 
  54. Now, set the output of this Compose action to arrAllItems.
  55. This completes out fourth part.
  56. Now, the fifth & final part. Here, we will export this output to Excel. For this, you may refer my earlier blogs using OneDrive as well as SharePoint library.
    1. Using One Drive-
      1. Export To Excel Using Power Automate
    2. Using SharePoint Library-
      1. Power Automate: Export To Excel (SharePoint)
  57. This is how, we can Export data to excel.
With this, I am concluding this post.
Happy Coding !!!
Will see you again with some new topics.

Stay Safe !
Stay Healthy !