How to use galleries to create dynamic entries in a data source in PowerApps

In this article, we will see how we can use galleries in PowerApps to create multiple rows for adding records to a data source. We will create dynamic entries in a gallery that looks like a form and adds/deletes a line/row with the press of a button.

Scenario: XYZ Inc. is a sales company deals in sales of hardware components from the manufacturers to retailers. User A is an on-field sales agent of XYZ Inc. and uses a static application to enter the order details from a customer. This application is further connected to a SharePoint list and creates a new item on the list whenever User A enters the detail and hits the submit button. The application provides the ability to enter only one order detail at a time and User A ends up putting more effort and time in entering those details.

We designed a customized PowerApp for XYZ Inc. where User A authenticates and lands on the Order Details page. User A can view all their previous entries, search for an order by entering the name of the customer, vendor, invoice number, etc. Functionality to add details is provided within the app. User A clicks the add new orders button and a form gallery is displayed. User A can add multiple records by creating new lines with the press of a button in the form gallery. A local collection with all the entries on the form is created in PowerApps. Once User A hits the “Finish & Save” button, an item for each entry is created on the SharePoint List and the Order Details gallery is updated with these newly added records.

Let’s look at the component-wise description of the controls in the app. The schema for data on the SharePoint List is:

S.No Column Name Column Data Type
1 Title (Order Number) Single Line of Text (255 Chars)
2 Customer Single Line of Text (255 Chars)
3 Shipping Address Single Line of Text (255 Chars)
4 Billing Address Single Line of Text (255 Chars)

On the App -> OnStart option, the expression used is:

ClearCollect(DynamicGallery,{Value:1}); Clear(OrderCollection); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order1"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order2"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order3"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order4"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order5"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order6"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order7"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order8"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order9")))

Explanation: Here, I am creating a collection “Dynamic Gallery” and this is the number of rows corresponding to the gallery control for creating the new orders. I am creating another collection “OrderCollection” and this collection contains all the Order Details from the SharePoint List named “OrderDets”.

Note: The “StartsWith” function is not delegable if a variable is passed as the second argument which is the reason why I am using multiple “Collect” statement to iterate over all possible values.

Galleries to create dynamic entries in a Data Source in PowerApps1

  1. This icon is the Home Page icon and clicking on this navigates the user to the home screen
  2. This icon is the Order Details Screen icon and clicking on this navigates the user to the Order Details Screen
  3. This icon is the Edit an Item icon and clicking on this allows the user to edit a particular item
  4. This icon is the Refresh Icon and clicking on this refreshes the data source, the expression used here is:

Refresh(OrderDets);Clear(OrderCollection); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order1"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order2"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order3"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order4"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order5"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order6"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order7"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order8"))); Collect(OrderCollection,Filter(OrderDets,StartsWith(Title,"Order9")))

Explanation: This refreshes the data source (“OrderDets” SharePoint List). It also clears the existing data from the “OrderCollection” collection and refills it with the new data.

  1. This is a gallery control that populates all the items of the SharePoint list
    • The expression used in the “Text” property of the “Order Details” label is:

"Order Details Total Orders Count:"&CountRows(OrderCollection)

Explanation: This expression concatenates the simple text (wrapped in “”) with the integer returned as a result of the “CountRows” function applied on the “OrderCollection” collection.

    • The expression used in the “Items” property of the Gallery is:

Sort(Filter(OrderCollection,If(!IsBlank(TextInput3.Text),StartsWith(Title,TextInput3.Tex t) ||
StartsWith(Customer,TextInput3.Text),true)),Value(Last(Split(Title,"r")).Result),Descen ding)

  1. This is a stack of Text labels used to show the information when an item is selected in the “Order Details” gallery. Expressions used on the labels:

Customer: Gallery5.Selected.Customer, Shipping Address: Gallery5.Selected.'Shipping Address', Billing Address: Gallery5.Selected.'Billing Address'

Explanation: Each line is an individual expression that fetched the attributes of the item selected in the “OrderDetails” gallery (Gallery5).

  1. This is a button control that enables the gallery control for the user to create dynamic lines and enter the order details. Expression used on this button:

ClearCollect(DynamicGallery,{Value:1});Set(NewOrder,true);Set(ResetGallery,false);Set(ResetGallery,true)

Explanation: Here I am recreating the “DynamicGallery” collection to accommodate just one value that corresponds to one row of the newly visible dynamic control gallery. I am setting up two new variables “NewOrder” and “ResetGallery” that control the visibility/reset of this dynamic gallery control, “Total Number of New Orders” and the “Finish and Save Button” controls.

Use Galleries to create dynamic entries in a Data Source in PowerApps 2

  1. This is the dynamic gallery control that I customized for user inputs. This gallery control has four text input controls to get the values for each of the attributes of the SharePoint List. The user can create multiple lines (one at a time) to add multiple records in one go. Data from each line is directly patched to the data source to create a new item. The user can remove the line by clicking the “X” icon. Configuration of the elements of the gallery control:

Gallery Properties:
Items: DynamicGallery, Visible: NewOrder,

Explanation: “DynamicGallery” is the collection that holds count of the orders to be added. “NewOrder” is a variable used to set the visibility of the controls.

  1. This icon is to remove the current row from the dynamic gallery. The expression used on this is:

Icon Properties:
OnSelect: Remove(DynamicGallery,ThisItem), Visible: If(ThisItem.Value <>
Last(Sort(DynamicGallery,Value,Ascending)).Value,true,false)

Explanation: We are removing the current item from the gallery by pressing this button (“OnSelect”) property. This icon’s visibility is set in a way that it shows up only if the current row is not the last item of the dynamic gallery.

  1. This icon is to add a new row/ line to the dynamic gallery. The expression used on this is:

Icon Properties:
OnSelect: Collect(DynamicGallery,{Value: ThisItem.Value + 1}), Visible: If(ThisItem.Value =
Last(Sort(DynamicGallery,Value,Ascending)).Value,true,false)

Explanation: We are adding a row/line by adding an item to the dynamic gallery collection by pressing this button (“OnSelect”) property. This icon’s visibility is set such that it shows up only on the last item of the dynamic gallery.

  1. This button is to perform the patch action on the “OrderDets” SharePoint List and it patches all the entries made by the User A in the dynamic gallery. The expression used in this control is:

ForAll(
Gallery3_1.AllItems,
Concurrent(
Patch(OrderDets,Defaults(OrderDets),{Title:TextInput2_6.Text,Customer:TextInput2 _7.Text,'Shipping Address':TextInput2_4.Text,'Billing Address':TextInput2_5.Text}), Patch(OrderCollection,Defaults(OrderCollection),{Title:TextInput2_6.Text,Customer: TextInput2_7.Text,'Shipping Address':TextInput2_4.Text,'Billing
Address':TextInput2_5.Text})));
ClearCollect(DynamicGallery,{Value:1});
Set(NewOrder,false);
Refresh(OrderDets)

Explanation: In this control, the concurrent function executes two patch commands, one in the data source (“OrderDets” SharePoint List) and the other on the local collection (“OrderCollection”) based on the inputs by the User A in each of the line/ row of the dynamic gallery. The “DynamicGallery” collection is being reset to hold a single value. The variable “NewOrder” is set to “false” to toggle the visibility of the dynamic gallery and then we finally refresh the data source.

Note: We are doing a concurrent patch instead of refreshing and recollecting the data in the collection “OrderCollection” from the data source to optimize the operations in the app.

  1. This is the text label control that displays the total number of current lines/rows User A has created. The expression used here is:

"Total Number of New Orders: "& CountRows(Gallery3_1.AllItems)

Explanation: Here the text “Total number of New Orders” is being concatenated with the number of rows of the dynamic gallery.

Use Galleries to create dynamic entries in a Data Source in PowerApps 3

  1. This is the text input control of the dynamic gallery. Here I am validating the text input and checking through the “OrderCollection” if the entered Order Number already exists. If it exists, the user will get an error notification. The expression used in the “OnChange” property of this control is:

If(TextInput2_6.Text in OrderCollection.Title,Notify("Order Number already exists!",NotificationType.Error))

Explanation: Here the if condition checks if the text of the text input control exists in the “Title” column of the “OrderCollection” collection and pops an error message if the condition is met.

  1. This is the error notification generated when the user enters an existing order number in the text input.

Use Galleries to create dynamic entries in a Data Source in PowerApps 4

  1. This is the App settings page of the app where we are increasing the soft limit of the data row limit on of non-delegable queries from 500 to 2000.

In this article, I have shown a basic implementation of the dynamic galleries concept and the multiple items/ records patch function for a SharePoint data source. This can be replicated with minor changes in the expressions for other data sources such as CDS, excel, SQL, etc.

I hope you found this interesting and it helped you. Thank you for reading!

AIS is the 2019 MSUS Partner Award Winner – Business Applications – Dynamics 365 for Sales. This is our vision for the Power Platform era.

I am incredibly excited to share that AIS has been announced as the 2019 MSUS Partner Award Winner – Business Applications – Dynamics 365 for Sales at #MSInspire!

Some background on how we won:

Story for MSUS Win Dynamics 365 SalesWhen the National Football League Players Association (NFLPA) needed to score a big win for its members, they brought in the AIS team to build a single, shared player management system, called PA.NET. AIS extensively customized Dynamics 365 for Sales to meet the unique needs of NFLPA, integrated it with Office 365… and then took it all to the cloud with Microsoft Azure.

Using Dynamics 365 for Sales, PA.NET provides one master set of player data and powerful reporting tools. Now employees across the organization can turn to the same system to answer questions, uncover marketing and licensing opportunities, and identify other ways to help members. When a specific licensing request comes in, they can find the right person, or people, in minutes.

So where do we go from here? From Dynamics to Power Platform.

Our Business Applications & Automation Practice is investing heavily in Dynamics and the Power Platform. We recognize that an organization’s adoption of the Power Platform should be thought of as a journey, not a one-off “app of the moment” solution. By focusing on enterprise management and leveraging the Common Data Service (CDS) as much as possible, we help clients like NFLPA scale their adoption as they migrate workloads and make use of PowerApps, Power BI, Flow, and Dynamics 365.

Power Platform Technologies

Earlier this year, we worked with friends in the business applications community around the world to launch our Power Platform Adoption Framework. Mature organizations realize that rigor, discipline, and best practices are needed to adopt the platform at scale.

The Power Platform Adoption Framework is the start-to-finish approach for adopting the platform at scale.

It helps enterprise organizations:

  • Get to value quickly
  • Educate, train, and grow their community of developers and power users
  • Create durable partnerships between business, IT, and the user community
  • Continuously improve ROI on the platform by identifying and migrating new workloads
  • Blend agile, rapid app development with rigorous, disciplined enterprise management

I hope that the framework will continue to become a worldwide standard for enterprise-grade adoption of the Power Platform. I’ve been lucky to collaborate with Power Platform experts and users around the world to create the Power Platform Adoption Framework. I’m proud to say that AIS is fully behind the framework, sharing it with the community, and committed to its future development as best practices for scaled adoption evolve. We’re sharing it so that everyone can use it because we believe that a vibrant and thriving community around this technology is good for everyone who uses it.

Please join me in congratulating the AIS team, and please join us on this journey to scale the Power Platform to meet the challenges of the years to come.

A Single Place to Manage, Create, and ConsumeAzure Monitor and OMS

The integration of the Operations Management Suite (OMS) into Azure Monitor is completed for both Azure Commercial and Azure Government. This change by Microsoft has given Azure Monitor/OMS users a single place to manage, create, and consume Azure Monitoring solutions. No functionality has been removed and documentation has been consolidated under the Azure Monitor documentation. With this consolidation of services, there have been some terminology changes that will impact the way one talks about Azure Monitor components. The consolidation of OMS and other Azure services into Azure Monitor is simplifying the way you manage the monitoring of your Azure services.

Updated Terminology

Microsoft has updated some of the terminologies for the Azure Monitor components to reflect the transition from OMS. I have highlighted some examples:

  • The log data for Azure Monitor is still stored in a Log Analytics Workspace, but the term Log Analytics in the Microsoft documentation is now Azure Monitor Logs.
  • The term log analytics now applies to the page in the Azure portal used to write and run queries and analyze log data.
  • What was once known as OMS Management solutions have been renamed Monitoring solutions (items like Security & Compliance and Automation & Control)

Azure Monitor — Your 1 Stop “Monitoring & Alerting” Shop

Azure Monitor is now pretty much the one stop shop for your monitoring and alerting needs (the exception here would be Azure Security Center is still the place to go to for most of your security and compliance needs).

Azure Monitor is broken out into four main categories in the Azure Portal:

  1. The main components of Azure monitor
  2. Insights
  3. Settings
  4. Support + Troubleshooting.

The main components include the Activity log, Alerts, Metrics, Logs, Service Health, and Workbooks.

Under Insights, there is Application, Virtual Machines, Containers, Network, and “…More”.

The Settings category includes Diagnostics settings and Autoscale.

And finally, under Support + Troubleshooting, there is Usage & estimated costs, Advisor recommendations, and New support request.

Check out the below table that provides an overview of the Azure Monitor Components and Descriptions:

Azure Monitor Component Description
Overview Overview of Azure Monitor
Activity Log Log data about the operations performed in Azure
Alerts Notifications based on conditions that are found in monitoring data both metrics and logs
Metrics (Metrics Explorer) Plotting charts, visually correlating trends, and investigating spikes and dips in metrics’ values.
Logs (Azure Monitor Logs) Useful for performing complex analysis across data from a variety of sources
Service Health Provides a personalized view of the health of the Azure services and regions you’re using
Workbooks Combine text, Analytics queries, Azure Metrics, and parameters into rich interactive reports.
Applications Application Performance Management service for web developers
Virtual Machines Analyzes the performance and health of your Windows and Linux VMs and monitors their processes and dependencies on other resources and external processes.
Containers Monitor the performance of container workloads deployed to either Azure Container Instances or managed Kubernetes clusters hosted on Azure Kubernetes Service (AKS).
Network Tools to monitor, diagnose, view metrics, and enable or disable logs for resources in an Azure virtual network.
More Replacement for the OMS Portal Dashboard.
Diagnostic Settings Configure the diagnostic setting for Azure resources (formally known as Diagnostic Logs)
Autoscale Consolidated view of Azure resources that have Autoscale enabled
Usage and estimated costs Consumption and cost estimates of Azure Monitor
Advisor Recommendations Link to Azure Advisor
New support requests Create a support request

Passing just about anything from PowerApps to Flow with the newly released JSON function

In this article, I will show you how we can send data from a Canvas App using the freshly released JSON function. I will pass data from the data table (of a SharePoint List), microphone (audio recording), and camera control (photo) to an MS Flow. A condition logic is set up in Flow to check the file type and create those accordingly in a dedicated SharePoint Library.

This article focuses on a canvas app and a flow. We will look at the component-wise structuring of both the app and the flow to achieve the objective.

Canvas App

Let’s look at the control-wise screens and functions used in the Canvas App.

  1. Data from a SharePoint list is displayed on a Gallery control in the app. A user can export this data to a PDF file and save it to SharePoint Document Library, and download it in the browser window.

Gallery Control

Here, we have a Gallery (‘Gallery2’) control that is populated with the data from a SharePoint List. The data is filtered to show only the first 10 records. The expression used on the ‘Items’ property of the Gallery control is:

FirstN(ShowColumns(OrderDets,"Title","Customer","ShippingAddress","BillingAddress"),10)

Explanation: Get the first 10 items from the ‘OrderDets’ SharePoint list and get the columns as specified.

The ‘Create PDF’ button creates a local collection and then triggers an MS Flow and passes the collection as an argument along with the desired file name using the JSON function. Finally, once the PDF is created and the Flow is executed successfully, the PDF file is opened in a new tab of the browser. The expression used on this button is:

ClearCollect(PDFCollection,Name:Concatenate("Test123",Text(Today()),".pdf"),Url:JSON(ShowColumns(Gallery2.AllItems,"Title","Customer","ShippingAddress","BillingAddress"))});Launch(CreateFilesSharePoint.Run(JSON(PDFCollection,IncludeBinaryData)).responsereturned)

Explanation: The ‘ClearCollect’ function creates a collection named ‘PDFCollection’ and this stores the data in the gallery control and the name of the PDF file. The name of the PDF file is a concatenated string with the naming convention of ‘Test123-today’s date.pdf’. The ‘URL’ key inside the ‘PDFCollection’ stores string type value for the table formatted Gallery items, using the JSON function. This value is later parsed as JSON while sending as an argument to the Flow. The ‘Launch’ function opens a new browser window to launch the newly created PDF file’s URL received as a response from the ‘CreateFilesSharePoint’ flow.

  1. The Microphone control on the app is used to record audio. Multiple recordings can be created and played/viewed on the gallery control.

Microphone Gallery Control

Here, we have a Microphone control ‘Microphone1’ to record the audio inputs and store that into a local collection ‘AudioCollection’. The Expression used on the ‘OnStop’ property of the Microphone control is:

Collect(AudioCollection{Name:Concatenate("Audio",Text(Today()),Text(CountRows(AudioCollection)),".mp3"),Url:Microphone1.Audio})

Explanation: The ‘Collect’ function updates a collection ‘AudioCollection’ to store the audio recordings with the unique file name. The filename is a concatenated string of ‘Audio-Today’s date-index of the audio file.mp3’.

The ‘Submit’ button triggers the Flow and creates all the audio recordings as separate files on the SharePoint document library. The Expression used on this button is:

CreateFilesSharePoint.Run(JSON(AudioCollection,JSONFormat.IncludeBinaryData))

Explanation: Here the JSON function converts the audio file URL to binary data and sends the ‘AudioCollection’ data to the ‘CreateFilesSharePoint’ flow.

The ‘Clear’ button clears data from the ‘AudioCollection’.

  1. The camera control is used to click photos in the canvas app. Multiple pictures can be captured and viewed on the Gallery control.

Camera Gallery Control

Here, we have a camera control ‘Camera1’ to capture a picture and store it into a local collection ‘ImageCollection’. The Expression used on the ‘OnSelect’ property of the Camera control is:

Collect(ImageCollection,{Name:Concatenate("Image",Text(Today()),"-",Text(CountRows(ImageCollection)),".jpg"),Url:Camera1.Photo})

Explanation: Collect function updates the ‘ImageCollection’ collection with the unique file name and the URL of the photo taken from the camera control. The name of the file is a concatenated string of ‘Image-Today’s Date-Index of the photo in the gallery control.jpg’.

The ‘Submit’ button triggers the Flow and creates all the images as separate files on the SharePoint document library. The Expression used on this button is:

CreateFilesSharePoint.Run(JSON(ImageCollection,JSONFormat.IncludeBinaryData))

Explanation: Here, the JSON function converts the image file URL to binary data and sends the ‘ImageCollection’ data to the ‘CreateFilesSharePoint’ flow.

The ‘Clear’ button clears data from the ‘ImageCollection’.

MS Flow

Coming to the ‘CreateFilesSharePoint’ flow: This flow is triggered by the button controls on the different screens in the Canvas App.

Action 1: Initialise a variable -> accommodates the input coming from the canvas app.

Action 2: Initialise a variable (2) -> To get the string to send a response back to the canvas app.

Action 3: Parse JSON: Get the dynamic data by parsing the data received from the canvas app according to the schema where we have an array that contains objects with the attributes: ‘Name – Filename’, ‘URL – Filecontent’.

Flow 1

Action 4: Apply to Each control: Iterate over each file item from the body output of the Parse JSON function.

Action 5: Condition control within the Apply to each Control: Split the file name and check if the extension is a PDF file.

If No,

Action 6: Create File 2 in SharePoint: to create a file for the image/ audio type in the defined library. If Yes,

Action 7: Parse JSON 2: The data content passed from the PowerApps as the URL key is now being parsed as individual elements to create an HTML table and then finally create a PDF file out of it.

Action 8: Create HTML Table: Creates an HTML table with the column names as headers and gets the data from the Parse JSON 2 action.

HTML Table from Parson JSON

Action 9: Create File in OneDrive: To create a temporary HTML file from the HTML table generated in the previous step and store it in the ‘Hello’ folder on the OneDrive.

Action 10: Convert File in OneDrive: To convert the previously created HTML file to a PDF document.

Action 11: Create File 2 in SharePoint: To create the PDF file from the converted file from the previous action. The file is stored in the specified document library on SharePoint.

Action 12: Delete File from OneDrive: To delete the temporary HTML file that was created in Action 9.

Action 13: Get file Properties SharePoint: To get the URL of the PDF file created in SharePoint.

Action 14: Set Variable: Set the URL to the file as a string value.

Create and Transform Files

Action 15: Respond to PowerApps: Send the URL of the file created on SharePoint to PowerApps. (Outside of the apply to each control)

Respond to PowerApps

In this blog, we have seen how we can use the JSON function to pass data from PowerApps to Flow. We were able to successfully send binary data (image files, audio recordings) and a gallery data table. We can also send collections, data directly from data sources with appropriate filters, etc. The attributes that can be sent via the JSON function does not support sending attachments, nested arrays/objects.

I hope you found this interesting and this helped you. Thank you for reading!

Most of the time, deploying database scripts is tricky, time-consuming, and error-prone — specifically when a script fails due to mismatched schema, missing prerequisite data, dependencies, or any other factor.

Thankfully, different tools can automate and simplify the process…one of which is SQL Change Automation from Red Gate.

What is SQL Change Automation?

Put simply, SQL Change Automation (SCA) allows you to develop and deploy changes to a SQL Server database. It automates validation and testing, which can be performed on build and release management systems such as Azure DevOps, TeamCity, Octopus Deploy, Bamboo, and Jenkins.

Installation & Required Tools

  • Download SQL Toolbelt
  • Run the .exe and select only “SQL Change Automation 3.0” and “SQL Change Automation PowerShell 3.1
  • Visual Studio 2015/2017
  • Azure DevOps

Automated Deployment with SCA & ADO CI/CD

Create an SCA Project

  1. First, create a new SQL Change Automation project by clicking the Create Project button from SQL Change Automation menu under Tools.
  2. Select the Development (source) and Deployment Target Databases. SCA will detect the differences and create a baseline script.
    Note: This baseline script is created from the selected Target database
  3. The next step is to identify the source database changes that need to be scripted and deployed to the target database. For this, click on the Refresh button in the SQL Change Automation tab. It will list the database objects which are different from the target database.
  4. After selecting the required objects, click the Import and Generate Scripts button. It will automatically generate all the required scripts.
  5. Now go ahead and build the solution!

Setup GIT Repository in DevOps

  1. Create a new repository and upload the SCA project.
  2. Create a new Feature branch and commit the changes to this repo. Raise a pull request and assign the reviewer.
  3. Once the pull request is approved and marked complete, the changes will automatically merge to the master branch.

Setup CI/CD Pipeline in DevOps

  1. Create a new Build pipeline, select the repo.
  2. Add a new build task Redgate SQL Change Automation: Build and configure it.
    Note: This extension must be first installed into your Azure DevOps organization before using it as a task in the build flow
  3. Save the pipeline and queue a new build.
  4. Next setup a Release pipeline, create a new Release, and select the Build artifact as the input.
  5. Add a new Release task Redgate SQL Change Automation: Release and specify the configuration details like operation type, build package path, target SQL instance, database name, and credentials.
    Note: This extension must be first installed into your Azure DevOps organization before using it as a task in the release flow
  6. Save the Release pipeline and trigger a new Release.
  7. Once the Release is successful, connect to the target database and verify if the new database objects are deployed.
    Note: The target database server can be in Azure or on-premises.

Here a short video on how to configure SCA and integrate with Azure DevOps CI/CD pipeline.

Automated Rollback Using SCA & ADO CI/CD

Rolling back a database deployment is a complicated task. The code on other fronts is rather easy to rollback — just deploy the previous version of the code package and done. But databases are not as flexible. Imagine there’s an error in a script and all usernames get deleted. There isn’t a good way to roll that back! Sure, a backup could be restored. But when was that backup taken? Have any new users been in the system since that backup? What data will be lost if the backup is restored?

The process needs to be thought through right before the deployments to ensure an effective rollback process. The steps below walk through a simple example of how a rollback can be applied in an automated manner using SCA with CI/CD.

  1. First, create a new folder in your SCA solution and name it Rollback. Add your rollback scripts to this folder.
  2. While creating migration scripts (i.e., UP script), also create Down scripts. To create rollback scripts, right-click the database object and select View Revert Script option.
  3. Save the script in a new file and save it under the Rollback folder.
    Note: This rollback script will not be executed as part of the deployment.
  4. If there are any issues post-deployment, copy this rollback script to the Migration folder. Insert the Metadata and save the script.
  5. Commit the script to GIT and complete the pull request.
  6. Raise a new Build and let Release to complete.
  7. Once successful, verify the changes.

Here is a short video on how to perform rollback with SCA generated scripts.

Key Terms:

  • Baseline: The schema of the Deployment Target will be read to create a baseline schema.
  • Shadow Database: SCA keeps the shadow database consistent with all the migration scripts currently in the project as needed, and uses it to verify scripts to detect problems in your code.
  • (Table) [__MigrationLog] keeps track of the migrations and Programmable Objects/additional scripts that have been executed against your database. (Additional executions of Programmable Objects/additional scripts will result in new rows being inserted.)
  • (View) [__MigrationLogCurrent] lists the latest version of each migration/Programmable Object/additional script to have been executed against the database.

Following up my last post on Azure Web App for Containers, in part two we’ll go through the various types of storage options available with Azure Web App for Containers, along with the scenarios where they fit best.

As of writing this post, there are 3 storage options:

  1. Stateless
  2. Storage on App Service Plan
  3. Storage using a Storage Account File Share

Stateless

As you all know, containers without any volume mounts are completely stateless — i.e., the container will not persist data once it is shut down. When you create an instance of Azure Web App for Containers, this is the default option. Reboots on Azure App Service platform can happen from time to time for maintenance. The only files which are persisted across reboots in this mode are the logs which are located under the /home/LogFiles folder.

This scenario is best applicable for APIs where you don’t have to store any data on the server itself. A typical use case would be a 3-tier architecture application where each layer resides as a separate resource in Azure.

Storage on App Service Plan

Using this option allows you to store data on the App Service Plan. In order to enable this, you would have to create an App Setting (WEBSITES_ENABLE_APP_SERVICE_STORAGE=true). In this mode, what happens is that the /home directory is persisted across reboots. Azure does this by mounting storage behind the scenes at this path and then persisting it. As this storage is maintained by Azure, it ensures that it is performant. From what I have seen, in case there is an issue (Performance or Availability) with the storage, Azure will try to switch to the secondary copy and while this is being done, this storage becomes read-only.

When site level backups are enabled on such an instance, the contents of the /home directory are also backed up. The downside to this is that this storage is only visible to your Web app and is not accessible to the outside world.

This option is best for scenarios where storage is required on the server with a minimal overhead of maintenance like when hosting Drupal & WordPress.

Storage using a Storage Account File Share

As of writing this, this option is still in Preview but allows you to connect external services to the Storage Account. This can be set up by going to the “Path Mappings” section of the Azure App Configuration –

Path Mappings Storage Account

This type of mapping supports both Blob containers and Azure Storage File Shares. Although the functionality provided by both is pretty much the same, the technologies behind the scenes which support these are different. For Blob Containers, Blobfuse is used to handle the translation and mapping of file paths to remote blob paths whereas mounting Azure Storage File Shares uses SMB Protocol and uses the CIFS mounts on Linux.

I would suggest you use the Azure Storage File Shares and not the Blob Containers for these storage mounts as Blobfuse is not POSIX (Portable Operating System Interface) compliant. For best performance and stability, use the Azure Storage File Share mount.

While using this option, do keep in mind that the site backups do not back up the mounts. You would have to manage the backups of the mounts on your own.

This option is best suited for scenarios where you need more control over the storage and the ability to connect other devices to the same storage as the Web App for Container.

Stay tuned for the next part!

As we think about services that Azure can offer, we often think about apps (e.g., App Services, AKS, and Service Fabric) and data (e.g., Azure Storage, Data Bricks, and Azure Data Lake). It turns out that you can also leverage Azure purely as a Network-as-a-Service (NaaS) –Network is a basic building block for all the app and data services in Azure. But, by NaaS, I am explicitly talking about leveraging Azure networking in a *standalone* manner. Allow me to explain what I mean:  In the picture below, you will see a representation of the Azure global footprint. It is so vast that it includes 100K+ miles of fiber and subsea cables, and 130 edge locations connecting over 50 regions worldwide. Think of NaaS as a way to tap into Azure’s global infrastructure to improve network performance and resilience of your applications, regardless of whether the apps are hosted in Azure or not.

Azure's Global Footprint

Let’s discuss two specific Azure Services that offer NaaS capabilities. Note, there are other services like Azure Firewall – think firewall as a service – that can fall in the NaaS category. However, I am limiting my discussion to two services – Azure Front Door and Azure Virtual WAN. In my opinion, these services closely align with a focus on leveraging Azure network infrastructure and services in a standalone manner.

Azure Front Door Service Icon

Azure Front Door Service

Azure Front Door service allows you to define global routing for your applications that optimize performance and resilience. Front Door is a layer 7 (HTTP/HTTPS) service. Please refer to the diagram below for a high-level view of how Front Door works – you can advertise your application’s URL using the anycast protocol. This way, traffic directed towards your application will get picked up by the “closest” Azure Front Door and routed to your application hosted in Azure on-premises – for applications hosted outside of Azure, the traffic will traverse the Azure network to the point of exit closest to the location of the app.

The primary benefit of using Azure Front Door is to improve the network performance by routing over the Azure backbone (instead of the long-haul public internet). It turns out there are several secondary benefits to highlight: You can increase the reliability of your application by having Front Door provide instant failover to a backup location. Azure Front Door uses smart health probes to check for the health of your application. Additionally, Front Door offers SSL termination and certificate management, application security via Web Application Firewall, and URL based routing.

Routing Azure Front Door

Azure Virtual WAN

Azure Virtual WAN

Azure Virtual WAN offers branch connectivity to, and through, Azure. In essence, think of Azure regions as hubs, that along with the Azure network backbone can help you establish branch-to-VNet and branch-to-branch connectivity.

You are probably wondering how Virtual WAN relates to existing cloud connectivity options like point-to-site, site-to-site, and express route. Azure WAN brings together the above connectivity options into a single operational interface.

Azure Virtual Wan Branch Connectivity

The following diagram illustrates a client’s virtual network overlay over the Azure backbone. The Azure WAN virtual hub is located in the Western Europe region. The virtual hub is a managed virtual network, and in turn, enables connectivity to VNets in Western Europe (VNetA and VNetB) and an on-premises branch office (testsite1) connected via site-to-site VPN tunnel over IPSec. An important thing to note is that the site-to-site connection is hooked to the virtual hub and *not* directly to the VNet (as is the case with a virtual network gateway).

Virtual Network Overlay Azure Backbone

Finally, you can work with one of many Azure WAN partners to automate the site-to-site connection including setting up the branch device (VPN/SD-WAN – software defined wide area network) that automates the connectivity setup with Azure.

In this episode of the Azure Government video series, Steve Michelotti sits down with AIS’ very own Vishwas Lele to discuss migrating and modernizing with Kubernetes on Azure Government. You’ll learn about the traditional approaches for migrating workloads to the cloud, including:

1. Rehost
2. Refactor
3. Reimagine

You will also learn how Kubernetes provides an opportunity to fundamentally rethink these traditional approaches to cloud migration by leveraging Kubernetes in order to get the “best of all worlds” in the migration journey. If you’re looking to migrate your existing legacy workloads to the cloud, while minimizing code changes and taking advantage of innovative cloud-native technologies, this is the video you should watch!

WORK WITH THE BRIGHTEST LEADERS IN SOFTWARE DEVELOPMENT

I just returned from Microsoft BUILD 2019 where I presented a session on Azure Kubernetes Services (AKS) and Cosmos. Thanks to everyone who attended. We had excellent attendance – the room was full! I like to think that the audience was there for the speaker 😊 but I’m sure the audience interest is a clear reflection of how popular AKS and Cosmos DB are becoming.

For those looking for a 2-minute overview, here it is:

In a nutshell, the focus was to discuss the combining Cloud-Native Service (like AKS) and a Managed Database

Microsoft Build Session Architecting Cloud-Native Apps with AKS and Cosmos DB Slide Deck

We started with a discussion of Cloud-Native Apps, along with a quick introduction to AKS and Cosmos. We quickly transitioned into stateful app considerations and talked about new stateful capabilities in Kubernetes including PV, PVC, Stateful Sets, CSI, and Operators. While these capabilities represent significant progress, they don’t match up with external services like Cosmos DB.

Microsoft Build Session Architecting Cloud-Native Apps with AKS and Cosmos DB Slide Deck

Microsoft Build Session Architecting Cloud-Native Apps with AKS and Cosmos DB Slide Deck Cloud Native Tooling

One option is to use Open Service Broker – It allows Kubernetes hosted services to talk to external services using cloud-native tooling like svcat (Service Catalog).

Microsoft Build Session Architecting Cloud-Native Apps with AKS and Cosmos DB Slide Deck svcat

Microsoft Build Session Architecting Cloud-Native Apps with AKS and Cosmos DB Slide Deck SRE

External services like Cosmos DB can go beyond cluster SRE and offer “turn-key” SRE in essence – Specifically, geo-replication, API-based scaling, and even multi-master writes (eliminating the need to failover).

Microsoft Build Session Architecting Cloud-Native Apps with AKS and Cosmos DB Slide Deck Mutli Master Support

Microsoft Build Session Architecting Cloud-Native Apps with AKS and Cosmos DB Slide Deck Configure Regions

Microsoft Build Session Architecting Cloud-Native Apps with AKS and Cosmos DB Slide Deck Portability

Since the Open Service Broker is an open specification, your app remains mostly portable even when you move to one cloud provider to another. OpenService Broker does not deal with syntactic differences, say connection string prefix difference between cloud providers.  One way to handle these differences is to use Helm.

Learn more about my BUILD session:

Here you can find the complete recording of the session and slide deck: https://mybuild.techcommunity.microsoft.com/sessions/77138?source=sessions#top-anchor

Additionally, you can find the code for the sample I used here: https://github.com/vlele/build2019 

WORK WITH THE BRIGHTEST LEADERS IN SOFTWARE DEVELOPMENT

As developers, we spend a lot of time developing APIs. Sometimes it’s to expose data that we’ve transformed or to ingest data from other sources. Coincidentally, more and more companies are jumping into the realm of API Management—Microsoft, Google, MuleSoft and Kong all have products now that provide this functionality. With this much investment from the big players in the tech industry, API management is obviously a priority. Now, why would anyone want to use an API Management tool?

The answer is simple: It allows you to create an API Gateway that you can load all your APIs into, providing a single source to query and curate. API Management makes life as an admin, a developer, and a consumer easier by providing everything for you in one package.

Azure API Management

Azure API Management logoWhat does Azure API Management provide? Azure API Management (APIM) is a cloud-based PaaS offering available in both commercial Azure and Azure Government. APIM provides a one-stop-shop for API authority, with the ability to create products, enforce policies, and utilize a robust developer portal.

Not only can API Management integrate seamlessly with your existing Azure infrastructure, but it can also manage APIs that exist on-prem and in other clouds. APIM is also available in both the IL4 and IL5 environments in Azure Government, which allows for extensibility and management for those working in the public sector.

APIM leverages a few key concepts to provide its functionality to you as a developer, including:

  • Products
  • Policies
  • Developer Portal

From providing security to leveraging rate-limiting and abstraction, Azure API Management does it all for API consolidation and governance in Azure. Any API can be ingested, and it gets even easier when APIs follow the OpenAPI Format.

What Are Products?

Products are a layer of abstraction provided inside APIM. Products allow you to create subsets of APIs that are already ingested into the solution—allowing you to overlap the use of APIs while restricting the use of individual collections of APIs. This level of compartmentalization allows you to not only separate your APIs into logical buckets but also enforce rules on these products separately, providing one more layer of control.

Product access is very similar to Azure RBAC—with different groups created inside of the APIM instance. These groups are yet another way for APIM admins to encapsulate and protect their APIs, allowing them to add users already associated to the APIM instance into separate subsets. Users can also be members of multiple groups, so admins can make sure the right people have access to the right APIs stored in their APIM instance.

What Are Policies?

Policies are APIM’s way of enforcing certain restrictions and providing a more granular level of control. There is an entire breadth of policies available in APIM, which range from simply disallowing usage of the API after calling it five times, to authentication, logging, caching, and transformation of requests or responses from JSON to XML and vice versa. Policies are perhaps the most powerful function of APIM and drive the control that everyone wants and need. Policies are written in XML and can be easily edited within the APIM XML Editor. Policies can also leverage C# 7 Syntax, which brings the power of the .NET Framework to your APIM governance.

What Is the Developer Portal?

The Azure API Management Developer Portal is an improved version of the Swagger documentation that’s generated when you use the OpenAPI spec. The Developer Portal provides an area for developers to readily see APIs, products, and associated applications. The Portal also provides sample request bodies (no more guessing API request structures!) and responses, along with code samples in many different languages.

Finally, the portal also allows you to try API calls with customized request bodies and headers, so you have the ability to see exactly what kind of call you want to make. Along with all that functionality, you can also download your own copy of the OpenAPI Spec for your API after it’s been ingested into your instance.

Why Should I Use APIM?

Every business should be using some form of API Management. You’ll be providing yourself a level of control previously not available. By deploying an API Gateway, that extra layer of abstraction allows for much tighter control of your APIs. Once an API has been ingested, APIM provides many additional functionalities.

First, you can match APIs to products, providing a greater level of compartmentalization. Second, you can add different groups to each product, with groups being subsets of users (i.e. Back-end Devs, Billing Devs, etc.). Third, you automatically generate a robust developer portal, which provides all of the functionality of the Swagger portal, but with added features, such as code snippets.  Finally, APIM also has complete integration with Application Insights in commercial Azure, providing access to a world-class logging and visualization tool.

Azure API Management brings power to the user, and no API should be left out.