Monday 1 April 2019

Using Azure Devops & Github to Automate SPFx solution Packaging Processes for MS Teams and SharePoint

In this article, we will understand how SPFx solutions built for MS Teams and SharePoint portals, could be automatically packaged and uploaded into app catalog portals. This automation is achieved with the help of Azure Devops and Github version control systems. Using devops ensures us the continuous delivery of changes onto the target platforms.
  
Remember, we will be targeting the webparts of this solution for SharePoint and as tabs for Microsoft Teams. Let us use github repositories as a version control tool for storing the solution. You will understand, how github could be easily integrated with Azure Devops for automating the packaging processes in this article. 

Create the SPFx solution, that is compatible for SharePoint online portals. The SPFx solution (available on the github https://github.com/nakkeerann/globalspfxsoln) contains, the azure-pipeline configuration file template for setting up the build pipelines. The following snapshot shows the configurations for creating SPFx solution. 

Push the code into Github using the git commands. For example, below commands are used for my repository.
git initgit add .git commit -m "first commit"git remote add origin https://github.com/nakkeerann/globalspfxsoln.gitgit push -f origin master


Install the Azure Pipelines to your Github account https://github.com/marketplace/azure-pipelines . Select the repositories required to install pipelines. Authorize azure pipelines. 

Friday 15 March 2019

Office365 SharePoint: Validate & Transform XML Data from Enterprise Systems into SharePoint using Azure Logics App

In this article, you will understand how the XML content from enterprise systems is first validated and converted/transformed to JSON, and pushed to Office 365 SharePoint online lists. We will be leveraging the power of Azure services, to achieve everything with just configurations.

The services used under Azure will be
  • Logics App
    • The entire flow will be set here. 
  • Integration account.
    • With in integration account, we will store the schema and map files to validate and transform the data.

In my previous article, you have seen how the JSON content from enterprise systems is flowing into SharePoint online. There, we have not focused on validating the JSON data, since HTTP trigger by default validates.

We have seen the introduction to the features being leveraged in the previous article. Let us get directly into the business scenarios.


Business Scenario


Assume an enterprise system exposes XML data to multiple systems. We will consider source system to be HTTP service (for our POC) and the target system to be SharePoint. The XML data from source system cannot be directly feed into SharePoint. For this purpose, we need to transform the XML content, into the required format. The following picture depicts the transformation. The data being transformed is a SharePoint list item entry, along with metadata properties required for the list item.
Data Representation between two systems. Left - XML data from HTTP service, and Right - JSON data compatible for SharePoint
Data Representation between two systems. Left - XML data from HTTP service, and Right - JSON data compatible for SharePoint list

Wednesday 27 February 2019

Office365 SharePoint : Converting Enterprise System Data using Azure Integration Account Map Component and Logic Apps

This article explains the advantages of using Azure Integration account along with Azure Logics App for integrating enterprise systems with Office 365 SharePoint.  


Azure Integration Account & Logic Apps:


Azure Integration account provides ways to store and manage the components/artifacts, that includes agreements, maps, schemas, etc. In this article, let us look at leveraging maps component and also look at advantages of using it.  

Maps component is used for mapping the data from one system into another system. The mapping of two systems, those using different form of data. These scenarios are handled by using liquid template mapping or XSLT mapping formats. Here in this article, liquid templates are used for mapping the data. 

Azure Logic apps is leveraged to explain the enterprise integrations scenarios, along with usage of Azure Integration account’s map. In the scenarios explained below, I have shown the data flow between HTTP service and SharePoint for easy understanding. 



Business Scenario & Mapping:


We are considering two services. One HTTP services that uses JSON for representing the data, and other SharePoint REST API that again uses JSON to represent the data. But there is variation in format that is being used between these two services. Say for example, we are using services to move the book details from one HTTP post request to SharePoint. The following picture depicts the data transformation. 
Data Representations between two services/systems
Data Representations between two services/systems considered
Note: Assume HTTP service as another system, which posts data. For our easy/better understanding, i have simplified the logic with simple data representations.

Saturday 9 February 2019

Analyze Office 365 SharePoint online Data using Azure Data Lake Storage and Analytics Service – Part II

In this article, we will understand how Microsoft flow can be configured to push the data from Office 365 SharePoint list into Azure Data Lake Storage service. We could also understand, how the data can be analyzed using the Azure Data Lake Analytics service.

In the previous article, you could understand the benefits of using Azure Data Lake Storage & Analytic services. Also, it helps configuring these two Azure services.


Setting up Microsoft Flow 


  • Login to the Microsoft Flow Portal. Go to my flows, and select create flow from blank option. The following snapshot shows the flow being configured.
MS Flow steps to push Data From SharePoint to Azure Data Lake Storage
MS Flow steps to push Data From SharePoint to Azure Data Lake Storage

Friday 1 February 2019

Analyze Office 365 SharePoint online Data using Azure Data Lake Storage and Analytics Service – Part I

This article series helps you understand pushing the data from SharePoint online into the Azure Data Lake Storage, then making data available for analytics services. We could get to know the benefits of using Azure Data Lake storage and Data Lake analytics service.


The following steps are created and configured for the flow.
  • Create Azure Data Lake Storage 
  • Create Azure Data Lake Analytics 
  • Configure Microsoft Flow to push data into Azure Data Lake Storage 
  • Configure Azure Data Lake Analytics service to process the storage data. 

Note: There might be plenty ways to integrate the data into the Azure Data Lake storage. But here, let us leverage Microsoft flow for easily pushing data from one system to another. It is just a two step process.


Why Azure Data Lake Storage & Analytics? 


Before building the solution, let us know the benefits of using these services. Azure Data Lake storage is primarily used for processing big data analytics. The services/solution works around big data, can be easily integrated with the Azure Data Lake storage service. This will be optimized storage for big data work analytics workloads. The data stored into the Azure data lake store, are in the form of hierarchical file system.