Showing posts with label Azure Cloud Series. Show all posts
Showing posts with label Azure Cloud Series. Show all posts

Monday, 19 August 2019

Using Azure Devops Deploy Webapi in Azure App Service

Here, in this article, we will see how to deploy Azure WebApi using Azure DevOps Pipeline, refer earlier articles for creating an Azure App Service for WebApi Application.

Lets create a new azure pipeline with classic editor and then choose the repository of the WebApi solution and then continue to select Azure WebApp for Asp.Net template as shown below, this would automatically add relevant tasks that are required for building the application.

As we don't have any test project,we have removed the unnecessary task, refer below build pipeline configuration.

Refer below screen shot for CD Pipeline configuration, easiest way to create release pipeline using the Azure Web App Deployment template.

Here, we have the authenticate to Azure Portal using the appropriate subscription and then configure the relevant resource that has to be deployed using the task.
Let's trigger the build and release pipeline for the AzureWebApi, deployment was successful and the application was deployed to the Azure App Service.


Wednesday, 2 January 2019

Liquid Transformation As a Azure Function using DotLiquid

In this article, I would brief about achieving Liquid Template Transformation using DotLiquid in a Azure Function.

Using Azure SDK, I have create a simple HTTP Trigger function. This has two projects, LiquidTransformationApp(LiquidTransformation) and AzureFunctions.Helper.

In LiquidTransformation, we extract the request body, map name and map type from the query string and invoke utility helper for the transformation, as shown

Here is utility helper class with does the Parse and Render of the Liquid Template. I have embedded the liquid templates as as resource files and based on the MapName and MapType, we would dynamically execute the transformation and render the transformed result as HTTP Response to client. 

Here, we receive JSON request and JSON request would contain nested JObject/Array, thus to parse the JSON request as Dictionary Object, we use ParseJSONHelper and GenericInterface<T> and Generic<T> are used in Parse and Render stages of Liquid Templates to achieve support for dynamic objects.

Here are the Json To Json and Json To XML Transformation liquid templates.
Lets run and verify the Liquid Template Transformation

Friday, 16 November 2018

Migrating BizTalk Transformation To Azure using Azure Function as an API

In this article, I would brief about Migrating BizTalk Transformation As-Is to Azure Function(API) and this can be invoked from Logic App. Thus the transformation can be execute across different resource group, region and integration account.

Using Azure SDK, I have create a simple HTTP Trigger function. This has two projects, TrasformationHelper(TransformAPI) and AzureFunctionHelper.

In TransformAPI, we extract the request body and map name from the query string and invoke utility helper for the transformation.
In AzureFunctionHelper, I have embedded the XSLT and Extension Objects as resource files and based on the MapName, we would dynamically execute the transformation and render the transformed result as HTTP Response to client.
Here is the XSLT for one of the transformation and this invoke external assembly methods, as shown. External assembly method does DB Lookup. Lets add the reference of External Assembly to Azure Function Project.
Lets configure hybrid connection in Azure Function. select Networking and click on configure your hybrid connection endpoints. Click on Add hybrid Connection, 
Create a new or bind to existing Hybrid Connection. Refer my earlier article for configuring azure hybrid connection to access on-prem sql server.
Lets deploy and test the Azure function API. This can be invoked from Logic App to achieve dynamic transformation.
Note:- External Assembly would be package as part of Azure Function Package. This allows us to completely re-use BizTalk Transformation as-is.

Monday, 12 November 2018

Connecting to On-Prem SQL Server using Azure Hybrid Connection

In this article, we will explore on connecting to on-premise sql server from Azure services using Hybrid Connections

Hybrid Connections allows Azure Web Sites and Services to securely connect to the on-premises resources hosted within the corporate network, without requiring any change to firewall or network.
Here is a simple console application,which lookup from product table available in on-premise SQL Server DB. Lets publish and host in Azure App Service as WebJob. 
Note: Here we have used the SQL Authentication and provided the SQL Server Instance along with default TCP Port for the SQL Server Connection String.
Now let us configure Hybrid Connection to get the WebJob up and running properly. Inside the App Service select Networking and click on Configure your hybrid connection endpoints. Click on Add hybrid Connection
Create and Configure the Hybrid Connection.
Before  installing and configuring Hybrid Connection Manager in on-prem server, make you  TCP dynamic ports are enabled and Active and Enabled Properties are set to Yes.
Lets install and launch the Hybrid connection manager in the server and then click on Add a new Hybrid Connection; Log in with the Azure Subscription Credentials and select the Hybrid Connection created there Restart the Azure Hybrid Connection Manager Service after adding the connection.
Let us run the WebJob and verify the status from Logs Section. As you can see we have done a lookup from the corresponding Product Id.

Sunday, 12 August 2018

Exposing On-Premise BizTalk Service to Internet with Service Bus & WCF-Relay Adapter

In this article, I would walk through on how to leverage WCF-Relay adapter to expose on-premise BizTalk service for public access.

Refer my earlier article for Boomerang pipeline component, as I would use the same component over here.

Lets create a two way service which receives the request and renders a transformed message. As show below create receive port  and a receive location with WCF-BasicHttpRelay binding, then provide the relay URL and configure the SAS Policy under Bindings Tabs.

Here is simple token provider console application, which generates the access token. This token has to be passed as authentication header in order to access the service.

Lets generate the token and using post man, these test the on-premise BizTalk service by posting a HTTP request with a valid request body.
As shown, transformed response would be rendered back to the client. By this way, we can access our on-premise service over the internet. Key take is that we can leverage BizTalk transformations from Azure Functions/Logic Apps instead of writing custom API to achieve complex transformation involving functiods.

Monday, 9 July 2018

Hybrid Adapters BizTalk Server 2016 | Event Hub & Service Bus Messaging Adapters

In this article, I would brief on configuring BizTalk 2016 Hybrid Adapters like Service Bus Message(Queues) and EventHub. These adapaters can be configured at receive and send ports.
Here, I have a Azure function which would act as a publisher to Event Hub and Service Bus Queue, you can refer my earlier Azure Function blog post.
Lets create a simple messaging solution for our demo purpose; then add and configure EventHub adapter as shown below, By Signing into Azure Account, we can configure the appropriate EventHub instance and its related SAS policy. 
Now, lets configure the SB Messaging Adapter, please provide the SB namespace and the queue name; In authentication tab provide relevant SAS Policy.
Let create a subscription inside BizTalk at the Sendports with ReceivePortName for EventHub and SB Messages; let test and verify the end to end flow.

Tuesday, 3 April 2018

Connecting On-Premise Service Using On-Premise Data Gateway


In this article, I would brief on connecting on-premise service(API/WCF service) using On-Premise Data Gateway.

Lets install and configure on-premise data gateway.
Configure the data gateway with your account id (must be domain user)
As shown below, configure the the gateway in appropriate location and provide the recovery key.
Post configuring the gateway in the on-premise system, lets connect to the Azure Portal and map the on-premise gateway with the Azure Gateway, as shown below. Make sure to choose the same region/location of on-premise gateway.
Now lets, create custom logic app connector.
Lets edit the custom connector and configure the on-premise service(for our eg we choose WebAPI) using swagger for API or WSDL for SOAP services.
Here we would connect to on-premise API, lets fill in the general information and enable the connectivity using on-premise data gateway as shown below
On the security tab, lets choose No Authentication to keep it simple and on the definition tab, configure each operations available on the API, which are available as part of Actions [Action Methods], finally update the connector. Now using this connector, we would be able to connect to on-premise services.
Lets create and connect to on-premise service using custom connector. Here, we will develop simple request response logic which would pull all the products information thru the on-premise API Service.

Lets add an action and choose custom connector and select relevant operation as shown below
Now, lets add a response and configure the response with the output body of custom connector.
Test and verify the logic app.

Monday, 19 February 2018

Hybrid Integration Solution Using BizTalk and Azure Logic App/Azure Func

In this article, I would show how to Trigger Logic App from BizTalk using Logic App Adapter,

Lets create, BizTalk solution and add a generate JSON schema using Product Json file, its quite similar to Flat File generation, refer below.



In order to send the request to and receive the response from logic app, create a receive and send pipeline with JSON Decoder and Encoder components, we would pass JSON request and convert the JSON response to XML Response.


Here is the simple orchestration which receive product message and then invokes the Logic app and places the response from the logic app to out folder.Lets deploy and configure the Application as shown below.


Test and verify the process, refer below for execution of BT odx, LogicApp and Azure Functions.

Sunday, 18 February 2018

Logic App | Integrating with Azure Function


In this article, I will integrate Azure Generic Webhook function with Logic App.

Lets create a generic webhook function and generate a code from json request as shown and use it the function.

This function validates the JSON input and renders response based on the unit provided.

Lets create Logic App and choose a blank template,as shown below.
 Bring in a HTTP Request,
 Using Sample JSON, we can generate the JSON schema and the configure the HTTP verb as post and provide the relative path
Once HTTP request is configured, we can add a new step and then choose an action and configure the Azure Function. Then pass in the HTTP body to function as shown.
Now, we can validate the response using condition and control the flow for true and false path.
Save the logic app and push message using PostMan tool and validate the behavior

Logic Apps | Azure Integration | Server-less Compute


In this article, I would explain how to develop a simple Logic App.In Azure, Click on New-->Enterprise Integration-->Logic App, as shown below
Provide the Logic App Name and choose the subscription and resource group.

Once its created;we can develop the flow either from a blank template or using common template. Here, we would would develop a simple app that poll twitter for a specific keyword and process the tweet to an Azure SQL Server DB.

Lets sign-in and configure/authorise logic app to use that twitter account to use for polling; now configure the polling details and text keyword to poll for.
Let now add a new step to process the tweet to SQL DB, bring in a SQL Server -Insert row and provide the Azure SQL Server connection details as shown. Here we capture the user name and tweet in the Azure SQL DB table.
Lets run and verify the behaviour, for every 3 mins logic app polls and if any tweets are available those are processed to Azure SQL DB.