Sunday, June 25, 2017

Inserting data in On Premises SQL Database using Logic Apps

If you are a BizTalk developer and get to know about any other integration tool/offering, and if you have to evaluate that - the first couple of things that would come in to your mind is(at least in my mind ;))
1. How to route file from one location to another 
2. How to perform CRUD operation on database

Thus, I thought of creating a simple Logic app which will receive a Product information and insert into table, but the table is part of database which is on premises.


Following are needed to do this

1. Azure Subscription
2. On premises machine with database to be used
3. On Premises Data Gateway installed on the on premises machine and registered on Azure


Now lets create the Solution


1. Create a database and a table 



  •   To  keep it simple have created a DB called DemoDb with table having three      columns 

Demo Db on premises





2.Install,Configure and start OPDGW 


  • If installation and configuration of OPDGW is done then make sure the OPDGDW service is started

on premises data gateway service







3. Create Logic App

  • Open Azure Portal, sign-in with your account and on your left side, click in New -> Web + Mobile -> Logic App
  • Provide Name, create/use existing Resource Group, location and click on Create  then on designer blade select Blank Logic App template
create logic appcreate logic app blank template
  • Now add a Request/Response Trigger - Logic Apps always start with trigger, as in BizTalk Message creates instance of Orch likewise trigger creates instance of LA
request/response trigger


  • Type Request and select Request/Response from the connectors list, as you see below this connector as 1 Trigger and 1 action associated with it

request/response trigger actions


  • The Request expects us to define the JSON schema for the request message intended to be received and the method. As BizTalk works internally on XML , Logic app works on JSON.
configure request/response trigger actions






  • I haven't provided any schema(thus accepting all valid JSON) but it should be done. Once you save the configuration , the URL of the logic app will be created automatically

save after configure request/response trigger actions

  • You can copy the url, we would need it to invoke this Logic app later. So the first step is done, now click on next step - select Add an action
new step - new action




  • After receiving the request message, next action is to insert in SQL thus look out for SQL connector
SQL Connector
SQL Connector Actions


  • Unlike Request/Response connector which had 1 Trigger associated with it, SQL connector does not and has only actions - seven actions are available as of now. As we intend to insert record in SQL table - select Insert Row action

SQL Connector Actions Insert row


  • As this is first time SQL connector is used, we need to create a connection to SQL and for it Logic app will use OPDGW thus select Connect via on-premise data gateway- provide ConnectionName and details of SQL Server,DB and authentication details.(This connection can be reused if operation on same table is to be performed anytime further).










SQL Connector  Insert row
  • Once connection is established, all the tables under DemoDb will be autopopulated. For now only one table is available so that is available - Product table, select it.

SQL Connector Insert row table

  • All the columns from the table get available with blank textbox against it to contain the value to be inserted

SQL Connector Insert row table - body



  • As in BizTalk Orchestration the received message is available for all following shapes likewise in logic apps the Body of the request trigger is available. But we want to have dedicated values for the columns, for that we need to explicitly let logic app know where to pick values from the body, for that we switch to code viewlogic app code view




  • Use @{triggerBody()?} to get access to the JSON created in the previous step, then navigate through your structure to find the variables(Description, ID and Price). 


logic app designer view



  • Going back to the Designer View, click in Designer and now see that the Workflow has identified these values as coming from the request


That's it, logic app is ready to accept request and insert row in table.


4. Test Logic App


  • To test the logic app we can use a Postman or ARC or any other client. I have used ARC.

ARC tool


  • We can check the status of the trigger and logic app on the portal

logic app diagnostics


  • And finally, we need to check the DB if entry is made table after insert


5.Error you might encountered



  • You might get below error while configuring the SQL connector if your On premises data gateway service is stopped or the machine hosting it isn't available over the network or is shutdown

opdgw error



Related Post

Thursday, June 22, 2017

Installing and Configuring On Premises Data Gateway - By adding user to Active Directory

This post is intended to walk-through the process of  Installing and Configuring On Premises Data Gateway when you don't have work account (But you have a Microsoft account)

It is stated that Microsoft account can't be used to configure/register OPDGW, instead you need to use school account or work account. So does that mean a person not having work/school account cannot explore/use OPDGW- Well there is a way, you add a user to Active Directory.


Let's see how we do it


1. How to add user to Active Directory

2. Install OPDGW on local machine
3. Configure OPDGW on machine
4. Register OPDGW on Azure Portal


1.How to add user to Active Directory

  • To add user to AD, login to portal and search for Active Directory
Select Active directory
  • Click on Acive Directory and Under Quick Links - select Add a user




  •  Before adding user, remember you have to use following pattern
Say your Microsoft account is - xyz@gmail.com then new user should be something@xyzgmail.onmicrosoft.com, the user must have an extension of onmicrosoft.com
On Premises Data Gateway create user
Out of curiosity, tried to test validation - it is in place. Error is reported if pattern is not followed

On Premises Data Gateway create user error



  • Now we need to make the user as Global Administrator - go to All Users and select the user added in above step


On Premises Data Gateway select user


Click on Directory role and select Global administrator in adjacent bladeOn Premises Data Gateway adding user as global admin
  • Now next is to Reset the password the newly created user, you can use this password to login to portal, but you will be prompted to change it at first login (copy this temporary password as this would be needed later )
new user reset password
  • Next, is to add the above user as Owner so that the user shares same subscription. Thus all the things done by/under this user will use your subscription.  
Go to your subscription --> Access control(IAM) and add the Role as Owner and select the user created aboveOn Premises Data Gateway adding user as owner

  • In the notification, you should see following
                    notification in Azure portal
  • Now final step with user, setting the password. Login to the azure portal with the new user created and you will get prompted to change the password                                    update new user password

2.Install OPDGW on local machine

On Premises Data Gateway download
  • Run the installation 
On Premises Data Gateway installation program on local machine

On Premises Data Gateway installation start on local machine

On Premises Data Gateway installation path on local machine

On Premises Data Gateway installation done on local machine

3. Configure OPDGW on local machine

  • Here we need to enter the User we created in step 1




On Premises Data Gateway installation done  on local machine

provide work account




If cookies are not enabled then you might see next screen 


cookies error





  • After you enable cookies, you should be able to continue. Provide the Name to the OPDGW to be setup and Recovery key


On Premises Data Gateway configuration  on local machine
Here if you see, the setup wizard choose the nearest Azure Region as West India based on proximity of the location (but it can be changed, recommended to choose the closest one)


On Premises Data Gateway configuration done on local machine


Now OPDGW is ready to use, it can be used by Logic App, Power BI, PowerApps and Flow.

On Premises Data Gateway Service on local machine

You can also verify on local machine too, go to services and you can see the service as can be seen above



4.Register OPDGW on Azure Portal (associate to new user created)

  • Login to Azure portal with the newly created user and search for On Premises Data Gateway
  • Click on Add to register the gateway you installed on local machine
  • Provide Name, Subscription , Resource Group. You will see that the installed OPDGW is already available in the drop-down list (So all the OPDGW which are configured with the current user will be populated)

Register On Premises Data Gateway




  • Click on Create , registration is done and now it should be available to use in the supporting Azure services
  • Just a small pointer (below) while giving name few characters are not supported as of now


Characters not allowed in Name of On Prem Data Gateway














Saturday, June 10, 2017

How to check time taken by each shape in Orchestration

Performance testing team knows how to find the  performance stats related to BizTalk application like how much time it takes to process n number of messages, how much time an orchestration takes or average time taken by Orchestration etc . 

There was a quite an interesting ask by Performance testing team - How to check time taken by each shape in Orchestration

This question forced me to scratch my head for a while :).

But yes there is a way to find the time taken by each step, you just need to enable tracking on Orchestration and use orchestration debugger

To demonstrate how, below is the orchestration debugger of a simple orchestration I have- which upon receiving a message calls a web api, collects response and archives it locally.

How to check time taken by each shape in Orchestration

As can be seen, the Green color arrow  denotes start of shape and the Blue colored denotes end of shape and time of the shape start and end is also tracked upto milliseconds. 

So the time taken by SndApiRequest would be 11.007-10.633 = 0.374



Monday, June 5, 2017

There were not enough free threads in the ThreadPool to complete the operation

During stress test , with over 1000 files dropped at receive location at once, found 73 suspended instances, upon checking found the Thread starvation error


error-There were not enough free threads in the ThreadPool to complete the operation



There were not enough free threads in the ThreadPool to complete the operation

Why it happened


It is clear from error that  the number of threads available were not sufficient to cater the load of messages pumped in.

What to do


1. The first thought would be to create a separate Host for Receive/Orchestration/Send so that each process has sufficient threads/resources for processing - but here it doesn't seem that significant as this server has only one application deployed with simple orchestration calling web service and collecting response.


Note: It is best practice to have dedicated host for Receiving, Sending, Orchestration and Tracking.


2. So the next was to increase threads, you will find following in host settings


Maximum engine threads: it denotes the maximum number of messaging engine threads per CPU.



By Default it is set to 20, I set it to 30 - restarted host instance and tested again with over 1000 messages. No errors now.

You can change the below values from Host Instance Settings in Admin Console(Platform Settings -> Host  -> Select the Host -> Settings) . You have to restart the Host Instance afterwards
host maximum engine thread setting


This option specifies the maximum number of threads that can be used by the End Point Manager (EPM) - the Messaging engine. The EPM doesn't use all threads instead it starts with the number of threads equivalent to 10% of this value and  as load increases adds threads up to the specified value. As load is reduced, the number of threads allocated is reduced.

Out of curiosity tried setting the number of threads to 100 - but it didn't allow, the limit is 50



maximum engine thread limit


 3. Each host instance has its own set of resources such as memory, handles, and threads in the .NET thread pool.Thus it is important to allocate enough threads to the .NET thread pool associated with an instance of a BizTalk host to prevent thread starvation.

.NET CLR: Use this to update the number of Windows threads available in the .NET thread pool associated with an instance of a BizTalk host.You can change the below values from Host Instance Settings in Admin Console(Platform Settings -> Host Instances -> Select the Host Instance -> Settings) . You have to restart the Host Instance afterwards-

Net clr host settings

Worker threads min/max values 
By Default it is   5/25
Recommended    25/100


The values specified for the thread pools are per CPU. For example, setting MaxWorkerThreads to 100 has an effective value of 200 on a 2 CPU computer


Out of curiosity tried setting the number of threads for Max Worker/IO Threads to 5000 - but it didn't allow, limit  is 500/1000 respectively


maximum limit of IO thread


maximum limit of worker thread


Worker threads are used to handle queued work items and I/O threads are dedicated callback threads associated with an I/O completion port to handle a completed asynchronous I/O request.

So if you see(point2 and 3) EPM(Messaging Engine) has its own thread pool which we can configure using attribute Maximum Engine thread(Maximum number of messaging engine threads per CPU) whereas Orchestration and send adapter share the worker threads - XLANG (Orchestration Engine) does most of its work, including accessing the BizTalk databases to send and receive messages, on .NET thread-pool worker threads and also the send adapters(SOAP/HTTP based).

4.Fourth thing to look would be maxconnection - Controls the maximum number of outgoing HTTP connections that you can initiate from a client to a specific IP address. This can be a bottleneck if BizTalk Server makes a large number of HTTP, SOAP, or WSE requests.

Increasing the maxconnection attribute results in increased thread pool and processor utilization. With the increase in the maxconnection value, a higher number of I/O threads will be available to make outbound concurrent calls to the Web service.

By Default it is   2
Recommended     12 * Number of CPUs 


To increase the number of concurrent connections, you can add/modify the entry in the BizTalk Server configuration file, BTSNTSvc.exe.config or BTSNTSvc64.exe.config for 64-bit hosts, on each BizTalk server. You can increase/limit this for the specific servers being called by adding additional entry for it 
       <system.net>
          <connectionManagement>
              <add address=" http://www.xyz.com" maxconnection="2" />
              <add address="*" maxconnection="20" />
          </connectionManagement>
       </system.net>

 or you can have one entry set for all, like below . Maximum value is 20.


maxconnection in BizTalk config


Note:  Although increasing maxconnection would help BizTalk to push more messages out, the capacity of the downstream system should be taken into account.


Every system has a breaking point, doing above steps will help to understand the limits of your server by playing  around with various combination of loads and configuration. 

Final step - If above steps are not sufficient, then its time to scale up/out your server.