Hi everyone, we are back again to train you on Microsoft Azure SQL databases and will also provide you cost effective tips and tricks when dealing with multiple number of databases in azure.
Microsoft Azure is rich for multiple data storage options among which SQL databases is most suitable for web apps. SQL databases are similar to what we understand for SQL Server but Microsoft has turned it into SaaS and is running within Azure.
Creating and configuring database
Let’s get started by creating a database just like we use to do in SQL server management studio but here just click on SQL databases in Azure options pane and hit the Add button on the top as shown below:
After hitting the Add button a familiar Azure blade option will appear as shown below:
Provide a unique database name, choose your subscription and existing resource group from the dropdowns and select required source for your database which could either be blank database, Microsoft sample database or any backup database.
Now we would create a database server by selecting the Server option from the Azure blade, select Create a new server and fill the required field as shown below and you are good to go with Azure SQL database server.
Now the next question it ask, if we want to use SQL elastic pool? We choose Not now option for instance but we will explore it in Scaling Azure SQL databases portion.
Next is the pricing tier, Microsoft created the wizard for us containing Basic to Premium pricing plans for our database and can be differentiated in terms of Database Transaction Unit (DTU) which is the database performance metric basically a blended measure of CPU, memory and read/write rates assigned to the database. Please choose the well suited pricing tier as per your needs, In my case I had chosen the Standard pricing tier with 10 DTU and 250GB of storage costing 10 USD/month as shown below:
Last option is of Collation which set some predefined rules to sort and compare data in database and is set by default but cannot be changed after database creation. Now hit the Create button and after some processing you would have a database to work with in the newly created database server as shown below:
As you can see this virtual database server currently has only one database which was created as blank database but I had added some schema and data in it for operational purpose. So going into it we found a number of interesting properties such as DTU utilization graph which tell us if the database is under heavy use or not then we will get to know that the DTU offered by our database server are not sufficient and probably have to worry about growing them, write now it’s not using any DTU as it’s a brand new database as shown below:
You can also restore your database by selecting the Restore option on the top and using one of its coolest feature called Point-in-time recovery, Microsoft keeps a running log about each transaction against the database so if in case some damaged is caused to the database you are not need to worry, just picked up the date and time prior to that damaged and hit OK, the database will roll back to that point in time.
Going to the Long-term option, as Azure is doing backups of our database we can choose a specific backup if we need to go back to a particular state in case of some disaster happens to the database.
Azure SQL database features
Let’s explore one of the cool feature called Data explorer which enables to explore data within SQL database. Click on the Data explorer present in database option blade and here we can login to the database server providing username and password just like in SQL server management studio as shown below:
On successful login we could see all our tables, views and stored procedure within our database, we could also query tables using transact SQL in query editor similar to SQL server management studio as shown below:
We can also perform CRUD operations on our table by right click on the table and select Edit option, a window just as shown below will appear for performing these operations.
This virtual database server is running within Azure environment but it makes you feel that you got your own server as you can perform all operations just like on premises SQL server environment.
The other cool thing is the Geo-Replication, moving into it we could see a cool little map on the top containing Microsoft’s worldwide location with the blue checked octagon which is our database server’s region.
Right now we aren’t having any geo replication configured for our database but here Microsoft is recommending the East US region as the place to replicate our database because within Microsoft Azure there’s a concept of paired location, although the connections among all region is really fast but Microsoft has given extra priority for paired region as data synchronization between them is really fast.
Let’s say you want to keep readable copy of your database in East US, simply click on it and configure it just like you did for creating a new azure SQL database. After setting it up you will be having two database servers in two different region and your both databases will keep synchronizing on each transaction.
You can also set East US region in FAIL OVER policy just if for any reason your West US database server goes down then East US database server will be serve in its place.
If your data is really important and you want its availability every time then Geo-Replication feature is the right choice.
Azure database import and export
Now that you are comfortable taking your real database present in your local machine and installing it into Azure SQL databases.
One way of doing it is to use Import database feature present on your SQL database server just as below:
On selecting Import database button below window would appear:
Choose subscription from the dropdown, database file from container of your storage account, pricing tier same as you did for creating new database, set collation accordingly, SQL server authentication is set by default, provide your database server credentials and hit OK, your database will be imported and created as additional database into Azure database server.
Next come Azure SQL database Export feature, Click on the database you want to export and click on the Export button as shown below:
This Export button will open up the window just as below after filling all the fields and hitting OK, it will create file of the selected database and store it into your Azure storage account later which could be downloaded from there.
Scaling Azure SQL databases
One of the great thing about Microsoft Azure services is the ability to scale, similarly we can scale Azure SQL databases in two ways, manually and dynamically.
For manual scaling you can simply go to your database and choose pricing tier from the options blade. Selected database was created at a particular size and at Standard plan but to increase database performance we would simply drag a slider to 150 DTU and price increases itself.
Click Apply button, there will be a little shutdown and restart and within couple of minutes our database would be scaled.
Dynamic scaling is useful when there are numbers of databases into your Azure SQL database server and when it becomes difficult to monitor each of their usage and decide which should be scale manually as well as they are costing you individually then the concept of Elastic pool meet your requirements here.
Microsoft introduced SQL Elastic pool to share database services among all database in it but these services can grow as per database requirement in terms of DTU.
To create elastic pool just click on New pool button on the top of your SQL database server as shown below:
A window just like below would appear in which you will provide pool name, select appropriate pricing tier, configure DTU per database as per requirement and add databases you want in your elastic pool, hit OK and you are good to go.
Always use Elastic pool when you have multiple databases in your Azure SQL database server as it allows databases to grow beyond the limit of manual scaling and it really save your money.
I hope you all had learned something very informative through this blog post finding Microsoft Azure as a suitable cloud platform for managing your SQL databases.
In this tutorial, we will cover application settings in Microsoft Azure.
Before jumping to application settings menu item, i am assuming you have successfully created app service and its running.
Microsoft supports different programming languages with different version numbers. php and java version is not setup to be configured by default so if we intend to do we need to manually turn it on or if don’t need (for instance) php version we can turn it off as one of the tactics to ensure we don’t get hacked.
We can also choose platform either 32 bit or 64 bit. For 64 bit your application need to be in basic or standard mode.
We can also turn on web sockets if our application is using one like node.js or firebase or Signal R.
We can set the application on always on it avoids when sometime application times out.
Managed pipeline version:-
For IIS pipeline versioning it needs to be integrated by default unless we have legacy app that requires an older version of IIS.
The concept of affinity means that when a visitor visits for the first time they get tagged with a cookie and every subsequent time that they come in the same session they directed back to same exact web instance so instead of having 10 running version, that user always goes to same instance every time.
What auto swap means is when we do a deployment from staging slot to production slot and ones deployment is complete and application has started Microsoft Azure automatically do a swap between staging environment and production environment .
So if developers want to determine why application is not working as expected we can turn the remote debugging on so that we can connect to visual studio and debug the code line by line.
Microsoft azure automatically turn it off after some time if so we don’t leave it on accidentally.
Application Settings and connection String:-
We can set up our azure specific app settings and this will override whatever in web.config file.
Now for instance if we have key =a and value =2 in app settings and key =a and value =1 in web.config file then when we deploy application Microsoft azure will replace the key ‘a’ value to 2. We can also check slot settings if we don’t want to override key from staging to production slot.
Connection string is pretty much similar to app settings except we need to choose SQL database. Slot settings is same as app settings if we don’t want to override connection string from staging to production slot.
If we have a particular file extension that we want a particular program to manage for us we can put them here. So if we want php engine to interpret any file that ends in .Html then we can put a mapping to php engine for instance e:\wwwroot.so can configure it within the portal.
Virtual Application and directories:-
Like in IIS, Virtual Application and directories are used to define sub directories that are actually mapped into other physical directories.For Example …/contactus lies in another physical path within website so whenever this URL hits it will mapped to physical directory provided .
When we set up web app for first time Microsoft automatically assigned a domain name by default like http://example.azurewebsites.com which you can see on overview section of Webapp. But almost all time we need to have a custom domain name or sub domain of our main domain when this URL is not sufficient. For that we need to go to custom domain section and purchase a domain through azure. So we can select buy domain button and go through the process of selecting available domain name, purchasing it and it will show up within custom domain name tab of application so Azure is basically functioning as a registrar.
In SSL certificates, we can install certificate so that our application can run over a secure channel. For that we need to generate a certificate and upload it in order for Microsoft azure to associate it with Web app. You go through Microsoft’s this link which will give you complete guide for importing and installing certificate
That’s all for this post.See you in next tutorial
The blog will let you explore how we can use XDS framework in Microsoft Dynamics AX 2012 to restrict user access to data. It enables you to control the accessibility of data a user has, based on desired dimension or attribute.
You can use multiple dimensions to restrict the user access to data in multiple areas of application. We will take the example in which we will use business unit as a dimension or attribute based on which we will restrict the user access to purchase orders.
This will clear your concept about how to use XDS framework in AX 2012 and then you can implement it according to your requirement.
- Create a new Query
- Add Purchline DataSource.
- In PurchLine add DimensionAttributeValueSetItemView DataSource.
- DimensionAttributeValueSetItemView: It provides a simplified query element when querying for its dimension segment values.
- Expand the newly created Data Source DimensionAttributeValueSetItemView, and add a new range on DimensionAttribute field and set (BusinessUnitDimensionAttribute()) in value property.
- BusinessUnitDimensionAttribute() : Function in SysQueryRangeUtil class to get business units assigned to user.
- You can also hardcode any business unit for testing.
- In DimensionAttributeValueSetItemView add MyBusinessUnits Data source. Set Fetch Mode to 1:1 and JoinMode to ExistsJoin.
- MyBusinessUnits: Table contains business units assigned by role.
- Go to security and add a new role XDSSecurity.
- Add roles for all Business units that you want to assign to the users.
- Create a new security policy and set the properties as follows.
- Now go to System administration > Common > Users.
- Select the user and assign XDS security and specific Business Unit role.
- Assign purchasing manager role to see purchase orders.
- Test the scenario, User must have the access to only those purchase orders that belongs to business units you have assigned the roles to him.
- Login from the user you have assigned the role.
- Go to Procurement and sourcing > Purchase orders > All purchase orders.
- Double click on any purchase order.
- In financial dimension tab you can verify that user have the access to only those purchase orders that belongs to business units you have assigned to him.
This Blog series will take you to the installation and deployment of the D365 (AX) on premises. Once, you have provisioned the On premises Project in LCS , you are good to go with the rest of the things. Microsoft has provided a comprehensive Document on the setup and deployment of On-Premises Environments here .
Before you start the setup, the following prerequisites must be in place. The setup of these prerequisites is out of scope here .
- Active Directory Domain Services (AD DS) must be installed and configured in your network.
- AD FS must be deployed.
- SQL Server 2016 SP1 must be installed on the Power BI machines.
- SQL Server Reporting Services 2016 must be installed in Native mode on the BI machines
- SQL server Integrations Services on AOS machines
To help improve the on-prem , deployment a number of power-shell scripts has been provided by the Microsoft , go through the pre-reqs sections of the document mentioned above and you will find the list of the software that are installed on the machines by the scripts , in addition to that, scripts will also add some firewall rules, SSL certificates and other necessary stuff according the machine types.
- Sign in to LCS.
- On the dashboard, select the Shared asset library tile.
- On the Model tab, in the grid, select the Dynamics 365 for Operations on-premises, Enterprise edition – Deployment scripts – Latest row.
- Select Versions, and then download the latest version of the zip file for the scripts.
- Right-click the zip file, and then select Properties. In the dialog box, select the Unblock check box.
- Copy the zip file to the machine that will be used to execute the scripts.
- Unzip the files into a folder that is named infrastructure.
- Describe configurations
The infrastructure setup scripts use the following configuration files to drive the setup.
- Service Accounts that are needed for the application to operate
- Certificates necessary for securing communications
- Database configuration
- Service Fabric cluster configuration
Define your machine configurations in ConfigTemplate
Altleast three primary Orchestrator Nodes are a must, rest of the node types can be adjusted accordingly
Define gSMA accounts
Create A Records For all AOS and Orch Nodes in DNS , as prescribed in the setup and Deployment Page mentioned above
Scripts for Domain Controller
Create GSMA accounts
New-D365FOGMSAAccounts -ConfigurationFilePath .\ConfigTemplate.xml
Create AXServiceUser user in ADDS
Generate self-sign certificates or copy thumbprint of CA generated certificates .
# Create self-signed certs
.\New-SelfSignedCertificates.ps1 -ConfigurationFilePath .\ConfigTemplate.xml
Use these Power Shell Scripts for Copying Thumbprint, don’t use GUI
- dir cert:\CurrentUser\My
- dir cert:\LocalMachine\My
- dir cert:\LocalMachine\Root
Set the generateSelfSignedCert tag to false if not using self-signed certificates
# Exports the script files to be execute on each VM into a directory VMs\<VMName>.
.\Export-Scripts.ps1 -ConfigurationFilePath .\ConfigTemplate.xml.
at this point you will get vm folder in your infrastructure directory with all the necessary scripts and SSL certificates categorized in to folders according to the configurations described in the configTemplate.
Download the following Microsoft Windows Installers (MSIs) into a file share that is accessible by all VMs.
- SNAC – ODBC driver
- SQL Server Management Studio
- Visual C++ Redistributable Packages for Microsoft Visual Studio 2013
- Access Database Engine 2010 Redistributable
Copy the contents of each infrastructure\VMs<VMName> folder into the corresponding VM, and then run the following scripts.
- .\Configure-PreReqs.ps1 -MSIFilePath <path of the MSIs>
If Test-D365FOConfiguration.ps1 gives no errors , you are set to install service fabric cluster, on your machines, we will discuss the service fabric cluster installation in our next post.
Dynamics 365 CRM doesn’t have the flexibility to schedule reoccurring workflows in terms of weekly, monthly or daily basis. In any case, there is a work-around that uses the CRM workflow “wait” step and child workflow to accomplish this functionality.
Following are the steps to create the reoccurring workflow in Microsoft Dynamics 365 CRM:
- At first you have to identify the logic/design of how and when your workflow will be triggered at first time execution.
- Now go to processes under settings tab and create a new process.
- Afterwards a new dialog will appear, fill the Process name, Category as workflow, select Entity and mark type as new blank process.
- Afterwards the detailed process steps will be appear on the screen. Fill the required details and add one step with wait condition, set process timeout on based on your conditions.
- Make sure to call the workflow again, add step Start Child Workflow and select the existing workflow again. It will call itself and will be reoccurring.
This procedure will ensure the workflow is activated initially and select the trigger point (create of the record, update of the field/status) and set the workflow as real time workflow. On the off chance that you need to stop the workflow after specific conditions are met, you can indicate those conditions in the workflow as a check condition before circling back to the first procedure.
Note: Wait conditions on workflows will require lots of resources/processing and may cause execution issues if there are excessively numerous running in the meantime. Think about making them on-request or including check conditions in the child workflow to end the re occurrence in the event that they don’t have to run constantly.
Hope this blog help you in setting Scheduling Reoccurring workflow in Microsoft Dynamics 365 CRM. Keep visiting to http://www.axpulse.com/blogs/ to get more exciting news and blogs related to Dynamics world.