Category Archives: Dynamics 365

INET Dynamics

Dynamics 365 Finance and Operations Tips & Tricks: Database Logging – Part 3

 

Categories: Dynamics 365
Tags:

 

This is the third part of our Bring Your Own Database (BYOD) into Dynamics 365 series(Read Part 1 and Part 2 here! . In the second part, we have introduced the overview of BYOD, its functionality and how to set it up.

In third part, I will demonstrate some scenarios of using BYOD.

1.   Export Data to BYOD

In order to use a data entity with BYOD, it needs to be published in Data Management workspace > Data Entities

Publishing a data entity

After publishing successfully, the field is ticked and if you have several BYODs, you can select them and publish all.

The publishing step is essential because it defines the entity schema on the destination database. Before inserting any records into the database, the structure of the entity must first be created.

Then we can create a new export project to test. For different business requirements, you can either get the data with a full push or an incremental push,

2.    Full push

This, as expected, export all records of the data entity to BYOD. Here, for demonstration purpose, we will work with only one data entity.

For exporting action, you can see that the export job can be triggered right away with Export now or be configured to run in the background as a batch job with Export in batch.

We will go with Export now first and there is an example of setting up the batch job later.

The export job is run and you can see the record counts. All customers are exported successfully.

You can now see the data loaded in the table in the external database. The table is created in the publishing entity step.

3.    Incremental push:

A full push deletes all existing records from an entity and then inserts the current set of records from the selected entity. This action ensures you have the latest data, however, large data will require much overhead and more time-consuming.

 

Sometimes, your solutions might need near real-time data or track data regularly, incremental push supports this requirment by tracking the changes of an entity and performing the export of changed records only.

 

First, by default, the system does not track changes of all entities. We will have to enable change tracking for our desired entities by going to Data Management workspace > Data Entities again. Here, for each entity, we can select enable Change Tracking. There are 3 different options for change trackings, which decide when a change is qualified for an incremental push.

  • Enable primary table: only changes are made to the primary table of the entity, an incremental push trigger is called. Before changing some values of the primary table, there might be some other changes of non-primary tables, however, if an incremental push export job is run, no record will be updated or inserted to BYOD. Only changes in the primary table will trigger the insert or update action to BYOD but now the action will write all changes including both changes of non-primary and primary tables.
  • Enable entire entity: tracks every change
  • Enable custom query: this option is for a developer to track only a specific set of fields for their custom solution.

For incremental push, we have some important points to notice below.

  • The first run of an incremental push is always a full push. It’s logical since the destination database has no records so all data is considered new changes.
    After that, SQL knows which records exist and what to track for changes.

  • Subsequent runs: all subsequent runs will track changes of the entity and export the changed records.

First, I run the export job another time. Without any changes, no records are exported as below.

Then I test by creating a new customer and run the export job again. Following the new record creation, the export job performs an insert to the BYOD.

To double confirm, you can also find the new record in BYOD now.

4.    Export in batch:

Due to data size, the export job might be run in the background and/or scheduled in a low-activity time.

 

Another benefit is to combining export in batch with incremental push to get latest data of an entity in your BYOD, which meets several business requirements for integration or reporting and analytics.

You can set the batch job recurrence as you preferred. One tip is to have different batch groups for different sets of actions. For live environment, you should configure your batch server and allocate batch groups to servers to balance and optimize workload for better performance.

After setting the batch job, it will be added to the batch queue and you can see it in System Administration > Batch jobs

Here for the quick demonstration, the job recurrence is every 2 minutes. You can check the batch job history to see all records of batch runs.

In Data Management workspace, corresponding export jobs are also created whenever the batch job runs.

 

If you have not had the chance to follow up on our previous tips and tricks, read part 1 here and part 2 here! I-Net Dynamics is an  Gold Partner of Microsoft with 25 years of experience with implementation, training and support of Microsoft ERP Systems. If you are currently using Dynamics 365 and feel a little lost, you can contact us here for assistance and support packages!

 

INET Dynamics
Microsoft Dynamics ERP Systems

Dynamics 365 Finance and Operations Tips & Tricks: Database Logging – Part 2

 

Categories: Dynamics 365
Tags:

 

 

Microsoft Dynamics 365 for Finance and Operations (D365FO) provides Bring Your Own Database (BYOD) feature, which helps you to export data entities to your own external databases serving businesses different needs.

Overall, with BYOD, you can complete the following tasks:
● Configure Microsoft Azure SQL databases so that you can export entity data from D365FO into it.
● Export either all the records (full push) or only the records that have changed or been deleted (incremental push).
● Export on regular schedule with recurrence option of the Finance and Operations batch framework.
● Access the entity database by using Transact-SQL (T-SQL), and even extend the database by adding more tables.
● Export entities into multiple databases.

Some common scenarios that BYOD is utilized are:

● Export data from Finance and Operations into your own data warehouse.
● Get data for other analytical purposes that use other tools requiring T-SQL access to data.
● Perform batch integration with other systems.
● Any other purposes of testing, validating, or integrating data.

If you have integrated solutions that require direct T-SQL access to the database, BYOD is the recommended upgrade path.

To use BYOD, you will need to complete several steps as below:
1. Create a SQL database to connect with your D365FO environment
2. Configure the BYOD connection in D365FO client.
3. Identify data entities that you want to use and publish them (publishing a data entity makes them available to export to BYOD)
4. After the above set up steps, you can create an export in Data Management Workspace with target format is your configured external database.

In this article, we will go over how to configure BYOD. In the next articles, I’ll provide more examples of exporting to BYOD and some common errors.

1. Create a SQL database to connect with your D365FO environment

You can create an Azure SQL Database easily on Azure portal.

Dynamics 365 Database

● Go to Azure portal -> SQL databases -> Add

You should also create a SQL user account for sign-in to the database. Please remember to note and save the server name, database name, and the SQL user account and password. These details are required when configuring the database on D365FO client.

● After selecting the new server as above, choose the appropriate Resource group and pricing tier as you want

Here, for example, I’m using a minimum basic SQL database. However, tiers affect functionality options. The basic tier does not support clustered columnstore indexes.

For integration with a business intelligence (BI) tool, premium databases with the option of clustered columnstore indexes (CCIs) can meet your performance needs. Since CCIs are in-memory indexes that enhances the performance of reading queries that are the major part of analytical and reporting tasks.
For general storing or integration purposes, you may find a standard database sufficient, however, depending on your data size and performance requirements, you can always go for a premium.
● Azure DB Connection String format we use here is ADO.NET

2. Configure the BYOD connection in D365FO client.

Go to System administration > Workspaces > Data management > Configure entity export to database.

To configure a new database, select New, and then enter a unique name and a description for the new database. Then, you can paste the connection string in the Azure portal here and remember to replace the appropriate authentication details (highlighted in screenshots).
Also, as explained above, here I use the basic database tier, so I have to turn off clustered columnstore indexes option. Otherwise, you will get an error.

Select Validate, and make sure that the connection is successful. One common error in this step is an error about not being able to access Azure server

The solution for the above error is enabling access in the Azure portal. So you need to go to the Azure DB > Firewall settings > Add client IP to allow access.

3. Publish data entities

To export to Azure DB, data entities must be published to it. Publishing a data entity to Azure DB creates tables and necessary structures of the data entity in Azure DB.
If data entities are not published, when exporting data entities to Azure DB, an error will show up.
● Go to System administration > Workspaces > Data management > Data entities
On default, you can see that the value of published field is No. To publish a data entity, you can simply select it and click Publish.

4. After successfully configuring the BYOD connection, we can now export data entities to it.

After entities are published to the destination database, you can use the Export function in the Data management workspace. You can create an export project with target format is now selected as your newly added BYOD.

In the next article, we will go through examples of exporting full push an incremental push to BYOD and some important behaviors of these options. If you face any issues importing your own database and require assistance, I-Net Dynamics is a certified Microsoft Dynamics re-seller with a experienced team of local support professionals who can assist you!

INET Dynamics
Microsoft Dynamics ERP Systems

Dynamics 365 Finance and Operations Tips & Tricks: Database Logging

 

Categories: Dynamics 365
Tags:

 

Written by Xuan Luan for I-Net Dynamics

What is Database Logging?

Database logging is a feature that allows you to track specific changes to tables and fields. The feature is also available in AX2012 and there is not much of a difference in setting it up in Microsoft Dynamics 365 Financial Operations.

Some example cases of utilizing database logging that we have come across are
● Businesses that want to keep tracks of changes to specific tables. These tables might have sensitive information or serve their specific business requirements, of which they want to have auditable records of changes.
● Some companies want to monitor changes of users in systems. For example, they want to record employeess who have been added and removed from the system.

Database logging and performance:

The feature is useful in some scenarios; however, just a quick reminder, logging requires resources and meticulous management. It might lead to some performance impacts when you overuse it. Although this document about performance implications of database logging is for AX2012, the same applies to Microsoft Dynamics 365 Financial Operations.

How to set up database logging

In this blog, we will guide you step by step to set up database logging to tracks roles assigned to users in the system.

1. This should be set up by the Admin user. Only users with the System Administrator role can access the System administration module.

2. Navigate to the feature: Click System administration > Setup > Database log setup.

The database log setup form will be opened. You might find enabled logs here. If what you want is not already listed, you add it by clicking New.

3. Click New to start the Logging database changes wizard

4. In the next window, you should be able to find tables that you want to track. Please be noted that not all tables are displayed. You can see more by clicking on two checkboxes “Show all tables” and “Show table names” at the bottom, which helps you to find required tables faster.

5. After selecting the table Security User Role, you can set what fields and changes you need. Logging everything is time-consuming and redundant, you might filter the fields depending on your needs. Here for the demonstration purpose, I’ll select all the fields.

6. Clicking Next, in this window, you can select what types of changes to track. Changes that can be tracked include insert, update, delete, or rename key. Again, I select everything here.

7. Click Next and then Finish to save. You might have some popups reminding about performance impacts, choose Yes. After that, you should see info log of operation completed.

How to see logs

You can navigate to System administration > Inquiries > Database > Database log.

1. Let’s test our new database log by the roles of a user. Here, I’m the Admin user with below roles.

2. Let’s remove the role System tracing user

3. Go to the database log form to check. You can click Refresh to get the latest data and also, sort the table to see the results better.

4. You should see the record as below

5. Also, you might want to check more details of this action by selecting the row and choose History tab. To see the full details, please click on Show cumulative changes and then full details will be shown correctly as below.

We hope that this article has been helpful to understand how you might utilize the Database Logging feature in Dynamics 365 Financial Operations. Please look forward to our future articles about some tips and tricks of our products. If you are interested in exploring any of our products and services, please contact us here.

INET Dynamics

3 Tips to Boost  Dynamics 365 for Finance and Operations Performance

 

Categories: Dynamics 365
Tags:

 

Dynamics 365 for Finance and Operations (D365FO) is an ERP software targeting for large enterprises. The product offers both cloud and on-premises options. Such a complex system serves a huge number of users with hundreds to thousands of actions each day. Without proper monitoring and tuning of the system, organizations might face slowness and performance issues.

 

The needs for performance testing and boosting of each company is different depending on their own business and as a result, their areas of large work volume. In this article, we would like to share some common tips which are all out of the box features of  D365FO to boost the performance.

Batch jobs to boost SQL performance:

In D365FO, SQL performance can impact performance in various scenarios. The system provides two notable system batch jobs that help to compress database and rebuild fragmented indexes.

 

First, with a large enterprise, database volume can grow very quickly, which leads to the needs of data volume management. Database compression offers some key benefits of storage requirement reduction, query performance and faster time to recover if needed. The feature is now introduced in the application that you can configure and run from the client.

 

Second, index fragmentation is another issue that grows over time and decreases performance. You can check your environment index fragmentation from LCS portal by navigating to Environment Details page > Environment monitoring > SQL Insights > Queries > Get fragmentation details >  Execute.

The system batch job for SQL defragmentation is introduced from Platform Update 22 so you can configure a regular job to rebuild indexes at your preferred time.

 

Overall, the two system batch jobs are available automatically for the whole system, not specific to a legal entity. Our recommendation is to keep it as default and not tempted to change other parameters like company, created by and run by except for the recurrence of the job. Microsoft recommendation from the official document is to run the job at least once a week.

 

Some other parameters that can be configured are how long the job will run, how many the indexes are rebuilt and DTU threshold. The threshold ensures the job is not run when the system is too busy. For example, with the parameters in the below screenshot, the rebuild index batch job will only run for 120 minutes and target 500 indexes, however, if the DTU level is higher than 50% at the scheduled time, the job will quit early without running to avoid hogging the resources.

You can refer to this official document for more details of this batch job.

Utilizing clean up jobs:

The amount of data can grow quickly over time, and some of these data might no longer be needed. Some intermediate data are used during business transactions, for example, sales or purchases orders. Once the transactions are posted, and the update history might not be needed and ready to be cleared. Each module in D365FO usually has the cleanup jobs that can be run manually or via batch to clear redundant data to free database size.

 

However, the cleanup actions of business modules affect the business data directly so your team should analyze carefully to decide what data is truly not needed before running the jobs. Usually, the jobs lie in each module Periodic/ Periodic tasks tab. A quick search of “clean up/ cleanup” keywords can show you a few examples.

Apart from the business modules cleanup jobs, one super frequent action is running batch jobs. Thus, the batch job history can pill up greatly and affect the future performance of batch jobs. It is recommended to run the batch job history cleanup regularly during off-business hours. You can check out this official document about setting up this cleanup job.

Tuning for Data Import Export jobs:

Data management tasks are common for all business. Usually, businesses have frequent imports and exports jobs. As a result, these jobs might lead to a huge amount of staging data. You should keep that in mind and regularly delete these staging data to reduce database storages as well as enhance DIXF performance. You should find the Staging cleanup feature in Data management workspace.

 

In conclusion, I hope to deliver to a few features of D365FO power that you can utilize to boost your environment’s performance. These points are a general approach to keep your overall system health under control. Depending on your business demands, you might face different scenarios, for example, some specific integration requirements, which as a result, calls for more detailed plannings and approaches of performance testing and tuning.