Dynamics CRM

Dynamics CRM front-end server deployment to replace corrupted server

This blog post is about remote configuration of settings in Windows server environment related to the Dynamics CRM front-end server installation.

Scenario

I recently ran into a situation where on-premise Dynamics CRM front-end server was corrupted and none of the Windows management tools were accessible on that machine. For example, event viewer, Windows services controller, MMC, IIS management console, CRM deployment manager etc. did not start at all. However, the Dynamics CRM services were still running properly on this server. The deployment model in this environment was such where all the front-end server roles were installed to this corrupted server and the CRM DB’s were on a separate server. The SQL server itself and the CRM DB’s were ok without any issues. The CRM environment was configured for claims-based authentication in IFD-mode.

So, the task here was to install all the CRM front-end services to a fresh Windows server machine.

SSL certificate

I started the Dynamics CRM installation wizard by pointing to an existing CRM deployment. When that option is used, the installation wizard gets the existing CRM deployment configuration data from the CRM configuration DB and assumes certain settings and configuration options to be the same in the new front-end server installation than in the old one. One of these options is the SSL certificate. The Environment Diagnostics Wizard (EDW) threw an error stating that existing claims-based authentication is configured to be using a certain SSL certificate and the same must be deployed to the new front-end server. Before running the EDW, I had deployed another, more recent SSL certificate on the new server. We were able to retrieve the same older SSL certificate from another server where the same was being used so luckily that issue got resolved.

Claims-based authentication and IFD

The next challenge was related to the claims-based authentication and IFD. As mentioned earlier, this is rather simple Dynamics CRM server deployment when thinking about server topology. The ADFS service was deployed also to the same, corrupted front-end server. This meant that none of the ADFS management tools were accessible either on that server. After the initial SSL certificate issue was resolved, the next error that EDW threw was related to the claims-based authentication, “The encryption certificate cannot be accessed by the CRM service account”:

Dynamics CRM

My first instinct was that hey, most likely the CRM service account does not have read privileges to the private key of the SSL certificate. But it turned out that it is not the issue here. Rather this error is due to some type of an issue in CRM server installation that when installing a new front-end server to an existing CRM deployment, the IFD and claims-based authentication need to be disabled first. Then the CRM server installation can be done and afterwards, the claims-based authentication and IFD can be activated again.

It would be two mouse clicks to disable the IFD and claims-based authentication of Dynamics CRM if the CRM deployment manager tool would be available to use. But as I mentioned in the beginning, this was not the case here. None of these types of tools were available on the corrupted server.

PowerShell to the rescue

After a bit of head scratching, I realized that I can use PowerShell to disable the IFD and claims-based authentication. But PowerShell did not start either on the corrupted server. However, good old PowerShell can be used remotely as well. It requires just enabling the PowerShell remoting. This can be done by various tools, for example on the server locally (for obvious reasons not an option here in this case), by using group policy or directly by using PowerShell Direct if your server platform is Windows Server 2016 or Windows Server 2019. But in my case here, the server platform was Windows Server 2012. For that, there is a tool called PsExec which is a Microsoft’s free remote-control tool: https://docs.microsoft.com/en-us/sysinternals/downloads/psexec

So, I downloaded PsExec and within a seconds, I had PowerShell remoting enabled on the corrupted server by executing the following piece of script:

psexec.exe \\RemoteComputerName -s powershell Enable-PSRemoting -Force

You need to have certain firewall port open for this to work. I will not get into opening firewall ports remotely here but depending from the firewall provider, naturally that can be done.

Disable claims-based authentication remotely

So how to start a remote PowerShell session? Quite easily, just execute the following script and you have a remote session started:

$s = New-PSSession -ComputerName <the remote server name>

Enter-PSSession -Session $s

And now you have a remote session where you can for example browse directories of the remote server and execute scripts on the remote server.

The rest is just like sliding in the water park during the hot summer months: easy and fun. You need to be in the Deployment Administrator role in the Dynamics CRM and next register the Dynamics PowerShell snap-in:

Add-PSSnapin Microsoft.Crm.PowerShell

One more thing you need to do if it turns out that in the old CRM server, the Windows registry setting of Dynamics Deployment Web Service Uri “DeploymentWSUri” registry key is not set (as it was here in my case). As the regedit tool did not work on the corrupter server, I needed to connect to the server’s Windows registry remotely. Luckily the regedit tool gives you this possibility and to configure this missing piece of registry key, connect to your old server’s Windows registry, open the following registry hive: \HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\MSCRM and add a new key of type string with the following value:

http://yourserver/xrmdeployment/2011/deployment.svc

Now you are ready to rock with the PowerShell and Dynamics CRM cmdlets. So, to check the current claims-based authentication settings, the following command can be used:

Get-CrmSetting -SettingType “ClaimsSettings”

That will show you a list of settings related to the claims-based authentication:

Next, you’ll execute the following piece of command:

$claims = Get-CrmSetting -SettingType “ClaimsSettings”

$claims.Enabled = 0

Set-CrmSetting $claims

Now the claims-based authentication should be disabled:

Dynamics CRM

Finally the installation of CRM

Now you are good to go and the EDW should pass all the tests without any errors. But you do need to restart the Dynamics CRM installation wizard from the beginning if you had it up and running while doing the above stuff. Just clicking back and forward to launch the EDW step of the wizard again does not do the trick.

Once the installation is completed and after patching the new CRM front-end server to the latest update level, your CRM adventures can continue with the brand-new server up and running.

I hope this blog post will help someone perhaps in a similar situation struggling with non-existing Windows tools and trying to complete things remotely.

How to run an SSIS package with Excel data source or destination in 64-bit environment?

So, I had a following scenario for one of our customers:

  • Need to execute an SSIS package with Excel and Dynamics 365 data sources and push the data over to Azure SQL DB
  • In the dev environment, the BIDS is 32-bit

I had actually a few different types of challenges in deploying the package to the production server from development environment. It took me a while to find out a solution to these, so I thought that it might be helpful for others struggling with the same issues to write out a small blog post.

How the data source and destination sensitive information gets deployed with the SSIS package?

This is configured in the dev environment in BIDS. It is basically a project option that needs to be set (EncryptSensitiveWithPassword):

You need to make sure also that the SSIS package level option is set to the same option. What this does is that it includes the sensitive information (for example the data source and destination connection string passwords) to the SSIS package but all that is protected with a password. Then in the execution server side, where you execute this package for example with SQL Agent job, you need to provide this password to be able to see or modify the connection options.

What does the project level connection manager mean in SSIS?

The next challenge that I had in this one was that I had a project level connection manager of Dynamics 365 specified in the SSIS project. This means that data connections using this type of connector do not get included to the SQL Agent job when you specify the SSIS package to be executed. What you need to do is that you specify the connection manager to be a package level instead of project level. This is done in the BIDS by right-clicking the connection manager and setting the “Convert to Package Connection” option. By doing this, all the connections using this connection manager are also used in the execution server side.

How to manage with excel data connections in 64-bit server environment?

When I deployed the SSIS package to our production server and created a SQL Agent job which is going to execute the package in a scheduled manner, it started to throw errors of these excel data sources. In detail the error was “The requested OLE DB provider Microsoft.Jet.OLEDB.4.0 is not registered”:

The resolution is that you need to install Microsoft Access Database engine to the server and then set the SQL Agent job to be run in 32-bit mode. You can find the Access DB engine download package here: https://www.microsoft.com/en-us/download/details.aspx?id=13255

And at least in our case, we needed to install the 32-bit version of the Access DB engine to make this work. I believe it is due to the fact that as the BIDS is 32-bit, then it builds the SSIS package to be 32-bit as well. Another step to success was to set the “Use 32-bit runtime” option:

With these options set, the package was executed successfully and data flows from excel files to Dynamics 365.

By the way, absolutely the easiest way to implement these types of scenarios against Dynamics 365 is to use the KingswaySoft Dynamics 365 SSIS Integration Toolkit. I have used it in several projects and it is by far the best Dynamics migration/integration tool I have used so far if you want to develop a no-code migration against Dynamics 365. So, I strongly recommend that.

Platform Economy is not only the Game of Big Players

Once again, the room is packed as #digitalist gathered in downtown Helsinki to learn about the revolution of digitalization. For the first time, session´s primary language was English although much of the attention still concentrated to the home grounds of the movement. As Finns are known to be engineer-minded gadget-loving people, it is only natural that the IoT seminar received more attention than earlier sessions this year.

 

However, perhaps shockingly to some, the message was quite the opposite. Kemira Vice President Charlotte Nyström closed the day concluding it well: It is not about the technology, it is not about the IoT – it is about the culture. Statement, which was strongly backed by the presentation explaining how IoT has enabled Kemira’s shift from chemical provider to plant process operator as a service, outsourcing large part of the customer’s value chain.

 

“It is not about the technology, it is not about the IoT – it is about the culture”

 

But let’s go back to the beginning of such stories. Morning opening presentation was given by Telia’s Brendan Ives. His claim was that although much of IoT enabling technologies are done by the global giants, implementations are still local. He made good points on having tech-savvy Nordic countries as sandbox regarding the larger implementations. As markets are getting more mature the solutions need to get smarter. All layers of the stack must have open service interfaces, reminded Matti Seppä from Landis+Gyr. Company that was one of the IoT pioneers providing smart boxes for electricity companies already 20 years ago. The simple fact is that Today no player can fulfill the needs of all individuals. Therefore, even the pioneers must open their platforms for other players in the ecosystem.

 

One great example of such thinking is the story of Fredman Group – thanks to whom even kitchens have a story to tell. This is made possible with IoT, but getting there is not about technology, it is about new way of thinking and measuring the value.

 

The simple fact is that Today no player can fulfill the needs of all individuals

 

Once a company famous for its quality plastic and paper accessories to kitchens, impressed many believers of modern management culture back in 2015. Company CEO Peter Fredman stated at the same #digitalist arena that their organization hierarchy is turned upside down. On the top of the heap sat the customer and CEO acted as a janitor supporting the organization to provide the best possible value for the user. Although they were not quite sure of all the steps, instead of just concentrating into a tiny piece of food creation process, they set the goal to fight for the best flavors.

 

Many in the audience were confused of such story in a very technology-oriented seminar. However, the point was that company was set to design the value chain of how food was created. Similarly, to customer experience, they wanted to optimize the ingredient experience in order to create a perfect kitchen. Naturally, it is not a task for a local plastic wrapping player in Finland, but certainly doable by combining technology-enabled insights, know-how and professional networks.

 

After proving their point in bringing more intelligence into kitchen management through IoT, it is only natural to step up the gear. Having a perfect meal on a plate is a much broader problem than just cooking the dish. There is whole ecosystem of equipment providers, logistics managers, storage regulations, quality requirements, etc. involved. For that to function with minimum friction, there is a need for an ecosystem platform. A place where new insights, know-how and professional networks can be brought together in an economically feasible way. A marketplace for critical vendors along the path of ingredient experience. A service for individual people to learn new skills. And a single point of information for monitoring operational excellence.

 

Such a platform must be created based on globally dominant technologies. However, the innovation, culture shift, and specific value promise are created and sandboxed locally… Until, the platform is given a chance to expand into global markets creating a new layer of value for all the players in a very traditional industry. And perhaps even disturbing the ecosystem for good.

Kilpailukyvyn kulmakivet

Työtuntien tuottavuus ja määrä ratkaisevat pelin. Suomessa tehty kilpailukykysopimus kannusti kasvattamaan työtuntien määrää. Uskoakseni kuitenkin varsin suuri osa ns. korkean jalostusarvon yrityksistä päätti hyödyntää sopimusta ennen kaikkea työtunnin tuottavuutta parantavilla tavoilla. Tämä oli myös Cloudrivenin valinta.

Uskomme tuottavampien työtuntien salaisuuden tiivistyvän seuraaviin elementteihin:

  1. Ihmisten työ suuntautuu mahdollisimman suoraan ja suurelta osin yrityksen tuotteista tai palveluista maksaville asiakkaille.
  2. Ihmiset osaavat hyödyntää informaatiota ja teknologiaa suuremman asiakasarvon tuottaakseen sekä yksin että yhdessä
  3. Ihmiset voivat hyvin

Asiakkaille suuntautuvan työn määrän tulee luonnollisesti olla varsin korkea riippumatta roolista. Jos tuotekehitys ei ymmärrä asiakkaan tapaa käyttää sovellusta, ei sovelluksesta kovin kummoista synny. En ole myöskään kuullut sellaisesta laskutettavasta projektityöstä, asiakaspalvelutyöstä tai markkinointi- ja myyntityöstä, joka asiakkaasta irrallaan muodostuisi kovinkaan tuottavaksi. Kaikki asiakastyö ei luonnollisesti ole suoraan laskutettavaa, mutta hyvin toteutettuna kaikki asiakastyö voi tuottaa nähtävissä olevassa tulevaisuudessa laskutusta. Cloudrivenin tapa huolehtia asiakkaille arvoa tuottavasta työstä tiivistyy luottamukseen, viikoittaisiin johtamistilanteisiin ja keskeisten informaatiovirtojen saattamiseen jokaisen Cloudriveniläisen viikon osaksi. Kiitos Eero Markelinille, joka kirjoitti kokemuksistaan päivän työvaihdosta Cloudrivenillä.

Ei tarvitse olla kummoinen yliopistotutkija, että ymmärtää informaation ja teknologian hyödyntämiskyvyn parantavan yksilön kykyä tuottaa arvoa asiakkaalle. Helsingin Sanomat julkaisi mainion artikkelin tuottavuudesta rakennusalalla. Alan teknologia on kehittynyt ja kehittyy edelleen, mutta tuottavuudessa ei ole artikkelin mukaan kovin kummoisia loikkia otettu viimeiseen 40 vuoteen. Ei palveluillakaan kovin kehuttavasti artikkelin mukaan mennyt. Minusta olennaisin viesti artikkelissa on, että tuottavuutta kehittääkseen on huomioitava 1) yhteistyö ja informaatiovirrat, 2) kyky hyödyntää teknologiaa  ja 3) johtaminen. Kaikki kolme osa-aluetta ovat keskeisessä osassa meidän toiminnassamme, kuten jo Eeron blogista kävi ilmi. Teknologian hyödyntämisen osalta käytämme useita toisiaan tukevia keinoja, joiden tavoitteena on läpi organisaation parantaa kykyämme tehdä tuottavammin töitä. Yksi näistä on markkinoilla hyvän vastaanoton saanut TrainEngage –palvelumme, joka ohjaa ihmisiä hyödyntämään selainkäyttöisiä sovelluksia järkevästi.

Viimeisenä, mutta ei vähäisimpänä, on ihmisten hyvä vointi. Ihmiskäsityksemme on holistinen ja moni käytäntömme tähtää siihen, että Cloudrivenillä olisi hyvä tehdä töitä. Sen lisäksi hyödynnämme alussa esille nostettua kilpailukykysopimusta yksilön valinnan mukaan joko liikuntaan tai uuden oppimiseen.

Näiden painopisteiden myötä Cloudrivenin alkuvuosi on sujunut mielestämme tyydyttävällä tasolla. Myyntimme on kasvanut verrokkivuodesta reilu 70 % ja liikevaihtokehityskin näyttää myönteiseltä. Luonnollisesti sijoitamme kasvuumme merkittävästi, mutta emme kassan kantokykyä enempää. Taloudellisista tuloksista kiittäminen on paitsi valintojamme, niin erityisesti työntekijöitämme ja asiakkaitamme. Tyytyväiset tekijät tuottavat asiakkaille arvoa.

 

How to set Dynamics 365 related entity fields using quickview forms

Challenge

You have a form in Dynamics 365 in which you need to show field values from related entity. It is naturally possible to do this by making a request to the Dynamics server side API’s from client side code. There are a few possibilities to that but I will not focus on those in this blog post.

Instead of server side API requests, there is a simpler way to utilize Dynamics quickview forms and JavaScript. This way you don’t need to do any API requests in code.

The scenario could be such where you have for example a Dynamics 365 Case entity form, and you’d need to show the related Product and Account entity values in lookup fields. These Product and Account lookup fields would get loaded when a Customer Asset entity lookup field value gets changed. This would mean that when user selects a customer asset for a case, then product and customer information would get loaded automatically to the form without user’s having to select them manually from individual lookup fields. By just using quickview form, the fields in it are read-only. So that’s why you’d need to have separate Account and Product lookup fields in Case entity form into which you set the values through JavaScript from the quickview form.

Solution

Below are steps how to do this:

  1. Create one quickview form to the Customer Asset entity. Add two Customer Asset entity fields into this quickview form:
    1. Account (lookup)
    2. Product (lookup)
  2. Insert that quickview form to the Case entity form and set the quickview form properties so that the name of the form is “CustomerAssetProductCustomer” and then set the data source properties of that quickview form according to your field names in the CRM form. You can set the quickview form not be visible by default.
  3. Add a small piece of JavaScript code to the OnChange event of the Customer Asset lookup field on the Case entity form. This JavaScript will fetch the Account and Product values from quickview form and set those values to the corresponding lookup fields on the Case form. While I was testing this scenario, I noticed that without adding a small delay to the JavaScript function processing, the quickview form was not loaded yet when the JavaScript got processed and that’s why it always set incorrect Account and Product values to the lookup fields. Those were the Account and Product lookup values from the previously selected Customer Asset. By adding a half a second delay, the solution started working properly.

function populateFieldsFromCustomerAssetRecord()

{

if(Xrm.Page.getAttribute(“yourcustomerassetlookupfield”).getValue())

{

setTimeout(populateFields, 500);

}

}

 

function populateFields()

{

var quickViewControl = Xrm.Page.ui.quickForms.get(“CustomerAssetProductCustomer”);

if (quickViewControl)

{

if(quickViewControl.isLoaded())

{

var product = quickViewControl.getControl(“msdyn_product”).getAttribute().getValue();

var account = quickViewControl.getControl(“msdyn_account”).getAttribute().getValue();

Xrm.Page.getAttribute(“productid”).setValue(product);

Xrm.Page.getAttribute(“customerid”).setValue(account);

}

}

}


At Cloudriven, we help organizations in every step of the Dynamics 365 projects.  If you require any help just contact us. We are here for you !

Contact Form

Want to know more about our products or services? Fill out the form and we'll be in touch as soon as possible.