Currently when copying PROD data to a lower environment like a Tier2 sandbox or a cloud hosted dev box, there is no standard way of how the data copied will be cleansed and how PROD specific endpoints will be updated to point to TEST endpoints. Some customers do it manually(yup!), some using a SQL script(requires JIT access SQL if doing on a tier2 environment) and some using a custom service.
In this post, I will solely be talking about how I was able to accomplish this with a custom service which can be called from an Azure Devops pipeline using a PowerShell script using D365.Integrations tools. This will be a two part process:
Develop a Custom Service in D365
Note that here I will not talk about how to create a customer service in Dynamics 365 Finance. For that Peter has done a splendid job explaining how to accomplish that in his blog post here Create a Custom Service in D365
When creating this service and before running the code to take request object to perform actions, a validation should be added so that the code in this service is never accidently run in a PROD environment.
Below is an example of the service class showing how I did it.
using Microsoft.Dynamics.ApplicationPlatform.Environment;
public class ABCDataCleanupServiceClass
{
public ABCDataCleanupResponse runCleanUpTasks(ABCDataCleanupRequest _request)
{
CustParameters custParametersTable;
var response = new ABCDataCleanupResponse();
const str tier2sandbox = "sandbox.operations.dynamics.com";
const str axcloudmachine = "axcloud.dynamics.com";
try
{
//Start with validating that code only runs in a non-PROD environment
IApplicationEnvironment env = EnvironmentFactory::GetApplicationEnvironment();
str currentUrl = env.Infrastructure.HostUrl;
if (strContains(currentUrl, tier2sandbox) || strContains(currentUrl, axcloudmachine))
{
//Code goes here
changecompany(_request.parmDataAreaId())
{
try
{
select forupdate custParametersTable;
custParametersTable.MandatoryTaxGroup = _request.parmMandatoryTaxGroup();
ttsbegin;
custParametersTable.update();
ttscommit;
response.parmDebugMessage("Param updated");
response.parmSuccess(true);
}
catch (Exception::CLRError)
{
System.Exception interopException = CLRInterop::getLastException();
response.parmSuccess(false);
response.parmErrorMessage(interopException.ToString());
}
}
}
}
catch (Exception::CLRError)
{
throw error('This a production environment');
}
return response;
}
}
Create an Azure DevOps Pipeline
In an Azure devops pipeline, create a new task of type PowerShell to install required tools. See script below:
Install-PackageProvider nuget -Scope CurrentUser -Force -Confirm:$false
Write-host "NUGET INSTALLED"
Install-Module -Name AZ -AllowClobber -Scope CurrentUser -Force -Confirm:$False -SkipPublisherCheck
Write-host "AZ INSTALLED"
Install-Module -Name d365fo.integrations -AllowClobber -Scope CurrentUser -Force -Confirm:$false
Write-host "D365.INTEGRATIONS INSTALLED"
Create a separate PowerShell task or use the same as per your need to call the custom service using Invoke-D365RestEndpoint. Note that before this custom service can be accessed by the script, enable the Azure App ClientID in D365 by going to Azure Active Directory Applications and assign it to a service account with system administrator role. To register the app in Azure, this blog has step by step instructions on how to do it. App Registration in Azure
Finally, below is an example for the script that has the request parameters in form of a JSON and passed as Payload to the function Invoke-D365RestEndpoint
$Payload = '{
"_request" :
{
"DataAreaId": "USRT",
"MandatoryTaxGroup": "true"
}
}'
$oDataToken = Get-D365ODataToken -Tenant $Tenant -Url $D365EnvironmentURL -ClientId $ClientId -ClientSecret $ClientSecret #-RawOutput -EnableException
Invoke-D365RestEndpoint -ServiceName "ABCDataCleanupServiceGroup/ABCDataCleanupService/runCleanUpTasks" -Payload $Payload -Tenant $Tenant -Url $D365EnvironmentURL -ClientId $ClientId -ClientSecret $ClientSecret -Token $oDataToken
Payload variable here contains parameters that will be passed to the custom service in D365 as a JSON
D365EnvironmentURL here will be the D365 environment where these tasks are to be performed
oDataToken here is being generated for D365 URL using the function Get-D365ODataToken included in d365.integations tools
Once the script is run in this example, it would have updated this parameter

Are you doing something similar or have any suggestions to improve this further? Let me know in comments.