After extensive testing, I found myself often in a situation where I had to get rid of some of my test objects in order to get back to a “clean” state of my test environment. In cases like this, I have often just:
However, cleaning up a connector space this way involves many manual steps and I was wondering if I could find a method to automate this process and to use ILM do delete the affected objects in my target data source.
In the article called “Understanding Deletions in ILM”, you can find all conceptual information you need to understand how deletions are processed in ILM. I highly recommend reading this article if you haven’t done this yet before you proceed with this section. In essence, if you need to delete an object in a data source from ILM, you need to:
Staging a deletion on an object requires a processing chain in ILM that “removes a link relationship between a metaverse object and a connector space object during the outbound synchronization phase”. You can implement such a processing chain by using an operational management agent. The following picture outlines this scenario:
Using an operational management agent to stage a deletion on an object requires two processing cycles:
The design of the operational management agent is extremely simple. There is no attribute flow required since the objects of this management agent are only used to trigger some object level operations. At a minimum, each object requires an anchor attribute. As you have already learned, we need to link our operational object with the managed metaverse object as outlined in the following picture:
This means, the only synchronization rule, you need to configure for the operational MA is the join synchronization rule. The most robust join implementation is based on a unique identifier. In case of our scenario, the best attribute value for this purpose is the GUID of the related metaverse object. This means that the anchor attribute, which is in our scenario called “ID”, should have the same value as the metaverse GUID of the managed object. This configuration enables you to configure a very simple join rule that is based on
With our current design, you can establish robust joins to our managed object as outlined in the following picture:
However, next to joining to managed objects, we also need to delete the metaverse object. The simplest method to do this is to import deletions for the operational objects. When a deletion for an object is imported and synchronized, the following happens:
The following picture outlines this process:
For a data structure that is as simple as the structure of our operational objects, it is sufficient to use a text file as data source. However, there are some considerations in conjunction with file management agents and processing deletions. To import deletions for all objects in a file based management agent, you can’t just delete all objects from the source data file and run a full import on a file MA. ILM interprets an empty data file as broken data source and generates an import error. To get around this, you can add one placebo object to your source data file and delete all objects besides the placebo object to import deletions for all other objects.
A better method to track deletions is to add a delta column to the source file and to populate this attribute with a value that indicates a deletion to ILM. As a positive side effect of this implementation, you can use the same data file for the delta and the full import. The delta column is ignored in case of a full import, which is why the system interprets each record at least as an add.
This method enables you to configure the following run profile sequence on your operational management agent to get the required deletions staged:
For a complete cleanup sequence, you can configure one run profile that contains all four run steps. However, for the initial testing, it is probably better to configure four separate run profile and to verify that the solution works as expected.
To summarize, as operational management agent, you need:
You can configure a management agent like this in 10 minutes; it is really easy.
Designing the operational data source was relatively simple; however, getting the required source data in a supported manner seemed to be a challenge. I was looking for a method I could use for all my scenarios without the need to modify the provisioning code.
When you look at the developer reference, you will find an interesting class called MIIS_CSObject in the WMI Provider Reference. As indicated by the name, this class enables you to extract object information from a connector space. One attribute that is exposed by this class is the “MvGuid” attribute of a metaverse object, which is exactly what I was looking for.
However, how do you get to an MIIS_CSObject? The answer is by querying. The MIIS_CSObject class supports the following, restricted set of queries:
From all these options, only the search with a specified connector space GUID seemed to be applicable for this scenario. However, this query has introduced another problem - the need to provide the connector space GUID of the managed object as search parameter.
ILM provides in CSExport a command-line tool to export objects from the connector space of a management agent in ILM to a file in .xml file format. This tool has some options to filter the exported data. For example, you can configure CSExport to only export disconnectors or to only include the synchronization hologram in the drop file. For more details on CSExport, see the ILM Help.
To summarize, one option to retrieve metaverse GUIDs of connectors in a connector space is to;
At this point, you are aware of a possible implementation strategy to extract metaverse GUIDs in a supported manner. Now, we need to look into an option to automate the process of generating the source data file. With PowerShell, Microsoft has introduced a command-line shell that includes a powerful scripting language. It is very simple to run a command-line tool such as CSExport from a PowerShell script.
To run CSExport, you need:
If the location of ILM is not included in your local path variable, you need to provide the complete path to CSExport. You can calculate the path by reading a registry key. For ILM, this key is:
hklm:\SYSTEM\CurrentControlSet\Services\miiserver\Parameters
This key has a string value called “Path” to store the location of your ILM installation. CSExport is stored in the Bin subfolder. To calculate the CSExport path, you can use the following script code:
set-variable -name RegKey -value "hklm:\SYSTEM\CurrentControlSet\Services\miiserver\Parameters" -option constant $csExportPath = ((Get-ItemProperty "$RegKey").Path) + "Bin\CsExport.exe"
At a minimum, you need to provide the name of the affected management agent as argument for CSExport. To keep things simple, you should also provide the path to the XML drop file, which should:
PowerShell has a special variable called $MyInvocation that stores some information about the script. For example, to retrieve the path of your script, you can read the $MyInvocation.MyCommand.Path property. Using this property, you can calculate the path of the drop file by replacing the “.ps1” suffix with “.xml”. CSExport generates an error if the target XML file already exists. This is why the script code, should delete the file if it exists prior to running CSExport:
$appName = ($MyInvocation.MyCommand.Path).substring(0,($MyInvocation.MyCommand.Path).length - 4) if(test-path "$appName.xml") {remove-item "$appName.xml"}
When PowerShell has finished running CSExport, you should examine the ExitCode parameter to see whether an error has occurred. All returned values other than 0 indicate an error. In the script, I have added a special handler for an incorrect management agent name since this was the most common error. In terms of parameters that define how PowerShell should run this CSExport, I’ve configured the script to hide command-line shell when running the tool. You can remove the related settings if you don't need them.
The next step is to process the XML drop file. In the script, I’m only processing objects of a specific type.
$obList = $xmlDoc."cs-objects"."cs-object" | where-object { $_."object-type" -eq $ObjectType}
For each matching object, the script issues a WMI query to ILM to retrieve the MvGuid attribute of the object. If the attribute has a value, the script writes it into a data file:
$mvguid = (get-wmiobject -namespace "$NameSpace" -computer "." -query "$WmiQuery='$($_.id)'").MvGuid if($mvguid.length -gt 0) {..}
The following script code block shows the complete script to generate the source data file for the operational management agent:
#-------------------------------------------------------------------------------------------------------------------- set-variable -name MaName -value "<MA NAME>" -option constant set-variable -name ObjectType -value "Person" -option constant set-variable -name RegKey -value "hklm:\SYSTEM\CurrentControlSet\Services\miiserver\Parameters" -option constant set-variable -name NameSpace -value "root\MicrosoftIdentityIntegrationServer" -option constant set-variable -name WmiQuery -value "Select * from MIIS_CSObject where Guid" -option constant #-------------------------------------------------------------------------------------------------------------------- $appName = ($MyInvocation.MyCommand.Path).substring(0,($MyInvocation.MyCommand.Path).length - 4) if(test-path "$appName.txt") {remove-item "$appName.txt"} if(test-path "$appName.xml") {remove-item "$appName.xml"} #-------------------------------------------------------------------------------------------------------------------- write-host "Starting process:" write-host "- Exporting data, please wait" $csExportPath = ((Get-ItemProperty "$RegKey").Path) + "Bin\CsExport.exe" $startinfo = new-object diagnostics.processstartinfo $startinfo.filename = $csExportPath $startinfo.arguments = """$MaName"" ""$appName.xml"" /o:b" $startinfo.UseShellExecute = $false $startinfo.CreateNoWindow = $true $process=[Diagnostics.Process]::Start($startinfo) $process.WaitForExit() if($process.ExitCode -Eq -2146232832) {throw (new-object Exception "Management agent not found")} if($process.ExitCode -Ne 0) {throw (new-object Exception "Csexport Eror: $process.ExitCode")} #-------------------------------------------------------------------------------------------------------------------- write-host "- Retrieving data, please wait" """ID"",""ChangeType""" | out-file -filepath "$appName.txt" -encoding "ASCII" #-------------------------------------------------------------------------------------------------------------------- [Xml]$xmlDoc = get-content "$appName.xml" $obList = $xmlDoc."cs-objects"."cs-object" | where-object { $_."object-type" -eq $ObjectType} $i = 1 $matches = 0 $obList | foreach-object -process { $x = ([int]($i/$obLIst.length * 100)) write-progress -activity "Retrieving data" -status "Please wait" -percentcomplete $x ` -currentoperation "$i of $($obLIst.length) objects processed" $mvguid = (get-wmiobject -namespace "$NameSpace" -computer "." -query "$WmiQuery='$($_.id)'").MvGuid if($mvguid.length -gt 0) { """$mvguid"",""Delete""" | out-file -filepath "$appName.txt" -append -encoding "ASCII" $matches++ } $i++ } write-host "- Number of objects added to the data file: $($matches)" #-------------------------------------------------------------------------------------------------------------------- write-host "Command completed successfully" #-------------------------------------------------------------------------------------------------------------------- trap [Exception] { Write-Host "`nError: $($_.Exception.Message)`n" -foregroundcolor white -backgroundcolor darkred Exit } #--------------------------------------------------------------------------------------------------------------------
To implement the solution outlined in this article, you need to perform the following steps:
The following sections provide more details about these steps.
The heart of this solution is the source data for the operational management agent. In this article, you find the script code to generate this data. You should run this script first and determine whether it creates the desired results.
All, you need to do on the target management agent is to configure the deprovisioning synchronization rule to “Stage a delete on the object for the next export”.
In the section called "Designing the operational management agent", I have already outlined all you need to know about the configuration of the operational management agent. To recap, you need:
The object deletion rule need to be configured to delete a metaverse object, when a connector from the operational management agent is disconnected:
As mentioned earlier, this is necessary to get the deprovisioning synchronization rule triggered on your target object.
To test a cleanup sequence, you need to have the following run profiles configured:
The configuration of run profiles is a very easy task. For the full import and the delta import on the operational management agent, you use the same source data file. Before configuring the run profiles, you should copy the data file to the MaData folder of your operational management agent.
To perform a cleanup sequence, you need to perform the following steps:
With the solution outlined in this article, you have a basic framework to cleanup a connector space in a lab environment. When you have tested a cleanup sequence successfully, you should look into options to fine tune the implementation. For example, when you have verified that the script to generate the data file works as expected, you could move the script to the MaData folder of you operational management agent and extend it with script code to start the run profiles. That way, you can fully automate the process of cleaning up your connector space. It is also advisable to filter the processed objects more granularly.
The solution to cleanup a connector space in a lab environment I have outlined in this article is definitely not meant to be a silver bullet. The suggested process has some limitations and can take a while to complete. This is why you should first run the provided script to determine whether the implementation works for you. However, since it is possible to fully automate the cleanup process, this implementation can free up some time and might also help to motivate you to do more testing.
For more information, please see:
Fernando Lugão Veltem edited Original. Comment: added toc and tags