A method to cleanup a connector space in a lab environment

A method to cleanup a connector space in a lab environment



Testing a configuration in a lab environment is very important.
Prior to bringing a solution into production, you should make sure that it works by testing it with a representative set of objects.
Next to ideal test data, you should also include miss-configured objects into your test procedure to make sure that you are not running into bad surprises later in your production environment.
After extensive testing, you may have objects in a connector space you need to get rid of.
There are several options to accomplish this.
In this article, I’m discussing an implementation I have developed to automate the cleanup process in a lab environment.

Conceptual background information

After extensive testing, I found myself often in a situation where I had to get rid of some of my test objects in order to get back to a “clean” state of my test environment.
In cases like this, I have often just:

  • Deleted the content of the target connector space
  • Deleted the objects in the target data source
  • Redeployed my test objects

However, cleaning up a connector space this way involves many manual steps and I was wondering if I could find a method to automate this process and to use ILM do delete the affected objects in my target data source.

In the article called “Understanding Deletions in ILM”, you can find all conceptual information you need to understand how deletions are processed in ILM.
I highly recommend reading this article if you haven’t done this yet before you proceed with this section.
In essence, if you need to delete an object in a data source from ILM, you need to:

  1. Stage a deletion on the affected object
  2. Export the deletion request to your data source
  3. Import the deletion from the target data source

Staging a deletion on an object requires a processing chain in ILM that “removes a link relationship between a metaverse object and a connector space object during the outbound synchronization phase”.
You can implement such a processing chain by using an operational management agent.
The following picture outlines this scenario:

Using an operational management agent to stage a deletion on an object requires two processing cycles:

  1. In the first cycle, the object from the operational management agent needs to join to the related metaverse object.
  2. In the second cycle, the operational object needs to delete the metaverse object to get the deletion staged on the target object.

 

Designing the operational management agent

The design of the operational management agent is extremely simple.
There is no attribute flow required since the objects of this management agent are only used to trigger some object level operations.
At a minimum, each object requires an anchor attribute.
As you have already learned, we need to link our operational object with the managed metaverse object as outlined in the following picture:

This means, the only synchronization rule, you need to configure for the operational MA is the join synchronization rule. The most robust join implementation is based on a unique identifier. In case of our scenario, the best attribute value for this purpose is the GUID of the related metaverse object. This means that the anchor attribute, which is in our scenario called “ID”, should have the same value as the metaverse GUID of the managed object. This configuration enables you to configure a very simple join rule that is based on

  • ID as Data source attribute
  • Direct as Mapping type
  • Any as Metaverse object type

With our current design, you can establish robust joins to our managed object as outlined in the following picture:

However, next to joining to managed objects, we also need to delete the metaverse object.
The simplest method to do this is to import deletions for the operational objects.
When a deletion for an object is imported and synchronized, the following happens:

  1. The operational object is deleted in the connector space, which removes the link to the metaverse object
  2. The link removal during the inbound synchronization phase triggers the object deletion rule
  3. The object deletion rule deletes the metaverse object, which removes the link to the managed object
  4. The link removal during the outbound synchronization phase triggers the deprovisioning synchronization rule
  5. The deprovisioning synchronization rule stages a deletion on the managed object

The following picture outlines this process:

For a data structure that is as simple as the structure of our operational objects, it is sufficient to use a text file as data source.
However, there are some considerations in conjunction with file management agents and processing deletions.
To import deletions for all objects in a file based management agent, you can’t just delete all objects from the source data file and run a full import on a file MA.
ILM interprets an empty data file as broken data source and generates an import error.
To get around this, you can add one placebo object to your source data file and delete all objects besides the placebo object to import deletions for all other objects.

ImportantImportant
In general, it is a bad practice to treat a "not existing" object as an indicator of a deletion.
While connector space objects that have not been reported by the data source are deleted during a full import, this is feature was implemented to ensure data consistency - not to track deletions.

A better method to track deletions is to add a delta column to the source file and to populate this attribute with a value that indicates a deletion to ILM.
As a positive side effect of this implementation, you can use the same data file for the delta and the full import.
The delta column is ignored in case of a full import, which is why the system interprets each record at least as an add. 

This method enables you to configure the following run profile sequence on your operational management agent to get the required deletions staged:

  1. Full Import
  2. Full Synchronization
  3. Delta Import
  4. Delta Synchronization

For a complete cleanup sequence, you can configure one run profile that contains all four run steps.
However, for the initial testing, it is probably better to configure four separate run profile and to verify that the solution works as expected.

To summarize, as operational management agent, you need:

  • A Delimited text file management agent.
  • A source data file with two columns:
    • ID – this is the anchor attribute that has the same value as the GUID of the managed metaverse object.
    • ChangeType – this is the change type attribute used to indicate deletions for the processed object.
  • A join synchronization rule that links an operational object with a metaverse object that has the anchor value as object-id.

You can configure a management agent like this in 10 minutes; it is really easy.

Defining the implementation strategy to generate the source data file

Designing the operational data source was relatively simple; however, getting the required source data in a supported manner seemed to be a challenge.
I was looking for a method I could use for all my scenarios without the need to modify the provisioning code.

When you look at the developer reference, you will find an interesting class called MIIS_CSObject in the WMI Provider Reference.
As indicated by the name, this class enables you to extract object information from a connector space.
One attribute that is exposed by this class is the “MvGuid” attribute of a metaverse object, which is exactly what I was looking for.

However, how do you get to an MIIS_CSObject?
The answer is by querying.
The MIIS_CSObject class supports the following, restricted set of queries:

  • Search for a connector space object with a specified GUID.
  • Search for a connector space object joined to a metaverse object with a specified metaverse GUID.
  • Search for a connector space object with a specified domain and account name in Active Directory, a global address list, or a Windows NT domain.
  • Search for a connector space object with a specified domain and user principal name in Active Directory, a global address list, or a Windows NT domain.

From all these options, only the search with a specified connector space GUID seemed to be applicable for this scenario.
However, this query has introduced another problem - the need to provide the connector space GUID of the managed object as search parameter.

ILM provides in CSExport a command-line tool to export objects from the connector space of a management agent in ILM to a file in .xml file format.
This tool has some options to filter the exported data.
For example, you can configure CSExport to only export disconnectors or to only include the synchronization hologram in the drop file.
For more details on CSExport, see the ILM Help.

To summarize, one option to retrieve metaverse GUIDs of connectors in a connector space is to;

  • Use CSExport to generate a XML drop file, which contains the connector space GUIDs of the objects of interest.
  • Use a WMI based query to extract the metaverse GUID of the managed objects.

 

Implementing the strategy to generate the source data

At this point, you are aware of a possible implementation strategy to extract metaverse GUIDs in a supported manner.
Now, we need to look into an option to automate the process of generating the source data file.
With PowerShell, Microsoft has introduced a command-line shell that includes a powerful scripting language.
It is very simple to run a command-line tool such as CSExport from a PowerShell script.

To run CSExport, you need:

  • The path to the tool
  • The arguments to run this tool
  • Some parameters to define how PowerShell should run this tool

If the location of ILM is not included in your local path variable, you need to provide the complete path to CSExport.
You can calculate the path by reading a registry key.
For ILM, this key is:

hklm:\SYSTEM\CurrentControlSet\Services\miiserver\Parameters

This key has a string value called “Path” to store the location of your ILM installation.
CSExport is stored in the Bin subfolder.
To calculate the CSExport path, you can use the following script code:

set-variable -name RegKey -value "hklm:\SYSTEM\CurrentControlSet\Services\miiserver\Parameters" -option constant 

$csExportPath = ((Get-ItemProperty "$RegKey").Path) + "Bin\CsExport.exe" 

At a minimum, you need to provide the name of the affected management agent as argument for CSExport.
To keep things simple, you should also provide the path to the XML drop file, which should:

  • have the same name as the script
  • be stored in the folder of the script

PowerShell has a special variable called $MyInvocation that stores some information about the script.
For example, to retrieve the path of your script, you can read the $MyInvocation.MyCommand.Path property.
Using this property, you can calculate the path of the drop file by replacing the “.ps1” suffix with “.xml”.
CSExport generates an error if the target XML file already exists.
This is why the script code, should delete the file if it exists prior to running CSExport:

$appName = ($MyInvocation.MyCommand.Path).substring(0,($MyInvocation.MyCommand.Path).length - 4)

if(test-path "$appName.xml") {remove-item "$appName.xml"}

When PowerShell has finished running CSExport, you should examine the ExitCode parameter to see whether an error has occurred.
All returned values other than 0 indicate an error.
In the script, I have added a special handler for an incorrect management agent name since this was the most common error.
In terms of parameters that define how PowerShell should run this CSExport, I’ve configured the script to hide command-line shell when running the tool.
You can remove the related settings if you don't need them.

The next step is to process the XML drop file.
In the script, I’m only processing objects of a specific type.

$obList = $xmlDoc."cs-objects"."cs-object" | where-object { $_."object-type" -eq $ObjectType} 

For each matching object, the script issues a WMI query to ILM to retrieve the MvGuid attribute of the object.
If the attribute has a value, the script writes it into a data file:

$mvguid = (get-wmiobject -namespace "$NameSpace" -computer "." -query "$WmiQuery='$($_.id)'").MvGuid
if($mvguid.length -gt 0) {..}

The following script code block shows the complete script to generate the source data file for the operational management agent:

#--------------------------------------------------------------------------------------------------------------------
 set-variable -name MaName     -value "<MA NAME>"  -option constant 
 set-variable -name ObjectType -value "Person" -option constant
 set-variable -name RegKey     -value "hklm:\SYSTEM\CurrentControlSet\Services\miiserver\Parameters" -option constant 
 set-variable -name NameSpace  -value "root\MicrosoftIdentityIntegrationServer" -option constant
 set-variable -name WmiQuery   -value "Select * from MIIS_CSObject where Guid" -option constant
#--------------------------------------------------------------------------------------------------------------------
 $appName = ($MyInvocation.MyCommand.Path).substring(0,($MyInvocation.MyCommand.Path).length - 4)
 if(test-path "$appName.txt") {remove-item "$appName.txt"}
 if(test-path "$appName.xml") {remove-item "$appName.xml"}
#--------------------------------------------------------------------------------------------------------------------
 write-host "Starting process:" 
 write-host "- Exporting data, please wait" 
 $csExportPath = ((Get-ItemProperty "$RegKey").Path) + "Bin\CsExport.exe"  
 $startinfo = new-object diagnostics.processstartinfo 
 $startinfo.filename        = $csExportPath
 $startinfo.arguments       = """$MaName"" ""$appName.xml"" /o:b" 
 $startinfo.UseShellExecute = $false 
 $startinfo.CreateNoWindow  = $true 
 $process=[Diagnostics.Process]::Start($startinfo) 
 $process.WaitForExit()
 if($process.ExitCode -Eq -2146232832) {throw (new-object Exception "Management agent not found")} 
 if($process.ExitCode -Ne 0) {throw (new-object Exception "Csexport Eror: $process.ExitCode")} 
#--------------------------------------------------------------------------------------------------------------------
 write-host "- Retrieving data, please wait" 
 """ID"",""ChangeType""" | out-file -filepath "$appName.txt" -encoding "ASCII"
#--------------------------------------------------------------------------------------------------------------------
 [Xml]$xmlDoc = get-content "$appName.xml"
 $obList = $xmlDoc."cs-objects"."cs-object" | where-object { $_."object-type" -eq $ObjectType} 
 $i = 1
 $matches = 0
 $obList | foreach-object -process {
    $x = ([int]($i/$obLIst.length * 100))
    write-progress -activity "Retrieving data" -status "Please wait" -percentcomplete $x `
                   -currentoperation "$i of $($obLIst.length) objects processed"
    $mvguid = (get-wmiobject -namespace "$NameSpace" -computer "." -query "$WmiQuery='$($_.id)'").MvGuid
    if($mvguid.length -gt 0) {
    """$mvguid"",""Delete""" | out-file -filepath "$appName.txt" -append -encoding "ASCII"
    $matches++
    }
    $i++
 }
 write-host "- Number of objects added to the data file: $($matches)"
#--------------------------------------------------------------------------------------------------------------------
 write-host "Command completed successfully"
#--------------------------------------------------------------------------------------------------------------------
 trap [Exception] 
 { 
    Write-Host "`nError: $($_.Exception.Message)`n" -foregroundcolor white -backgroundcolor darkred
    Exit
 }
#--------------------------------------------------------------------------------------------------------------------
 

Implementing the solution

To implement the solution outlined in this article, you need to perform the following steps:

  1. Generate the source data
  2. Modify the target MA
  3. Configure the operational MA
  4. Modify the object deletion rule
  5. Configure Run Profiles
  6. Perform a cleanup sequence

The following sections provide more details about these steps.

Generating the source data

The heart of this solution is the source data for the operational management agent.
In this article, you find the script code to generate this data.
You should run this script first and determine whether it creates the desired results.

Modifying the target management agent

All, you need to do on the target management agent is to configure the deprovisioning synchronization rule to “Stage a delete on the object for the next export”.

 

Configuring the operational management agent

In the section called "Designing the operational management agent", I have already outlined all you need to know about the configuration of the operational management agent.
To recap, you need:

  • A Delimited text file management agent.
  • A source data file with two columns - ID and ChangeType:

  • ID – this is the anchor attribute that has the same value as the GUID of the managed metaverse object:

  • ChangeType – this is the change type attribute used to indicate deletions for the processed object:

  • A join synchronization rule that links an operational object with a metaverse object that has the anchor value as object-id:

 

Modifying the object deletion rule

The object deletion rule need to be configured to delete a metaverse object, when a connector from the operational management agent is disconnected:

As mentioned earlier, this is necessary to get the deprovisioning synchronization rule triggered on your target object.

Configuring Run Profiles

To test a cleanup sequence, you need to have the following run profiles configured:

Management Agent Run Profile
My Operational MA Full Import
My Operational MA Delta Import
My Operational MA Full Synchronization
My Operational MA Delta Synchronization
Target MA Export
Target MA Delta Import

The configuration of run profiles is a very easy task.
For the full import and the delta import on the operational management agent, you use the same source data file.
Before configuring the run profiles, you should copy the data file to the MaData folder of your operational management agent. 

Performing a cleanup sequence

To perform a cleanup sequence, you need to perform the following steps:

  1. Full import on the operational management agent
  2. Full synchronization on the operational management agent
  3. Delta import on the operational management agent
  4. Delta synchronization on the operational management agent
  5. Export on the target management agent
  6. Delta import on the target management agent

 

Fine-tuning the implementation

With the solution outlined in this article, you have a basic framework to cleanup a connector space in a lab environment.
When you have tested a cleanup sequence successfully, you should look into options to fine tune the implementation.
For example, when you have verified that the script to generate the data file works as expected, you could move the script to the MaData folder of you operational management agent and extend it with script code to start the run profiles.
That way, you can fully automate the process of cleaning up your connector space.
It is also advisable to filter the processed objects more granularly.

Summary

The solution to cleanup a connector space in a lab environment I have outlined in this article is definitely not meant to be a silver bullet.
The suggested process has some limitations and can take a while to complete.
This is why you should first run the provided script to determine whether the implementation works for you.
However, since it is possible to fully automate the cleanup process, this implementation can free up some time and might also help to motivate you to do more testing.

Additional Information

For more information, please see:

Leave a Comment
  • Please add 7 and 8 and type the answer here:
  • Post
Wiki - Revision Comment List(Revision Comment)
Sort by: Published Date | Most Recent | Most Useful
Comments
Page 1 of 1 (1 items)
Wikis - Comment List
Sort by: Published Date | Most Recent | Most Useful
Posting comments is temporarily disabled until 10:00am PST on Saturday, December 14th. Thank you for your patience.
Comments
  • Fernando Lugão Veltem edited Original. Comment: added toc and tags

Page 1 of 1 (1 items)