Imports Archives - ServiceNow Guru https://servicenowguru.com/category/imports/ ServiceNow Consulting Scripting Administration Development Tue, 28 May 2024 21:41:37 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.2 https://servicenowguru.com/wp-content/uploads/2024/05/cropped-SNGuru-Icon-32x32.png Imports Archives - ServiceNow Guru https://servicenowguru.com/category/imports/ 32 32 Simplifying Data Imports from Third Parties https://servicenowguru.com/imports/simplifying-data-imports-parties/ https://servicenowguru.com/imports/simplifying-data-imports-parties/#comments Wed, 01 Mar 2017 13:53:04 +0000 https://servicenowguru.wpengine.com/?p=12342 Recently along with my Crossfuze colleagues Josh McMullen and Scott Cornthwaite, I performed work for a client that needed to import asset data from several third party vendors. The company, a global manufacturer of household appliances, has dozens of vendors around the world, which supply data (CSV and Excel files) using proprietary formats and column names. The

The post Simplifying Data Imports from Third Parties appeared first on ServiceNow Guru.

]]>
Recently along with my Crossfuze colleagues Josh McMullen and Scott Cornthwaite, I performed work for a client that needed to import asset data from several third party vendors. The company, a global manufacturer of household appliances, has dozens of vendors around the world, which supply data (CSV and Excel files) using proprietary formats and column names. The client’s desired future state is to enforce a single format for use by all vendors. But to control their hardware and software assets today, they needed a solution to work with multiple vendors and data formats.

We faced a few challenges. Our solution needed to:

  • Allow data imports without elevated roles or system privileges
  • Handle the same kind of data from multiple vendors (e.g. hardware asset imports)
  • Handle data in a variety of file formats including text (.csv) and MS Excel
  • Provide feedback to the client’s IT asset management (ITAM) team
  • Check data quality and handle exception conditions defined by the client
  • Run with minimal input or intervention

We started to create custom inbound email actions to process emails with attached data files as they were sent from each vendor. But we discovered some big downsides with that approach. The most serious was that there’s no way to validate the accuracy or completeness of data being sent by the vendor. Inbound email processing also leaves the client at the mercy of the vendor for the timing of the imports. Finally, it isn’t very scalable, since a new inbound email action must be created for every additional vendor and/or data source.

What the client needed was a simple way to perform a data import via a Data Source with a file attachment. They were already aware of the Load Data wizard that ServiceNow provides on the System Import Sets application menu, but that solution isn’t very user-friendly, and it requires a lot of manual input each time new data are imported.

Asset Import Wizard

To make it easy for the client’s ITAM team to import their data in to ServiceNow, we leveraged the power of the Service Catalog. Specifically, we created a Record Producer to provide a simple front-end for importing vendors’ data files. Record producers can be used by anyone. Their visibility can be limited to only interested parties. Files attached to the record producer are automatically attached to the record that is created. And the record producer’s Script field enables powerful data processing.

We configured the record producer as follows:

  • Name: Asset Import Wizard
  • Table: Data Source (sys_data_source)

We added a single variable to the record producer, a lookup select box to allow the ITAM team member to specify the type of import they were performing.

Import Wizard Configurations

The variable gets its list of options from a custom table called Import Wizard Configurations. This table allowed us to build a flexible framework for defining different types of file imports from any vendor.
This table has many fields that are similar to those on a Data Source record. That’s because the record producer’s script queries the Wizard Configuration table for values to use when it inserts a new record in the Data Source table. Here’s how this form breaks down.

  • Vendor references records in the Company (core_company) table where the Vendor field is set to true.
  • Import Type is a choice list of options that describe the kind of asset data being imported (e.g. Hardware, Software, HW end of life disposal, or Lease Contract)
  • Expected Data Format is a choice list that allows either CSV or Excel (.xls) formats to be specified.
  • Data Source Name Prefix is a text field for naming the data source. The record producer’s script automatically appends the current time/date stamp to the prefix when each Data Source record is created
  • Import Set Table and Transform Map are fields that reference records in those tables. The import set and transform map to be used for these data imports must be created in advance.
  • Header Row and Sheet Number are used to specify values when an MS Excel file is the data source. For CSV files, the client just sets these to 1.
  • Active allows the client to deactivate the Wizard Configuration record if it is no longer needed. The Lookup Select Box variable on the record producer displays only active records in this table.

Record Producer Script

The record producer’s script is where the wizard’s “magic” happens. The script does several things:

  1. It validates that the wizard is submitted with an attachment of the correct format.
  2. It queries the Wizard Configuration table for the record that is selected in the Import Type variable, and inserts a new record into the Data Source table using values from the Wizard Configuration record.
  3. It imports the data into the import set table and executes the specified transform map, using ServiceNow helper functions GlideImportSetLoader and GlideImportSetTransformerWorker
  4. It provides feedback to the user upon successful execution of the script, or displays appropriate error messages if the script encountered errors. Error conditions cause the record producer to abort without creating the Data Source record.

Here is the code we used in the record producer script:


// Verify attachment is included and in correct format
var gr2 = new GlideRecord("sys_attachment");
gr2.addQuery("table_sys_id", current.sys_id);
var oC = gr2.addQuery("table_name", "sys_data_source");
oC.addOrCondition("table_name", "sc_cart_item");
gr2.query();
if (!gr2.next()) {
gs.addErrorMessage("You must attach a file to submit. Your import submission has been aborted.");
current.setAbortAction(true);
producer.redirect="com.glideapp.servicecatalog_cat_item_view.do?v=1&sysparm_id=<SysID of the Record Producer>";
}
else{
//Get the glide record for the selected import type
var gr = new GlideRecord('u_pmy_imp_wiz_cfg');
gr.addQuery('sys_id',producer.import_type);
gr.query();
if(gr.next()){
if(gr2.getRowCount() > 1){
gs.addErrorMessage("You may only attach one file at a time for this import wizard. Your import submission has been aborted.");
current.setAbortAction(true);
producer.redirect="com.glideapp.servicecatalog_cat_item_view.do?v=1&sysparm_id=<SysID of the Record Producer>";
}
//check to make sure the file format is correct on the attachment
var passedFormatCheck = false;
var errorCaught = true;
if (gr.u_format == 'CSV'){
if (gr2.file_name.endsWith('.csv') == true){
passedFormatCheck = true;
}
else{
gs.addErrorMessage("This import type is expecting submission of a CSV file (.csv), but a different file format was attached. Your import submission has been aborted.");
current.setAbortAction(true);
producer.redirect="com.glideapp.servicecatalog_cat_item_view.do?v=1&sysparm_id=<SysID of the Record Producer>";
}
}
else if (gr.u_format == 'Excel'){
if(gr2.file_name.endsWith('.xls') == true){
passedFormatCheck = true;
}
else{
gs.addErrorMessage("This import type is expecting submission of an Excel file (.xls), but a different file format was attached. Your import submission has been aborted.");
current.setAbortAction(true);
producer.redirect="com.glideapp.servicecatalog_cat_item_view.do?v=1&sysparm_id=<SysID of the Record Producer>";
}
}

if(passedFormatCheck == true){
// Create data source record (based on form import type selection record)

current.name = gr.u_ds_naming + '_' + gs.nowDateTime();
current.format = gr.u_format;
current.import_set_table_name = gr.u_import_set.name;
current.header_row = gr.u_header_row;
current.sheet_number = gr.u_sheet_number;
current.file_retrieval_method = "Attachment";
current.type = "File";

//Data source needs to be created before we can trigger the commands below, so we create the record outside of the normal record producer method
current.insert();

// Process file into data source record
var loader = new GlideImportSetLoader();
var importSetRec = loader.getImportSetGr(current);

// Import data from data source to import set table (based on form import type selection record)
var ranload = loader.loadImportSetTable(importSetRec, current);
importSetRec.state = "loaded";
importSetRec.update();

// Start appropriate transform map (will have the logic for logging exceptions within the transform map scripts, and will trigger an email once complete to the import submitter with an outline of the logged errors and warnings)
var transformMapID = gr.u_transform;
var transformWorker = new GlideImportSetTransformerWorker(importSetRec.sys_id, transformMapID);
transformWorker.setBackground(true);
transformWorker.start();

//Inform the user that a email outlining the status of the import will be sent once the import is complete
gs.addInfoMessage("Your import file has been submitted. An email will be sent to you once the import is completed to outline any errors or warnings encountered while importing.");
producer.redirect="home.do";
}

}
else{
gs.addErrorMessage('Something went wrong with the import. Please contact a system admin to investigate.');
}

// Since we inserted the data source already, abort additional insert by record producer
current.setAbortAction(true);
}

The Miracle of Transform Map Scripts

If the wizard works magic on the front end, then Transform Maps do the same on the back end. Even when the client’s ITAM team have an opportunity to review the vendors’ data before importing it, there can still be errors. We identified all of the potential failure points in each vendor’s data.

We used onBefore transform scripts to check each source row for exceptions before the source fields are mapped into the target table. We grouped these exceptions into two categories; Errors and Warnings. The transform scripts use log.error() and log.warn() to write exceptions to the import set log for each import run. Error exceptions cause the source row to be skipped by setting the ignore variable to true. Warning exceptions are logged, but the source row is transformed.

We also determined whether other tables required records to be updated or inserted as a result of the data import. For example, lease contract imports contain information about the hardware assets that are under lease. As each hardware record is updated, a related record has to be created in the Assets Covered table, in order to associate that asset with its lease contract.

We used onAfter transform scripts to handle these secondary table updates. OnAfter scripts run after the source record has been transformed into the target table. These scripts also logged exceptions if any were encountered during the update.

After all of the source rows have been evaluated and/or transformed, an onComplete script compiles the exceptions from the import set log in a block of text, then queues a system event. The user who initiated the import receives a notification containing that block of text. The notification provides feedback in near-real time, and lists exceptions from the import set log that would normally only be available to administrators.

The Universal Translator

The final cog in this machine is the Vendor Model Translation table. We created this custom table because the client found that their vendors used model identifiers that did not match the names in the client’s Model (cmdb_model) table. The lack of a common name, model number or some other identifier in the vendor’s data makes it impossible to match asset models.

The vendor translation table is a simple cross-reference table that associates a vendor’s name for a given model with the model record in ServiceNow. It contains just four fields:

  • Vendor is a reference to a vendor record in the Company (core_company) table
  • Active is a true/false value that can be used to filter records during queries
  • Vendor Model is a text field that stores the identifier used by the vendor to reference the model
  • ServiceNow Model is a reference to a record in the Model (cmdb_model) table

With this simple table it is possible to establish aliases for any number of models and/or vendors. It can also be used as a kind of normalization table, listing several vendor models that all refer to the same model in the client’s Model table. In the example below, three mobile phone models provided by the vendor are all associated with a single model in ServiceNow:

We put this table to use in a couple of ways. The transform map for hardware asset imports used an onBefore script to set values in the target Hardware (alm_hardware) table based on a match in the vendor translation table. This script also illustrates some of the exception logging we performed for the client:


(function runTransformScript(source, map, log, target /*undefined onStart*/ ) {
var errorCondition = false;
var itemModel = '';
var modelDisp = '';
var vendorName = 'Name of the vendor company';
//Set the sourceRow variable to allow for input into the log statements the row from the source that failed
var sourceRow = source.sys_import_row + 2;
var excPrefix = 'Exception: Asset Import HW ASN ' + vendorName + ': Source Data Row ' + sourceRow + ': ';
// Do not transform the source record unless the source u_inventory_category field contains the text string "computer.system". This filters out non-computer hardware that may be included in the ASN. Log an Error if the inventory category value does not contain ".Computer Systems.".
if (source.u_inventory_category.indexOf('.COMPUTER SYSTEMS.') == -1){
//log.error(excPrefix + 'Item not a hardware asset (inventory category field does not contain ".computer system.")');
errorCondition = true;
}
else{
// Check for empty vendor model number in source, Log ERROR with message per error exception
if(JSUtil.nil(source.u_mfg_part_num) == true){
log.error(excPrefix + 'Manufacturer part number (mfg part num) missing in source data');
errorCondition = true;
}
else{
// Perform lookup against custom translation table, using vendor and vendor model key as the unique identifiers
//query the table for model translations for the selected vendor
var gr2 = new GlideRecord('u_vendor_translation');
gr2.addQuery('u_vendor', 'SysID of the vendor's record in the core_company table');
gr2.addQuery('u_vendor_model', source.u_mfg_part_num);
gr2.addActiveQuery();
gr2.query();
// Confirm if match found, if not, raise ERROR exception into import log
if(gr2.next()){
//Set the MODEL FIELD on the record so we don't need to query the table again in a field map script
source.model = gr2.u_sn_model.sys_id;
target.model = gr2.u_sn_model;
//set the itemModel field for use in another query later
itemModel = gr2.u_sn_model;
modelDisp = gr2.u_sn_model.getDisplayValue();
target.model_category = gr2.u_sn_model.cmdb_model_category;
}
else{
log.error(excPrefix + 'Source model ' + source.u_mfg_part_num + ' does not match a model in Service-Now. Check vendor translation table to ensure a translation is set up.');
errorCondition = true;
}
}

// Check for empty serial number, if empty, Log ERROR with message per error exception logging section below
if (JSUtil.nil(source.u_serial_num) == true){
log.error(excPrefix + 'Serial number missing in source data');
errorCondition = true;
}
else{
// Confirm that the serial number does not exist in the alm_asset table already as that same model, if so, raise ERROR exception
var modelDesc = '';
var gr1 = new GlideRecord('alm_asset');
gr1.addQuery('serial_number', source.u_serial_num);
//If we found a model through the translation table record, perform an additional filter on the model to make sure we don't have a duplicate model + serial number combo
if (itemModel != ''){
gr1.addQuery('model', itemModel);
//set the modelDesc variable to include that optionally in an error code if applicable
modelDesc = 'with model number ' + modelDisp + ' ';
}
gr1.query();
if(gr1.next()){
log.error(excPrefix + 'Serial number ' + source.u_serial_num + ' ' + modelDesc + 'matches existing hardware asset record ' + gr1.getDisplayValue());
errorCondition = true;
}
}

if(JSUtil.nil(source.u_customer_po_num) == true){
log.error(excPrefix + 'Customer PO Num field is empty on this record.');
errorCondition = true;
}
else{
// Perform lookup on the PO Line item table based on the PO provided and the model of hardware *Assume no more than one line item per model
var enc = 'purchase_order.u_po_number='+ source.u_customer_po_num +'^purchase_order.status!=canceled^model=' + itemModel;
var pol = new GlideRecord('proc_po_item');
pol.addEncodedQuery(enc);
pol.orderByDesc('sys_created_on');
pol.query();
if(pol.next()){
//Set the source record's Purchase Line to the sys_id of the Purchase Order Line Item
source.purchase_line = pol.sys_id;
}
else{
log.error(excPrefix + 'Unable to find a PO Line item under PO number ' + source.u_customer_po_num + ' for product model ' + modelDisp + '. This could be an issue with the vendor translation table record, or the wrong model was selected on a line item.');
errorCondition = true;
}
}

// If the Vendor is not found in ServiceNow's core_company table, log a Warning
var gr3 = new GlideRecord('core_company');
gr3.addQuery('sys_id', 'bbb81b896f8641009e4decd0be3ee4b1'); //sys_id of the vendor from the core_company table
gr3.query();
if(!gr3.next()){
log.warn(excPrefix + 'Vendor ABCDEFG (bbb81b896f8641009e4decd0be3ee4b1) does not match a record in the ServiceNow.');
}

// Check if the source record's quantity is greater than 1, if so, Log ERROR with message per error exception logging section below
if(source.u_qty_shipped > 1){
log.error(excPrefix + 'QTY shipped is greater than 1.');
errorCondition = true;
}
}

//Skip importing the source record if the transfer map is attempting to update an existing record
if (action == "update" && errorCondition != true){
log.error(excPrefix + 'Record is attempting to update an existing record.');
ignore = true;
}
// Skip importing the source record if any exceptions are found (ignore = true )
if(errorCondition == true){
ignore = true;
}

})(source, map, log, target);

For mobile device imports we wrote an onStart transform script that uses the Vendor Model Translation table to update source rows before any source rows are transformed. That reduces the ITAM team’s administrative task load by leveraging information already in ServiceNow’s Model table.


(function runTransformScript(source, map, log, target /*undefined onStart*/ ) {
/***
* Before transforming any source rows we'll try to match them to models using the
* Vendor Model Translation table or the ServiceNow Model table
***/
log.info('import set ' + import_set.sys_id.toString());
//Query the import set table for rows in the import set we're transforming
var row = new GlideRecord('u_mobile_device_import');
row.addQuery('sys_import_set', import_set.sys_id.toString());
row.query();
while(row.next()){
if(!row.u_device_model_.nil()){
//Check the Vendor Model Translation table for a record that matches the source's Device Model
var gr2 = new GlideRecord('u_vendor_translation');
gr2.addQuery('u_vendor_model', row.u_device_model_);
gr2.addActiveQuery();
gr2.query();
if(gr2.next()){
//Set the MODEL FIELD on the row so that we can coalesce
row.u_device_model_ = gr2.u_sn_model.sys_id.toString();
row.update();
}
else{
//No match in the Vendor Translation table, so check ServiceNow's Model table for a match
var gr3 = new GlideRecord('cmdb_model');
gr3.addQuery('display_name', row.u_device_model_);
gr3.addQuery('status','In Production');
gr3.query();
if(gr3.next()){
//Set the MODEL FIELD on the row so that we can coalesce
row.u_device_model_ = gr3.sys_id.toString();
row.update();
}
}
}
}
})(source, map, log, target);

Putting It All Together

The Import Wizard gave the client a simple way to initiate data imports from any number of vendors at a convenient time, after they verified the quality of the data. The Wizard Configuration table provided a means to extend the wizard’s functionality for multiple vendors and data imports. And the Vendor Model Translation table allowed the ITAM team to associate a vendor’s model information with Model records in the client’s ServiceNow instance.

The client’s ITAM team reports that the wizard has already paid off.  In a few short days it has simplified their work, reduced errors, and made processing asset data much more efficient. And project managers are making plans to bring more vendors on board as they work toward enforcing common data formats across all of their vendors.

Standing on the Shoulders of Giants

This article wouldn’t be complete without acknowledging the assistance of others. We based our design for the Import Wizard record producer on this ServiceNow Community post by Michael Ritchie and endorsed by everyone’s favorite bow-tie wearing technical genius, Chuck Tomasi. The post is well worth reading, since Michael leads you through the steps you’ll need to follow before using the record producer to perform a data import.

The post Simplifying Data Imports from Third Parties appeared first on ServiceNow Guru.

]]>
https://servicenowguru.com/imports/simplifying-data-imports-parties/feed/ 14
Execute Scheduled Jobs From Script https://servicenowguru.com/scripting/execute-scheduled-jobs-script/ https://servicenowguru.com/scripting/execute-scheduled-jobs-script/#comments Mon, 29 Nov 2010 16:06:08 +0000 https://servicenowguru.wpengine.com/?p=2846 Scheduled jobs are an extremely useful way to automate processes in Service-now.com and lift some of the administrative burden of the tool off of the shoulders of your Service-now administrators. It’s very easy to create a scheduled job or a scheduled import. Typically there’s no scripting or advanced configuration involved at all. Just set it

The post Execute Scheduled Jobs From Script appeared first on ServiceNow Guru.

]]>
Scheduled jobs are an extremely useful way to automate processes in Service-now.com and lift some of the administrative burden of the tool off of the shoulders of your Service-now administrators. It’s very easy to create a scheduled job or a scheduled import. Typically there’s no scripting or advanced configuration involved at all. Just set it up with a schedule and let it run! There are some situations where you might need to execute a scheduled job outside of its normal schedule. You can always open up the scheduled job entry and click the ‘Execute Now’ button, but there are also probably times where you’d really like to automate the process based on some trigger in a workflow or state change in some ticket or CI.

This article shows you how you can use script to execute an already-scheduled job on-demand. To do this, you simply mimic the behavior of the ‘Execute Now’ button by querying for the scheduled job you want to execute and firing off the job. Here’s the script…

You can modify the script below by adding your scheduled job name and querying the appropriate table for your scheduled job as shown here. This will not work if you simply query the ‘sysauto’ table. You must query one of the extensions below directly.

  • ‘scheduled_import_set’ (Scheduled Import Sets)
  • ‘sysauto_script’ (Scheduled Script Execution)
  • ‘sysauto_template’ (Scheduled Template Generation)
  • ‘sysauto_report’ (Scheduled Report)
//Execute a scheduled script job
var rec = new GlideRecord('sysauto_script');
rec.get('name', 'YOUR_JOB_NAME_HERE');
if (typeof SncTriggerSynchronizer != 'undefined')
   SncTriggerSynchronizer.executeNow(rec);
else
   Packages.com.snc.automation.TriggerSynchronizer.executeNow(rec);

The post Execute Scheduled Jobs From Script appeared first on ServiceNow Guru.

]]>
https://servicenowguru.com/scripting/execute-scheduled-jobs-script/feed/ 18
The Simple Power of XML Export/Import https://servicenowguru.com/imports/simple-power-xml-export-import/ https://servicenowguru.com/imports/simple-power-xml-export-import/#comments Thu, 04 Nov 2010 16:57:04 +0000 https://servicenowguru.wpengine.com/?p=2624 Service-now.com makes extensive use of the XML format for storing and transporting system updates and changes. Update sets, System Import Sets, and certain export methods all make use of XML files in one way or another. One feature that a lot of people don’t know about until somebody shows them is the ability to export

The post The Simple Power of XML Export/Import appeared first on ServiceNow Guru.

]]>
Service-now.com makes extensive use of the XML format for storing and transporting system updates and changes. Update sets, System Import Sets, and certain export methods all make use of XML files in one way or another. One feature that a lot of people don’t know about until somebody shows them is the ability to export and import XML files in a Service-now instance. A manual XML import/export can be extremely helpful in the right circumstances and is something that every Service-now consultant or administrator should have in their tool belt. In this post, I’ll explain how simple it is to transport system changes from one Service-now instance to another with a few clicks.

First of all, I don’t recommend using this method as a full-time replacement for update sets or import sets between instances. Those methods are much better for bulk changes and much less prone to user error when configured and used correctly. This method is completely manual, but it is really nice to know about if you’ve got something that you need to do one time – maybe something got left out of an update set or you need to pull a record from the Service-now demo instance.

It’s also critical when using this method that the sys_ids of any record you import match between the source and target if the target record needs to be updated. The sys_id is the coalesce value used to match records and indicates that an update should occur rather than an insert. This article shows how you can view the record sys_id values to make sure they match.

XML Export (Single Record from a form)

If you need to export a single record the easiest way to do it is simply to open up the record and right-click the form header as shown below. The export will ask you to save an XML file which can be imported in another Service-now instance.

XML Export (Multiple Records or Single Record from a list)

You can also export to XML from any full table list (it doesn’t work from related lists on a form). Exporting from a list allows you to filter down to multiple records (or even a single record) that you want to export. Anything that shows in the filtered list will be part of the export. The export option is accessed by right-clicking the list header as shown below.

XML Import

Once you have an XML export using one of the methods described above, you can import that XML file into any Service-now instance by right-clicking any full list header (it doesn’t work from related lists on a form) and selecting the ‘Import XML’ option. Since your XML file contains all of the details about the record(s) included you don’t have to worry about where you initiate the import from. For example, I could import a business rule export using the context menu from a list of users. The system knows from the details of the XML file where the records really need to go. It also doesn’t care if the import contains one or multiple records. The process is the same for both. Any record(s) that don’t exist in the destination instance will be created and any that already exist will be updated – AS LONG AS THE SYS_ID VALUES MATCH.

There are some cases when the functionality you are exporting is actually contained in several different tables. If you don’t have to export from these places frequently, it may be good enough simply to export from all of the individual tables as shown above. If it’s something you do frequently, you may want to consider creating a custom UI action to accomplish the task. Instructions for that can be found here.

The post The Simple Power of XML Export/Import appeared first on ServiceNow Guru.

]]>
https://servicenowguru.com/imports/simple-power-xml-export-import/feed/ 4
Deactivating Users From LDAP https://servicenowguru.com/imports/deactivating-users-ldap/ https://servicenowguru.com/imports/deactivating-users-ldap/#comments Wed, 01 Sep 2010 20:30:00 +0000 https://servicenowguru.wpengine.com/?p=1854 One of the most common LDAP integration requirements is to disable users in ServiceNow when they become disabled in the LDAP source. It is probably worth mentioning that you always want to deactivate user records (and most other records too) in ServiceNow instead of deleting them. Once a user record has been created in ServiceNow

The post Deactivating Users From LDAP appeared first on ServiceNow Guru.

]]>
One of the most common LDAP integration requirements is to disable users in ServiceNow when they become disabled in the LDAP source.

It is probably worth mentioning that you always want to deactivate user records (and most other records too) in ServiceNow instead of deleting them. Once a user record has been created in ServiceNow it should always remain in ServiceNow because that record could be linked to hundreds of other records (tasks, CIs, etc.). Deleting the record kills the relationship to those other records. Deactivating the record keeps that relationship in place.

Because the exact steps to set up this behavior vary depending on your LDAP setup and processes, this configuration isn’t something that can be predefined in ServiceNow. Typically a ServiceNow consultant assists with this setup and specific requirements are determined on a client-by-client basis. It has been my experience that there are two common approaches that can be used to disable ServiceNow users from LDAP. This article explains these approaches and how you can implement the needed functionality.

The basic idea to keep in mind when you set up either of these configurations is that the system needs to see some consistent data indicator per user object in LDAP that the user has been disabled. Generally this indicator is membership in a specific OU (which could be identified by parsing the ‘dn’ attribute) or through the use of the ‘useraccountcontrol’ attribute…or both. The bottom line is that ServiceNow cannot disable users from an external source unless it receives consistent data about those users indicating which ones should be disabled and which ones should not. With either of the approaches below this data MUST come into ServiceNow by way of an import set table where it can be evaluated and transformed. If these terms seem foreign to you you’ll want to familiarize yourself with ServiceNow Import Sets before you continue.

Disabling users in Service-now from an LDAP source typically involves one of these two methods…

1Separate import process from LDAP just for disabled users: In this approach you don’t attempt to handle disabled users through the standard LDAP import at all. Instead, you (or your LDAP Admin) produce a separate extract from your LDAP source that contains the users that need to be disabled. This extract can then be used in a separate import to simply set the ‘active’ flag on every record in the import to ‘false’. The entire script (‘target.active=false’) can be placed in the ‘Script’ field directly on the ‘Table Transform Map’ record.

The benefits to this solution are that the scripting is extremely simple and you don’t have to worry about identifying which users are active or inactive in ServiceNow. You simply deactivate everything in the feed! An additional benefit is that you don’t have to bring a bunch of users into a temporary import table just to see if they should be deactivated or not. Depending on the number of users records that have to be evaluated in your transform (and the number of attributes being brought in per record), this may be the only way to disable users without impacting the performance of the LDAP load. The obvious drawback to this method is that you have to have a separate process to create and drop the extract of disabled users in a location where your ServiceNow data source can pick it up.

2Open up the LDAP OU filter to bring everything in to your import set table (including disabled users) and then ignore inserts of disabled users based on certain script conditions: The sample ‘Users’ OU definition that ServiceNow provides in its out-of-box LDAP sample contains a filter that looks like this…

This filter is important because it defines what user records will be brought into the ServiceNow import set table to be evaluated. The red-highlighted portion of the filter in the image above represents the common problem faced when you try to disable users from an LDAP import. What it basically says is that all disabled users in your LDAP source should be filtered out. This is great but you can’t very well disable a user record when you cannot see because it is disabled! The solution in this case is to remove that portion of the filter so that you can see all of those user records. “Seeing” those user records means that they are all brought into your temporary import set table so that they can be evaluated by an import set transform and disabled if necessary.

Once those records hit your import set table your transform takes over and can manipulate the target records in the ServiceNow user table as required. The exact script that you write depends on the LDAP attributes that you bring into ServiceNow that indicate a disabled user. Here are a couple of common examples that I’ve used before to filter based on OU (from the ‘dn’ attribute which comes into your import table as ‘u_dn’) and the ‘useraccountcontrol’ attribute (which is usually 514 or 546 for a disabled user). For more information on the ‘userAccountControl’ attribute see this page.

//Deactivate LDAP-disabled users during transform based on 'userAccountControl' attribute
if(source.u_useraccountcontrol == '514' || source.u_useraccountcontrol == '546'){
target.active=false;
target.locked_out=true;
}
//Deactivate LDAP-disabled users during transform based on OU membership in 'dn'
if(source.u_dn.indexOf('OU=Disabled Accounts') > -1){
target.active = false;
target.locked_out = true;
}

The last piece to this solution is VERY IMPORTANT. Because you’ve opened your OU definition filter so that you can see enabled and disabled accounts from your LDAP source, ServiceNow will evaluate all of those records. That part is desirable. What isn’t desirable is the insertion of 50 thousand inactive LDAP accounts from 5 years ago that you don’t want to be created in ServiceNow. You should NEVER create a user record in ServiceNow for a user that is inactive. The only thing you want to do is disable any existing ServiceNow users as they are disabled in your LDAP source. In order to prevent this from happening you need to set up a script that runs before the transform of each record and identifies if a record is disabled AND is being inserted. If an insert of a disabled user is happening then the operation should be ignored by the transform. Here’s what that ‘onBefore’ transform map script looks like…

Please note that the exact content of this script depends on the way disabled users are defined in your LDAP source. You may need to change this script to meet the needs of your environment.

//Ignore any insert of a disabled record as defined by the 'userAccountControl' attribute
var uc = source.u_useraccountcontrol;
if((uc == '514' || uc == '546') && action == 'insert'){
ignore = true;
}

The script examples above are given to show the different parts of solution #2. If you put all of it together you can produce a very elegant solution that only requires a single script. Here’s a great script that I got from Valor Poland that puts all of this together in a single script…and makes it so that you don’t have to rely on hard-coded ‘userAccountControl’ values. It also allows you the option of reactivating LDAP user accounts as well. The script can be placed in the ‘Script’ field of the ‘Table Transform Map’ record or in an ‘onBefore’ transform map script. Very, very cool.

Note that this script makes use of the ‘userAccountControl’ attribute. If you are using another method to indicate disabled users in your LDAP source then this script wouldn’t apply.
//Deactivate LDAP-disabled users during transform based on 'userAccountControl' attribute
//Convert the userAccountControl attribute back to a hex value
var ctrl = parseInt(source.u_useraccountcontrol, 10);
ctrl = ctrl.toString(16);

//The only digit we care about is the final one
//A final hex digit value of '2' in 'ctrl' means disabled
if(ctrl.substr(-1) == '2'){
//Deactivate and lock the user account
target.active = false;
target.locked_out = true;
//Ignore any insert of a disabled record
if(action == 'insert'){
ignore = true;
}
}
else {
//Optional: Reactivate and unlock the user account
//target.active = true;
//target.locked_out = ctrl.substr(-2, 1) == '1';
}

The post Deactivating Users From LDAP appeared first on ServiceNow Guru.

]]>
https://servicenowguru.com/imports/deactivating-users-ldap/feed/ 4
The Best LDAP Integration Tip You’ve Never Heard Of https://servicenowguru.com/service-now-general-knowledge/ldap-integration-attributes-tip/ https://servicenowguru.com/service-now-general-knowledge/ldap-integration-attributes-tip/#comments Wed, 30 Jun 2010 16:49:56 +0000 https://servicenowguru.wpengine.com/?p=1747 One of the basic (but often forgotten) guidelines that should be followed whenever you consider importing any data into your ServiceNow instance is to only import the information that is actually necessary within the tool. Just because you CAN import the data into ServiceNow doesn’t necessarily mean that the data SHOULD be imported or that

The post The Best LDAP Integration Tip You’ve Never Heard Of appeared first on ServiceNow Guru.

]]>
One of the basic (but often forgotten) guidelines that should be followed whenever you consider importing any data into your ServiceNow instance is to only import the information that is actually necessary within the tool. Just because you CAN import the data into ServiceNow doesn’t necessarily mean that the data SHOULD be imported or that it has any value. Even if somebody thinks the data has value within ServiceNow, you should also consider if that value outweighs the work and trouble of importing and maintaining that data going forward. This is particularly true for CMDB and old ticket data but is also true of user data imported from LDAP. One thing that a lot of people don’t realize is that you can end up with ‘garbage’ data from LDAP but that it is also very simple to configure your system to prevent this from happening.

LDAP configuration is typically one of those “Set it and forget it” type of tasks. You connect to the LDAP server, specify the OUs and transform maps, and run the scheduled import. Even though your LDAP integration may be working just fine, chances are that you are actually bringing way more user information into your system than is necessary or useful. This is because by default, an LDAP map brings in ALL available attributes from the LDAP server for each object. Usually, the majority of these attributes aren’t necessary but they end up getting imported and stored for each import in the temporary import set table before potentially being transformed into your user table. The real tragedy with this is that because you still have to bring all of the ‘garbage’ data into the system before the transform, it can actually slow the import time considerably if you have a very large set of data coming over (probably anything above the ‘hundreds’ range of records).

It’s actually very simple to prevent this unnecessary data from coming into your system at all and cluttering up your import table and slowing down your LDAP import. This can be done by adding a comma-separated list of attributes to be brought over from your LDAP server to the LDAP Server record in your instance. To completely minimize the amount of data brought over, this list should contain only those fields used in your transform map. This method has been documented on the ServiceNow LDAP Configuration wiki page.

department,employeeid,givenname,l,mail,manager,sn,source,telephoneNumber,title,uid,dn,cn,o,street,postalCode,mobile,samaccountname

Cleaning up existing clutter

If you’ve already had an LDAP integration running without this setup, you’ve not only got a bunch of garbage data, but a bunch of garbage columns in your import set table as well. Once you modify the attributes for your LDAP server you should go back and clean up the table and data by using the ‘Cleanup’ module under the ‘System Import Sets’ application. You’ll want to have BOTH checkboxes un-checked so that you remove both the data and the table structure. This table structure will be re-created the next time you run your import! Since this is an LDAP import, you’ll want to make sure you go back to the newly-created import table and adjust the column lengths of the ‘DN, Source, and Manager’ fields on your import table to 255 so that the manager mapping and login information doesn’t get truncated during the import.

The post The Best LDAP Integration Tip You’ve Never Heard Of appeared first on ServiceNow Guru.

]]>
https://servicenowguru.com/service-now-general-knowledge/ldap-integration-attributes-tip/feed/ 2
Defining the ‘Run as’ User for a Scheduled Import https://servicenowguru.com/imports/defining-run-as-user-scheduled-import/ https://servicenowguru.com/imports/defining-run-as-user-scheduled-import/#comments Fri, 05 Feb 2010 14:43:08 +0000 https://servicenowguru.wpengine.com/?p=896 Recently I had a client ask me why all of their imported records showed up as being updated and created by the person who initially created the import. The problem, they said, was that the imported records were being created and updated on a scheduled basis by the system so it really wasn’t a particular

The post Defining the ‘Run as’ User for a Scheduled Import appeared first on ServiceNow Guru.

]]>
Recently I had a client ask me why all of their imported records showed up as being updated and created by the person who initially created the import. The problem, they said, was that the imported records were being created and updated on a scheduled basis by the system so it really wasn’t a particular person who was creating the records. While I agreed that it was a problem, I wasn’t sure at the time why it was happening or what the solution was. Now I know!


It should be no surprise to anybody that each record created or updated in Service-now has some information about who created and last updated the record (along with when those actions occurred). When you’re doing an initial import of data directly through import sets it is accurate for the system to record that the user performing those actions performed the creation and/or update of each of those records.

With a scheduled import however, that same assumption isn’t correct. Generally you would like to have those actions attributed to system behavior rather than the person who set up the scheduled import. Enter the ‘Run as’ field on the ‘Scheduled Import’ table.

By default, this field isn’t part of the Scheduled Import form so you’ll need to personalize your form layout to see it and use it. You’ll notice that this field is automatically populated with the name or ID of the user that is creating the scheduled import record. The value in this field determines which user will run the import and, in turn, what user will be recorded in the ‘Updated by’ and ‘Created by’ fields for each record imported by that schedule.

Since you’ll probably create some sort of dummy or system user for this function, it is important to note that the user account used in the ‘Run as’ field does not have to have any permissions or roles at all to be used as the ‘Run as’ user.

The post Defining the ‘Run as’ User for a Scheduled Import appeared first on ServiceNow Guru.

]]>
https://servicenowguru.com/imports/defining-run-as-user-scheduled-import/feed/ 1