Integration Archives - ServiceNow Guru https://servicenowguru.com/tag/integration/ ServiceNow Consulting Scripting Administration Development Tue, 01 Oct 2024 15:19:54 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.2 https://servicenowguru.com/wp-content/uploads/2024/05/cropped-SNGuru-Icon-32x32.png Integration Archives - ServiceNow Guru https://servicenowguru.com/tag/integration/ 32 32 Using Import Sets for REST Integration in ServiceNow https://servicenowguru.com/integration/using-import-sets-rest-integration/ Mon, 26 Aug 2024 17:18:21 +0000 https://servicenowguru.com/?p=16807 Integrating data from external sources into ServiceNow is a common requirement for organizations aiming to maintain a unified system of record. One effective method for achieving this is through the use of Import Sets, which can be enhanced using REST APIs for seamless data transfer. In this blog post, we will explore how to use

The post Using Import Sets for REST Integration in ServiceNow appeared first on ServiceNow Guru.

]]>

Integrating data from external sources into ServiceNow is a common requirement for organizations aiming to maintain a unified system of record. One effective method for achieving this is through the use of Import Sets, which can be enhanced using REST APIs for seamless data transfer. In this blog post, we will explore how to use Import Sets for REST integration in ServiceNow, focusing on key components such as Transform Maps, Transform Scripts, and the Import Set API.

What are Import Sets?

Import Sets in ServiceNow are used to import data from various data sources and map that data into ServiceNow tables. They act as staging tables where raw data is initially stored before being processed and transformed into the target tables in ServiceNow.

Transform Maps

Definition

A Transform Map is a set of field mappings that determines the relationships between fields in an import set and fields in the target table. Transform Maps ensure that the data imported from the staging table is correctly mapped and transferred to the desired table in ServiceNow.

Creating Transform Maps

  1. Navigate to: System Import Sets > Administration > Transform Maps.
  2. Click on New to create a new transform map.
  3. Provide a name for the Transform Map and select the source Import Set Table and the Target Table.
  4. Define field mappings by adding field map records, specifying the source field from the Import Set Table and the target field in the ServiceNow table.

Transform Maps Validation

  • Ensure the Transform Map is reusable by defining source and target tables that can be applied to multiple data imports.
  • Use the Auto Map Matching Fields feature to automatically map fields with the same name.
  • Validate the field mappings by testing with sample data to ensure accuracy.

Transform Scripts

Definition

Transform Scripts are server-side JavaScript that can be used to manipulate data during the transformation process. These scripts provide additional flexibility to handle complex data transformations and custom logic.

Types of Transform Scripts

  1. onBefore: Executed before any record is processed.
  2. onStart: Executed at the start of the transformation.
  3. onAfter: Executed after each record is processed.
  4. onComplete: Executed after all records have been processed.

Example Usage

(function transformRow(source, target, map, log, isUpdate) {
// Example: Set a default value if a field is empty
if (!source.field_name) {
target.field_name = ‘Default Value’;
}
})(source, target, map, log, isUpdate);
Import Set API

The Import Set API allows for the import of data into ServiceNow from external sources using RESTful web services. This API provides endpoints to create, update, and delete records in Import Set Tables.

Key Endpoints

  1. POST /api/now/import/{table_name}: Import data into a specified Import Set Table.
  2. GET /api/now/import/{table_name}: Retrieve data from a specified Import Set Table.
  3. PUT /api/now/import/{table_name}/{sys_id}: Update a record in a specified Import Set Table.
  4. DELETE /api/now/import/{table_name}/{sys_id}: Delete a record from a specified Import Set Table.

Example Usage

To import data into an Import Set Table using REST, you can use the following example:

Request:

POST /api/now/import/u_my_import_table
Content-Type: application/json
Authorization: Bearer {your_token}{
“field1”: “value1”,
“field2”: “value2”,

}

Response:

{
“status”: “success”,
“sys_id”: “1234567890abcdef”
}

Sample REST API Call using PostMan

 

Integrating It All Together

  1. Setup Import Set Table: Create or identify the Import Set Table where data will be initially stored.
  2. Define Transform Map: Create and configure a Transform Map to map fields from the Import Set Table to the target table.
  3. Write Transform Scripts: Add any necessary Transform Scripts to handle custom logic and data manipulation.
  4. Use Import Set API: Utilize the Import Set API to import data from external sources into the Import Set Table.
  5. Run Transform: Execute the Transform Map to move data from the Import Set Table to the target table, applying any Transform Scripts in the process.

Conclusion

Using Import Sets for REST integration in ServiceNow provides a robust solution for importing and transforming data from various external sources. By leveraging Transform Maps and Transform Scripts, you can ensure that data is accurately and efficiently mapped to your ServiceNow tables. The Import Set API further enhances this process by allowing seamless integration through RESTful web services. By following these best practices and utilizing the tools provided by ServiceNow, you can achieve a streamlined and effective data import process.

The post Using Import Sets for REST Integration in ServiceNow appeared first on ServiceNow Guru.

]]>
Simplifying Data Imports from Third Parties https://servicenowguru.com/imports/simplifying-data-imports-parties/ https://servicenowguru.com/imports/simplifying-data-imports-parties/#comments Wed, 01 Mar 2017 13:53:04 +0000 https://servicenowguru.wpengine.com/?p=12342 Recently along with my Crossfuze colleagues Josh McMullen and Scott Cornthwaite, I performed work for a client that needed to import asset data from several third party vendors. The company, a global manufacturer of household appliances, has dozens of vendors around the world, which supply data (CSV and Excel files) using proprietary formats and column names. The

The post Simplifying Data Imports from Third Parties appeared first on ServiceNow Guru.

]]>
Recently along with my Crossfuze colleagues Josh McMullen and Scott Cornthwaite, I performed work for a client that needed to import asset data from several third party vendors. The company, a global manufacturer of household appliances, has dozens of vendors around the world, which supply data (CSV and Excel files) using proprietary formats and column names. The client’s desired future state is to enforce a single format for use by all vendors. But to control their hardware and software assets today, they needed a solution to work with multiple vendors and data formats.

We faced a few challenges. Our solution needed to:

  • Allow data imports without elevated roles or system privileges
  • Handle the same kind of data from multiple vendors (e.g. hardware asset imports)
  • Handle data in a variety of file formats including text (.csv) and MS Excel
  • Provide feedback to the client’s IT asset management (ITAM) team
  • Check data quality and handle exception conditions defined by the client
  • Run with minimal input or intervention

We started to create custom inbound email actions to process emails with attached data files as they were sent from each vendor. But we discovered some big downsides with that approach. The most serious was that there’s no way to validate the accuracy or completeness of data being sent by the vendor. Inbound email processing also leaves the client at the mercy of the vendor for the timing of the imports. Finally, it isn’t very scalable, since a new inbound email action must be created for every additional vendor and/or data source.

What the client needed was a simple way to perform a data import via a Data Source with a file attachment. They were already aware of the Load Data wizard that ServiceNow provides on the System Import Sets application menu, but that solution isn’t very user-friendly, and it requires a lot of manual input each time new data are imported.

Asset Import Wizard

To make it easy for the client’s ITAM team to import their data in to ServiceNow, we leveraged the power of the Service Catalog. Specifically, we created a Record Producer to provide a simple front-end for importing vendors’ data files. Record producers can be used by anyone. Their visibility can be limited to only interested parties. Files attached to the record producer are automatically attached to the record that is created. And the record producer’s Script field enables powerful data processing.

We configured the record producer as follows:

  • Name: Asset Import Wizard
  • Table: Data Source (sys_data_source)

We added a single variable to the record producer, a lookup select box to allow the ITAM team member to specify the type of import they were performing.

Import Wizard Configurations

The variable gets its list of options from a custom table called Import Wizard Configurations. This table allowed us to build a flexible framework for defining different types of file imports from any vendor.
This table has many fields that are similar to those on a Data Source record. That’s because the record producer’s script queries the Wizard Configuration table for values to use when it inserts a new record in the Data Source table. Here’s how this form breaks down.

  • Vendor references records in the Company (core_company) table where the Vendor field is set to true.
  • Import Type is a choice list of options that describe the kind of asset data being imported (e.g. Hardware, Software, HW end of life disposal, or Lease Contract)
  • Expected Data Format is a choice list that allows either CSV or Excel (.xls) formats to be specified.
  • Data Source Name Prefix is a text field for naming the data source. The record producer’s script automatically appends the current time/date stamp to the prefix when each Data Source record is created
  • Import Set Table and Transform Map are fields that reference records in those tables. The import set and transform map to be used for these data imports must be created in advance.
  • Header Row and Sheet Number are used to specify values when an MS Excel file is the data source. For CSV files, the client just sets these to 1.
  • Active allows the client to deactivate the Wizard Configuration record if it is no longer needed. The Lookup Select Box variable on the record producer displays only active records in this table.

Record Producer Script

The record producer’s script is where the wizard’s “magic” happens. The script does several things:

  1. It validates that the wizard is submitted with an attachment of the correct format.
  2. It queries the Wizard Configuration table for the record that is selected in the Import Type variable, and inserts a new record into the Data Source table using values from the Wizard Configuration record.
  3. It imports the data into the import set table and executes the specified transform map, using ServiceNow helper functions GlideImportSetLoader and GlideImportSetTransformerWorker
  4. It provides feedback to the user upon successful execution of the script, or displays appropriate error messages if the script encountered errors. Error conditions cause the record producer to abort without creating the Data Source record.

Here is the code we used in the record producer script:


// Verify attachment is included and in correct format
var gr2 = new GlideRecord("sys_attachment");
gr2.addQuery("table_sys_id", current.sys_id);
var oC = gr2.addQuery("table_name", "sys_data_source");
oC.addOrCondition("table_name", "sc_cart_item");
gr2.query();
if (!gr2.next()) {
gs.addErrorMessage("You must attach a file to submit. Your import submission has been aborted.");
current.setAbortAction(true);
producer.redirect="com.glideapp.servicecatalog_cat_item_view.do?v=1&sysparm_id=<SysID of the Record Producer>";
}
else{
//Get the glide record for the selected import type
var gr = new GlideRecord('u_pmy_imp_wiz_cfg');
gr.addQuery('sys_id',producer.import_type);
gr.query();
if(gr.next()){
if(gr2.getRowCount() > 1){
gs.addErrorMessage("You may only attach one file at a time for this import wizard. Your import submission has been aborted.");
current.setAbortAction(true);
producer.redirect="com.glideapp.servicecatalog_cat_item_view.do?v=1&sysparm_id=<SysID of the Record Producer>";
}
//check to make sure the file format is correct on the attachment
var passedFormatCheck = false;
var errorCaught = true;
if (gr.u_format == 'CSV'){
if (gr2.file_name.endsWith('.csv') == true){
passedFormatCheck = true;
}
else{
gs.addErrorMessage("This import type is expecting submission of a CSV file (.csv), but a different file format was attached. Your import submission has been aborted.");
current.setAbortAction(true);
producer.redirect="com.glideapp.servicecatalog_cat_item_view.do?v=1&sysparm_id=<SysID of the Record Producer>";
}
}
else if (gr.u_format == 'Excel'){
if(gr2.file_name.endsWith('.xls') == true){
passedFormatCheck = true;
}
else{
gs.addErrorMessage("This import type is expecting submission of an Excel file (.xls), but a different file format was attached. Your import submission has been aborted.");
current.setAbortAction(true);
producer.redirect="com.glideapp.servicecatalog_cat_item_view.do?v=1&sysparm_id=<SysID of the Record Producer>";
}
}

if(passedFormatCheck == true){
// Create data source record (based on form import type selection record)

current.name = gr.u_ds_naming + '_' + gs.nowDateTime();
current.format = gr.u_format;
current.import_set_table_name = gr.u_import_set.name;
current.header_row = gr.u_header_row;
current.sheet_number = gr.u_sheet_number;
current.file_retrieval_method = "Attachment";
current.type = "File";

//Data source needs to be created before we can trigger the commands below, so we create the record outside of the normal record producer method
current.insert();

// Process file into data source record
var loader = new GlideImportSetLoader();
var importSetRec = loader.getImportSetGr(current);

// Import data from data source to import set table (based on form import type selection record)
var ranload = loader.loadImportSetTable(importSetRec, current);
importSetRec.state = "loaded";
importSetRec.update();

// Start appropriate transform map (will have the logic for logging exceptions within the transform map scripts, and will trigger an email once complete to the import submitter with an outline of the logged errors and warnings)
var transformMapID = gr.u_transform;
var transformWorker = new GlideImportSetTransformerWorker(importSetRec.sys_id, transformMapID);
transformWorker.setBackground(true);
transformWorker.start();

//Inform the user that a email outlining the status of the import will be sent once the import is complete
gs.addInfoMessage("Your import file has been submitted. An email will be sent to you once the import is completed to outline any errors or warnings encountered while importing.");
producer.redirect="home.do";
}

}
else{
gs.addErrorMessage('Something went wrong with the import. Please contact a system admin to investigate.');
}

// Since we inserted the data source already, abort additional insert by record producer
current.setAbortAction(true);
}

The Miracle of Transform Map Scripts

If the wizard works magic on the front end, then Transform Maps do the same on the back end. Even when the client’s ITAM team have an opportunity to review the vendors’ data before importing it, there can still be errors. We identified all of the potential failure points in each vendor’s data.

We used onBefore transform scripts to check each source row for exceptions before the source fields are mapped into the target table. We grouped these exceptions into two categories; Errors and Warnings. The transform scripts use log.error() and log.warn() to write exceptions to the import set log for each import run. Error exceptions cause the source row to be skipped by setting the ignore variable to true. Warning exceptions are logged, but the source row is transformed.

We also determined whether other tables required records to be updated or inserted as a result of the data import. For example, lease contract imports contain information about the hardware assets that are under lease. As each hardware record is updated, a related record has to be created in the Assets Covered table, in order to associate that asset with its lease contract.

We used onAfter transform scripts to handle these secondary table updates. OnAfter scripts run after the source record has been transformed into the target table. These scripts also logged exceptions if any were encountered during the update.

After all of the source rows have been evaluated and/or transformed, an onComplete script compiles the exceptions from the import set log in a block of text, then queues a system event. The user who initiated the import receives a notification containing that block of text. The notification provides feedback in near-real time, and lists exceptions from the import set log that would normally only be available to administrators.

The Universal Translator

The final cog in this machine is the Vendor Model Translation table. We created this custom table because the client found that their vendors used model identifiers that did not match the names in the client’s Model (cmdb_model) table. The lack of a common name, model number or some other identifier in the vendor’s data makes it impossible to match asset models.

The vendor translation table is a simple cross-reference table that associates a vendor’s name for a given model with the model record in ServiceNow. It contains just four fields:

  • Vendor is a reference to a vendor record in the Company (core_company) table
  • Active is a true/false value that can be used to filter records during queries
  • Vendor Model is a text field that stores the identifier used by the vendor to reference the model
  • ServiceNow Model is a reference to a record in the Model (cmdb_model) table

With this simple table it is possible to establish aliases for any number of models and/or vendors. It can also be used as a kind of normalization table, listing several vendor models that all refer to the same model in the client’s Model table. In the example below, three mobile phone models provided by the vendor are all associated with a single model in ServiceNow:

We put this table to use in a couple of ways. The transform map for hardware asset imports used an onBefore script to set values in the target Hardware (alm_hardware) table based on a match in the vendor translation table. This script also illustrates some of the exception logging we performed for the client:


(function runTransformScript(source, map, log, target /*undefined onStart*/ ) {
var errorCondition = false;
var itemModel = '';
var modelDisp = '';
var vendorName = 'Name of the vendor company';
//Set the sourceRow variable to allow for input into the log statements the row from the source that failed
var sourceRow = source.sys_import_row + 2;
var excPrefix = 'Exception: Asset Import HW ASN ' + vendorName + ': Source Data Row ' + sourceRow + ': ';
// Do not transform the source record unless the source u_inventory_category field contains the text string "computer.system". This filters out non-computer hardware that may be included in the ASN. Log an Error if the inventory category value does not contain ".Computer Systems.".
if (source.u_inventory_category.indexOf('.COMPUTER SYSTEMS.') == -1){
//log.error(excPrefix + 'Item not a hardware asset (inventory category field does not contain ".computer system.")');
errorCondition = true;
}
else{
// Check for empty vendor model number in source, Log ERROR with message per error exception
if(JSUtil.nil(source.u_mfg_part_num) == true){
log.error(excPrefix + 'Manufacturer part number (mfg part num) missing in source data');
errorCondition = true;
}
else{
// Perform lookup against custom translation table, using vendor and vendor model key as the unique identifiers
//query the table for model translations for the selected vendor
var gr2 = new GlideRecord('u_vendor_translation');
gr2.addQuery('u_vendor', 'SysID of the vendor's record in the core_company table');
gr2.addQuery('u_vendor_model', source.u_mfg_part_num);
gr2.addActiveQuery();
gr2.query();
// Confirm if match found, if not, raise ERROR exception into import log
if(gr2.next()){
//Set the MODEL FIELD on the record so we don't need to query the table again in a field map script
source.model = gr2.u_sn_model.sys_id;
target.model = gr2.u_sn_model;
//set the itemModel field for use in another query later
itemModel = gr2.u_sn_model;
modelDisp = gr2.u_sn_model.getDisplayValue();
target.model_category = gr2.u_sn_model.cmdb_model_category;
}
else{
log.error(excPrefix + 'Source model ' + source.u_mfg_part_num + ' does not match a model in Service-Now. Check vendor translation table to ensure a translation is set up.');
errorCondition = true;
}
}

// Check for empty serial number, if empty, Log ERROR with message per error exception logging section below
if (JSUtil.nil(source.u_serial_num) == true){
log.error(excPrefix + 'Serial number missing in source data');
errorCondition = true;
}
else{
// Confirm that the serial number does not exist in the alm_asset table already as that same model, if so, raise ERROR exception
var modelDesc = '';
var gr1 = new GlideRecord('alm_asset');
gr1.addQuery('serial_number', source.u_serial_num);
//If we found a model through the translation table record, perform an additional filter on the model to make sure we don't have a duplicate model + serial number combo
if (itemModel != ''){
gr1.addQuery('model', itemModel);
//set the modelDesc variable to include that optionally in an error code if applicable
modelDesc = 'with model number ' + modelDisp + ' ';
}
gr1.query();
if(gr1.next()){
log.error(excPrefix + 'Serial number ' + source.u_serial_num + ' ' + modelDesc + 'matches existing hardware asset record ' + gr1.getDisplayValue());
errorCondition = true;
}
}

if(JSUtil.nil(source.u_customer_po_num) == true){
log.error(excPrefix + 'Customer PO Num field is empty on this record.');
errorCondition = true;
}
else{
// Perform lookup on the PO Line item table based on the PO provided and the model of hardware *Assume no more than one line item per model
var enc = 'purchase_order.u_po_number='+ source.u_customer_po_num +'^purchase_order.status!=canceled^model=' + itemModel;
var pol = new GlideRecord('proc_po_item');
pol.addEncodedQuery(enc);
pol.orderByDesc('sys_created_on');
pol.query();
if(pol.next()){
//Set the source record's Purchase Line to the sys_id of the Purchase Order Line Item
source.purchase_line = pol.sys_id;
}
else{
log.error(excPrefix + 'Unable to find a PO Line item under PO number ' + source.u_customer_po_num + ' for product model ' + modelDisp + '. This could be an issue with the vendor translation table record, or the wrong model was selected on a line item.');
errorCondition = true;
}
}

// If the Vendor is not found in ServiceNow's core_company table, log a Warning
var gr3 = new GlideRecord('core_company');
gr3.addQuery('sys_id', 'bbb81b896f8641009e4decd0be3ee4b1'); //sys_id of the vendor from the core_company table
gr3.query();
if(!gr3.next()){
log.warn(excPrefix + 'Vendor ABCDEFG (bbb81b896f8641009e4decd0be3ee4b1) does not match a record in the ServiceNow.');
}

// Check if the source record's quantity is greater than 1, if so, Log ERROR with message per error exception logging section below
if(source.u_qty_shipped > 1){
log.error(excPrefix + 'QTY shipped is greater than 1.');
errorCondition = true;
}
}

//Skip importing the source record if the transfer map is attempting to update an existing record
if (action == "update" && errorCondition != true){
log.error(excPrefix + 'Record is attempting to update an existing record.');
ignore = true;
}
// Skip importing the source record if any exceptions are found (ignore = true )
if(errorCondition == true){
ignore = true;
}

})(source, map, log, target);

For mobile device imports we wrote an onStart transform script that uses the Vendor Model Translation table to update source rows before any source rows are transformed. That reduces the ITAM team’s administrative task load by leveraging information already in ServiceNow’s Model table.


(function runTransformScript(source, map, log, target /*undefined onStart*/ ) {
/***
* Before transforming any source rows we'll try to match them to models using the
* Vendor Model Translation table or the ServiceNow Model table
***/
log.info('import set ' + import_set.sys_id.toString());
//Query the import set table for rows in the import set we're transforming
var row = new GlideRecord('u_mobile_device_import');
row.addQuery('sys_import_set', import_set.sys_id.toString());
row.query();
while(row.next()){
if(!row.u_device_model_.nil()){
//Check the Vendor Model Translation table for a record that matches the source's Device Model
var gr2 = new GlideRecord('u_vendor_translation');
gr2.addQuery('u_vendor_model', row.u_device_model_);
gr2.addActiveQuery();
gr2.query();
if(gr2.next()){
//Set the MODEL FIELD on the row so that we can coalesce
row.u_device_model_ = gr2.u_sn_model.sys_id.toString();
row.update();
}
else{
//No match in the Vendor Translation table, so check ServiceNow's Model table for a match
var gr3 = new GlideRecord('cmdb_model');
gr3.addQuery('display_name', row.u_device_model_);
gr3.addQuery('status','In Production');
gr3.query();
if(gr3.next()){
//Set the MODEL FIELD on the row so that we can coalesce
row.u_device_model_ = gr3.sys_id.toString();
row.update();
}
}
}
}
})(source, map, log, target);

Putting It All Together

The Import Wizard gave the client a simple way to initiate data imports from any number of vendors at a convenient time, after they verified the quality of the data. The Wizard Configuration table provided a means to extend the wizard’s functionality for multiple vendors and data imports. And the Vendor Model Translation table allowed the ITAM team to associate a vendor’s model information with Model records in the client’s ServiceNow instance.

The client’s ITAM team reports that the wizard has already paid off.  In a few short days it has simplified their work, reduced errors, and made processing asset data much more efficient. And project managers are making plans to bring more vendors on board as they work toward enforcing common data formats across all of their vendors.

Standing on the Shoulders of Giants

This article wouldn’t be complete without acknowledging the assistance of others. We based our design for the Import Wizard record producer on this ServiceNow Community post by Michael Ritchie and endorsed by everyone’s favorite bow-tie wearing technical genius, Chuck Tomasi. The post is well worth reading, since Michael leads you through the steps you’ll need to follow before using the record producer to perform a data import.

The post Simplifying Data Imports from Third Parties appeared first on ServiceNow Guru.

]]>
https://servicenowguru.com/imports/simplifying-data-imports-parties/feed/ 14
Sending attachments to a 3rd-party service desk https://servicenowguru.com/integration/sending-attachments-3rdparty-service-desk/ https://servicenowguru.com/integration/sending-attachments-3rdparty-service-desk/#comments Wed, 14 Apr 2010 22:47:33 +0000 https://servicenowguru.wpengine.com/?p=1529 I often hear requests for this to be bidirectional. Sending attachments from Service-now to a third-party system isn’t something that we’ve actually implemented in the past. However, with the number of people asking for it lately, I decided to write up a solution. Fetch Attachment as Base64 Encoded Data The following code is assumed to

The post Sending attachments to a 3rd-party service desk appeared first on ServiceNow Guru.

]]>
I often hear requests for this to be bidirectional. Sending attachments from Service-now to a third-party system isn’t something that we’ve actually implemented in the past. However, with the number of people asking for it lately, I decided to write up a solution.

Fetch Attachment as Base64 Encoded Data

The following code is assumed to be within a business rule on the task table (or a table that inherits from task, such as incident, problem, change, etc).

var StringUtil = Packages.com.glide.util.StringUtil;

var gr = new GlideRecord('sys_attachment');
gr.addQuery('table_sys_id', current.sys_id);
gr.addQuery('table_name', current.getTableName());
gr.addQuery('file_name', &'truck7.jpg');
gr.query();

if (gr.next()){
var sa = new Packages.com.glide.ui.SysAttachment();
var binData = sa.getBytes(gr);
var encData = StringUtil.base64Encode(binData);
}
else{
gs.log('record not found');
}

The above code is where the magic happens. The attachment to the incident is retrieved from the Service-now database and is now in base64 encoded data, ready to be sent to the integrating party via web services.

Send Attachment Data via SOAP

The code for sending the encoded data to a 3rd-party system might look something like this:

var envelope = new SOAPEnvelope();
envelope.createNameSpace('xmlns:imp', 'http://www.abcCompany.com/soap');
envelope.setFunctionName('imp:insert');
envelope.addFunctionParameter('uid', current.correlation_id);
envelope.addFunctionParameter('content', encData);
envelope.addFunctionParameter('content_type', gr.content_type);
var request = new SOAPRequest('http://www.abcCompany.com/addAttachment', 'user','pass');
var res = request.post(envelope);

Fetch Attachment Data – Alternate Method

You very well may want to have the business rule run on the sys_attachment table instead of the TASK table. If this is the case, your business rule will look like the following:

var StringUtil = Packages.com.glide.util.StringUtil;

var sa = new Packages.com.glide.ui.SysAttachment();
var binData = sa.getBytes(current);
var encData = StringUtil.base64Encode(binData);

The post Sending attachments to a 3rd-party service desk appeared first on ServiceNow Guru.

]]>
https://servicenowguru.com/integration/sending-attachments-3rdparty-service-desk/feed/ 14
Extending the CMDB https://servicenowguru.com/cmdb/extending-cmdb/ https://servicenowguru.com/cmdb/extending-cmdb/#comments Wed, 24 Feb 2010 17:02:58 +0000 https://servicenowguru.wpengine.com/?p=1052 One of the most basic needs that a customer has when building out their CMDB is extending it to match the types of CIs that they’re currently using in their company.  This is especially true when bringing data in from a 3rd-party CMDB (such as IBM’s CCMDB, HP’s uCMDB, etc) with Service-now.  Some of these

The post Extending the CMDB appeared first on ServiceNow Guru.

]]>
One of the most basic needs that a customer has when building out their CMDB is extending it to match the types of CIs that they’re currently using in their company.  This is especially true when bringing data in from a 3rd-party CMDB (such as IBM’s CCMDB, HP’s uCMDB, etc) with Service-now.  Some of these CMDBs have hundreds of class types with scores of fields for each class.  How can you get the 3rd-party data into Service-now when the schema is so different?

There are essentially four main steps to accomplish this: decide what classes and fields need to be brought across, create a mapping document, extend the Service-now CMDB to accept the new classes, and send the data from the 3rd party CMDB to Service-now.

1) Decide what classes and fields should be included in Service-now

Every integration discussion starts out the same, “Can Service-now receive my 200+ different CI class types and how long will it take for us to do this?”  While it is good to know the capabilities of Service-now before embarking on such a task, perhaps the more important question that product managers should be asking is, “out of these 200 class types, what are the +/- 20 classes that make sense to integrate with our service desk?”  The first answer is usually, “All of them!”  However, when looked at more carefully, it usually makes sense to only bring over a small subset of the entire collection.

2) Create a mapping document

The difficulty of this step largely depends on the number of classes and fields that were decided in step one.  Again, you’ll want to pear down the number of fields in each class to be small and relevant.  The more common items such as IP address and MAC address will be similarly named in each system.  Determine what fields are missing in SN in this step so that you know which fields need to be added in step 3.

3) Extend the Service-now CMDB

If steps 1 and 2 were done correctly, this step should be straightforward.  The attached video is a demonstration that I put together to show how simple it is to extend the CMDB and how to go about doing it.  You should use your mapping  document to show you which classes need to be added and what fields need to be in those classes.  For the classes that already exist, you will simply be adding the new fields to the already existing class.

4) Send the data to Service-now

This is the step that scares some people and is mistakenly seen and the most difficult and time consuming step.  However, when the previous steps are completed beforehand, this step is actually the simplest of them all.

The most common method of bringing CMDB CI data into Service-now is by using a CSV/XML file. This is a perfect choice when you’re wanting to do a one-time load of CI data into your system. This can be done by creating a new data source and attaching your CSV / XML file to the record. More details on this can be found here: http://wiki.service-now.com/index.php?title=Importing_Data_Using_Import_Sets

In a CMDB integration, the most common way of sending data to Service-now from a 3rd-party is by using JDBC connection to the source of record.  A simple (or not so simple) SQL statement retrieves the classes that you’re interested in and sends them to Service-now to be mapped.

Another common way of sending data to SN is by using web services.  If your CMDB or discovery solution supports sending SOAP, then you might consider this option as well.  I cover this in the attached video as well.

A CMDB integration can be daunting.  However, if you follow this guide, you should have a good idea how to approach it in a way that encourages a successful and timely outcome![youtube]http://www.youtube.com/watch?v=1LeIKFt5ZG8[/youtube]

The post Extending the CMDB appeared first on ServiceNow Guru.

]]>
https://servicenowguru.com/cmdb/extending-cmdb/feed/ 7
The Correlation ID / Display fields https://servicenowguru.com/integration/correlation-id-display-fields/ https://servicenowguru.com/integration/correlation-id-display-fields/#comments Mon, 08 Feb 2010 20:07:45 +0000 https://servicenowguru.wpengine.com/?p=909 If you have looked at the schema for many of the tables within Service-now, you’ve probably noticed the Correlation ID and Correlation Display fields and may have wondered what they are for.  Typically, these fields are used for integrating a third-party system with Service-now. Let’s assume that I am doing an integration with a alerting

The post The Correlation ID / Display fields appeared first on ServiceNow Guru.

]]>

If you have looked at the schema for many of the tables within Service-now, you’ve probably noticed the Correlation ID and Correlation Display fields and may have wondered what they are for.  Typically, these fields are used for integrating a third-party system with Service-now. Let’s assume that I am doing an integration with a alerting system system from HP called OpenView.  In this integration, alerts are sent from HP to Service-now.  When the ticket is updates and/or closed, Service-now updates/closes the alert in OpenView.

Correlation ID

When an alert is received by Service-now, the relevant fields from the alert are mapped to a Service-now incident. Since it is so common to want to know what the actual alert number is in OpenView, we have place to store that on the incident table.  The field is called the correlation_id.

The correlation_id field stores the unique identifier (in the other system) for the incoming task, or for the sake of our example, the incoming alert.  This field is important for a couple of reasons.  First, many customers want to display the OpenView ticket number on the incident form itself in case the operator wants to go to OpenView and look at the alert himself.  Second, storing this information allows Service-now to update the alert in OpenView when the incident is updated and/or closed.  In this integration, we use our mid server to call an OpenView shell command that accepts the unique id for the ticket in addition to other parameters to update that ticket.

Correlation Display

It is common for administrators to desire to know the origin of the tasks, CI’s, users, etc, that exist in their table.  If they are looking at a list of CIs, it might be important to know whether the Service-now discovery tool, a LANDesk integration, or a consumed csv file was the source of each of the CIs now existing in our CMDB.  This is where the correlation display comes in handy.  In our OpenView integration example, whenever we receive an incident from an alert sent from OpenView, we hardcode the correlation_display to be “HPOVO”.  This allows administrators to know the source of their incident.  Using this field they will know that it became an incident because of the OpenView system, as opposed to their MS SCOM alerting tool or even a service desk operator.

Another use for the correlation display is to assist with the integration itself.  When a business rule is written to send data back to OpenView, it is good practice to put in the condition field: current.correlation_display == “HPOV” in addition to your other conditions.  The reason should be obvious.  If you want to update an alert in the 3rd party system, you want to make sure that you’re sending the updated alert information to the correct system!  While many people often leave this out of the condition statement, it is considered good practice to put it in.  If you do, your integration will not break if a new alerting integration is added in the future.

The post The Correlation ID / Display fields appeared first on ServiceNow Guru.

]]>
https://servicenowguru.com/integration/correlation-id-display-fields/feed/ 2
Removing or modifying the default ServiceNow login page https://servicenowguru.com/single-sign-on/removing-modifying-default-servicenowcom-login-page/ https://servicenowguru.com/single-sign-on/removing-modifying-default-servicenowcom-login-page/#comments Mon, 28 Dec 2009 19:51:33 +0000 https://servicenowguru.wpengine.com/?p=388 For some Service-now.com implementations, there may be a need to modify the default login page. In other implementations, it may be necessary or desirable to remove the login page altogether (this scenario would only apply if you have set up Single sign-on for your instance and you didn't want users to authenticate directly against Service-now).
Whatever the reason is, you may find it necessary to modify the behavior of the Service-now login page. If you do, there are a couple of options I would recommend.

The post Removing or modifying the default ServiceNow login page appeared first on ServiceNow Guru.

]]>
For some ServiceNow implementations, there may be a need to modify the default login page.  In other implementations, it may be necessary or desirable to remove the login page SNC login formaltogether (this scenario would only apply if you have set up Single sign-on for your instance and you didn’t want users to authenticate directly against Service-now).  Even if you thought you wanted to disable the login page entirely I wouldn’t recommend it unless you had some very stringent security requirements that you couldn’t make an exception to.  The problem with removing the page entirely is that you cut off access to your ServiceNow instance entirely if your SSO portal goes down.  In an event like this, you probably want your ServiceNow administrator – and potentially process users – to be able to access the instance through a local login account (which requires a login page!).

Whatever the reason is, you may find it necessary to modify the behavior of the ServiceNow login page.  If you do, there are a couple of options I would recommend.

If you want to disable local login entirely (only if you are implementing SSO), the first option you should consider is to set the ‘glide.authentication.external.disable_local_login’ redirection property. This property was introduced in the Fall 2009 Stable 2 build so you may have to upgrade to get it. It allows you to disable login on the standard welcome page unless SSO credentials were present. It needs to be used along with a second redirect property that will redirect non-sso users to the page of your choice. These properties are documented here…
http://wiki.service-now.com/index.php?title=External_Authentication_%28S…

Another option you might try is to install the content management plugin and configure a custom welcome page.  This option could be used to customize the login page and present whatever information you wanted to show.  You could use this with SSO or if you just wanted to customize the look and feel of your Service-now login page.
When you install the content management plugin, you’ll have a module in your instance called ‘Configuration Page’. You can specify the login page for your instance in the ‘Login page’ field on the configuration page. Then when anyone came to your instance they would see the page specified (which could be any html page you want to create).

The post Removing or modifying the default ServiceNow login page appeared first on ServiceNow Guru.

]]>
https://servicenowguru.com/single-sign-on/removing-modifying-default-servicenowcom-login-page/feed/ 4