Integration Archives - ServiceNow Guru https://servicenowguru.com/category/integration/ ServiceNow Consulting Scripting Administration Development Tue, 22 Oct 2024 15:57:36 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.2 https://servicenowguru.com/wp-content/uploads/2024/05/cropped-SNGuru-Icon-32x32.png Integration Archives - ServiceNow Guru https://servicenowguru.com/category/integration/ 32 32 Custom queue event handling in ServiceNow – Implementation steps https://servicenowguru.com/integration/custom-queue-event-handling-servicenow/ https://servicenowguru.com/integration/custom-queue-event-handling-servicenow/#comments Tue, 29 Oct 2024 09:57:11 +0000 https://servicenowguru.com/?p=16974 Background Looking at the ServiceNow native processes, one can easily see that big portion of them are event-based, rather than synchronous. This is especially true for the processes which are not critical to the user experience or the ones which are not dependencies of other business logic. In a nutshell, an event is logged in

The post Custom queue event handling in ServiceNow – Implementation steps appeared first on ServiceNow Guru.

]]>
Background

Looking at the ServiceNow native processes, one can easily see that big portion of them are event-based, rather than synchronous. This is especially true for the processes which are not critical to the user experience or the ones which are not dependencies of other business logic.

In a nutshell, an event is logged in a queue and when system resources are available, the event is picked up and processed by the associated Script Action.

 

Below is a simple visual representation of the process along with explanation (source: Steven Bell) :

0. I register my new event in the Registry, create my Script Action associated to that event, and if needed my Script Include which could be called by the Script Action. Registering my event tells the Worker to listen for that event, and that it will be expected to do something with it.

1. Something executes a gs.eventQueue statement which writes an event record on the queue. BTW, this is not an exhaustive list.

2,3,4. The event worker(s), whose job it is to listen for events listed in the Registry, picks up the event and sees if there is a Script Action(s) associated with the registered event.

5,6. If a Script Action is found to run then it is executed which in turn may execute my Script Include if I choose.

 

Remember the info message when adding a role to a user or a group:

What’s happening behind – an event is logged to the queue and the roles are added to the group in the first possible moment, when the system has resources for that. Usually this is near real-time, but in case of higher priority operations are already queued, this will wait till they free up some processing power.

Now if one is implementing an application, based on synchronous logic, occupying almost all the system resources, this may lead to performance implications, slowing down the instance tremendously.

One possible approach in such cases is to shift from synchronous processing to event-based processing, which will lead to better performance.

But since events are being logged (unless another queue is explicitly specified) to the default queue, we might run into performance issues again.

Here comes the custom queue implementation. It is nothing more than a separate queue to which events can be queued explicitly, leveraging the fifth parameter of gs.eventQueue() API (more on that later).

 

Implementation

The implementation process is similar with the normal event-based logic implementation. We need to have:

  • An event registered in the event registry
  • A Business rule or any other logic to fire the event
  • A Script action to process the event
  • Custom queue processor

I will not discuss the first three, because these are pretty straightforward, and docs are easily available.

Custom queue processor implementation

The easiest way to create a processor for a custom queue is to:

  • go to System Scheduler -> Scheduled Jobs -> Scheduled Jobs
  • find a job with Trigger type = interval (i.e. ‘text index events process’)

  • change the name (it can be anything) replace ‘text_index’ with the name of your custom queue inside the fcScriptName=javascript\:GlideEventManager(<HERE>).process(); line
  • set Next action to be in the near future, i.e. 30 seconds from the current moment (This is very important in order to get the job running)

  • (optional) edit the Repeat interval (short repeat interval may have some negative impact on the system performance, but at the same time, the lower the repeat interval, the sooner your event will be queued and processed)
  • Right click -> Insert and stay! Do not Save/Update!

You can have one or more custom queues, depending on the purpose. These must be aligned with the system resources – nodes, semaphores, workers. I will not go deeper on these, more information can be found in the Resources chapter below.

Logging an event to a specific (custom) queue

gs.eventQuque() API accepts 5 parameters:

  • Event name
  • GlideRecord object
  • Param1
  • Param2
  • (optional) queue

This fifth optional parameter ‘queue’ is the one that tells the system to which event queue an event should be logged.

We can log an event to the custom queue we have created above (‘custom_queue_one’) we can use the following line of code:

gs.eventQueue('event.name', grSomeObject,  null, null, ‘custom_queue_one’);

 

NB: queue name (fifth parameter) must be exactly the same as the one we’ve passed to the GlideEventManager during the process creation above.

Everything else (script actions, etc. is the same like in a normal event logging)

 

Good practices

  • Just because you can, doesn’t mean you should – this implementation is applicable only to cases where huge amounts of records must be processed (see Performance chapter)
  • Naming is important, give your names and processor readable names
  • For optimal performance, multiple custom queues can be created to handle a particular event. In this case, the event logging must be done in a way that ensures even distribution between the queues. To better organize these, one possible approach can be to:
    • Create a script include, holding an array with the names of your event queues
    • Use the following line of code to randomly distribute events to the queues:

gs.eventQueue('event.name', grSomeObject,  null, null, event_queues[Math.floor(Math.random()*event_queues.length)]);

where event_queues is an array containing the names of your queues

 

Performance

  • Even though we implement this approach to achieve performance, for low number of transactions it does not yield any performance gain, because of the Repeat interval – the longer it is, the slower will be the overall wait time
  • For large number of transactions (thousands of records), the achieved performance gain can be really significant. In one of my implementations, I was able to achieve 30x faster execution.

 

More information

The post Custom queue event handling in ServiceNow – Implementation steps appeared first on ServiceNow Guru.

]]>
https://servicenowguru.com/integration/custom-queue-event-handling-servicenow/feed/ 3
Using Import Sets for REST Integration in ServiceNow https://servicenowguru.com/integration/using-import-sets-rest-integration/ Mon, 26 Aug 2024 17:18:21 +0000 https://servicenowguru.com/?p=16807 Integrating data from external sources into ServiceNow is a common requirement for organizations aiming to maintain a unified system of record. One effective method for achieving this is through the use of Import Sets, which can be enhanced using REST APIs for seamless data transfer. In this blog post, we will explore how to use

The post Using Import Sets for REST Integration in ServiceNow appeared first on ServiceNow Guru.

]]>

Integrating data from external sources into ServiceNow is a common requirement for organizations aiming to maintain a unified system of record. One effective method for achieving this is through the use of Import Sets, which can be enhanced using REST APIs for seamless data transfer. In this blog post, we will explore how to use Import Sets for REST integration in ServiceNow, focusing on key components such as Transform Maps, Transform Scripts, and the Import Set API.

What are Import Sets?

Import Sets in ServiceNow are used to import data from various data sources and map that data into ServiceNow tables. They act as staging tables where raw data is initially stored before being processed and transformed into the target tables in ServiceNow.

Transform Maps

Definition

A Transform Map is a set of field mappings that determines the relationships between fields in an import set and fields in the target table. Transform Maps ensure that the data imported from the staging table is correctly mapped and transferred to the desired table in ServiceNow.

Creating Transform Maps

  1. Navigate to: System Import Sets > Administration > Transform Maps.
  2. Click on New to create a new transform map.
  3. Provide a name for the Transform Map and select the source Import Set Table and the Target Table.
  4. Define field mappings by adding field map records, specifying the source field from the Import Set Table and the target field in the ServiceNow table.

Transform Maps Validation

  • Ensure the Transform Map is reusable by defining source and target tables that can be applied to multiple data imports.
  • Use the Auto Map Matching Fields feature to automatically map fields with the same name.
  • Validate the field mappings by testing with sample data to ensure accuracy.

Transform Scripts

Definition

Transform Scripts are server-side JavaScript that can be used to manipulate data during the transformation process. These scripts provide additional flexibility to handle complex data transformations and custom logic.

Types of Transform Scripts

  1. onBefore: Executed before any record is processed.
  2. onStart: Executed at the start of the transformation.
  3. onAfter: Executed after each record is processed.
  4. onComplete: Executed after all records have been processed.

Example Usage

(function transformRow(source, target, map, log, isUpdate) {
// Example: Set a default value if a field is empty
if (!source.field_name) {
target.field_name = ‘Default Value’;
}
})(source, target, map, log, isUpdate);
Import Set API

The Import Set API allows for the import of data into ServiceNow from external sources using RESTful web services. This API provides endpoints to create, update, and delete records in Import Set Tables.

Key Endpoints

  1. POST /api/now/import/{table_name}: Import data into a specified Import Set Table.
  2. GET /api/now/import/{table_name}: Retrieve data from a specified Import Set Table.
  3. PUT /api/now/import/{table_name}/{sys_id}: Update a record in a specified Import Set Table.
  4. DELETE /api/now/import/{table_name}/{sys_id}: Delete a record from a specified Import Set Table.

Example Usage

To import data into an Import Set Table using REST, you can use the following example:

Request:

POST /api/now/import/u_my_import_table
Content-Type: application/json
Authorization: Bearer {your_token}{
“field1”: “value1”,
“field2”: “value2”,

}

Response:

{
“status”: “success”,
“sys_id”: “1234567890abcdef”
}

Sample REST API Call using PostMan

 

Integrating It All Together

  1. Setup Import Set Table: Create or identify the Import Set Table where data will be initially stored.
  2. Define Transform Map: Create and configure a Transform Map to map fields from the Import Set Table to the target table.
  3. Write Transform Scripts: Add any necessary Transform Scripts to handle custom logic and data manipulation.
  4. Use Import Set API: Utilize the Import Set API to import data from external sources into the Import Set Table.
  5. Run Transform: Execute the Transform Map to move data from the Import Set Table to the target table, applying any Transform Scripts in the process.

Conclusion

Using Import Sets for REST integration in ServiceNow provides a robust solution for importing and transforming data from various external sources. By leveraging Transform Maps and Transform Scripts, you can ensure that data is accurately and efficiently mapped to your ServiceNow tables. The Import Set API further enhances this process by allowing seamless integration through RESTful web services. By following these best practices and utilizing the tools provided by ServiceNow, you can achieve a streamlined and effective data import process.

The post Using Import Sets for REST Integration in ServiceNow appeared first on ServiceNow Guru.

]]>
Scripted Web Services https://servicenowguru.com/integration/scripted-web-services/ https://servicenowguru.com/integration/scripted-web-services/#comments Fri, 20 Aug 2010 20:50:23 +0000 https://servicenowguru.wpengine.com/?p=1930 One of ServiceNow’s principal strengths comes from its extensible framework. Built into the framework is the ability to retrieve information using one of a myriad of methods. If you want to get data out of any table, you can get it via direct web services, using basic auth data retrieval, having it pushed to a

The post Scripted Web Services appeared first on ServiceNow Guru.

]]>
One of ServiceNow’s principal strengths comes from its extensible framework. Built into the framework is the ability to retrieve information using one of a myriad of methods. If you want to get data out of any table, you can get it via direct web services, using basic auth data retrieval, having it pushed to a client, FTP server or  linux server by using the scheduled data extract, and more.  However, there are times when none of these solutions give you a logical way to achieve what you want in a simple manner.  I will give two examples: 1) adding an item to a cart, 2) fetching an attachment from a record.  For tasks such as these, the flexible Scripted Web Services is the answer.

Scripted web services are used in many integrations where a product wants to accomplish something that doesn’t have a simple solution already built-in.

How to use Scripted Web Services

This functionality requires the Web Service Provider – Scripted plugin.

There are four parts to scripted web services: WSDL (generated for you), script, input variables, output variables.  The WSDL is created for you and is what you’ll use on the client side to access your new web service.  The script is simple JavaScript and performs the required action.  Within the script you retrieve the input variables that you expect the client to send your web services.  Those variables are references using request. notation.  Once the script nears an end, you can set the output variables – the data you want to send back to the client – by using  response. notation.

This is the most flexible way to receive SOAP data and to return whatever you wish to return.  Let me explain by solving to two problems mentioned above.

Ordering a Blackberry

var cart = new Cart();
var item = cart.addItem('e2132865c0a801650010e411699');
cart.setVariable(item, 'original', request.phone_number);

// set the requested for
var gr = new GlideRecord("sys_user");
gr.addQuery("user_name", request.requested_for);
gr.query();
if (gr.next()) {
  var cartGR = cart.getCart();
  cartGR.requested_for = gr.sys_id;
  cartGR.update();
}

var rc = cart.placeOrder();
response.request_number = rc.number;

The script is a simple example of how to order a backberrry for a specific user.  Phone_number and requested_for are sent via the soap client to the script and the scripted web service returns the request_number back to the client.  Something that would be very difficult to figure out using traditional web service methodologies is quite simple to do with using this technique.

Fetching an attachment from a task record

if (typeof GlideStringUtil != 'undefined')
   var StringUtil = GlideStringUtil;
else
   var StringUtil = Packages.com.glide.util.StringUtil;

var  gr = new GlideRecord("sys_attachment");
gr.addQuery("table_sys_id", request.sys_id);
gr.query();
if (gr.next()){
   if (typeof GlideSysAttachment != 'undefined')
      var sa = new GlideSysAttachment();
   else
      var sa = new  Packages.com.glide.ui.SysAttachment();
   var binData =  sa.getBytes(gr);
   var encData =  StringUtil.base64Encode(binData);
   response.file_name = gr.file_name;
   response.table_name = gr.table_name;
   response.encodedAttachment = encData;
}
else{
   gs.log("Record not found");
}

Again, this is pretty straightforward.  What would be the alternative if direct web services were used?  Well, there would need to be a call to the sys_attachment record to find out the file_name, table_name, and associated task record.  Then there would need to be several more calls to fetch every attachment chunch (attachments are stored as chunks of data internally) only to reassemble the data client-side….not very friendly.

I created an update set for the fetch attachment record that you may download and apply to your instance: Fetch Attachment scripted web service.  Remember to enable the Web Service Provider – Scripted plugin before applying the update set.

The post Scripted Web Services appeared first on ServiceNow Guru.

]]>
https://servicenowguru.com/integration/scripted-web-services/feed/ 6
The Best LDAP Integration Tip You’ve Never Heard Of https://servicenowguru.com/service-now-general-knowledge/ldap-integration-attributes-tip/ https://servicenowguru.com/service-now-general-knowledge/ldap-integration-attributes-tip/#comments Wed, 30 Jun 2010 16:49:56 +0000 https://servicenowguru.wpengine.com/?p=1747 One of the basic (but often forgotten) guidelines that should be followed whenever you consider importing any data into your ServiceNow instance is to only import the information that is actually necessary within the tool. Just because you CAN import the data into ServiceNow doesn’t necessarily mean that the data SHOULD be imported or that

The post The Best LDAP Integration Tip You’ve Never Heard Of appeared first on ServiceNow Guru.

]]>
One of the basic (but often forgotten) guidelines that should be followed whenever you consider importing any data into your ServiceNow instance is to only import the information that is actually necessary within the tool. Just because you CAN import the data into ServiceNow doesn’t necessarily mean that the data SHOULD be imported or that it has any value. Even if somebody thinks the data has value within ServiceNow, you should also consider if that value outweighs the work and trouble of importing and maintaining that data going forward. This is particularly true for CMDB and old ticket data but is also true of user data imported from LDAP. One thing that a lot of people don’t realize is that you can end up with ‘garbage’ data from LDAP but that it is also very simple to configure your system to prevent this from happening.

LDAP configuration is typically one of those “Set it and forget it” type of tasks. You connect to the LDAP server, specify the OUs and transform maps, and run the scheduled import. Even though your LDAP integration may be working just fine, chances are that you are actually bringing way more user information into your system than is necessary or useful. This is because by default, an LDAP map brings in ALL available attributes from the LDAP server for each object. Usually, the majority of these attributes aren’t necessary but they end up getting imported and stored for each import in the temporary import set table before potentially being transformed into your user table. The real tragedy with this is that because you still have to bring all of the ‘garbage’ data into the system before the transform, it can actually slow the import time considerably if you have a very large set of data coming over (probably anything above the ‘hundreds’ range of records).

It’s actually very simple to prevent this unnecessary data from coming into your system at all and cluttering up your import table and slowing down your LDAP import. This can be done by adding a comma-separated list of attributes to be brought over from your LDAP server to the LDAP Server record in your instance. To completely minimize the amount of data brought over, this list should contain only those fields used in your transform map. This method has been documented on the ServiceNow LDAP Configuration wiki page.

department,employeeid,givenname,l,mail,manager,sn,source,telephoneNumber,title,uid,dn,cn,o,street,postalCode,mobile,samaccountname

Cleaning up existing clutter

If you’ve already had an LDAP integration running without this setup, you’ve not only got a bunch of garbage data, but a bunch of garbage columns in your import set table as well. Once you modify the attributes for your LDAP server you should go back and clean up the table and data by using the ‘Cleanup’ module under the ‘System Import Sets’ application. You’ll want to have BOTH checkboxes un-checked so that you remove both the data and the table structure. This table structure will be re-created the next time you run your import! Since this is an LDAP import, you’ll want to make sure you go back to the newly-created import table and adjust the column lengths of the ‘DN, Source, and Manager’ fields on your import table to 255 so that the manager mapping and login information doesn’t get truncated during the import.

The post The Best LDAP Integration Tip You’ve Never Heard Of appeared first on ServiceNow Guru.

]]>
https://servicenowguru.com/service-now-general-knowledge/ldap-integration-attributes-tip/feed/ 2
Sending attachments to a 3rd-party service desk https://servicenowguru.com/integration/sending-attachments-3rdparty-service-desk/ https://servicenowguru.com/integration/sending-attachments-3rdparty-service-desk/#comments Wed, 14 Apr 2010 22:47:33 +0000 https://servicenowguru.wpengine.com/?p=1529 I often hear requests for this to be bidirectional. Sending attachments from Service-now to a third-party system isn’t something that we’ve actually implemented in the past. However, with the number of people asking for it lately, I decided to write up a solution. Fetch Attachment as Base64 Encoded Data The following code is assumed to

The post Sending attachments to a 3rd-party service desk appeared first on ServiceNow Guru.

]]>
I often hear requests for this to be bidirectional. Sending attachments from Service-now to a third-party system isn’t something that we’ve actually implemented in the past. However, with the number of people asking for it lately, I decided to write up a solution.

Fetch Attachment as Base64 Encoded Data

The following code is assumed to be within a business rule on the task table (or a table that inherits from task, such as incident, problem, change, etc).

var StringUtil = Packages.com.glide.util.StringUtil;

var gr = new GlideRecord('sys_attachment');
gr.addQuery('table_sys_id', current.sys_id);
gr.addQuery('table_name', current.getTableName());
gr.addQuery('file_name', &'truck7.jpg');
gr.query();

if (gr.next()){
var sa = new Packages.com.glide.ui.SysAttachment();
var binData = sa.getBytes(gr);
var encData = StringUtil.base64Encode(binData);
}
else{
gs.log('record not found');
}

The above code is where the magic happens. The attachment to the incident is retrieved from the Service-now database and is now in base64 encoded data, ready to be sent to the integrating party via web services.

Send Attachment Data via SOAP

The code for sending the encoded data to a 3rd-party system might look something like this:

var envelope = new SOAPEnvelope();
envelope.createNameSpace('xmlns:imp', 'http://www.abcCompany.com/soap');
envelope.setFunctionName('imp:insert');
envelope.addFunctionParameter('uid', current.correlation_id);
envelope.addFunctionParameter('content', encData);
envelope.addFunctionParameter('content_type', gr.content_type);
var request = new SOAPRequest('http://www.abcCompany.com/addAttachment', 'user','pass');
var res = request.post(envelope);

Fetch Attachment Data – Alternate Method

You very well may want to have the business rule run on the sys_attachment table instead of the TASK table. If this is the case, your business rule will look like the following:

var StringUtil = Packages.com.glide.util.StringUtil;

var sa = new Packages.com.glide.ui.SysAttachment();
var binData = sa.getBytes(current);
var encData = StringUtil.base64Encode(binData);

The post Sending attachments to a 3rd-party service desk appeared first on ServiceNow Guru.

]]>
https://servicenowguru.com/integration/sending-attachments-3rdparty-service-desk/feed/ 14
The Correlation ID / Display fields https://servicenowguru.com/integration/correlation-id-display-fields/ https://servicenowguru.com/integration/correlation-id-display-fields/#comments Mon, 08 Feb 2010 20:07:45 +0000 https://servicenowguru.wpengine.com/?p=909 If you have looked at the schema for many of the tables within Service-now, you’ve probably noticed the Correlation ID and Correlation Display fields and may have wondered what they are for.  Typically, these fields are used for integrating a third-party system with Service-now. Let’s assume that I am doing an integration with a alerting

The post The Correlation ID / Display fields appeared first on ServiceNow Guru.

]]>

If you have looked at the schema for many of the tables within Service-now, you’ve probably noticed the Correlation ID and Correlation Display fields and may have wondered what they are for.  Typically, these fields are used for integrating a third-party system with Service-now. Let’s assume that I am doing an integration with a alerting system system from HP called OpenView.  In this integration, alerts are sent from HP to Service-now.  When the ticket is updates and/or closed, Service-now updates/closes the alert in OpenView.

Correlation ID

When an alert is received by Service-now, the relevant fields from the alert are mapped to a Service-now incident. Since it is so common to want to know what the actual alert number is in OpenView, we have place to store that on the incident table.  The field is called the correlation_id.

The correlation_id field stores the unique identifier (in the other system) for the incoming task, or for the sake of our example, the incoming alert.  This field is important for a couple of reasons.  First, many customers want to display the OpenView ticket number on the incident form itself in case the operator wants to go to OpenView and look at the alert himself.  Second, storing this information allows Service-now to update the alert in OpenView when the incident is updated and/or closed.  In this integration, we use our mid server to call an OpenView shell command that accepts the unique id for the ticket in addition to other parameters to update that ticket.

Correlation Display

It is common for administrators to desire to know the origin of the tasks, CI’s, users, etc, that exist in their table.  If they are looking at a list of CIs, it might be important to know whether the Service-now discovery tool, a LANDesk integration, or a consumed csv file was the source of each of the CIs now existing in our CMDB.  This is where the correlation display comes in handy.  In our OpenView integration example, whenever we receive an incident from an alert sent from OpenView, we hardcode the correlation_display to be “HPOVO”.  This allows administrators to know the source of their incident.  Using this field they will know that it became an incident because of the OpenView system, as opposed to their MS SCOM alerting tool or even a service desk operator.

Another use for the correlation display is to assist with the integration itself.  When a business rule is written to send data back to OpenView, it is good practice to put in the condition field: current.correlation_display == “HPOV” in addition to your other conditions.  The reason should be obvious.  If you want to update an alert in the 3rd party system, you want to make sure that you’re sending the updated alert information to the correct system!  While many people often leave this out of the condition statement, it is considered good practice to put it in.  If you do, your integration will not break if a new alerting integration is added in the future.

The post The Correlation ID / Display fields appeared first on ServiceNow Guru.

]]>
https://servicenowguru.com/integration/correlation-id-display-fields/feed/ 2