How to do LDAP query for multiple records

I have a database table with ~3000 employee records. There’s a column that I don’t get from the system of origin that I need to extract from our LDAP server. Basically, I need to get the SAM account for each employee using their employee ID as the lookup. I haven’t seen much documentation on how to use the LDAP query service but I wanted to see what the best practice would be in terms of getting the SAM account.

First of all, how do I actually structure the query in a flow? Second, is it better for me to insert all the records into the table and then somehow loop over each record in the table to do an LDAP query for each one (update of each record)? OR…is it possible for me to do the LDAP query inline as I’m doing the insert with the JDBC Adapter in that flow?

I hope I’m being clear and any help or guidance would be greatly appreciated. Thanks in advance.


Hi Damon,

did you check the IS Built-In-Services Reference for the chapter about the pub.client.ldap-Folder?

If possible, I would prefer to retrieve the data from Source system, then Loop over the ResultSet and enhance the data with the ldap data.

After Loop is done perform a BatchInsert into the database.

When the data is already existing in the database, you can retrieve those entries where the LDAP data is missing by a SELECT JDBC call, loop over the result set with enhancing the ldap data and then perfom a BatchUpdate to the database.

This will reduce the data load on the communication with the database.


I understand what you’re saying, Holger, and I’m trying to do that. I actually have the LDAP query working in terms of it returning the one piece of data I need from the LDAP server (SAM Account). The issue seems to be I get a document called “Results” back, under that is another record with the CN data (info on the person I’m searching against) of the record in AD, and then a string field under that called SAMAccount that has the data in it.

What I’m wondering is how do I get that data to coincide with the correct flat file record? For every record I’m doing a lookup on in my flat file, I’m getting a Results document back as I described above (200 records in file = 200 Results documents in pipeline). I hope I’m explaining myself well. I would really like to understand this process because I have multiple other flows I can apply all this to. Any more help would be GREATLY appreciated.

And just to confirm, I want to build the document out (again 200 records in file = 200 array members, right?) and then just do an insert into the table after building the entire array. Is that correct?

Thanks again for all the help.


Hi Damon,

just some more questions:
What is the primary source of your data?
A FlatFile document, a XML document, …?

When reading these in, most likely you will have a list which contains each employees entry with all the fields you need (except the one you want to read from the LDAP).

Create a document type with all the fields you need (including the LDAP field).
Data (i.e. names) should match the BatchInsert-Signature, field types should be string in most cases, some might be Date.
Check this against your table structure.

Initialize a document list based on the doctype
Loop over the input list

  • query LDAP on current employees id to retrieve the LDAP field
  • check for uniqueness (length of result = 1) of the LDAP result
  • map the data and the unique result of the ldap search to a temp single instance of your document type
  • pub.list:appendToDocumentList (in: source list and temp single instance, out = source list)
  • drop temp single instance to avoid mess up of different items.
    BatchInsert with the resulting doc type list as input

See IS Built-In-Services Reference and JDBC Adapter Users Guide for further informations.


sorry for the delayed response, Holger. I found the issue. The results coming back from the LDAP query (text field in each results document) wasn’t in the source side of the pipeline. Once I copied that structure into the input side, I had access to it and it maps perfectly just like I want.

Thanks for all your help.