The DataIterator

I like the DataIterator, but it isn't always the best solution to use for iteration of data. To explain I will tell about a recent flow I needed to develop and the solution I came up with, part of which used the DI.

My client receives an item master file from GHX and they use it to update the Item Master (IC11) & Vendor Item (PO13) records, to update unreleased Requition line items (RQ10), and to update pending PO Interface line items (PO23).

We will update the description values, manufacture values, and vendor item number values for the items in the file. We are also updating the GL account on IC11 and the Purchasing Major and Minor Class values from another file.

The value joining the two files is the UNSPSC code value.

I used a File Access to load each file, and the first was a combination of line and tab separated values. Each item was separated by line but each data element of the item was separated by tab stops. It was recommended that I use a DataIterator to first parse the lines and then to parse the tab separated data elements (basically a nested pair of DI's).

If you're familiar with this, it works pretty much like a query to cycle through the data elements and I thought that was a bit clumsy. 

My solution? I used an Assign node - JavaScript Expression and treated the file's data like an array. It performed the exact same function, but was much faster. I first 'extracted' the data I needed and assigned it to a new variable (line and CSV data separated).

Remember, I had two files that I need to merge via UNSPSC codes before I could begin my updates in Lawson. One file had this value separated by the 4 parts while the other lumped them together. 

The 2nd file contained 49,023 lines of data and the thought of using the DataIterator to read through all of those records concerned me. I thought of the amount of time required and again decided to use the Assign - JavaScript Expression.

This process, reading through 49k lines, parsing and rewriting the output to a new variable took less than 4 minutes. That comes out to better than 200 lines of data per second. Remember, it had to iterate each record by line and then each line by its data element.

Now that I had two sets of records, ready to be merged (by UNSPSC value) I used another JavaScript Expression to compare, match and write the merged data into another new variable.

Then for each set of Lawson records to be updated I used a SQL query to retrieve my item records and wrote that to a new variable, which I then compared to the merged data to determine which had to be updated.

You guessed it, with an Assign - JavaScript Expression. So if the item description was different, or the vendor item, manufacturer info, etc. was different then I wrote which records required updating to a variable.

Now I had, for example, a list of items that needed updates done on IC11 in Lawson. That's where the DataIterator now came into play. I could iterate this final output by line and then create a transaction expression to use in an AGS update.

Using the DI at this point made sense since I needed to perform an update after each item record was parsed out. Within the DI loop was an Assign and a Lawson Transaction. 

My conclusion? If you need to perform a function while looping through the data, use a DataIterator. If not then using a JavaScript Expression is much more efficient.