Listing of Load IDs with Index Values

Previous topic - Next topic

rellerbrock

Is there a way to get a listing of index values with load ids for a given day and application group?   We had an issue where duplicate files were generated and then sent to OnDemand.   I am trying to get a list so we can identify load ids of duplicate index values.  I can then script an unload of those documents.   Thanks.

jsquizz

Here is what I would do. Of course, I would fine tune this depending on your needs. Make sure you test all of this.

1) Get the exact table name for the specific app group-

db2 => select arsag.name, arsseg.table_name from arsag, arsseg where arsag.agid=arsseg.agid and arsag.name='AGNAME'

NAME                                                         TABLE_NAME
------------------------------------------------------------ ------------------
AGNAME                                                      TBL1


2) get a listing of the date, plan, doc_name from the table, and use where clause with the date you choose (in the arsdate format..)

db2 => select rdate, plan , doc_name from TBL1 where rdate = '17840'

3) export to excel, apply some filters to remove duplicates. Then toss them in an arsadmin unload
#CMOD #DB2 #AFP2PDF #TSM #AIX #RHEL #AWS #AZURE #GCP #EVERYTHING

Stephen McNulty

If you are looking for duplicate 'load' files, an option could be to use the system load folder.  It provides the load id, Input file name, and input size.
#ISERIES #ODWEK #XML

rjrussel

#3
While I don't have time at the moment to write up entire instructions here is, at a high level, what I would do.

1. Find the load ID of the load you're interested in looking at the index data for.

2. Use arsadmin retrieve to pull the index object (ex: 123FAA1)

3. Use arsadmin decompress to uncompress the object.

--> All meta data that was loaded to the database will be contained in the uncompressed data.

By following this method you can simply use supported, native OnDemand commands to perform the task before you.

Thanks,
RR

rellerbrock

Thanks to everyone for their input this worked to get the duplicates removed.