Arsdoc get � best optimized way to extract and reload

Previous topic - Next topic

jsquizz

So I am working on a migration and I�m just looking at some possible ways to extract and reload my data.

My thought - arsdoc get with -L, read from parameter file with a listing of load ID�s. Do this on an app group by app group basis. We only have like 10, but a good amount of data.

We have a process, a weird one- that extracts from a �short term� application group and loads into a �long term� application group. This was handled by a third party years ago, and since I joined I redid it with a few basic shell scripts. My script generates a list of dates of the month prior, and loops through that + an app group name, and does all the magic. This process is very good. I can�t wrap my head around why they are going from one app group to another. I think it�s for performance at the db level or something

I figure there�s other ways, where I can extract a month of data per file, etc. so I guess my three options I�m trying to weigh

1) retrieve an entire load, reload with arsload
2) retrieve an entire day, reload with arsload
3) retrieve an entire month, reload with arsload.

I�m leaning towards one. I also think that would be easier to recover if there�s some kind of issue.

What�s everyone�s thoughts.
#CMOD #DB2 #AFP2PDF #TSM #AIX #RHEL #AWS #AZURE #GCP #EVERYTHING

J9CMOD

For us (and our high retentions) we would do #3 or even by year, but that's a matter of opinion.
I'd love to see the code you use, as I am currently setting up a similar process and you could save me some time.

Justin Derrick

Just to give you some insight into the short-term-to-long-term process...  Some systems can only send one (or very few) documents to CMOD at a time.  This is good for those apps, but bad for CMOD.  Content Manager OnDemand's compression works best with big batches of data that's very similar. 

There's also a lot of overhead in CMOD (and TSM if you're using it) for each load into CMOD -- so having a 'temporary' AG where you load lots of small individual files, then exporting them in a batch once a week/month/quarter is a good strategy for optimizing particular AGs. 

-JD.
Call:  +1-866-533-7742  or  eMail:  jd@justinderrick.com
IBM CMOD Wiki:  https://CMOD.wiki/
FREE IBM CMOD Webinars:  https://CMOD.Training/
IBM CMOD Professional Services: https://CMOD.cloud

Interests: #AIX #Linux #Multiplatforms #DB2 #TSM #SP #Performance #Security #Audits #Customizing #Availability #HA #DR

Lars Bencze

There's also a #4 possibility - depending on the format of your data (see especially JD's comment previously).
If you keep the original input data files that you receive, for example in a (compressed) folder on disk for 1 month, then re-load it into the new long-term appgroup when the time has come, and simply do an unload of those LoadIDs in the first/short-term Application Group.

This method does take up extra disk space, but disk is really cheap nowadays. It would save you the hassle of exporting, reloading and unloading in a batch.
OnDemand for MP expert. #Multiplatforms #Admin #Scripts #Performance #Support #Architecture #PDFIndexing #TSM/SP #DB2 #CustomSolutions #Integration #UserExits #Migrations #Workflow #ECM #Cloud #ODApi

Justin Derrick

Lars has a good idea, but you'd need to do some sort of consolidation step inbetween - the point is to load a very large single file to get the benefits of better compression and lower overhead.

-JD.
Call:  +1-866-533-7742  or  eMail:  jd@justinderrick.com
IBM CMOD Wiki:  https://CMOD.wiki/
FREE IBM CMOD Webinars:  https://CMOD.Training/
IBM CMOD Professional Services: https://CMOD.cloud

Interests: #AIX #Linux #Multiplatforms #DB2 #TSM #SP #Performance #Security #Audits #Customizing #Availability #HA #DR