Hi:
I have a lrge number of individual files with their associated index file. i.e.: One PDF file plus one txt file (Index for the PDF file). x 1000. Same for TIFF Images
What I need is to pack all this files together for example instead of trying to load 1000 single TIFF files, I want to pack them together into a one large file only with an index file that has the classic off-set values inside, so OnDemand process one large file only, instead of 1000 little ones.
Does anyone has any solution suggestion that I can use to achieve this. The bulk of files are TIFF.
Thank you.
Roberto.
Hi Roberto. :)
There are two ways to do this...
1) Write a program that appends the index files together. You don't need to worry about the offsets and lengths -- arsload will concatenate the files for you. You just need a long index file that contains pointers to all 1000 files.
2) Load all your little files into an application group, then use arsdoc get to extract ALL the files into a single index file & output file, load that resulting file into your permanent Application Group, then delete the contents of the temporary one. If you're doing a historical conversion, I'd load up a month's worth of data at a time.
Hope this helps!
-JD.
Quote from: Justin Derrick on May 18, 2010, 06:17:43 AM
1) Write a program that appends the index files together. You don't need to worry about the offsets and lengths -- arsload will concatenate the files for you. You just need a long index file that contains pointers to all 1000 files.
Hi Roberto,
Just a tip, for that way
get the CODEPAGE:XXX in your first index file, save it in the the new index file, then append all the indexes together after this CODEPAGE:XXX, but remove the line CODEPAGE is all indexes before appending.
You must have ONLY ONE CODEPAGE in an index file, and at the beginning of the file, if you concatenate blindly all files together, it won't work.
Hope it was clear!
Cheers,
Alessandro