Unable to load the big files.File=arsadmp.c

Previous topic - Next topic

Gobi21

Hi All,

My Ondemand load processing is failing to load. here is the error message. It is looking like the issue is happenning only for big files. we are using version 8.4.1.

Strange that it is not able to write anything in system log too.

arsload: 09/12/14 15:30:10 -- Loading started, 159359663 bytes to process
OnDemand Load Id = >xxx-xx-xxxx-xx<
An error occurred.  Contact your system administrator and/or consult the System Log.  File=arsadmp.c, Line=1574
Failed while attempting to load the database
Loaded 0 rows into the database
arsload: 09/12/14 15:30:10 Loading failed
arsload: Processing failed for file >xxx.20140912_1530<
arsload: 09/12/14 15:30:10 -- Unloading started
OnDemand UnLoad Successful - LoadId(xxx-xx-xxxx-xx) Rows Deleted(0)

09/12/14 15:30:11 -- Unloading of data was successful
An error occurred.  Contact your system administrator and/or consult the System Log.  File=arsadmin.c, Line=2293

arsload: Unable to log load information
arsload: Processing has stopped.  The remaining files will NOT be processed.

Can someone adjust to clue to resolve this issue.?. On the same day, i could see some small loads with less than 100 records went sucess.

rick

could be the index has a non supportable database chars. Did you try with other large samples?

Ed_Arnold

What's the operating system for the server?  AIX?  Windows?  z/OS?  Something else?

Ed
#zOS #ODF

Gobi21

Yes.. I tried with large feeds. It is uncommon. some of the files are loading fine and some of the files are throwing this error.

This is Linux x86_64.

Thanks for having a look on this.

Justin Derrick

You can check the DB2 Diag log for more information about the failure.  You should find it inside your database instance owner directory (usually archive or odadmin) under sqllib/db2dump.  You'll want to review those logs around the time of the failure to get a better understanding of why the databased choked on the load.

-JD.
Call:  +1-866-533-7742  or  eMail:  jd@justinderrick.com
IBM CMOD Wiki:  https://CMOD.wiki/
FREE IBM CMOD Webinars:  https://CMOD.Training/
IBM CMOD Professional Services: https://CMOD.cloud

Interests: #AIX #Linux #Multiplatforms #DB2 #TSM #SP #Performance #Security #Audits #Customizing #Availability #HA #DR

Gobi21

Thanks JD.

we are using Oracle. Anywhere i can find out the related logs?.


-Gobi

jeffs42885

Have your Oracle DBA's look at the transaction logs. We came across this incident a few weeks ago when trying to load large files. CMOD should be able to handle a 159MB file, but in our case these files had 200-300k rows and the transaction logs were filling extremely fast, causing all kinds of issues with CMOD and P8. This is why I don't like Oracle and prefer DB2/CMOD.

Gobi21

Thanks everyone for your suggestions.

We had space issue in TableSpace DB which was found from the DB Activity log. Once the space was increased. we are fine now.