java - Spring Batch: Step Partitioning OutOfMemory error -


i have step partitioned spring batch , using jdbcpagingitemreader reading data in chunk , processing.

the batch running fine 3000 records. if load increased 6000 getting outofmemoryerror

exit failureexeptions : [java.lang.outofmemoryerror, org.springframework.batch.core.jobexecutionexception: partition handler returned unsuccessful step, java.lang.outofmemoryerror, java.lang.outofmemoryerror, java.lang.outofmemoryerror] 

the average load of batch 3000 records max 6000 in exceptional case. testing our batch 6000 records.

current activity:

1) trying heap dump analyze issue further.

2) looking other options increasing heap size. existing setup min 128 mb , max 512 mb

the job xml file:

<import resource="../config/batch-context.xml" /> <import resource="../config/database.xml" />  <job id="partitionjob" xmlns="http://www.springframework.org/schema/batch">     <step id="masterstep" parent="abstractpartitionerstagedstep">         <partition step="slave" partitioner="rangepartitioner">             <handler grid-size="5" task-executor="taskexecutor" />         </partition>     </step> </job>  <bean id="abstractpartitionerstagedstep" abstract="true">     <property name="listeners">         <list>             <ref bean="updatelistener" />         </list>     </property> </bean>  <bean id="updatelistener" class="com.test.springbatch.model.updatefilecopystatus" />  <step id="slave" xmlns="http://www.springframework.org/schema/batch">     <tasklet>         <chunk reader="pagingitemreader" writer="flatfileitemwriter"             processor="itemprocessor" commit-interval="1" retry-limit="0"             skip-limit="100">             <skippable-exception-classes>                 <include class="java.lang.exception" />             </skippable-exception-classes>         </chunk>     </tasklet> </step>  <bean id="rangepartitioner" class="com.test.springbatch.partition.rangepartitioner">     <property name="datasource" ref="datasource" /> </bean>  <bean id="taskexecutor" class="org.springframework.scheduling.concurrent.threadpooltaskexecutor">     <property name="corepoolsize" value="5" />     <property name="maxpoolsize" value="5" />     <property name="queuecapacity" value="100" />     <property name="allowcorethreadtimeout" value="true" />     <property name="keepaliveseconds" value="60" /> </bean>  <bean id="itemprocessor" class="com.test.springbatch.processor.caseprocessor" scope="step">     <property name="threadname" value="#{stepexecutioncontext[name]}" /> </bean>  <bean id="pagingitemreader" class="org.springframework.batch.item.database.jdbcpagingitemreader" scope="step">     <property name="datasource" ref="datasource" />     <property name="savestate" value="false" />     <property name="queryprovider">         <bean class="org.springframework.batch.item.database.support.sqlpagingqueryproviderfactorybean">             <property name="datasource" ref="datasource" />             <property name="selectclause" value="select *" />             <property name="fromclause" value="from ( select case_num ,case_stts_cd, updt_ts,sbmt_ofc_cd, sbmt_ofc_num,dstr_chnl_cd,aprv_ofc_cd,aprv_ofc_num,sbmt_typ_cd, row_number() over(order case_num) rownumber tsmcase proc_ind ='n' ) data" />             <property name="whereclause" value="where rownumber between :fromrow , :torow " />             <property name="sortkey" value="case_num" />         </bean>     </property>      <property name="parametervalues">         <map>             <entry key="fromrow" value="#{stepexecutioncontext[fromrow]}" />             <entry key="torow" value="#{stepexecutioncontext[torow]}" />         </map>     </property>     <property name="pagesize" value="100" />     <property name="rowmapper">         <bean class="com.test.springbatch.model.caserowmapper" />     </property> </bean>  <bean id="flatfileitemwriter" class="com.test.springbatch.writer.fnwriter" scope="step" /> 

my questions:

  1. job running fine 3000 records why throwing oom error if load 6000 records through using chunk oriented processing? jdbcpagingitemreader caches internally?
  2. is job configurations looks fine? there scope of improvement in job configurations?