Spring batch with task executor does not run in parallel -
i have spring batch config follows:
<beans> <bean id="taskexecutor" class="org.springframework.scheduling.concurrent.threadpooltaskexecutor"> <property name="corepoolsize" value="25"/> </bean> <batch:job id="springjobbatch1"> <batch:step id="step1" > <batch:tasklet task-executor="taskexecutor"> <batch:chunk reader="reader1" writer="writer1" commit-interval="1000" /> </batch:tasklet> </batch:step> <batch:listeners> <batch:listener ref="listener1"/> </batch:listeners> </batch:job> <bean id="reader1" class="org.springframework.batch.item.database.jdbcpagingitemreader" scope="step"> <property name="datasource" ref="testdsref1" /> <property name="queryprovider"> <bean class="org.springframework.batch.item.database.support.sqlpagingqueryproviderfactorybean"> <property name="datasource" ref="testdsref1" /> <property name="selectclause" value="select..."/> <property name="fromclause" value="from..."/> <property name="whereclause" value="where..."/> <property name="sortkey" value="order..." /> </bean> </property> <property name="pagesize" value="1000"/> <property name="savestate" value="false"/> <property name="rowmapper" ref="testmapper" /> </bean> <bean id="testrunjob1" class="package1.anyclass.thread1"> <property name="cache" ref="testdsref2"/> property name="joblauncher" ref="joblauncher" /> <property name="reader" ref="springjobbatch1"/> </bean> <bean id="joblauncher" class="org.springframework.batch.core.launch.support.simplejoblauncher"> <property name="jobrepository" ref="jobrepository" /> </bean> <bean id="jobrepository" class="org.springframework.batch.core.repository.support.simplejobrepository"> <constructor-arg> <bean class="org.springframework.batch.core.repository.dao.mapjobinstancedao"/> </constructor-arg> <constructor-arg> <bean class="org.springframework.batch.core.repository.dao.mapjobexecutiondao" /> </constructor-arg> <constructor-arg> <bean class="org.springframework.batch.core.repository.dao.mapstepexecutiondao"/> </constructor-arg> <constructor-arg> <bean class="org.springframework.batch.core.repository.dao.mapexecutioncontextdao"/> </constructor-arg> </bean> <bean id="thread1" class="java.lang.thread"> <constructor-arg index="0" type="java.lang.runnable" ref="testrunjob1" ></constructor-arg> </bean> <bean id="thread2" class="java.lang.thread"> <constructor-arg index="0" type="java.lang.runnable" ref="testrunjob2" ></constructor-arg> </bean> <bean id="thechosenonethread" init-method="initmethod"> <property name="thread1" ref="thread1"/> <property name="thread2" ref="thread2"/> </bean> </beans> public class thechosenonethread{ thread1 t1; thread2 t2; public void initmethod(){ executorservice t1 = executors.newfixedthreadpool(numberofthreads); executor.execute(t1); executor.execute(t2); executor.shutdown(); try{ executor.awaittermination(long.max_value, timeunit.nanoseconds); } catch(exception e){ system.out.println("cache restart timed-out."); } }
expected:
requirement make sure thread1 , thread2 executes in parallel , server should start after completes. actual:
when server starts, beans gets initialized.along it, "thechosenonethread" thread's initmethod gets executed. threads t1 , t2 stuck along main server thread , server never starts. please suggest me simpler way if possible.
you shouldn't manually creating threads , calling executor. set executorservice
of simplejoblauncher
, , call runjob
2 jobs. they'll run in parallel because executorservice
has more 1 thread. (and can remove thread1 , thread2 beans.)
to check completion, either periodically check returned jobexecution
s runjob
, or use jobexecutionlistener
.
Comments
Post a Comment