Multiple Spring Batch jobs in one Grails application -
I have a Grails app that uses the Springbait plugin to control batch loading in the database. I will get two batch configs in a Grails app, let's call them at OneBatchConfig.groovy Doublebatch config.groovy read both the details in a file, perform some processing, and save the results in the database. The configuration of However, these filtering stages are configurable, this can be the first job, filing the record with name = 'john', while the other can filter the record with name = 'steve' This configuration is inside the batch's job definition, not in the code. So my question is: In my test it appears that all the beans are defined in both of these First of all, is this understandable? It makes some sense how other beans, such as But if so, then this is a large number of It is possible how I configure every job, I am just missing something, so it is different from other jobs? its components in a job are simply spring beans which are loading in the global Grails context This means that you have to give your components a unique name globally. I use every step only with the name of the job, although it can make some long beans names. If you were using spring-batch directly, you could use automatic jobberstarter to get around this. Grails plugin does not currently do this. To show this request, I have commented on Gitub on your issue OneBatchConfig.groovy and
DualCheck.groovy . They do very similar things, just with a slightly different processing. For example:
beams {xmlns batch: "http://www.springframework.org/schema/batch "Batch. Job (ID:" Batch Job 1 ") {Batch.Step (id: 'Load file') {Batch.tkletkletlet {Batch.Cook (Reader: 'Filereader', Processor: 'Composite Processor', Author: ' DBRRIR ')}}} // file provider; For now composite processor (composite intprocessor) ignore the details of the filter (rep = [ref ('filter ivlidname'), referee ('filter inevidetments')] // filter * processor}}
beam {xmlns batch: "http://www.springframework.org/schema/batch" batch.job (id: "BatchJob2") {Batch.step (id: 'load file') {batch.tkletkletlet {batch.cook (reader: 'filereader', processor: 'composite processor', author: 'DBRRIR')}}} // Un See now File Reader and DBRIR (Composite Eitprosessor) for the composite itcomprocessor (representatives * [Reference ('Filter Invalid Names'), Referee ('Filter Invality')] / I File Details * Processor}}
fileReader for each batch function is different, for example, they Separate fay L are read, but the most obvious is that the different applicable processing each task. The first batch exits some records on the basis of job names and departments, while the other employs some records on the basis of names and cities.
* batch config.groovy files are part of the global name space; Including the phases name it means that a code named
loadFile is called
OneBatchConfig.groovy and
DualCheckConfigrag. Geov is a bad idea in , and only a Grails actually will have those steps after the application begins. It also means that the
filterInvalidNames bean can be present only once, and therefore the same filtering will be applied to both tasks (some invisibly).
dataSource , are available without doing anything.
* batchconfig. Groovy Files inside the same Grails app Have any solutions other than the need to use unique names for each one step and all the
* BatchConfig.groovy files? Like the ability to apply any kind of namespace in the name of bean inside each batch? Jobs Block?
Comments
Post a Comment