The JBatch Example in my last post was ok for a prototype but when you have to write many batch jobs we would rather want to have to code less and have more ready-to-use components. This can be done very easily in only a few steps by putting these reusable components into a batch.core project which we can later declare as a depencency in your batch projects.

Dependencies

As this is about using JBatch as our batch framework of choice for every batch project we can put all those Maven dependencies into the batch.core project. Now the only direct dependency we need to specify in our batch project is batch.core which leaves us with a very clean pom.xml file.

Note: You still need to specify how the application will be build in the build section. If this becomes more complex we could create our own custom packaging type batch in a separate maven plugin project and just specify that plugin in our build section. But that would be overkill at the moment.

Database connection

As we are primarily working with our beloved DB2 on IBM i why not have a database connection readily available for usage?! We just have to create a Jdbi instance provider which takes our database connection configuration via MicroProfile Config, create a DataSource instance and feed that to the Jdbi class which returns a fresh Jdbi instance to be used.

Note: Keep in mind that the Jdbi instance itself is no database connection. We get a database connection by retrieving a Handle instance from the Jdbi instance.

The code may look like this.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
@ApplicationScoped
public class IbmiDataSourceProvider {

@Produces
@Named("ibmiDataSource")
public DataSource provideDataSource() {
Config config = ConfigProvider.getConfig();
String host = config.getValue("ibmi.host", String.class);
String user = config.getValue("ibmi.user", String.class);
String password = config.getValue("ibmi.password", String.class);
String libraries = config.getValue("ibmi.libraries", String.class);

AS400JDBCDataSource ds = new AS400JDBCDataSource(host, user, password);
ds.setLibraries(libraries);
return ds;

}
}

This code only supports a single database connection configuration (which is my use case) but can be extended to support multiple database connections to different machines.

Note: This will be even more compact when SmallRye Config switches to the jakarta namespace and we can use CDI to access the configuration.

Note: Helidon has an extension which builds an injectable DataSource instance purely from configuration. Very slick and developer friendly! :)

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
@ApplicationScoped
public class IbmiDatabaseProvider {

@Inject
@Named("ibmiDataSource")
DataSource dataSource;

private Jdbi jdbi;

@PostConstruct
public void postConstruct() {
jdbi = Jdbi.create(dataSources.get());
jdbi.installPlugin(new SqlObjectPlugin());
}

@Produces
public Jdbi provideJdbi() {
return jdbi;
}
}

Utility Classes

The following utility classes will also make it into our new batch.core project.

  • BatchExecutorServiceProvider
  • ConsoleWriter
  • EndJobListener
  • PassThroughItemProcessor

Application Starter

The classes Main and BatchStarter can also be moved to the batch.core project as they have no project specific code. Main is our entry point for the application and needs to be specified on application start like this:

java -cp batch.project.jar:libs/* batch.core.Main start batchJobId

The Rest

This just leaves us with the batch project specific classes like the implementation of ItemReader, ItemProcessor, ItemWriter and database access classes like Jdbi DAO interfaces, mapper and data models.

That are just 4 classes for the example project with just 152 lines of code. We just cut our lines of code to a third of the original code base. I think that is not a bad result.

1
2
3
4
5
6
jbatch-example/src/main/java/rpgnextgen/batch/data$ find . -name '*.java' | xargs wc -l
36 ./Customer.java
25 ./CustomerMapper.java
17 ./CustomerDao.java
74 ./CustomerReader.java
152 insgesamt

And when it comes to configuration … as we are using MicroProfile Config we don’t need to configure anything in the batch project itself as most configuration is done in the batch.core project. We just need to override the ibmi.user and ibmi.password variable conveniently via environment variables on the target system. Very easy to do.

Note: We probably can even do without any configuration at all because the IBM Toolbox for Java supports using the user of the current job. AFAIK you would need to specify localhost for the server and *CURRENT for the user. But for some reason that didn’t work. If anybody has come up with the solution for this I would be very happy for a message on how to do it.

Happy simplifying!

Mihael