When the IBM i community talks about programming languages on the IBM i it is mostly about RPG and CL. Sometimes C also trickles into the conversion but mostly that’s about it. Most people would not give Java a second thought on the IBM i even though many IBM products are based on Java and are running very well on the i.

Java frameworks and libraries are very mature. And with organizations like Eclipse and Apache you get really high quality software.

What may hinder the usage of Java on IBM i is the release cycle of the Java VM and runtime from IBM. IBM releases “only” Long Term Support (LTS) releases of Java for IBM i. We had Java 11 on IBM i and it took a long time till IBM released Java 17. This really hinders adoption as the major players are dashing forward with ever shorter release cycles and more current versions of dependencies in their requirements.

The Helidon microservice framework is a good match when it comes to releases and requirements as you can use Helidon version 2 which is supported by Oracle and run it with Java 11 on older IBM i versions. But IBM has released Java 17 for IBM i some time ago so that Helidon version 3 can now also run on IBM i which comes with some improvements concerning messaging. With Helidon 3 Oracle jumped from MicroProfile Messaging 1.0 to MicroProfile Messaging 3.0.

And MicroProfile Messaging is the key component of this post as it makes it very easy to receive data from message queuing systems like ActiveMQ (Classic) or Artemis.

Could this have been possible earlier with Java on IBM i?

Yes, definitely! But with the advent of microservices and the corresponding specs in Java implementations and projects in general it is now much easier to do it.

For the examples in this post I will use the Helidon framework version 3.

Receiving Data

Originally the RPG program had sent the data as one big blob of bytes. And for as long as we stay in our RPG bubble this would have worked. Would it also work for Java? Yes, but it would take some extra steps. With the JTOpen library we could define a record format (see RFML, Record class, RecordFormat class, RecordFormatDocument class) and feed the data to that Record or RecordFormatDocument instance. Then we could extract the single pieces of data we need from that instance.

Does not sound very straight forward, doesn’t it?!

But as we also have the choice to send the data in JSON format we can simplify things a lot on the Java side because then the framework(s) is doing all the heavy lifting for us. For getting the data for an auction we can just code the following.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
@Inject
ObjectMapper mapper;

@Incoming("auction-events")
@Acknowledgment(Acknowledgment.Strategy.MANUAL)
public CompletionStage<Void> receiveMessage(JmsMessage<String> message) {
try {
AuctionEvent event = mapper.readValue(message.getPayload(), AuctionEvent.class);
handleEvent(event);
return message.ack();
} catch (Exception e) {
logger.log(Level.SEVERE, "Error on processing auction event", e);
return message.nack(e);
}
}

That is pretty straight forward :).

What makes the code also very short is the fact that we can get an Auction object with a one-liner by using the Jackson library to deserialize the JSON data. Very nice library!

Note: I decided to manually acknowledge the messages as this allows us to “not acknowledge” a message which then stays in the queue and won’t be lost. This is not the ideal solution as now the message will block any further processing. You need to find a strategy which best fits your use case.

Artemis

The above solution for receiving data would be ideal. But as we all know we don’t always have the ideal situation in the real world. Mostly it looks a little bit more complicated and this is also the case here.

If we send the data with the STOMP client for ILE it will always send a Content-Length header. Artemis will always interpret the message as a ByteMessage when using JMS because of this header, see Artemis - STOMP and JMS interoperability.

But with the MicroProfile Spec we are super flexible when it comes to how to receive message from a queue.

To circumvent our problem we can inspect what type of message we got and then act accordingly.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
@Inject
private ObjectMapper mapper;

@Incoming("auction-events")
@Acknowledgment(Acknowledgment.Strategy.MANUAL)
public CompletionStage<Void> receiveMessage(JmsMessage<?> message) {
String messageText = null;

try {
// content-length header available -> ByteMessage else TextMessage
if (message instanceof JmsTextMessage) {
messageText = ((JmsTextMessage) message).getPayload();
} else if (message instanceof JmsBytesMessage) {
messageText = new String((byte[]) message.getPayload(), StandardCharsets.UTF_8);
} else {
throw new RuntimeException("Unknown JMS message type " + message.getClass().getName());
}

AuctionEvent event = mapper.readValue(messageText, AuctionEvent.class);
handleEvent(event);
return message.ack();
} catch (Exception e) {
logger.log(Level.SEVERE, "Error on processing auction event", e);
return message.nack(e);
}
}

In our example we use the well established library Jackson to convert the data from JSON to an auction object.

Logging

If you want to integrate the logs of your Java application into any central logging server then one way to achieve this is to send you logs directly to the logging server via a socket connection. The Java Util Logging framework supports this by using the SocketHandler class.

Note: You may want to reimplement the SocketHandler so that it reconnects in case it lost the connection to the logging server.

Formatter

The SocketHandler class will by default send every log entry to the log server in XML format. If you need to transfer the log entries in JSON this can be done very easily by implementing the java.util.logging.Formatter interface.

To create the JSON string you can use the ObjectMapper class from the Jackson library. To get a root node you can call ObjectMapper::createObjectNode. For adding values to the root node you can call the corresponding put method from the ObjectMapper class.

Custom Data

By default only the data from the log entry is sent to the log server. You may want to transfer additional data to the log server, f. e. the application name so that you know to which application the log entries belong to.

For static data this can be added to the configuration of the formatter.

1
rpgnextgen.log.JsonFormatter.application = rpgnextgen.helidon.mq

The formatter can query the LogManager for any logging configuration data.

1
String application = logManager.getProperty("rpgnextgen.log.JsonFormatter.application");

And later add it to the log entry.

1
2
ObjectNode root = mapper.createObjectNode();
root.put("app", application);

Dependencies

With most other programming languages it takes a lot of effort to choose the libraries you want to use. Best negative example is the world of JavaScript/TypeScript.

In our Java project using Helidon we have almost solely dependencies from Apache, Eclipse, JBoss and Oracle. Every one of them is offering high quality software packages. The remaining dependencies can be easily looked up (like the Netty project or FasterXML). This makes it much easier to check dependencies and maintain the application.

Happy integrating!

Mihael