The entry point into a Simple Sourcing client is the EventSourcedClient DSL. With this you can:

  • Set the Kafka for configuration to specify how to interact with the Kafka cluster
  • Add a set of commands for one or more aggregates
  • Call the build method which returns a CommandAPISet. A CommandAPISet provides access to a
    CommandAPI for each aggregate

For example:

public static void main(final String[] args) {
    final AggregateSerdes<~> avroAggregateSerdes = ...;

    final CommandAPISet commandApiSet =
            new EventSourcedClient()
                    .<UserKey, UserCommand>addCommands(builder -> builder
                    .withKafkaConfig(builder -> builder

    return commandApiSet.getCommandAPI("user");

The client can now publish commands using the command API.

The addCommands method on the EventSourcedClient works similarly to the aggregate builder. However the client only needs to know about the commands, and provide serdes for serialization. It does not need to know about the command and event handlers.

The Client ID, specified in withClientId, should be different for each client instance, or at least for each host machine the client is running on. If you have many clients, and don’t create separate client IDs for each client, it will result in unncessary network traffic.

Combined server and client

Although it is generally encouraged to separate the client application from the streaming server application it is possible to create a client from an EventSourcedApp directly. Calling the getCommandAPISet on the EventSourcedApp instance returns a CommandAPISet. Note that this method must be called after starting the application with the start method.