RPC vs Message Broker


#1

We are planning on using Proteus and RSockets for a new project, but I am not sure if we still need a message broker.

If I have hundreds of clients, is it better for a service to use a message broker to publish updates to the clients or should each client make an rpc call to the service to get a stream of updates?

Thanks for your insights


#2

Hi,

This depends on wither the message is unique for each client, or the same message for all clients.

If the messages are the same for all clients then you need shared state - e.g topic, database, etc. It is still a good to front a message broker with an RSocket RPC service. This gives your clients an API to call, and decouples you from the message broker.

Messages unique to clients do not require a message broker - just make an RPC call to the stream.

If you can provide more detail about your use-case we can provide an example…
Are the messages distinct to a client?
Are the messages stateful - or are they only required when the user has subscribed?

Thanks,
Robert


#3

We have two use cases

  1. We receive a live stream of track messages, they get processes through a few algorithms, then published the results to all the clients. Current we are using ActiveMQ
  2. A user makes a change to a data object on a client, then we push the change the server, then publish out the change to all clients. Current we are using REST and ActiveMQ

Our current system is a large monolith developed using Java running on JBoss with a Postgres DB

We are planning on a new Micro Service implementation using

SpringBoot (Reactive Core)
RSocket-RPC
Proteus
MongoDB and or Protgress with Reactive Drivers

I am trying to figure out if a need Kafka or Pulsar in the architecture.

How does one Front a Message Broker with an RPC Service?

Thanks for your replay.

Tom


#4

Hi Tom,

There are a couple ways to handle those use-cases. The first is with Kafka. Here is a demo project that fronts Kafka with an RPC call:

This example has three parts:

  1. A Spring boot command-line application that generates random longs, and sends them to Kafka
    https://github.com/netifi/proteus-spring-kafka-example/tree/master/number-generator

  2. A Proteus Spring service that provides a stream of random longs qualified by a Type (all, even, odd, positive, negative)
    https://github.com/netifi/proteus-spring-kafka-example/tree/master/service

  3. A Proteus Spring client that consumes a stream of randoms longs
    https://github.com/netifi/proteus-spring-kafka-example/tree/master/client

I used reactor-kafka (https://github.com/reactor/reactor-kafka) to communicate with Kafka.

You can start any number clients and servers and they will get the same messages of the type client asked for - i.e. every client that asked for odds will get the same odd numbers, etc.

You will also need to Kafka to test this. I have a Mac and followed this HomeBrew example:

The other way to handle this is with broadcast. I will post an example of this in a bit.

Cheers,
Robert


#5

Thanks, this has been very helpful.

Tom


#6

No problem - let me know if you need anything else.