Tagged GO

Near Real Time Data Driven SaaS Integration with Streaming | Part 1: Overview

File based approaches

File-based data exchange comfort a large percentage of the integrations between SaaS and other ERP solutions in the past and today. The approach is robust and allows a large number of transactions to be executed in batch without affecting the online systems, but it has an inconvenience: the information is not updated at the last moment.

Streaming Solutions

Streaming solutions (kafka, JMS, …) allow weakly coupled systems, which provide everything necessary to have information updated almost in real time without overloading the source and target systems.

Indeed, the use of streaming allows the source system to publish data to an intermediate flow where data is stored during a sufficient time so that the target systems have the chance to consume the data according to the load fluctuations to which they are subjected and the source system publishes data in the same way.

The benefits are, among others:
– Destination systems are not overloaded
– Data arrives at the destination much earlier than in batch mode
– No data is lost

Solution example

Oracle OCI Streaming

OCI Streaming is a Kafka compatible, secure, no lock-in, pay as you use, scalable, cheap streaming solution that allows the purpose mentioned with very low effort and ease to develop and deploy.

Source systems

Source systems, such as Oracle SaaS, can be queried by an intermediate publisher with the REST API’s, get the data and put as messages in topics utilising the streaming API included in OCI SDK’s provided.

Target Systems

Destination systems, receive messages by means of an intermediate publisher that get messages from the topics and send to the target using their own API.

Containerisation and “K8szation”

One good approach for this purpose is creating a containerised programs to be deployed and run in a Kubernetes Cluster.

Next Post: Near Real Time Data Driven SaaS Integration with Streaming | Part 2: Extracting/Publishing Data Changes

Hope it helps! 🙂

Building Producer and Consumer Clients in go Language for Oracle Event Hub Cloud Service


Documentation here or watch following video:

When the cluster is created, go to details page and grab the connection url, which has the  format <broker1_ip>:6667,…,<brokern_ip>:6667



I’ve done the following (Mac):

git clone https://github.com/edenhill/librdkafka.git
cd librdkafka
./configure --prefix /usr/local
sudo make install (sudo make uninstall in case you want to remove the lib)

brew install librdkafka


I’ve downloaded this consumer and this producer samples. Create producer.go and consumer.go files and put the sample code in each file.

go build producer.go

go build consumer.go

Test your clients:

./producer <broker1_ip>:6667,...,<brokern_ip>:6667 sample

./consumer <broker1_ip>:6667,...,<brokern_ip>:6667 consumer_example sample



That’s all folks!

🙂 Enjoy