oke-cluster-start-stop, a kubectl Plugin for starting/stopping all the Compute Nodes of an OKE Cluster


If you want to start or stop all of your OKE data plane cluster nodes you must go to all the nodes and execute the action one by one.

The current post depicts a kubectl plugin extension created for helping you to start/stop your oke clusters.

NOTES: The tool has been developed and tested in Mac. In addition the tool utilises jq which is installed automatically in case it is not.

Take a look at this github page for instructions and code. Here an example of the tool at work:

NOTE: The tool is gonna be submitted soon for approval in the krew-index (like a package manager for kubectl plugins), meanwhile you can install it locally as follows:

git clone https://github.com/javiermugueta/oke-cluster-start-stop.git
cd oke-cluster-start-stop
kubectl krew install --manifest=oke-cluster-start-stop.yaml --archive=oke-cluster-start-stop-1.0.0.zip

In addition you can execute the tool as a stand alone script as well.

Hope it helps! 🙂

Links

Kubectl plugins developer guide

Oracle SaaS | Analysing the Structure and Morphology of the REST API Methods


Oracle SaaS API’s have a structure that is equal for all the “business entities” across the different modules (Finance, Loyalty, Supply Chain, …) such as invoices, expenses, loyalty transactions and so on. Let’s analyze the pattern. In this episode, we are focusing on the “Get all xxx” methods because it conforms a pattern that we’ll utilise in a sooner construct which we are working on for building a tool that extracts and publishes changes in the data in near-realtime.

Verb

Verb for all of them is obviously GET

Endpoint

Endpoint is <fqdnurl>/<endpoint> where

<fqdn> is the server url such as https://iiii-eeee.fa.em2.oraclecloud.com
<endpoint> is the API endpoint provided in the documentation such as /fscmRestApi/resources/11.13.18.05/invoices 

Data Retrieved

If you execute the API call and data exists, conditions are valid and the user you are using for the connection has privileges to read, a JSON structure is retrieved up to the number of records in the limit parameter (default 25). If more records exist it is notified in the hasMore field in the response.

Parameters

expand: Includes child records of the entity, you can specify which childs to include or all for example ?expand=invoiceLines (tip: if onlyData is false (default) the links provide you the name of the child entities)
fields: If you don't want all the fields in the response, include the names of the fields expected
limit: maximum number of records per call execution
offset: the ordinal number of the set of records retrieved according to the total and the limit parameter. For example if limit is 5 and you want the records from 6 to 10, offset must be set to 1
onlyData: by default, response JSON includes child objects links, putting this value to false disables that option
orderBy: fields and order criteria for retrieving data such as ?orderBy=InvoiceId:asc,SupplierSite:desc
q: the filter condition to get data such as ?q=InvoiceAmount>1000
totalResults: by default is false so total records retrieved are not evaluated, if true they are counted with the drawback of a more computation cost and probably lower response time and performance
totalResults effect
Get all banks REST API
Get all expense records API
Get all invoices API

Example

In the following example we are requesting invoices and their lines with an amount greater than 1000 indicating that we want to know the number of invoices that accomplish the criteria

curl -X GET -k -H 'Content-Type: application/vnd.oracle.adf.resourceitem+json' -u jxvxxr.mxgxxtx@xrxclx.cxm:password "https:/iiii-eee.fa.em2.oraclecloud.com//fscmRestApi/resources/11.13.18.05/invoices?limit=5&totalResults=true&q=InvoiceAmount>1000&expand=invoiceLines&onlyData=true"

Result

{
  "items" : [ {
    "InvoiceId" : 1,
    "InvoiceNumber" : "DFS03",
    "InvoiceCurrency" : "EUR",
    "PaymentCurrency" : "EUR",
    "InvoiceAmount" : 1100,
   …
    "CreationDate" : "2019-07-10T08:33:13+00:00",
    "CreatedBy" : "dxnx.fxrnxndxz@xrxclx.cxm",
    "LastUpdatedBy" : "dxnx.fxrnxndxz@xrxclx.cxm",
    "LastUpdateDate" : "2019-07-16T12:46:37+00:00",
    "LastUpdateLogin" : "8AC9D26BC392042BE053A46F660ACD78",
    "invoiceLines" : [ {
      "LineNumber" : 1,
      "LineAmount" : 1000,
      "AccountingDate" : "2019-07-10",
      …
      "MultiperiodAccountingAccrualAccount" : "",
      "CreatedBy" : "dxnx.fxrnxndxz@xrxclx.cxm",
      "CreationDate" : "2019-07-10T08:33:18+00:00",
      "LastUpdateDate" : "2019-07-10T08:34:56+00:00",
      "LastUpdatedBy" : "dxnx.fxrnxndxz@xrxclx.cxm",
      "LastUpdateLogin" : "8AC9F283AC3E043EE053A46F660A9D93"
    }, {
      "LineNumber" : 2,
      "LineAmount" : 100,
      "AccountingDate" : "2019-07-10",
      …
    } ]
  }, {
    "InvoiceId" : 1001,
    …,
    "invoiceLines" : [ {
      "LineNumber" : 1,
      "LineAmount" : 1000,
      …
    }, {
      "LineNumber" : 2,
      "LineAmount" : 100,
      …
    } ]
  }, …
    } ]
  } ],
  "totalResults" : 155,
  "count" : 5,
  "hasMore" : true,
  "limit" : 5,
  "offset" : 0,
  "links" : [ {
    …
}

So far, so good! Please note we have highlighted CreationDate and LastUpdateDate in the records, w’ll se the reason in a new post sooner, stay tuned!

So that’s all folks for now, hope it helps! 🙂

More Info

Oracle Cloud Applications Documentation Index

REST API for Finiancials

REST API for Supply Chain & Manufacturing

And so on… locate the REST API on the left panel of the documentation for each module listed in the index

Near Real Time Data Driven SaaS Integration with Streaming | Part 2: Extracting/Publishing Data Changes to Streaming Topics


As we mentioned in a previous post, the idea here is to develop a construct that seeks for changes in the SaaS data system and publishes those changes to a stream for later consumption from other systems over there.

Indeed, as we see in the diagram, there is a block in which we are executing a program that run in loops of “get all” requests to a list of API endpoints that have been configured with the Administration User interface. The logic is:

  • wake up every x seconds
  • for each endpoint registered
  • get parameters such as conditions, last successful execution time, …
  • execute GET requests in loops with chunks of N records per call and get the data until there is no more data retrieved
  • put the data in the topic with the streaming API

As we mentioned in this post, the SaaS REST API’s have a common pattern so its easy to create a program that executes the logic mentioned and put it in an image container to be deployed in K8s, we’ll show an example in a new post sooner.

And that’s all for today, hope it helps! 🙂

Oracle Integration Cloud | Integrating Applications Through Robotics Process Automation


Robotics Process Automation (RPA) is a technology that allows to automate the human-machine dialog.

How many of you have seen end users working with several User Interface apps “at the same time”?

The lack of integration between our systems and apps is because several reasons, such as failed IT implementations, niche, legacy or obsolete solutions, no API’s or data interchange mechanisms available, and so on.

RPA can help us to solve the problem, because what it basically does is interact with the application UI’s providing us with the ability to catch the data and put it in a central and “apificated” repository that can then be utilised by the ecosystem.

Oracle Integration Cloud provides connectors with commercial RPA solutions such as UIPath and Automation Anywhere as well as standard REST connector that allows to integrate with other open source RPA tools.

So that’s it, you can automate the user interaction with the different applications and get the data as long as the robot simulates the human.

Oracle Integration Cloud provides connectors to a bunch of commercial solutions, including RPA, this way you are not lock-in to a specific vendor.

Hope it helps! 🙂

Near Real Time Data Driven SaaS Integration with Streaming | Part 1: Overview


File based approaches

File-based data exchange comfort a large percentage of the integrations between SaaS and other ERP solutions in the past and today. The approach is robust and allows a large number of transactions to be executed in batch without affecting the online systems, but it has an inconvenience: the information is not updated at the last moment.

Streaming Solutions

Streaming solutions (kafka, JMS, …) allow weakly coupled systems, which provide everything necessary to have information updated almost in real time without overloading the source and target systems.


Indeed, the use of streaming allows the source system to publish data to an intermediate flow where data is stored during a sufficient time so that the target systems have the chance to consume the data according to the load fluctuations to which they are subjected and the source system publishes data in the same way.

The benefits are, among others:
– Destination systems are not overloaded
– Data arrives at the destination much earlier than in batch mode
– No data is lost

Solution example

Oracle OCI Streaming

OCI Streaming is a Kafka compatible, secure, no lock-in, pay as you use, scalable, cheap streaming solution that allows the purpose mentioned with very low effort and ease to develop and deploy.

Source systems

Source systems, such as Oracle SaaS, can be queried by an intermediate publisher with the REST API’s, get the data and put as messages in topics utilising the streaming API included in OCI SDK’s provided.

Target Systems

Destination systems, receive messages by means of an intermediate publisher that get messages from the topics and send to the target using their own API.

Containerisation and “K8szation”

One good approach for this purpose is creating a containerised programs to be deployed and run in a Kubernetes Cluster.

Next Post: Near Real Time Data Driven SaaS Integration with Streaming | Part 2: Extracting/Publishing Data Changes

Hope it helps! 🙂

“Kool” Kubernetes Client Tools


When working with k8s, you typically have several clusters and a bunch of namespaces per cluster, the following tools can help you manage the stuff with easy.

kubectx + kubens

kubectx allows you to change the context between different k8s clusters

kubens allows you to change between different namespaces in the current cluster context

brew install kubectx

More info here

Kube-ps1

Allows you to see the current context and namespace in the promt

brew install kube-ps1

More info here

Examples

kubectx, kubens and kube-ps1 at a glance

kubeaudit

Great tool for auditing the security settings on your k8s clusters

brew install kubeaudit
kubeaudit allowpe

krew

Krew is a package manager for kubectl (kubectl has an extensibility framework and this tool helps manage extensions)

Hope it helps! 🙂

Set Up an Ingress Controller in OKE Cluster


Here a recipe to create a ingresscontroller.

Step 1: setting up the nginx ingress controller

kubectl apply -f https://raw.githubusercontent.com/kubernetes/ingress-nginx/master/deploy/static/mandatory.yaml

Step 2: exposing the ingress as a service of type LoadBalancer (as a public IP)

kubectl apply -f https://raw.githubusercontent.com/javiermugueta/rawcontent/master/cloud-generic.yaml

Step 3: Execute this command several times until the external-ip appears as non <pending>, grab the IP for later testing

kubectl get svc -n ingress-nginx

Step 4: a certificate for ssl termination

openssl req -x509 -nodes -days 365 -newkey rsa:2048 -keyout tls.key -out tls.crt -subj "/CN=nginxsvc/O=nginxsvc"


kubectl create secret tls tls-secret --key tls.key --cert tls.crtopenssl req -x509 -nodes -days 365 -newkey rsa:2048 -keyout tls.key -out tls.crt -subj "/CN=nginxsvc/O=nginxsvc"

Step 5

kubectl create -f https://raw.githubusercontent.com/javiermugueta/rawcontent/master/hello-world-ingress.yaml

Step 6: A typical hello world deployment with a pod in it and 3 replicas

kubectl create -f https://raw.githubusercontent.com/javiermugueta/rawcontent/master/ingress.yaml

Step 7: Test the url and see what happens…

That’s all folks! Hope it helps 🙂