Subscribing Network Insights Kafka Producer to a Kafka topic and streaming events to a Kafka Consumer

Applications frequently need to talk to other applications.  As an example Application A gathers some data and Application B needs to do some custom processing with the output of data from Application A.  When Applications that are split in functionality need to exchange data between each other, there are generally 2 kinds of ways to do this. 

  1. The old legacy way was Synchronous communications from application to application.  This method is not very reliable as data can be lost when the receiving application goes down for a period of time.  Further if Application A needs to send data to multiple other applications,  let’s say Application A (buying service), needs to send data to Application B (shipping service) and Application C (inventory service) then Application A will have to send the messages to 2 different applications increasing load on Application A.  Further synchronous messaging between applications can be problematic if there are sudden spikes of traffic.
  2. The way that most applications handle messaging between them is through some sort of middle-ware service.  The middle-ware service can be broadly categorized in 3 different models. 
    1. queue model.   A queue model is where there is a queue service that polls the producer for messages at given intervals and stores the message. The consumers of the messages then poll the queue service at given intervals of time and picks up the messages and sends a command to clear up the messages from the queue, so that other consumers don’t pick it up also, hence duplicating messages.  This works well when there is a cluster of consumers to share the load of processing. Queue model is basically a pull model.  In AWS, the equivalent of this service is SQS (Simple Queue Service). 
    2. pub/sub model.  In this model,  the producer application subscribes to a topic that’s hosted by the queue service.  It then sends the messages (push model), to the middleware application hosting the topic.  The consumer applications then subscribe to the topic also but as consumers.  In AWS, the equivalent of this service is SNS (Simple Notification Service).
    3. Real time pub/sub model or streaming messages from publisher to middleware queue service. RabbitMQ and Apache Kafka fall into this category.   in AWS, the equivalent of this service is Kinesis. 

Cisco Nexus Insights from release 5.0.1x can use the Kafka services that runs on ND and subscribe to a topic as a publisher to that topic.  The topic has been created somewhere else on a Kafka service.  You can then have a Kafka consumer subscribe to that topic and receive all the messages.  The messages that can be obtained from NI in this way are anomalies, advisories,  faults, audit logs and statistics.  You can be selective on what you export (based on your requirements) and then send the Kafka consumer messages to some other application like elasticsearch/kibana to do some custom fancy processing based on what you want.

If you want to test out this export quickly I have created a Kafka container, that you can run, create a topic,  setup NI to publish to the topic and a consumer to subscribe to that topic.  Then you can see the streaming messages come in to the Kafka consumer.

Below is how you can do this in a few minutes to test it out.
for Cisco Network Insight Kafka Consumer bringup. ND is Kafka producer.

Prerequisites:

Make sure that docker and docker-compose are installed:

Method for ubuntu bionic 18.04 for installing docker and docker-compose

  "sudo apt update -y" 
  "sudo apt upgrade -y" 
  "sudo echo net.ipv4.ip_forward=1 >> /etc/sysctl.conf" 
  "sudo sysctl  -p" 
  "sudo sysctl --system" 

 Exit out of session and ssh back in: 


  "sudo apt install docker.io -y" 
  "sudo systemctl start docker" 
  "sudo systemctl enable docker" 
  "sudo usermod -aG docker $USER 

Exit out of session and ssh back in: 

  "sudo apt install docker-compose -y"
Once docker and docker-compose are installed, just do the below:
  1. cd to some directory in your ubuntu or local mac, then clone the repo.  git clone https://github.com/soumukhe/kafka-docker-compose-NI.git
  2. cd kafka-docker-compose-NI
  3. do a "ip a" on ubuntu box to find your host ip.   then run the script  "changeHostIP.sh your_host_ip"  as an example:   "./changeHostIP.sh 10.10.10.50"
  4. docker-compose up --build -d (or docker-compose build then docker-compose up -d)
  5. do a docker ps to make sure that sm-kafka and sm-zookeeper is up
  6. execute this script ./copy_kafka-from-container.sh
  7. cd ACI-Consumer
  8. a topic called test-topic already exists.  execute "./listTopic.sh" to verify. 
  9. If you want to make a new topic create the topic by executing script    ./createTopic.sh  my_topic_name   (where my_topic_name is a name of your choosing)
  10. start kafka consumer by executing script ./startConsumer.sh   your terminal will now be waiting for messages from the producer.  If you want to quickly test to make sure that Kafka consumer can receive messages, open another terminal go the the ACI-Consumer directory and execute ./startProducer.sh and then type in any message like hello, consumer terminal should see that message now
  11. configure NI to export data to Topic Name: test-topic on base_machine_host:9092
  12. To stop all containers related to kafka (zookeeper and kafka), make sure you are in the kafka-docker-compose-NI directory and then executue docker-compose down

Configuring NI to subscribe to test-topic

On Nexus Insights, click on the Intent Icon, then Click on Data Management/Export Data/Add New (as shown below)

Figure 1

On the next screen (Message Bus Configuration), put in the IP of your host where the Kafka Container is running.  Put in the port number that Kafka Container is exposing.  Also, put in some name for this object and also the name of the topic “test-topic” in my case, that I want the Kafka publisher to send the messages to.

Figure 2

Next, choose the items that you want to export.  You can export Anomalies, Bugs, Advisories, Faults, Audit Logs and Statistics.  In my case I choose all of them, because I just want to see messages coming to my Kafka Consumer that I brought up.  In real life, you probably will be more selective and choose based on what application you are sending these Kafka messages to.

Figure 3

Now, go the Kafka Consumer Terminal that you had brought up.  You will see all the Kafka messages coming in there.  Try logging in and out of your ACI Fabric and you will see the Audit log messages related to that.

Figure 4

Conclusion:  The Kafka messaging bus of ND, gives NI the capability to send Kafka messages from the fabrics it’s managing.  The messages can then be fed into to any custom application using Kafka Connect or some other means to applications like elasticsearch/Kibana for further analysis and custom reports.


Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.