Kaa releases
Shortcuts
Skip to end of metadata
Go to start of metadata

The Kafka log appender is responsible for transferring logs from the Operations server to the Apache Kafka service. The logs are stored in the specified topic.

Creating Kafka log appender in Admin UI

The easiest way to create a Kafka log appender for your application is by using Admin UI.

Creating Kafka log appender with REST API

It is also possible to create a Kafka log appender for your application by using REST API. The following example illustrates how to provision the Kafka log appender for the Cell Monitor demo application available in Kaa Sandbox.

Configuration

The Kafka log appender configuration should match the following Avro schema.

 Click here to expand...

Fields description

NameDescription
kafkaServerslist of kafka bootstrap servers (hostname and port pairs)
topiclogs destination topic
useDefaultPartitionerif false, appender will calculate partition independently
partitionCountamount of event partitions
kafkaKeyTypetype of generated message key
executorThreadPoolSizenumber of threads that can simultaneously perform operation with your Kafka
bufferMemorySizemessage buffer size in bytes
kafkaCompressiontype of built-in message compression types
kafkaAcknowledgementthe number of acknowledgments the producer requires the leader to have received before considering a request complete
retriesfailover property. Amount of connection retries on failed message delivery

 

The following configuration taken from the Cell Monitor demo matches the previous schema.

 Click here to expand...

 

Administration

The following REST API call example illustrates how to create a new Kafka log appender.

 Example result

Playing with Kafka log appender

To check out Kafka log appender you can play with Data collection demo. Download Kaa Sandbox then set up it and go to Data collection demo application. 

 

NOTE

Kafka must be installed, running and reachable from Kaa to complete this example. For details about Kafka installation refer to official Apache documentation.

We have next log schema: 

 Click here to expand...
 
{
"type":"record",
 "name":"LogData",
 "namespace":"org.kaaproject.kaa.schema.sample.logging",
 "fields":[
{
"name":"level",
 "type":{
"type":"enum",
 "name":"Level",
 "symbols":[
"KAA_DEBUG",
 "KAA_ERROR",
 "KAA_FATAL",
 "KAA_INFO",
 "KAA_TRACE",
 "KAA_WARN"
 ]
}
},
 {
"name":"tag",
 "type":"string"
 },
 {
"name":"message",
 "type":"string"
 }
]
}

The following JSON example matches the previous schema.

 Click here to expand...

{
"level" : "KAA_INFO",
"tag" : "TEST_TAG",
"message" : "My simple message"
}

Go to Data collection demos in Sandbox.

Follow Installation instructions.

Next, in the Admin UI follow to Data collection demo application

Go to application's Log appenders configuration and add a new one.

Enter name of the new appender (in this example it is "Kafka")

Select Kafka appender type.

 

Set up appender Configuration similar to screenshot

In this example, Kafka server installed in the Sandbox VM.

Now click Add button on the top of the screen to create and deploy appender.

Verify that newly created appender has appeared in list.

 

From Kafka installation directory run next command:

This will bring up Kafka consumer, so we can see logs transferred from Kaa.

Now run Data collection demo application. Verify that logs have been successfully sent to Kaa

Make sure, that Kafka consumer receive logs:

 

 

Copyright © 2014-2016, CyberVision, Inc.

  • No labels