Recently I was assisting a customer who was loading Oracle log files using logstash 1.4.2 that comes bundled with IBM Operations Analytics – Log Analysis 1.3.1. All was working fine. They upgraded to Log Analysis 1.3.2 configuring the bundled logstash 1.5.3. They copied the logstash config file from Log Analysis 1.3.1 to Log Analysis 1.3.2. After doing this logstash encounter some problems.

When we debugged the problems, it was discovered the following changes were required in the logstash config parameters. These are specific to the logstash output plugin that comes with Log Analysis 1.3.2 which are highlighted below:

output {
scala {
scala_url => "https://:/Unity/DataCollector"
scala_user => ""
scala_password => ""
scala_keystore_path => ""
batch_size => 500000
idle_flush_time => 5
sequential_flush => true
num_concurrent_writers => 20
use_structured_api => false
disk_cache_path => "/Logstash/cache-dir"
scala_fields =>
{
"host1@path1,host2@path2"
=> "event_field11,event_field12,...,event_field1N"
"host3@path3"
=> "event_field21,event_field22,...,event_field2N"
}
date_format_string => "yyyy-MM-dd'T'HH:mm:ssX"
log_file => "/Logstash/logs/scala_logstash.log"
log_level => "info"
}
}

Note: The value to scala_keystore_path is empty and LA_password is unencrypted password.

More details on migrating Log Analysis 1.3.1 to 1.3.2 and configuring logstash 1.5.3 with Log Analysis 1.3.2 is available at knowledge center for reference.

Join The Discussion

Your email address will not be published. Required fields are marked *