Hi all,
After long time i am writing this blog. It's about log Management.
Refer this blog for more tools: http://www.tuicool.com/articles/B7N3qq
Splunk - proprietary software
Graylog2
Fluentd:
Logstash
Following are the experiments which i have done:
I tested the things for single node. After the successful installation of Eucalyptus you will get the log under the $EUCA_HOME directory.
Configuration:
In 'elasticsearch.yml' do the following things:
network.host: IP
discovery.zen.ping.multicast.enabled: false
discovery.zen.ping.unicast.hosts: ["hostname" ]
http.cors.enabled: true
http.cors.allow: "/.*/"
After long time i am writing this blog. It's about log Management.
Refer this blog for more tools: http://www.tuicool.com/articles/B7N3qq
Splunk - proprietary software
- one click solution
- By web portal itself you can load data and folder it will analyse and prepare visuale your logs
Graylog2
- Good one
- But process the messages from TCP, UDP and other ports.
- Using rsyslog to communicate to nodes
- It will be good for bigger organization
Fluentd:
- All are like plugins have to add and do the work
- In_tail is used for reading from text files
- Everything in json
- Implemented with c
Logstash
- Everythings inbuilt
- It will be good if it combine with Elasticsearch + kibana
- jruby runs on JVM
- Autorefresh also we can enable in kibana3
Following are the experiments which i have done:
Source:
- Eucalyptus - 3.4.2 ( Eucalyptus version not a problem)
- Logstash - 1.4.2
- Elasticsearch - 1.4.2
- Kibana3
I tested the things for single node. After the successful installation of Eucalyptus you will get the log under the $EUCA_HOME directory.
Configuration:
In 'elasticsearch.yml' do the following things:
network.host: IP
discovery.zen.ping.multicast.enabled: false
discovery.zen.ping.unicast.hosts: ["hostname" ]
http.cors.enabled: true
http.cors.allow: "/.*/"
If you are installing logstash from the deb the configuration files are located under the following directory:
/etc/logstash/conf.d/*
Logstash will grep the logs based on the configuration files.
Here only we have to pass our configuration files which contains
sample.conf
input {
... files location to grep the logs
}
filter{
... Grok patterns to filter the logs from log file
}
output{
... redirecting to stdout or to elasticsearch
}
Now kibana3 have to be extracte and keep under your /var/www/kibana3/*. So your kibana3 will run under your webbrowser.
To run your elasticsearch in background:
./elasticsearch &
To run your logstash to grep the logs:
./logstash -f /etc/logstash/conf.d/eucalyptus_cloud.conf
then redirect your browser to the http://ip/kibana3 link it will show the visualized pattern of eucalyptus logs.
Based on the search pattern you can grep the logs like ERROR, INFO, cc.log.
It will give the visualization as:
For more guidelines please refer the following links:
Credits to:
https://www.digitalocean.com/community/tutorials/how-to-use-logstash-and-kibana-to-centralize-and-visualize-logs-on-ubuntu-14-04
http://logstash.net/docs/1.4.1/configuration
http://grokconstructor.appspot.com/do/match?example=0
https://www.eucalyptus.com/blog/2013/08/23/extracting-info-euca%E2%80%99s-logs
Regards,
cooldharma06.. :)
.. Always be cool ..
It will give the visualization as:
For more guidelines please refer the following links:
Credits to:
https://www.digitalocean.com/community/tutorials/how-to-use-logstash-and-kibana-to-centralize-and-visualize-logs-on-ubuntu-14-04
http://logstash.net/docs/1.4.1/configuration
http://grokconstructor.appspot.com/do/match?example=0
https://www.eucalyptus.com/blog/2013/08/23/extracting-info-euca%E2%80%99s-logs
Regards,
cooldharma06.. :)
.. Always be cool ..
