🎄 Happy Holidays! 🥳
Most of Solace is closed December 24–January 1 so our employees can spend time with their families. We will re-open Thursday, January 2, 2024. Please expect slower response times during this period and open a support ticket for anything needing immediate assistance.
Happy Holidays!
Please note: most of Solace is closed December 25–January 2, and will re-open Tuesday, January 3, 2023.
How to push Syslog to ELK ?
We need to push Syslog to ELK, so that we can have end-to-end message flow dashboard.
Comments
-
Hey @cgovind ! There are a couple of threads in the community previously posted about ELK - were you able to check them out? here is a link for the search query https://solace.community/search?query=ELK&scope=site&Search=ELK.
We also have a blog post by @arih that could be helpful! https://solace.com/blog/integrating-solace-with-logstash/
Let us know if you were not able to find the answers there
0 -
Hi @cgovind ,
Are you trying to push Solace brokers' log via syslog to ELK? If so, these links should help:
https://docs.solace.com/Best-Practices/Monitoring-Using-Syslog.htm
https://docs.solace.com/System-and-Software-Maintenance/Monitoring-Events-Using-Syslog.htm#montoring_and_management_1462237600_256968And then you'll need to take care of syslog input for the Logstash: https://www.elastic.co/guide/en/logstash/current/plugins-inputs-syslog.html
But, let us know more details what you want to achieve if that is not the case.
0 -
HI @cgovind ,
Adding to answers above. At Solace end you can configure syslog forwarding to any syslog listener.
As you did for Splunk, at Solace brokers, the configuration is the same, you will just need to define the destination (ip, port, tcp/udp).The syslog format from Solace is RFC3164. You can find the details here: https://docs.solace.com/Best-Practices/Monitoring-Using-Syslog.htm
At ELK, you will need to configure the Logstash component to listen for syslog (input) and send to Elasticsearch (output).
You can find the guide to write the logstash from the link provided from earlier answers. You can create the logstash pipeline according to your need.
If you need an example, you can try this logstash configuration:
input { syslog { ecs_compatibility => "disabled" port => 514 timezone => "Asia/Singapore" } } filter { if [facility_label] == "local3" or [facility_label] == "local4" { grok { match => { "message" => "%{NOTSPACE:event_type}: %{NOTSPACE:event_name}: (?:%{NOTSPACE:vpn_name}) (?:%{NOTSPACE:client_name}) %{GREEDYDATA:message}" } overwrite => [ "message" ] } } date { match => [ "timestamp", "MMM dd HH:mm:ss", "MMM d HH:mm:ss" ] timezone => "Asia/Singapore" } } output { elasticsearch { hosts => [ "http://elasticsearch:9200" ] user => "elastic" password => "${VARIABLE_OF_YOUR_ELASTIC_PASSWORD}" } stdout { codec => rubydebug } }
1