Solace broker and ELK integration
I am not able to apply solace gork pattern(Solace event log) on logstash config file
if [type] == "syslog" { grok { match => { "message" => "%{SOLACE_EVENT_LOG}" } } syslog_pri {} }
Comments
-
Hi @Kinjal Shah !
Here's one of my logstash config files. It might be a bit complicated, but it should give you some pointers. https://github.com/aaron-613/solace-logging-config/blob/master/solace_logstash.conf
Other options are here, and a bit of a write-up: https://github.com/aaron-613/solace-logging-config/BTW, did you try Googling for just "solace logstash"? Here are a couple other resources from our webpage:
Blog: https://solace.com/blog/integrating-solace-with-logstash/
Integration Guide: https://docs.solace.com/Developer-Tools/Integration-Guides/Elk-Stack.htm0 -
Double post! Just saw your other one. Ok: did you notice the Appendix (appendices) at teh end of the PDF document? All the various patterns are defined there, and a full config file is given.
If you've copy-pasted all of those from the Appendix, is your Logstash giving any errors? grok rules parsing errors..?
0 -
getting below error if I copy the logstash config file provided in appendix in pdf
Oct 28 13:13:47 ip-172-32-2-49.us-east-2.compute.internal logstash[3847]: create at org/logstash/execution/ConvergeResultExt.java:129
Oct 28 13:13:47 ip-172-32-2-49.us-east-2.compute.internal logstash[3847]: add at org/logstash/execution/ConvergeResultExt.java:57
Oct 28 13:13:47 ip-172-32-2-49.us-east-2.compute.internal logstash[3847]: converge_state at /usr/share/logstash/logstash-core/lib/logstash/agent.rb:355
Oct 28 13:13:47 ip-172-32-2-49.us-east-2.compute.internal logstash[3847]: [2020-10-28T13:13:47,675][ERROR][logstash.agent ] An exception happened when converging configuration {:exception=>LogStash::Error, :message=>"Don't know how to handleJava::JavaLang::IllegalStateException
forPipelineAction::Create<main>
", :backtrace=>["org/logstash/execution/ConvergeResultExt.java:129:increate'", "org/logstash/execution/ConvergeResultExt.java:57:in
add'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:355:inblock in converge_state'"]} Oct 28 13:13:47 ip-172-32-2-49.us-east-2.compute.internal logstash[3847]: [2020-10-28T13:13:47,684][FATAL][logstash.runner ] An unexpected error occurred! {:error=>#<LogStash::Error: Don't know how to handle
Java::JavaLang::IllegalStateExceptionfor
PipelineAction::Create>, :backtrace=>["org/logstash/execution/ConvergeResultExt.java:129:in
create'", "org/logstash/execution/ConvergeResultExt.java:57:inadd'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:355:in
block in converge_state'"]}
Oct 28 13:13:47 ip-172-32-2-49.us-east-2.compute.internal logstash[3847]: [2020-10-28T13:13:47,736][ERROR][org.logstash.Logstash ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit0 -
Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"Java::JavaLang::IllegalStateException", :message=>"Unable to configure plugins: (ConfigurationError) Something is wrong with your configuration.", :backtrace=>["org.logstash.config.ir.CompiledPipeline.(CompiledPipeline.java:126)", "org.logstash.execution.JavaBasePipelineExt.initialize(JavaBasePipelineExt.java:80)", "org.logstash.execution.JavaBasePipelineExt$INVOKER$i$1$0$initialize.call(JavaBasePipelineExt$INVOKER$i$1$0$initialize.gen)", "org.jruby.internal.runtime.methods.JavaMethod$JavaMethodN.call(JavaMethod.java:837)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuper(IRRuntimeHelpers.java:1169)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuperSplatArgs(IRRuntimeHelpers.java:1156)", "org.jruby.ir.targets.InstanceSuperInvokeSite.invoke(InstanceSuperInvokeSite.java:39)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$initialize$0(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:43)", "org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:82)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:70)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:332)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:86)", "org.jruby.RubyClass.newInstance(RubyClass.java:939)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(RubyClass$INVOKER$i$newInstance.gen)", "org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:207)", "usr.share.logstash.logstash_minus_core.lib.logstash.pipeline_action.create.RUBY$method$execute$0(/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:52)", "usr.share.logstash.logstash_minus_core.lib.logstash.pipeline_action.create.RUBY$method$execute$0$VARARGS(/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb)", "org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:82)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:70)", "org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:207)", "usr.share.logstash.logstash_minus_core.lib.logstash.agent.RUBY$block$converge_state$2(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:342)", "org.jruby.runtime.CompiledIRBlockBody.callDirect(CompiledIRBlockBody.java:138)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:58)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:52)", "org.jruby.runtime.Block.call(Block.java:139)", "org.jruby.RubyProc.call(RubyProc.java:318)", "org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:105)", "java.lang.Thread.run(Thread.java:748)"]}
0 -
its unable to configure pipeline
0 -
@arih I believe you were recently using Solace + ELK. Any chance you ran into this issue or know what we might be missing?
It looks like @Kinjal Shah is having an issue trying to get the Solace Grok patterns working for event logs that are defined in the appendix of the integration guide
0 -
Hi @marc ,
My blog post talks about the other side of the integration - where Logstash consumes logs from Solace Queue using JMS. So I didn't play around with config like what @Kinjal Shah and @Aaron did.
0 -
Hi @Kinjal Shah ..! I have recently been playing around with Logstash again, reacquainting myself with it. Using the Integration PDF document you reference, I copied the two Appendices into two separate files... one for patterns, and one for config. I was able to get it working without too much issue. The one thing I changed was to add a specific link to the patterns file inside each
grok
block in the configuration. I think there's probably a better way to do that, but I'm just trying to get it working.So, for me: I have a patterns file (copied from appendix):
# # TOP-LEVEL PATTERNS # SOLACE_REMOTE_COMMAND %{SYSLOG_PREAMBLE} %{SOLACE_MGMT_REMOTE_USER_INFO}\s*%{SOLACE_MGMT_EPILOGUE} SOLACE_SHELL_COMMAND %{SYSLOG_PREAMBLE} %{SOLACE_MGMT_SHELL_USER_INFO}\s*%{SOLACE_MGMT_EPILOGUE} SOLACE_EVENT_LOG %{SYSLOG_PREAMBLE} %{SOLACE_EVENT_EPILOGUE} # .....[USERID]\[[PID]\]: [CMDSRC]/[ignored] \s* [CLIENT-ADDRESS] # ..... devAdmin[14970]: SEMP/mgmt 14.140.217.68 # ..... admin[27647]: CLI/1 69.204.252.14 # SOLACE_MGMT_REMOTE_USER_INFO %{SOLACE_MGMT_LOCAL_USER_INFO}/%{WORD}\s*%{IPORHOST:solace_client_address} # ..... [USERID]\[[PID]\]: [CMDSRC] \s* [ignored] \s* # ..... support[6528]: SHELL CLI/1 SOLACE_MGMT_SHELL_USER_INFO %{SOLACE_MGMT_LOCAL_USER_INFO}\s*%{NOTSPACE} # ..... [USERID]\[[PID]\]: [CMDSRC] # ..... rbc_devAdmin[14970]: SEMP/mgmt # ..... support[6528]: SHELL SOLACE_MGMT_LOCAL_USER_INFO %{DATA:syslog_userid}\[%{POSINT:syslog_pid}\]: %{WORD:solace_cmd_source} # <[PRI]>[TIMESTAMP] [SERVERNAME] # <142>Nov 18 21:30:05 demo-tr SYSLOG_PREAMBLE <%{POSINT:syslog_pri}>%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} # ..... [EVTAG] [SCOPE]: [EVENT_ID] : [MESSAGE] # ..... event: SYSTEM: SYSTEM_AUTHENTICATION_SESSION_CLOSED: - - SEMP session (gory details etc.) # ..... heinzvpnINFO: CLIENT: CLIENT_CLIENT_CLOSE_FLOW: kov perf-130-81/31733/#00000001 Client (702) (gory details etc.) SOLACE_EVENT_EPILOGUE %{WORD:solace_event_log_tag}: %{WORD:solace_scope}: %{WORD:solace_event_id}: %{NOTSPACE:solace_vpn} %{NOTSPACE:solace_client} %{GREEDYDATA:solace_message} # ..... [IGNORED] \s* [START]\s*[END] \s*[STATUS] [MESSAGE] # ..... rbc_devAdmin 09:16:57 09:16:57 ok show queue (etc.) # ..... admin 21:30:00 21:30:05 ok (config)# show syslog # ..... admin --- --- --- (/usr/sw/jail/logs) tail -f command.log SOLACE_MGMT_EPILOGUE %{WORD}\s*%{NOTSPACE:solace_cmd_start_time}\s*%{NOTSPACE:solace_cmd_end_time}\s*%{NOTSPACE:solace_cmd_status}\s*%{GREEDYDATA:solace_message}
And then I have the configuration file, which I modified to point to that patterns file:
input { tcp { port => "51420" type => syslog } } filter { if [type] == "syslog" { #### MGMT: command.log SEMP, CLI, SHELL if [message] =~ /: CLI\/|: SEMP\/|: SHELL/ { grok { patterns_dir => ["/etc/logstash/patterns"] match => { "message" => "%{SOLACE_REMOTE_COMMAND}" } match => { "message" => "%{SOLACE_SHELL_COMMAND}" } add_field => { "solace_event_id" => "MGMT_%{solace_cmd_source}" } add_field => { "solace_scope" => "MGMT" } } } ### EVENTS: event.log or system.log w/vpn-specific tagging else if [message] =~ / CLIENT:| VPN:| SYSTEM:/ { grok { patterns_dir => ["/etc/logstash/patterns"] match => { "message" => "%{SOLACE_EVENT_LOG}" } } } ### UNKNOWN: just parse the SYSLOG basics and force the rest into the solace_message field else { grok { patterns_dir => ["/etc/logstash/patterns"] match => { "message" => "%{SYSLOG_PREAMBLE} %{GREEDYDATA:solace_message}" } # Set solace fields so we can search for these cases add_field => { "solace_event_id" => "UNKNOWN" } add_field => { "solace_scope" => "UNKNOWN" } } } # Does the nasty parsing of the syslog_pri field into facility+severity # Have to wait til the SYSLOG_PREAMBLE has been grokked first syslog_pri {} } } output { #file { path => "/tmp/log_everything.log" } if [type] == "syslog" and "_grokparsefailure" in [tags] { file { path => "./logs/failed_syslog_events.log" } } elasticsearch { host => localhost } }
Hopefully that helps! Let me know.
1