Getting getSplitBacklogBytes() unable to read bytes error after draining Streaming Dataflow job
I am trying to read messages from solace queues through solace io connector and then writing them to GCS bucket , i was able to read and write messages to GCS as it was streaming job i need to drain it so when i drain the Streaming dataflow job getting below error
error:
getSplitBacklogBytes() unable to read bytes from: queue name
Encountered a Parser Exception querying queue depth: java.lang.NullPointerException: Failed to evaluate /rpc-reply/rpc/show/queue/queues/queue/info/current-spool-usage-in-bytes in <?xml version="1.0" encoding="UTF-8" standalone="no"?>
what might the reason for this error?
Comments
-
Hi @abhitej,
Can you share more info around what solace io connector you're using? Also out of curiosity why are you querying queue depth via a SEMP command? In general you usually want to register a listener in an event-driven paradigm and then your app gets messages pushed to it when available. This way you shouldn't need to poll to see if messages are available.If for some reason you really need to poll using that SEMP command you'll need to enable
show
commands. You'll find more info in this docs section: https://docs.solace.com/SEMP/Using-Legacy-SEMP.htm#Configur
Note the sub-section "Allowing Access to Show Commands"1 -
Hello @abhitej , Are you using this BEAM I/O Connector ? This requires "show semp" commands to be enabled. (see below from the documentation)
https://github.com/SolaceProducts/solace-apache-beam#allow-apache-beam-to-detect-message-backlog-to-scale-workersAllow Apache Beam to Detect Message Backlog to Scale Workers Apache Beam uses the message backlog as one of its parameters to determine whether or not to scale its workers. To detect the amount of backlog that exists for a particular queue, the Beam I/O Connector sends a SEMP-over-the-message-bus request to the broker. But for it to be able to do this, show commands for SEMP-over-the-message-bus must be enabled.
Are you running broker on your container env (like docker) or using Solace Cloud service?
1