🎄 Happy Holidays! 🥳

Most of Solace is closed December 24–January 1 so our employees can spend time with their families. We will re-open Thursday, January 2, 2024. Please expect slower response times during this period and open a support ticket for anything needing immediate assistance.

Happy Holidays!

Please note: most of Solace is closed December 25–January 2, and will re-open Tuesday, January 3, 2023.

Failed to consume a message from destination

I have a queue A which has many topic subscriptions. We run our microservice using Spring cloud stream stack. When queue A receives messages from different topics and processing them and all of a sudden, we see below exception stacktrace is continuously printing in the logs. I am not sure about the reason, while it is pouring out logs like below, some of the messages are still coming inside and is being processed parallelly. What could be the reason for the below exception. Can you please help.


"Failed to consume a message from destination QueueA - attempt 1","threadID":"pool-6-thread-2","sourceHost":"Host1","logVersion":"1.5","category":"com.solace.spring.cloud.stream.binder.inbound.RetryableInboundXMLMessageListener"}

"Failed to consume a message from destination QueueA - attempt 1","threadID":"pool-6-thread-2","sourceHost":"Host1","logVersion":"1.5","category":"com.solace.spring.cloud.stream.binder.inbound.RetryableInboundXMLMessageListener"}

"Failed to consume a message from destination QueueA - attempt 2","threadID":"pool-6-thread-3","sourceHost":"Host1","logVersion":"1.5","category":"com.solace.spring.cloud.stream.binder.inbound.RetryableInboundXMLMessageListener"}

"Failed to consume a message from destination QueueA - attempt 2","threadID":"pool-6-thread-3","sourceHost":"Host1","logVersion":"1.5","category":"com.solace.spring.cloud.stream.binder.inbound.RetryableInboundXMLMessageListener"}

"Failed to consume a message from destination QueueA - attempt 3","threadID":"pool-6-thread-3","sourceHost":"Host1","logVersion":"1.5","category":"com.solace.spring.cloud.stream.binder.inbound.RetryableInboundXMLMessageListener"}

"Failed to consume a message from destination QueueA - attempt 3","threadID":"pool-6-thread-3","sourceHost":"Host1","logVersion":"1.5","category":"com.solace.spring.cloud.stream.binder.inbound.RetryableInboundXMLMessageListener"}

"Processing message 33691b9e-1446-49c9-367b-2dad796dbe1d <failed-message: 65226031-6bb3-a9f8-3be3-00d21996d5d6, source-message: 1, >","threadID":"pool-6-thread-3","sourceHost":"Host1","logVersion":"1.5","category":"com.solace.spring.cloud.stream.binder.util.SolaceErrorMessageHandler"}

"XMLMessage 1: Will be re-queued onto queue QueueA","threadID":"pool-6-thread-3","sourceHost":"Host1","logVersion":"1.5","category":"com.solace.spring.cloud.stream.binder.util.JCSMPAcknowledgementCallbackFactory$JCSMPAcknowledgementCallback"}

"Rebinding flow receiver container c82f7e39-b268-46da-b829-86931723dc90","threadID":"pool-6-thread-3","sourceHost":"Host1","logVersion":"1.5","category":"com.solace.spring.cloud.stream.binder.util.FlowReceiverContainer"}

"Stopping flow receiver container c82f7e39-b268-46da-b829-86931723dc90","threadID":"pool-6-thread-3","sourceHost":"Host1","logVersion":"1.5","category":"com.solace.spring.cloud.stream.binder.util.FlowReceiverContainer"}

"Waiting for 0 items, time remaining: 10000 MILLISECONDS","threadID":"pool-6-thread-3","sourceHost":"Host1","logVersion":"1.5","category":"com.solace.spring.cloud.stream.binder.util.UnsignedCounterBarrier"}

"Unbinding flow receiver container c82f7e39-b268-46da-b829-86931723dc90","threadID":"pool-6-thread-3","sourceHost":"Host1","logVersion":"1.5","category":"com.solace.spring.cloud.stream.binder.util.FlowReceiverContainer"}

"Binding flow receiver container c82f7e39-b268-46da-b829-86931723dc90","threadID":"pool-6-thread-3","sourceHost":"Host1","logVersion":"1.5","category":"com.solace.spring.cloud.stream.binder.util.FlowReceiverContainer"}

"Failed to consume a message from destination QueueA - attempt 1","threadID":"pool-6-thread-3","sourceHost":"Host1","logVersion":"1.5","category":"com.solace.spring.cloud.stream.binder.inbound.RetryableInboundXMLMessageListener"}

"Failed to consume a message from destination QueueA - attempt 1","threadID":"pool-6-thread-3","sourceHost":"Host1","logVersion":"1.5","category":"com.solace.spring.cloud.stream.binder.inbound.RetryableInboundXMLMessageListener"}

Comments

  • Kaliappans
    Kaliappans Member Posts: 24

    @TomF @marc  Can I have some reply on this post. I am getting lot of logs everyday.

  • marc
    marc Member, Administrator, Moderator, Employee Posts: 963 admin

    Hi @Kaliappans,

    It looks like your application is failing to process some of the events that it is receiving. Since you have events coming in from different topics maybe it isn't properly processing messages from one of them? I'd suggest adding some logging into your function handling these events to troubleshoot what is going wrong.

    Also what version of solace-spring-cloud-bom or the solace binder are you using? Your logs look a bit different than mine.

    Also, if you haven't seen it yet, check out section 9 of this codelab which covers the consumer error handling options. It looks like you potentially have all the defaults set w/ 3 retries and then requeueing the msg: https://codelabs.solace.dev/codelabs/spring-cloud-stream-beyond/index.html?index=..%2F..index#8

  • Kaliappans
    Kaliappans Member Posts: 24

    @marc , We are using the latest version of solace-spring-cloud-bom. The logs keep repeating the same message and printing continuously in the console. One thing I observed form solace portal during this exception scenario, I see some of the messages are moving between two columns that is 'Unacknowledged Messages' and 'Messages Redelivered' between instances as below. Do you get any insights from this screenshot.



  • Kaliappans
    Kaliappans Member Posts: 24
    edited March 2023 #5

    I am not sure why some of the msgs are not getting acknowledged. This is my configurations. please help


    connectRetriesPerHost: 1
    reconnectRetries: 20
    connectionTimeout: 4000
    connectRetry: 5
    replyTimeout: 25000
    


  • marc
    marc Member, Administrator, Moderator, Employee Posts: 963 admin
    edited March 2023 #6

    Hi @Kaliappans,

    Unfortunately we'll need more detail to help. This should only happen if message processing is failing. That could be happening in the framework itself or in your message handler. Unfortunately we need more information before we can help. Are the messages making it into your code? What exceptions are you seeing? If nothing, maybe up the log level to see what we are missing.

    Note that if you have Solace support you can share your code with them as well and they may be faster to respond :)

    Hope that helps!

  • Kaliappans
    Kaliappans Member Posts: 24

    @marc , as I already mentioned log exception trace at the top, the same is printing repeatedly. While receiving msgs and acknowledged, all of a sudden, the above exception trace was printing and msgs are getting requeued. not sure about the reason, pls help. If you need more details, do let me know, i am ready to share.

  • marc
    marc Member, Administrator, Moderator, Employee Posts: 963 admin

    Hi @Kaliappans ,

    I definitely need more info to help. Can you change the first line of your method to log something and see if it's getting to it? If not then the problem is likely that Spring can't transform the payload of some messages to the POJO that is expected.

  • Kaliappans
    Kaliappans Member Posts: 24
    edited March 2023 #9

    @marc , As you mentioned, I added log line in the entry of the method. But, when this failure happens, though I see that messages are requeued in the log, no control is coming inside the listener method. But logs are poured like anything and msgs are keep requeued and failure happens continously.

  • Kaliappans
    Kaliappans Member Posts: 24
    edited April 2023 #10

    Updating the starcktrace again, since we disabled one package earlier. Please see below and help to find the rootcause


    023-04-13 16:21:39.086 WARN 27352 --- [pool-3-thread-5] s.b.i.RetryableInboundXMLMessageListener : Failed to consume a message from destination QUE_173707_PLATFORM_EVENT_LOGGING_DEV - attempt 3
    2023-04-13 16:21:39.086 WARN 27352 --- [pool-3-thread-5] s.b.i.RetryableInboundXMLMessageListener : Failed to consume a message from destination QUE_173707_PLATFORM_EVENT_LOGGING_DEV - attempt 3
    2023-04-13 16:21:39.086 WARN 27352 --- [pool-3-thread-5] s.b.i.RetryableInboundXMLMessageListener : Failed to consume a message from destination QUE_173707_PLATFORM_EVENT_LOGGING_DEV - attempt 3
    2023-04-13 16:21:39.086 WARN 27352 --- [pool-3-thread-5] s.b.i.RetryableInboundXMLMessageListener : Failed to consume a message from destination QUE_173707_PLATFORM_EVENT_LOGGING_DEV - attempt 3
    2023-04-13 16:21:39.087 WARN 27352 --- [pool-3-thread-5] s.b.i.RetryableInboundXMLMessageListener : Failed to consume a message from destination QUE_173707_PLATFORM_EVENT_LOGGING_DEV - attempt 3
    2023-04-13 16:21:39.087 ERROR 27352 --- [pool-3-thread-5] o.s.integration.handler.LoggingHandler  : org.springframework.messaging.MessagingException: Incorrect type specified for header 'deliveryAttempt'. Expected [class java.util.concurrent.atomic.AtomicInteger] but actual type is [class java.lang.String]; nested exception is java.lang.IllegalArgumentException: Incorrect type specified for header 'deliveryAttempt'. Expected [class java.util.concurrent.atomic.AtomicInteger] but actual type is [class java.lang.String]
    	at org.springframework.integration.core.ErrorMessagePublisher.determinePayload(ErrorMessagePublisher.java:186)
    	at org.springframework.integration.core.ErrorMessagePublisher.publish(ErrorMessagePublisher.java:162)
    	at org.springframework.integration.handler.advice.ErrorMessageSendingRecoverer.recover(ErrorMessageSendingRecoverer.java:83)
    	at com.solace.spring.cloud.stream.binder.inbound.RetryableInboundXMLMessageListener.lambda$handleMessage$14(RetryableInboundXMLMessageListener.java:65)
    	at org.springframework.retry.support.RetryTemplate.handleRetryExhausted(RetryTemplate.java:539)
    	at org.springframework.retry.support.RetryTemplate.doExecute(RetryTemplate.java:387)
    	at org.springframework.retry.support.RetryTemplate.execute(RetryTemplate.java:225)
    	at com.solace.spring.cloud.stream.binder.inbound.RetryableInboundXMLMessageListener.handleMessage(RetryableInboundXMLMessageListener.java:60)
    	at com.solace.spring.cloud.stream.binder.inbound.InboundXMLMessageListener.receive(InboundXMLMessageListener.java:113)
    	at com.solace.spring.cloud.stream.binder.inbound.InboundXMLMessageListener.run(InboundXMLMessageListener.java:73)
    	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at java.lang.Thread.run(Thread.java:748)
    Caused by: java.lang.IllegalArgumentException: Incorrect type specified for header 'deliveryAttempt'. Expected [class java.util.concurrent.atomic.AtomicInteger] but actual type is [class java.lang.String]
    	at org.springframework.messaging.MessageHeaders.get(MessageHeaders.java:216)
    
  • Kaliappans
    Kaliappans Member Posts: 24

    Can I get some response for my query @marc @Tamimi @giri

  • Aaron
    Aaron Member, Administrator, Moderator, Employee Posts: 644 admin

    Hi @Kaliappans. Our Solace Community is not actively monitored by our Support team. This is a best-effort forum for questions and discussions. If you are a Solace customer and require more immediate action, please email support@solace.com.

    I have replied your other thread. It appears similar to this one. Without providing additional detail, there's not too much to go on. My guess is that your publisher (publishers?) are not encoding things properly, payload is not exactly what the consumer is expecting. So what does your published message that's in the queue look like?

This Week's Leaders