Java Client: How to use lookup data
Hi together,
what is the best way in Solace to use non static structured lookup data in a Java Client, without the use of an external system ?
For example, I am getting an Id from a source system and now I want to enrich my data based on the lookup and send it back to Solace.
Comments
-
Hi @Dominik,
Can you supply a bit more detail around what you need help with around the non-static structured data? I'm not sure that I'm following you. That said, once you have the data from a Solace perspective you can add it to the message properties and/or the payload of an outgoing message. And if it's something that subscribers might want to filter on you can consider adding it to the topic name that you publish to.0 -
Hi @marc,
sure, I will try :-) .
So, we are trying to use as less external systems as possible. Thats why we want to solve each issue first with Solace Broker / Cache.
Regarding the Use Case: We getting two different data sets from different source systems. The main data and the lookup data.
The lookup data will be updated from the SourceSystem like weekly. Its the kind of{"id": "1", "value": "My First Value"} {"id": "2", "value": "My Second Value"} {"id": "3", "value": "My Third Value"}
The main data has a huge structure and includes only the
"id"
field but needs the"value"
part from the second data source. Of course the easiest way to solve this would be to use an external system like Cassandra / Hazelcast or Postgres etc. The problem with that is, atm we dont have other external systems.
Thats why we trying to find a good solution / best practices with Solace itself, like just using a Replay each time.
Hope you are able to follow me ?0 -
Hi @Dominik, the thing to remember is that the Solace broker doesn't look at the payload. So if we can put the data (or perhaps just the key) in the topic, we can do something.
If you can put the data in the payload, you could use something like SolaceCache to hold the latest value for the lookup value. So, during your weekly lookup run you send a message to topic some/topic/id/1 with payload "My First Value". Then when your other system gets ID 1, it just does a cache request on some/topic/id/1 and gets back a message with "My First Value" in the payload. Lookup complete!
0 -
Hi @Dominik,
So let me make sure I understand. Does the main data come in periodically throughout the week via a message/event over Solace? Then when it's received you want your app to take theid
, retrieve thevalue
from the "lookup" data for that id, and use that to enrich the main data which is published back onto the Solace Broker?Also how large is the lookup data? If it's small enough to just hold it in a local hashmap within your Java app (and less than 30MB to hold in a single solace message) I'd probably have the lookup data published in a message to a topic and attracted to a last value queue (LVQ). Then when your app starts I'd have it browse that queue and get the latest lookup data. This way if your app ever crashes and needs to restart it will have the latest config waiting on the LVQ but it can also subscribe to the topic which receives lookup updates to stay up to date when changes are made. Do you think that would work?
Of course if my assumptions above are wrong we need a different solution 😜
@TomF any thoughts on the above?
1 -
An LVQ to hold all the reference data is a really good idea, if the data is small enough (as you mentioned). Or possibly a couple different LVQs, with the data sharded across them as appropriate.
For much larger applications, I have seen SolCache (now PubSub+ Cache) used to hold all the reference data, getting rehydrated once a day or once a week as required. That said, for this very simple straightforward map lookup, Hazelcast seems like a good fit..? But if trying to simplify the number of different types of components (and you already have Solace Cache) then that could definitely be fine to use.
0 -
Thanks guys!
I thought about it the last days, and came up with the solution to build a different Microservice, which consumes the lookup data, store it in a hashmap and offers an API for other Microservices. With the LVQs its even better because of the persistency.
I will go with this approach, thank you !0