Let Solace act like it is Kafka to Kafka Consumers
I would like to connect apache Druid to Solace. However, it seems Druid doesn't support any of the connection types that Solace offers:
One of the solutions would be if Solace offered Kafka consumers the possibility to directly connect to Solace, where Solace acts like it is kafka to them. Is this something that's a) possible or b) on the horizon? It would open a massive amount of possibilities when it comes to connecting applications.
Or is my thinking completely off here.
Comments
-
Wow @marc, that's exactly what I had in mind, except… it's the other way around 😁
In need a Kafka Consumer Proxy, to put it in the same wording.
The goal here is to read data from Solace using Kafka logic (the consumer connects using kafka settings), or is that already possible on default with one of connection options?
0 -
Hey @MrSmart,
You'd need a Kafka Broker between the Apache Druid Consumer and Solace, but you should be able to use either our Kafka Connect Source Connector + Kafka Connect to do this or our Integrated Kafka Bridge functionality. The latter is currently in Beta but you can try it out with a release 10.4.1 broker. Would that work?
0 -
Also @MrSmart if they have a feature request to support MQTT, JMS or AMQPv1let us know and we'll all go upvote it :)
I assume what you are hoping to use is the "Kafka Emitter"? That could be used as a template for one of the above, or I saw an older page talking about a Rabbit, but I imagine that uses AMQP 0.9.1. If that is open source it might be easy to leverage as a template for supporting one of the above options as well.
0 -
Well i'd rather have a direct connection between these applications instead of another broker just to load data into Druid.
I guess I came accross a Solace < - > Druid limitation, where either Solace could deliver the solution or Druid.
If a Kafka Consumer Proxy would become a thing, that would be awesome, also towards future projects 🙂
0 -
Hi @marc,
The weird thing is that they supported AMQP:
But they completely removed the support after version 0.16. i'ts a bit unfortunate.
This model of stream pull ingestion was deprecated for a number of both operational and architectural reasons, and removed completely in Druid 0.16.0. Operationally, realtime nodes were difficult to configure, deploy, and scale because each node required an unique configuration. The design of the stream pull ingestion system for realtime nodes also suffered from limitations which made it not possible to achieve exactly once ingestion.
I guess someone could create an extension to make this work, not really a Java enthusiast myself though:
btw the Kafka Emitter is sending data out of Druid, i just want to ingest data into Druid that's currently inside Solace.
0 -
Hey @MrSmart,
This might actually make it easier! If that is your use case I would imagine you could use Solace RDP (REST Delivery Points) to post events into Druid using REST. Is that an option they have? A Solace RDP essentially takes the messages from a Solace queue and sends them out via REST (basically a webhook). The thing to be cautious of would be your event throughout. If you're expecting hundreds or thousands of events per second you would now be making hundreds or thousands of REST POSTs into Druid
@Aaron explains them more here:
0 -
Hi @marc,
Druid has an API endpoint where you can submit INSERT and REPLACE queries to, guess that would be an semi-ideal solution. I still need a transformer script that takes messages from solace and maps them into queries.
I really hope Solace will support the Kafka protocol natively one day, it makes so much sense to offer it as a communication channel to existing applications.
0