Let Solace act like it is Kafka to Kafka Consumers

Options
MrSmart
MrSmart Member Posts: 29 ✭✭✭
edited July 2023 in Connectors & Integrations #1

I would like to connect apache Druid to Solace. However, it seems Druid doesn't support any of the connection types that Solace offers:

https://druid.apache.org/docs/latest/ingestion/index.html#ingestion-methods

One of the solutions would be if Solace offered Kafka consumers the possibility to directly connect to Solace, where Solace acts like it is kafka to them. Is this something that's a) possible or b) on the horizon? It would open a massive amount of possibilities when it comes to connecting applications.

Or is my thinking completely off here.

Comments

  • marc
    marc Member, Administrator, Moderator, Employee Posts: 921 admin
    edited July 2023 #2
    Options

    Hi @MrSmart,

    Indeed it is, check out @Aaron's recent blog re: the Kafka Proxy and see if this gives you what you need: https://solace.com/blog/introducing-kafka-producer-proxy/

    Video & code is linked there as well.

    Hope that helps!

  • MrSmart
    MrSmart Member Posts: 29 ✭✭✭
    Options

    Wow @marc, that's exactly what I had in mind, except… it's the other way around 😁

    In need a Kafka Consumer Proxy, to put it in the same wording.

    The goal here is to read data from Solace using Kafka logic (the consumer connects using kafka settings), or is that already possible on default with one of connection options?

  • rtomkins
    rtomkins Member, Employee Posts: 23 Solace Employee
    Options

    Hey @MrSmart,

    You'd need a Kafka Broker between the Apache Druid Consumer and Solace, but you should be able to use either our Kafka Connect Source Connector + Kafka Connect to do this or our Integrated Kafka Bridge functionality. The latter is currently in Beta but you can try it out with a release 10.4.1 broker. Would that work?

  • marc
    marc Member, Administrator, Moderator, Employee Posts: 921 admin
    edited July 2023 #5
    Options

    Also @MrSmart if they have a feature request to support MQTT, JMS or AMQPv1let us know and we'll all go upvote it :)

    I assume what you are hoping to use is the "Kafka Emitter"? That could be used as a template for one of the above, or I saw an older page talking about a Rabbit, but I imagine that uses AMQP 0.9.1. If that is open source it might be easy to leverage as a template for supporting one of the above options as well.

  • MrSmart
    MrSmart Member Posts: 29 ✭✭✭
    Options

    Well i'd rather have a direct connection between these applications instead of another broker just to load data into Druid.

    I guess I came accross a Solace < - > Druid limitation, where either Solace could deliver the solution or Druid.

    If a Kafka Consumer Proxy would become a thing, that would be awesome, also towards future projects 🙂

  • MrSmart
    MrSmart Member Posts: 29 ✭✭✭
    edited July 2023 #7
    Options

    Hi @marc,

    The weird thing is that they supported AMQP: https://druid.apache.org/docs//0.14.1-incubating/development/extensions-contrib/rabbitmq.html

    But they completely removed the support after version 0.16. i'ts a bit unfortunate. https://druid.apache.org/docs/latest/ingestion/standalone-realtime.html

    This model of stream pull ingestion was deprecated for a number of both operational and architectural reasons, and removed completely in Druid 0.16.0. Operationally, realtime nodes were difficult to configure, deploy, and scale because each node required an unique configuration. The design of the stream pull ingestion system for realtime nodes also suffered from limitations which made it not possible to achieve exactly once ingestion.

    I guess someone could create an extension to make this work, not really a Java enthusiast myself though:

    https://druid.apache.org/docs/latest/development/modules.html

    btw the Kafka Emitter is sending data out of Druid, i just want to ingest data into Druid that's currently inside Solace.

  • MrSmart
    MrSmart Member Posts: 29 ✭✭✭
    Options

    Well I guess I wasn't the only one thinking this would be a good idea…

    https://streamnative.io/blog/kafka-on-pulsar-bring-native-kafka-protocol-support-to-apache-pulsar

  • MrSmart
    MrSmart Member Posts: 29 ✭✭✭
    Options

    @Aaron Maybe a bit of a bold question, but do you have plans to turn your Kafka Producer Proxy around and make it work for consumers so it becomes a Kafka Consumer Proxy?

  • marc
    marc Member, Administrator, Moderator, Employee Posts: 921 admin
    edited July 2023 #10
    Options

    Hey @MrSmart,

    btw the Kafka Emitter is sending data out of Druid, i just want to ingest data into Druid that's currently inside Solace.

    This might actually make it easier! If that is your use case I would imagine you could use Solace RDP (REST Delivery Points) to post events into Druid using REST. Is that an option they have? A Solace RDP essentially takes the messages from a Solace queue and sends them out via REST (basically a webhook). The thing to be cautious of would be your event throughout. If you're expecting hundreds or thousands of events per second you would now be making hundreds or thousands of REST POSTs into Druid

    @Aaron explains them more here: https://www.youtube.com/live/TeKgwMz1pZY?feature=share&t=505

  • MrSmart
    MrSmart Member Posts: 29 ✭✭✭
    Options

    Hi @marc,

    Druid has an API endpoint where you can submit INSERT and REPLACE queries to, guess that would be an semi-ideal solution. I still need a transformer script that takes messages from solace and maps them into queries.

    I really hope Solace will support the Kafka protocol natively one day, it makes so much sense to offer it as a communication channel to existing applications.