Building an open source Scala gRPC/REST HTTP Proxy for Kafka (I)
Day 0 of the journey on building my first Scala GraalVM ZIO application
Caveat: Normally I write posts about how I tackled a problem and present the solution on a silver platter.
In this series I’d like you to take you on a journey of exploration. I have no idea how fast this will progress or where it will end up. I will try to work at least 1 day a week on this.
There are way better tutorials write ups for each of these tools (zio, kafka, api, graalvm), but I just want to show that even people working for years in the software engineering field still don’t know everything, have to learn and are beginners when broaching new terrains.
Follow along at your own risk ;-)
I’ve been working with Scala for quite some time, but as most Scala devs know there is this separation between the Scala Spark crowd (California School) vs the Scala Functional Programming crowd (Glasgow School).
Even as a member of the former one, I’ve always tried to make my a FP as possible. However, truth be told, the Spark API does not require that level of strictness and even feels like you are being punished for trying to do it the pure typed way.
In my work as a data engineer, I do come across problems that don’t require a distributed data solution (Spark, Flink, Hadoop, Kafka Streaming, etc) but more of a mundane issue: Getting data into Kafka.
In all tools I used there are great connectors for this, but when running applications have to send data connecting to Kafka can be cumbersome. That’s why we need a API endpoint that is easy to use to POST messages on via REST or gRPC and that handles the interaction with Kafka for us, including validating the messages with a schema registry before publishing them.
Of course this already exists, most notably kafka-pixy, but I’d like to build my own using Scala libraries I don’t use often. My wish is to build this for both Scala 2 & 3, using the Zio framework and running it on GraalVM