How to Leverage the Apache Kafka Ecosystem to Productionize Machine Learning
This talk shows how to productionize Machine Learning models in mission-critical and scalable real time applications by leveraging Apache Kafka as streaming platform. The talk discusses the relation between Machine Learning frameworks such as TensorFlow, DeepLearning4J or H2O and the Apache Kafka ecosystem. A live demo shows how to build a Machine Learning environment leveraging different Kafka components: Kafka messaging and Kafka Connect for data movement, Kafka Streams for model deployment and inference in real time, and KSQL for real time analytics of predictions, accuracy and alerts.
Kai Wähner works as Technology Evangelist at Confluent. Kai’s main area of expertise lies within the fields of Big Data Analytics, Machine Learning, Integration, Microservices, Internet of Things, Stream Processing and Blockchain. He is regular speaker at international conferences such as JavaOne, O’Reilly Software Architecture or ApacheCon, writes articles for professional journals, and shares his experiences with new technologies on his blog (www.kai-waehner.de/blog). Contact and references: firstname.lastname@example.org / @KaiWaehner / www.kai-waehner.de