Abstract

As continuous big data processing is gaining popularity it naturally implies that there is a need to transition many of the distributed machine learning functionality to a streaming backend. The most common use case is to give streaming predictions based on the model learnt in batch, however in some cases it is beneficial to also update the model on the fly. It is not uncommon that streaming learners need different algorithms than their batch counterparts. The talk discusses the common use cases and the pitfalls of the streaming ML transition through the example of recommender systems. It also offer a dive into the implementation of a Scala library augmenting FlinkML with streaming predictors.

Slides: Marton Balassi Streaming ML with Flink

Video on YouTube

Speaker

Márton Balassi
Solutions Architect, Cloudera

Details