Using Redis as your Online Feature Store: 2021 highlights & 2022 directions

apply(meetup) - Feb '22 - 30 minutes

With the growing business demand for real-time predictions, we are witnessing companies making investments in modernizing their data architectures to support online inference.  When companies need to deliver real-time ML applications to support large volumes of online traffic, Redis is most often selected as the foundation for the online feature store, because of its ability to deliver ultra-low latency with high throughput at scale. 2021 was a year of significant growth in customers building their online features stores with Redis. 2022 will see an increase in customers buying COTS feature store software supporting low-latency, high throughput online inference requirements.

In this talk, Redis will share key observations around customers, architectural patterns, use cases and industries adopting Redis as an online feature store. In the process, Redis will also highlight its integrations with key partners in the feature store & MLOps ecosystem including Feast, Microsoft Azure and Tecton.

Ed Sandoval

Senior Product Manager


Ed Sandoval is a Senior Product Manager at Redis focused on AI/ML. Ed is based out of the UK, comes from Salesforce where he spent 8+ years and worked on a number of AI/ML products of their Einstein platform. Prior to that he was Enterprise Architect at HP Enterprise Services. He holds a Masters in Software Engineering from University of Oxford.