INSPIRE. CONNECT. EMPOWER!
Whether you’re an individual, a tiny team or a growing startup, you don’t have to take on the world alone. Join us at any of the following events and meet an inspired community of people who are passionate about working together to build a better Houston for all!
UPCOMING IMPACT HUB HOUSTON EVENTS
UPCOMING HOUSTON COMMUNITY EVENTS
- This event has passed.
Operationalizing Machine Learning Pipeline at Scale for Mission-Critical Apps
February 2 @ 11:30 am - 12:30 pm CST
The significant amount of effort required to bring Machine Learning (ML) models into usefully deployable form seems to be the main obstacle inhibiting the Cambrian explosion and adoption of ML products in different industry verticals right now. A lack of a standardized approach for training and serving models at scale usually means it will take a much longer time (approximately 6 – 11 months ) to get ML models developed by Data Scientist into a deployable form that can run continuously without any glitch in production. This is only not acceptable, it inhibits the ability to iterate faster and course-correct in cases where the models are not performing or behaving as expected in the production pipeline.
As Google Cloud partners our company MavenCode specializes in building ML pipelines that accelerate the process of getting customers building large scale systems like this into a production-ready state, leveraging on battle tested approaches and frameworks provided by Google. In the past few months, we have been heavily invested in the Kubeflow opensource project and have been able to build a process around the platform for orchestrating and deploying ML and AI models at scale.
In this presentation, I will be working you through the process of bootstrapping Kubernetes Cluster in the cloud for training and serving models at scale using Kubeflow and also discuss all the lessons we have learned in the process.
We will be using the Main Conference room downstairs for this meetup. We are limited to a 1 hour window. Please be on time.