Model Deployment at scale on Kubernetes#
Yatai(屋台, food cart) lets you deploy, operate and scale Machine Learning services on Kubernetes.
It supports deploying any ML models via BentoML, the unified model serving framework.
🍱 Made for BentoML, deploy at scale
Scale BentoML to its full potential on a distributed system, optimized for cost saving and performance.
Manage deployment lifecycle to deploy, update, or roll back via API or Web UI.
Centralized registry providing the foundation for CI/CD via artifact management APIs, labeling, and WebHooks for custom integration.
🚅 Cloud native & DevOps friendly
Join us in our Slack community where hundreds of ML practitioners are contributing to the project, helping other users, and discuss all things MLOps.
The BentoML Blog and @bentomlai on Twitter are the official source for updates from the BentoML team. Anything important, including major releases and announcements, will be posted there. We also frequently share tutorials, case studies, and community updates there.
Yatai has a thriving open source community where hundreds of ML practitioners are contributing to the project, helping other users and discuss all things MLOps. 👉 Join us on slack today!