Seldon Core logo

Seldon Core

(0)
Model serving

Seldon Core is an open-source platform to deploy machine learning models on Kubernetes at a massive scale.

Use it when

  • You want a serving framework that supports a wide range of ML frameworks.
  • You want a serving framework that works on multiple implementation languages (R, Julia, C++, Java, and Python).
  • You want a Kubernetes-native model serving platform.
  • You want a cloud-agnostic framework.
  • You want pre-built Docker images to get models in production.
  • You want built-in model monitoring features.
  • You need minimal code editing to start serving the model.
  • You want integrated Nvidia NVIDIA Triton.

Watch out

  • It is not suited for edge-based or IoT model serving.
  • Running and maintaining Kubernetes clusters for model serving may not always be optimal.

Example stacks

Airflow + MLflow stack

Installation

kubectl create namespace seldon-system
helm install seldon-core seldon-core-operator       --repo https://storage.googleapis.com/seldon-charts       --set usageMetrics.enabled=true       --namespace seldon-system       --set istio.enabled=true

Reviews