Serverless API with Cerebrium
Let's deploy our ML model as a REST API using the Serverless platform Cerebrium.
Forget about Docker, IAM roles, and EC2 instances. Serverless ML is about focusing on what differentiates your ML product, not setting up and mantaining infrastructure.
Your REST API endpoint needs to preload the ML model from the CometML Model Registry. For that, you need to set the following secrets on your Cerebrium Dashboard:
COMET_ML_WORKSPACE
COMET_ML_API_KEY
COMET_ML_MODEL_NAME
Then run
$ make deploy
[PENDING VIDEO 🎬]
$ make test-endpoint