You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I tried to dockerize a streaming model by following example streaming server, and run with something like "mlserver build streaming_model/ -t 'stream_ml_service", then "docker run -it --rm -p 8080:8080 stream_ml_service".
But it raised:
Traceback (most recent call last):
File "/opt/conda/bin/mlserver", line 8, in <module>
sys.exit(main())
File "/opt/conda/lib/python3.10/site-packages/mlserver/cli/main.py", line 269, in main
root()
File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1157, in __call__
return self.main(*args, **kwargs)
File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1078, in main
rv = self.invoke(ctx)
File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1688, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1434, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 783, in invoke
return __callback(*args, **kwargs)
File "/opt/conda/lib/python3.10/site-packages/mlserver/cli/main.py", line 24, in wrapper
return asyncio.run(f(*args, **kwargs))
File "/opt/conda/lib/python3.10/asyncio/runners.py", line 44, in run
return loop.run_until_complete(main)
File "uvloop/loop.pyx", line 1517, in uvloop.loop.Loop.run_until_complete
File "/opt/conda/lib/python3.10/site-packages/mlserver/cli/main.py", line 47, in start
server = MLServer(settings)
File "/opt/conda/lib/python3.10/site-packages/mlserver/server.py", line 32, in __init__
self._metrics_server = MetricsServer(self._settings)
File "/opt/conda/lib/python3.10/site-packages/mlserver/metrics/server.py", line 26, in __init__
self._app = self._get_app()
File "/opt/conda/lib/python3.10/site-packages/mlserver/metrics/server.py", line 30, in _get_app
app.add_route(self._settings.metrics_endpoint, self._endpoint.handle_metrics)
File "/opt/conda/lib/python3.10/site-packages/starlette/applications.py", line 166, in add_route
self.router.add_route(
File "/opt/conda/lib/python3.10/site-packages/starlette/routing.py", line 833, in add_route
route = Route(
File "/opt/conda/lib/python3.10/site-packages/starlette/routing.py", line 226, in __init__
assert path.startswith("/"), "Routed paths must start with '/'"
AssertionError: Routed paths must start with '/'
Does anyone know how to dockerize a streaming model, thanks?
The text was updated successfully, but these errors were encountered:
I tried to dockerize a streaming model by following example streaming server, and run with something like "mlserver build streaming_model/ -t 'stream_ml_service", then "docker run -it --rm -p 8080:8080 stream_ml_service".
But it raised:
Does anyone know how to dockerize a streaming model, thanks?
The text was updated successfully, but these errors were encountered: