Skip to content

Latest commit

 

History

History
56 lines (48 loc) · 1.71 KB

prepare-cache-service.rst

File metadata and controls

56 lines (48 loc) · 1.71 KB

Prepare Redis Service

Backend.AI makes use of Redis as the message queue (event bus) and the cache backend. Launch the service using docker compose by generating the file $HOME/halfstack/redis-cluster-default/docker-compose.yaml and populating it with the following YAML. Feel free to adjust the volume paths and port settings. Please refer the latest configuration (it's a symbolic link so follow the filename in it) if needed.

x-base: &base
   logging:
      driver: "json-file"
      options:
         max-file: "5"
         max-size: "10m"

services:
   backendai-half-redis:
      <<: *base
      container_name: backendai-halfstack-redis
      image: redis:6.2-alpine
      restart: unless-stopped
      command: >
         redis-server
         --requirepass develove
         --appendonly yes
      volumes:
         - "${HOME}/.data/backend.ai/redis-data:/data:rw"
      healthcheck:
         test: ["CMD", "redis-cli", "--raw", "incr", "ping"]
         interval: 10s
         timeout: 3s
         retries: 10
      ports:
         - "8110:6379"
      networks:
         - half_stack
      cpu_count: 1
      mem_limit: "2g"

networks:
   half_stack:

Execute the following command to start the service container. The project ${USER} is added for operational convenience.

$ cd ${HOME}/halfstack/redis-cluster-default
$ docker compose up -d
$ # -- To terminate the container:
$ # docker compose down
$ # -- To see the container logs:
$ # docker compose logs -f