System Prerequisites

Install the necessary software depending on your platform

Create Directories to Store Models

  1. Choose a location to store AI models and output files from ComfyUI. For example will use:
  • Models: ~/models/ComfyUI--models
  • Output: ~/models/ComfyUI--output
  1. Download the docker image:
docker pull livepeer/comfyui-base:server

Run the container

  1. Run the container using the paths you created above:

If you are using Windows, make sure Docker Desktop is running first

The --download-models and --build-engines flags will download the required models and build TensorRT engines. This process may take some time. You will only need these flags on the first run of the container, or when adding additional models.

Engine files must be compiled on the same GPU hardware/architecture they will be used on.

Access ComfyUI and ComfyStream

The --server flag will start ComfyUI, ComfyStream and UI automatically. You can access ComfyUI and ComfyStream at: