Getting Started
Install ComfyStream
A quick start guide to running your first AI video workflow with ComfyStream
System Prerequisites
Install the necessary software depending on your platform
Create Directories to Store Models
- Choose a location to store AI models and output files from ComfyUI. For example will use:
- Models:
~/models/ComfyUI--models
- Output:
~/models/ComfyUI--output
- Download the docker image:
Run the container
- Run the container using the paths you created above:
If you are using Windows, make sure Docker Desktop is running first
The --download-models
and --build-engines
flags will download the required models and build TensorRT engines. This process may take some time. You will only need these flags on the first run of the container, or when adding additional models.
Engine files must be compiled on the same GPU hardware/architecture they will be used on.
Access ComfyUI and ComfyStream
The --server
flag will start ComfyUI, ComfyStream and UI automatically. You can access ComfyUI and ComfyStream at:
- ComfyUI: http://localhost:8188
- ComfyStream: http://localhost:8888
- UI: https://localhost:3000