Indeed it is an optional thing to use containers on a WIndows computer, to run software that is able to run natively but I would say that one of the main reasons to use the container version of CPAI on a Windows machine is to avoid the potential problems that appear to be coming about when folks upgrade CPAI. CPAI has many components to it and apparently problems can come about. You can very easily test a new version or go back to an older version without any concerns using docker. All of the CPAI changes are in the container and not the host OS, so you can run it, test it, break it, and delete it with reckless abandon
. Run or download it again once more as if nothing ever happened in just minutes. To me, that is the key and why I use it. It also allows CPAI to also be run on a Linux box, which is a free OS that can be installed in minutes. That said, it is this versatility of deployment that is something the CPAI folks appear to be striving for and I'd say they have achieved it. You can even run CPAI on a rPI 4 (although it still needs Coral TPU integration to be useful).
You can run various versions by appending the command to launch the CPAI container in docker and adding the version you want to run as per these examples.
For CPU versions:
docker pull codeproject/ai-server (no version info pulls the most recent version pushed to docker hub by CPAI, also referred to as "latest")
docker pull codeproject/ai-server:cpu-2.0.7
docker pull codeproject/ai-server:cpu-2.0.6
docker pull codeproject/ai-server:cpu-1.6.8
For the gpu enabled versions:
docker pull codeproject/ai-server:gpu (no version info pulls the most recent version pushed to docker hub by CPAI)
docker pull codeproject/ai-server:gpu-2.0.7
docker pull codeproject/ai-server:gpu-2.0.6
docker pull codeproject/ai-server:gpu-1.6.8
You can find all the releases available on the CPAI docker hub "tag" page here. Change the tag at the end and you can test any version you want.