harleyl7
Pulling my weight
- Jun 4, 2021
- 264
- 224
I'm running CPAI on 192.168.1.2:5000 on a docker container.Looks like you're still using deepstack. This integration only works with CPAI.
I'm running CPAI on 192.168.1.2:5000 on a docker container.Looks like you're still using deepstack. This integration only works with CPAI.
CPAI runs on port 32168.I'm running CPAI on 192.168.1.2:5000 on a docker container.
I'm using CPAI in a docker on 192.168.1.2:5000It is grayed out because you are using DeepStack, it only works with CodeProject.AI
It runs on the port you set it at. Mine is on 4999 for example.CPAI runs on port 32168.
I can change the port to whatever I want. It's actively running on it.CPAI runs on port 32168.
If that is the case send Blue Iris support the issue you are having with CP.AI on a docker container, he should be able to fix it.I'm running CPAI on 192.168.1.2:5000 on a docker container.
Ahh okay. I might just install it locally on the blue iris server. I was trying to offload some cpu work off my little i5-6500If that is the case send Blue Iris support the issue you are having with CP.AI on a docker container, he should be able to fix it.
So: Blue Iris doesn't currently change CodeProject.AI Server's settings, so it doesn't provide you a way to change the custom model folder location from within Blue Iris.
Blue Iris will still use the contents of this folder to determine the calls it makes. If you don't specify a model to use in the Custom Models textbox, then Blue Iris will use all models in the custom models folder that it knows about.
15:06:00: Performing Install on module ALPR
15:06:00: Preparing to install module 'ALPR'
15:06:00: Downloading module 'ALPR'
15:06:00: (using cached download for 'ALPR')
15:06:00: Unable to unpack module 'ALPR'
Use the below link to downloadSorry, the picture did not attache properly View attachment 151965
Sent from my SM-S906U using Tapatalk
Try the below to do a clean install.@MikeLud1 I finished upgrading CPAI from 1.6.8 to 2.0.6. The upgrade reported success. I tried going to the "Install Modules" tab on the server to install the ALPR module, but when I click on Install the following entries are logged in the log file:
Code:15:06:00: Performing Install on module ALPR 15:06:00: Preparing to install module 'ALPR' 15:06:00: Downloading module 'ALPR' 15:06:00: (using cached download for 'ALPR') 15:06:00: Unable to unpack module 'ALPR'
Any suggestions on how I can force it to download the ALPR module rather than try to use the cached copy?
Try the below to do a clean install.
Uninstall the CodeProject.AI Server app
Delete the C:\ProgramData\CodeProject\AI directory if it exists
Delete the C:\Priogram Files\CodeProject\AI directory if it exists
Install CodeProject.AI 2.0.6
vI was afraid you were going to say that. With my connection speed the install takes ~90 minutes.
I have the same problem - a long copper overhead line. Openreach intended to upgrade it to fibre over 18 months ago for free but when they hit their 80% target in Congleton they moved on. Now it’s 2025. Are the ISP’s offering to install fibre for you for a price? If so, what is the price? Starlink would be too expensive for me.Yeah, but on a 12mb DSL connection it still takes 2.0.6 an hour and a half to download and install. Watching >2GB of pytorch come down at 1.2MB/sec is sort of like watching paint dry.
Sure wish there was an offline installer available.
In 6 months I shouldn't care as I ought to have 1Gb service. There are 2 competing ISPs claiming they'll have fiber to the home at my location in the next 3-6 months.