prsmith777
Getting comfortable
Thanks Mike. I tried the fix, but it didn't work unfortunately. I guess I'll wait till the next release. Is your LPR still only good on CUDA 11.x and not good on CUDA 12?
The current version does work with CUDA 12.xThanks Mike. I tried the fix, but it didn't work unfortunately. I guess I'll wait till the next release. Is your LPR still only good on CUDA 11.x and not good on CUDA 12?
Try these settings, the below settings will on run the APR moduleSo I moved my Codeproject from my Ubuntu box to my orange PI--- I noticed I stopped getting MQTT messages... I moved back to the Linux box... and I am not getting MQTT message. The only thing I changed was the codeproject pointed under settings.... Let me post me settings...
I do a debug Window watching a plate-- and it seems to read the plate just fine. I am guessing I need to remove some of the "REQUIRED" things to post the plate. I don't see them in the debug window. Both the Orange Pi and Ubuntu seem to report the same thing...
Weird that changing it back didn't work.
View attachment 179467
View attachment 179466
View attachment 179464
View attachment 179463
View attachment 179462
Try these settings, the below settings will on run the APR module
View attachment 179470
Let me look into this, can you post some images that did not work.Does the LPR plugin get trained with training data? I can try to put together some night time shots with known plates...
Let me look into this, can you post some images that did not work.
Not at this timeSorry- did you want a series of training data?
Depending on your budget, an RTX 3060 12gb is a excellent GPU for AI. If this GPU is out of your price range, let me know what is you budget is.What’s the best gpu to use with this AI? Or will a powerful cpu be enough?
I can get a 4090/a6000 if that’s what it takes to have the fastest and most reliable results. But not sure it these highend gpu will make a difference to be honest.Depending on your budget, an RTX 3060 12gb is a excellent GPU for AI. If this GPU is out of your price range, let me know what is you budget is.
What about an 8gb 4060ti, it's supposed to be more power efficient. Is that true? Got one coming from cyber-Monday deal. What would be better between the two?Depending on your budget, an RTX 3060 12gb is a excellent GPU for AI. If this GPU is out of your price range, let me know what is you budget is.
The 8gb 4060ti will work fineWhat about an 8gb 4060ti, it's supposed to be more power efficient. Is that true? Got one coming from cyber-Monday deal. What would be better between the two?
The more Vram you have the more AI models can be used also it helps with training AI modules.@MikeLud1 I’m glad I might not need to spend more money as I have a 3060 laying around. Btw what about Vram? Does it make a difference the more you have? Like for having more channels?