Thought I'd see what was new in the latest coral cpai module.
Trying to reduce the consumption of my servers and having the old gtx970 (solid and fast) sucks up quite a bit running just for this 24/7
Last year I had zero success with tpu/cpai and bi combo.
Last night I managed to get it working with a medium and large model with fast speeds. (of course not as accurate as the gpu models).
But with the default MobileNet SSD is fast and in current testing mode.
"inferenceDevice": null,
"inferenceLibrary": "TF-Lite",
"canUseGPU": "false",
"successfulInferences": 472,
"failedInferences": 101,
"numInferences": 573,
"averageInferenceMs": 8.201271186440678
No custom models built-in, Seen a few around git but haven't tried any yet.
Trying to reduce the consumption of my servers and having the old gtx970 (solid and fast) sucks up quite a bit running just for this 24/7
Last year I had zero success with tpu/cpai and bi combo.
Last night I managed to get it working with a medium and large model with fast speeds. (of course not as accurate as the gpu models).
But with the default MobileNet SSD is fast and in current testing mode.
"inferenceDevice": null,
"inferenceLibrary": "TF-Lite",
"canUseGPU": "false",
"successfulInferences": 472,
"failedInferences": 101,
"numInferences": 573,
"averageInferenceMs": 8.201271186440678
No custom models built-in, Seen a few around git but haven't tried any yet.