Alright looks like the MobileNetSSD model can handle 90 objects. Based on coral.ai it uses the COCO dataset
https://tech.amikelive.com/node-718/what-object-categories-labels-are-in-coco-dataset/
https://cocodataset.org/#home...
One interesting difference is the custom model. When using the USB coral that field in BI was blank but with the mini PCI-e coral it shows MobileNetSSD as a custom model.
Just wanted to share that I switched to the mini pci-e coral yesterday and my inference speed has gone from ~200ms average on the usb coral with medium model size down to 40ms average. Pretty impressive. I'm on the latest Codeproject and BI.
For comparison, my GPU inference speeds are 150ms...