8-9ms inference speeds with OpenVino on a cpu 7th gen!!

Pentagano

Getting comfortable
Dec 11, 2020
645
292
Uruguay
Well after using coral tpu devices (usb and m2) with Frigate, I came across openvino model on Frigate.
Decided to give it a try, very simple set up, just a few lines of code in the config.

Fire it up on my dell 7th gen intel with no gpu or tpu.

Well I'll be dipped!

Faster than the coral usb and hardly touches the cpu. Not sure how they have done that.
Detects people perfectly (if you don't need it for small animals etc) it is perfect.

Even tried it on an old i5 -3470t lenovo mini. 45ms not too shabby at all.
 
those are great numbers. I wonder if there's a way to incorporate into BI
 
Suggest it to the Code Project Developers. At the very least they can take a look at how it's functioning.
 
Verions 14.1

Testing on an old i5 3470t (which I thought wasn't supported using openvino cpu).
This old underpowered cpu still manages 18ms inference speeds!! Ubuntu distro
My i7 6th gen 6700 4ms!! openvino may not be as accuracte but if you don't have a tpu then this is a good option


1729164945092.png
 
  • Like
Reactions: mat200
Mind sharing the config to make it simpler for your forum pals @Pentagano ?
With the collapse of CPAI it looks about time to give Frigate another run. I understand the newer version gave it a much needed refresh.
Thanks !
 
  • Like
Reactions: mat200
Mind sharing the config to make it simpler for your forum pals @Pentagano ?
With the collapse of CPAI it looks about time to give Frigate another run. I understand the newer version gave it a much needed refresh.
Thanks !

It hasn't collaped yet, it simply moved where the files are located. It has had at least two updates since moving off the website to githib.
 
It hasn't collaped yet, it simply moved where the files are located. It has had at least two updates since moving off the website to githib.
True and I am running a docker container of one of the newer from github. But it seems the writing is on the wall. But the cruxed of my comment was to see if he would share the config related.
 
Mind sharing the config to make it simpler for your forum pals @Pentagano ?
With the collapse of CPAI it looks about time to give Frigate another run. I understand the newer version gave it a much needed refresh.
Thanks !
CPAI is still here and works very well - doesn't rely on any cloud service. You can still use Deepstack if you want.

Let me see if I can get it to you later , very easy and well documented