Playback painfully jerky, slow and stuttering.

Flintstone61

Known around here
Joined
Feb 4, 2020
Messages
6,687
Reaction score
11,077
Location
Minnesota USA
1920x1080 on a 40" Sony Bravia. and also 1080P on the Dell monitor at the other BI installation.
no remote desktop stuff
4k monitors will obviously take more horsepower to run i'd imagine.
and i dont know if remote desktops transmit and receive in the selected monitor resolution or what?
 

Corvus85

Getting the hang of it
Joined
Aug 18, 2021
Messages
498
Reaction score
79
Location
Australia
and i dont know if remote desktops transmit and receive in the selected monitor resolution or what?
Evidently, they do. I didn't really think it would cause that much processing overhead on the server end, but apparently it does.

Running 1280x1024 now and my CPU usage is only in single digits with the console open.

Could this really have been my problem all along @wittaj?
 

TVille

Getting comfortable
Joined
Apr 26, 2014
Messages
672
Reaction score
1,639
Location
Virginia
So you're saying you replaced an aging system with a new system. Ok, not sure why that's a surprising conclusion. Am I missing something? How many cameras are you running on it? And what's the performance like on idle while the console window is open - and when playing back a mainstream recording?
I replaced it because it kept freezing. Yes it was old, but my other install is 9 years old, bought used, AMD machine, it works fine. I have no issue with older equipment that is reliable.


I don't think this is chasing ghosts, I'm very concerned about the fact that I've got a late-gen system that can barely keep up with 7 cameras because I want to add a few more. Not to mention that it's being extremely unpredictable.
Yesterday I was seeing barely any performance hit with RDP, now look at the performance between RDP and Parsec.

View attachment 117658 View attachment 117661
Parsec - Console window open vs Parsec - console window closed.

View attachment 117659 View attachment 117660
RDP - Console window open vs. RDP - Console window Closed
I am running a Dell Precision 3620 Tower i7-6770, 16 GB ram, SSD boot, pair of WD Purple for video storage, Win10 Pro. When I got the machine i did a clean install of Win 10 Pro from the media creation tool as recommended on here. Pulled the old 1 TB HD, installed the SSD, ran the tool, voila, clean Win 10 Pro. I have 16 cameras active, 11 @ 4 MP, 5 @ 2 MP. Most are running 15 fps, all with substreams. I have Rekor Scout running on one camera, and DeepStack running on probably six cameras or so. I also have BackBlaze running doing off site backups, but not of my BI video, as well as Cumulus MX, a weather uploading program. I have ZeroTier running for VPN access to it when remote.

I find viewing through BI is a little jerky compared to exported playback, but it is like the frame rate jumps a little. For me it is not objectionable at all, I don't miss anything, it's not not movie quality. The machine is headless running Win RDP. My viewing machine is the i7-3770 sitting next to it. The both connect to a GB Netgear switch.

Here are two screen shots of Task Manager performance tab. It is still dark outside, so scenes are a little simpler than daylight. First one is just BI console open and all cameras on screen. Second is with a daylight clip running.
 

Attachments

TVille

Getting comfortable
Joined
Apr 26, 2014
Messages
672
Reaction score
1,639
Location
Virginia
Evidently, they do. I didn't really think it would cause that much processing overhead on the server end, but apparently it does.

Running 1280x1024 now and my CPU usage is only in single digits with the console open.

Could this really have been my problem all along @wittaj?
Possibly. I'm running 1920x1200 on my Linux viewing machine with dual monitors. I don't know what the capabilities of the Intel video are. It may struggle with higher resolutions.
 

wittaj

IPCT Contributor
Joined
Apr 28, 2019
Messages
25,376
Reaction score
49,425
Location
USA
Hooking a 4K monitor to mine results in lower CPU compared to when I RDP into it on my older laptop. YMMV.
 

Corvus85

Getting the hang of it
Joined
Aug 18, 2021
Messages
498
Reaction score
79
Location
Australia
Hooking a 4K monitor to mine results in lower CPU compared to when I RDP into it on my older laptop

It's not the resolution of the monitor, it's the resolution you use when you remote into it. The higher the resolution of the remote window, the more CPU and GPU resources it takes from the BI machine.

In other words, if you change the desktop resolution of the BI machine to 4K, and the RDP into it, the usage will be much higher than if you set the resolution to 1080p and then remote into it.

Can you check and compare please?
 

wittaj

IPCT Contributor
Joined
Apr 28, 2019
Messages
25,376
Reaction score
49,425
Location
USA
My BI machine is set to 4K resolution. I have never noticed big swings in CPU remoting in from different devices at different resolutions, although I do not remote into it with a 4k device either. But maybe remoting in with a 4k device is the reason for the increase?
 

Corvus85

Getting the hang of it
Joined
Aug 18, 2021
Messages
498
Reaction score
79
Location
Australia
So you control your BI machine from the machine itself with a monitor connected?

It's not about different resolutions between client and host. It's all about the resolution the host is in while you're controlling it remotely. The higher the desktop resolution of the host, the more resources it chews on the host.

I feel like this should be in the tweaking wiki or something, since it's easy to miss and has a devastating impact on performance if missed.
 

wittaj

IPCT Contributor
Joined
Apr 28, 2019
Messages
25,376
Reaction score
49,425
Location
USA
No I run it headless and RDP into it. I lugged a 4k monitor to it just for you to see what happened and to see what resolution the computer was running at. With RDP, those sections are blocked off and do not let you change those via RDP, so maybe it auto changes to the resolution of the RDP monitor?
 

Corvus85

Getting the hang of it
Joined
Aug 18, 2021
Messages
498
Reaction score
79
Location
Australia
My point is - when you're running the machine from a 4K monitor sitting in front of it with a desktop resolution of 4K, it's fine.
But as soon as that monitor is disconnected, the display adapter 'falls back' to a lower resolution. This is why when you RDP into it, there's no performance hit. Because you're seeing it in that lower resolution.

In my case, I had a 4K monitor plugged into it, and so that meant I was RDP'ing into it at the full 4K resolution.
So it had to render each 4K frame (60 per second) to send to me, regardless of what 'scale' option I set.

You can see this for yourself if you have your monitor connected to it and turned on - at the full 4K desktop resolution - and then try to RDP into it while the console is open.

It was immediately obvious to me that as soon as I disconnected that HDMI cable, I happened to already be logged in via RDP, and I saw the RDP feed change resolution, and the CPU/GPU usage plummet.

To test this, I reconnected the HDMI cable, and connected via Parsec this time, because that app lets you actually change the rendering resolution of the server. I selected a lower desktop resolution for it, and viola - single digit usage.
 

wittaj

IPCT Contributor
Joined
Apr 28, 2019
Messages
25,376
Reaction score
49,425
Location
USA
I will try that later. What is your CPU with the 4K monitor and no remoting in? It should be single digits.
 

Corvus85

Getting the hang of it
Joined
Aug 18, 2021
Messages
498
Reaction score
79
Location
Australia
What is your CPU with the 4K monitor and no remoting in? It should be single digits.
With the console open, it's 15% (and no motion/Deepstack being triggered).

With the console closed, it's 6%.

Outta curiosity, are you running H264 or H265 cameras?
 

wittaj

IPCT Contributor
Joined
Apr 28, 2019
Messages
25,376
Reaction score
49,425
Location
USA
That is still high for that machine and # of cameras.

I run H264.
 

Corvus85

Getting the hang of it
Joined
Aug 18, 2021
Messages
498
Reaction score
79
Location
Australia
By the way, I hope this is allowed. I opened another thread on this, but I'll post it here since this is getting more participation.

Part of my ongoing saga to get this setup 'just right' has me breaking stuff all over the place.

Now for some reason, I'm not able to just click any camera when viewing 'All cameras' within UI3 and get a 'solo' view of it. Double clicking doesn't do anything either.
This is regardless of which browser, machine, device, user, etc.

I used to be able to just single-click on any camera in the main view and get a solo live view, and if I double-click on that, I was able to go full-screen.

The funny thing is, this behaviour works as normal when I select any other group. Why doesn't it work from the 'All cameras' view?

How can I restore this behavior?
 

wittaj

IPCT Contributor
Joined
Apr 28, 2019
Messages
25,376
Reaction score
49,425
Location
USA
As long as it is performing that is what is important. Although in time an issue now could creep up and bite you at some point.

Are you on the latest 5.5.5.0? If so, rollback and see if UI3 is fixed.
 

Corvus85

Getting the hang of it
Joined
Aug 18, 2021
Messages
498
Reaction score
79
Location
Australia
Ok I figured it out. Rolled back to 5.5.4.5, but now the layout that I see in UI3 is completely different to the layout that I painstakingly arranged (and is still visible) in BI console.

How do I get it to display the same layout again?

Edit: Ok weird, turns out it's only displaying incorrectly on one device. Is there a setting that might affect this within UI3?
 

Freubel

Getting the hang of it
Joined
Apr 20, 2018
Messages
79
Reaction score
40
Location
Volendam
My point is - when you're running the machine from a 4K monitor sitting in front of it with a desktop resolution of 4K, it's fine.
But as soon as that monitor is disconnected, the display adapter 'falls back' to a lower resolution. This is why when you RDP into it, there's no performance hit. Because you're seeing it in that lower resolution.

In my case, I had a 4K monitor plugged into it, and so that meant I was RDP'ing into it at the full 4K resolution.
So it had to render each 4K frame (60 per second) to send to me, regardless of what 'scale' option I set.

You can see this for yourself if you have your monitor connected to it and turned on - at the full 4K desktop resolution - and then try to RDP into it while the console is open.

It was immediately obvious to me that as soon as I disconnected that HDMI cable, I happened to already be logged in via RDP, and I saw the RDP feed change resolution, and the CPU/GPU usage plummet.

To test this, I reconnected the HDMI cable, and connected via Parsec this time, because that app lets you actually change the rendering resolution of the server. I selected a lower desktop resolution for it, and viola - single digit usage.
This is what i was trying to say
 

wittaj

IPCT Contributor
Joined
Apr 28, 2019
Messages
25,376
Reaction score
49,425
Location
USA
My point is - when you're running the machine from a 4K monitor sitting in front of it with a desktop resolution of 4K, it's fine.
But as soon as that monitor is disconnected, the display adapter 'falls back' to a lower resolution. This is why when you RDP into it, there's no performance hit. Because you're seeing it in that lower resolution.

In my case, I had a 4K monitor plugged into it, and so that meant I was RDP'ing into it at the full 4K resolution.
So it had to render each 4K frame (60 per second) to send to me, regardless of what 'scale' option I set.

You can see this for yourself if you have your monitor connected to it and turned on - at the full 4K desktop resolution - and then try to RDP into it while the console is open.

It was immediately obvious to me that as soon as I disconnected that HDMI cable, I happened to already be logged in via RDP, and I saw the RDP feed change resolution, and the CPU/GPU usage plummet.

To test this, I reconnected the HDMI cable, and connected via Parsec this time, because that app lets you actually change the rendering resolution of the server. I selected a lower desktop resolution for it, and viola - single digit usage.
OK so I hooked a 4K monitor to it and had the BI console open on the monitor and RDP into it and the CPU didn't increase. Now it doesn't let me see Blue Iris on the 4k monitor and RDP at the same time - are you able to do so? Maybe that is the issue?

I then closed BI on the 4K monitor and then RDP again and CPU was the same.

I then turned off the monitor and left it connected and RDP again and the CPU was the same.
 
Top