Any FFMPEG gurus in here?

kmax1940

Young grasshopper
Sep 27, 2016
42
3
Is anyone in here good at FFMPEG?

I have a command that takes rtsp and streams it to HLS but...
The CPU performance is horrible.
I wonder if that is just the way it is...
or if it is possible to tune it and make it perform well.

Trying to stream multiple cams via hls on a vps...
 
Thanks for the replies! Here is the command I am running.
This is running on a small VPS.
Latest versions of Ubunutu, Apache, & FFMPEG.

This command takes up 60-70% of the CPU!
This is just streaming one camera from RTSP to HLS.
Any tips?


ffmpeg -i "rtsp:/testuser:testpass@555.555.555.555:554/unicast/c4/s1/live" -y -s 854x480 -c:v libx264 -b:v 10000 -an -hls_time 2 -hls_list_size 5 -hls_delete_threshold 1 -hls_flags "delete_segments+program_date_time+temp_file" mystream.m3u8

The camera is set to:
Bitrate = 1024
Resolution = 704 x I cant remember the second number

I know the bitrate in the command seems high.
When I set it to 1000 in that command... the picture is just a huge blob of blocks.

The command above works great.
The only problem is that it takes up 60-70% of the CPU.
My goal is to be able to run 16 streams like this on a vps.

Thanks
 
-c:v libx264 is the CPU intensive part. That is telling ffmpeg to re-encode all the video frames as H.264. Change it to -c:v copy and delete the -s 854x480 and the -b:v 10000 as those won't be applicable anymore because you're now copying the video stream.

Your -hls_time 2 argument is probably telling it to produce video segments that are 2 seconds long, so ideally you should have your video source provide an i-frame at least every 2 seconds. Probably safest to have the i-frame interval exactly match the hls segment size, which means for a typical IP camera source you would want the i-frame interval to be exactly double the value of the frame rate so that there is one i-frame every 2 seconds. You could also try having the i-frame interval equal to the frame rate, that might work better. You could also experiment with larger values in -hls_time but that will increase the delay of the video stream so I would recommend it only as a last resort.

I know the bitrate in the command seems high.
When I set it to 1000 in that command... the picture is just a huge blob of blocks.

This is probably because libx264 was defaulting to the ultrafast encoder profile or something. Suffice it to say, there is more than just the bit rate that determines the video quality. Anyway it won't be a concern if you stop re-encoding the video frames.
 
I expect without re-encoding you will be able to handle 16 streams no problem on the VPS. At that point the hassle will be dealing with stream interruptions because at that kind of scale they are likely to happen frequently and it won't be something where you want to fix it manually every time a stream gets broken.
 
  • Like
Reactions: TRLcam
Thank you all so much!
I am going to study these posts and make some changes.

Just FYI for anyone who happens on to this thread in the future...
I am attaching an image of the camera settings.

substream-settings.PNG
 
Yup so definitely set the i frame interval to either 15 or 30. The current setting of 50 will not be optimal for HLS streaming with a 2 second segment size because there is only one i-frame every 3.3333 seconds.
 
  • Like
Reactions: gwminor48
Wow! Ok, thank you guys so much!
I think you may have helped us on the right path to resolving an issue that has been a pain for a long time!

We have a platform that restream rtsp to HLS...

The only problem is Uniview cameras won't work on iOS.
It just flashes the first picture then goes away.
Even devs we hired have struggled with it.

Here is what i just learned with the commands you guys showed me.

I had actually gotten it to work myself on iOS using the command I pasted in the thread earlier.
But of course the CPU performance was too bad.

SOOO...

When I changed my command from -c:v libx246 TO -c:v copy
It does not work on iOS anymore.

So now at least I have a lead as to why it is not working on iOS.

Now, to figure out how to get these uniview cameras to work on iOS using -c:v copy instead of -c:v libx246 :)

Thanks again for your time.

It has been a huge help.
 
Yep, I have confirmed it a few times.

If I use
"-c:v libx264" it works on ios

If I use
"-c:v copy" it does not play on ios. It just flashes the first picture or either just plays black screen.

Any ideas?
 
I expect without re-encoding you will be able to handle 16 streams no problem on the VPS. At that point the hassle will be dealing with stream interruptions because at that kind of scale they are likely to happen frequently and it won't be something where you want to fix it manually every time a stream gets broken.

Thank you so much!
You are right the encoding was the problem!
 
Thank you! I really appreciate your time!
Thanks for writing all that out.
When I replaced libx264 with copy it make the CPU performance amazing!

I think it also helped me get on the right path toward solving an issue we have an uniview rtsp now playing on iOS.

Thanks again.
 
  • Like
Reactions: sebastiantombs