[FFmpeg-user] Average N frames of RTSP video for better snapshot quality?
Steven Kan
steven at kan.org
Tue Mar 8 21:36:27 EET 2022
> On Mar 8, 2022, at 11:02 AM, Steven Kan <steven at kan.org> wrote:
>
>
>> On Mar 8, 2022, at 10:32 AM, Michael Koch <astroelectronic at t-online.de> wrote:
>>
>> Am 08.03.2022 um 19:09 schrieb Steven Kan:
>>> After 7.5 years of waiting, my banana plant is finally flowering! I want to do a time-lapse capture of the flowering and fruiting process. Due to its location, the easiest way for me to get a camera out there is to use a little WyzeCam v3 with the RTSP firmware and the Wyze lamp socket. Unfortunately the WyzeCam doesn’t (yet) have a externally accessible JPG snapshot feature, so I have a cron job set up to:
>>>
>>> ./ffmpeg -rtsp_transport tcp -i rtsp://anonymous:password@$IPAddress/live -frames:v 1 $outfile
>>>
>>> every hour. The results are OK, but not fantastic:
>>>
>>> https://www.kan.org/pictures/BananaTimeLapseFirstImage.jpg <https://www.kan.org/pictures/BananaTimeLapseFirstImage.jpg>
>>>
>>> Is there a way to tell ffmpeg to collect N frames of video and output one single averaged image to improve the SNR? Even if there’s some wind, the flower stalk shouldn’t be moving much.
>>>
>>> I tried:
>>>
>>> ./ffmpeg -rtsp_transport tcp -i rtsp://anonymous:password@192.168.1.39/live -frames:v 10 ~/BananaLapse/MultiFrame%03d.jpg
>>>
>>> and that results in N JPGs. I suppose I could have a second ffmpeg command that averages those 10 JPGs, but can this all be done in one pass? Thanks!
>>
>> You can use the "tmix" filter before you extract the images from the video.
>>
>> Michael
>
> Thanks! Can I get a little help on the syntax? Right now it’s still expecting to output multiple images:
>
> ./ffmpeg -rtsp_transport tcp -i rtsp://anonymous:password@192.168.1.39/live -frames:v 10 -vf tmix=frames=10:weights="1" ~/BananaLapse/MultiFrame.jpg
Ah, I think I figured it out. This works:
./ffmpeg -rtsp_transport tcp -i rtsp://anonymous:password@192.168.1.39/live -vf tmix=frames=10:weights="1" -frames:v 1 ~/BananaLapse/MultiFrame.jpg
I now have the -vf first, which averages 10 frames into 1, and then -frames:v expects only 1, correct? The output appears to be what I expect, with various values of N:
https://www.kan.org/pictures/MultiFrame1.jpg <https://www.kan.org/pictures/MultiFrame1.jpg>
https://www.kan.org/pictures/MultiFrame10.jpg <https://www.kan.org/pictures/MultiFrame10.jpg>
https://www.kan.org/pictures/MultiFrame128.jpg <https://www.kan.org/pictures/MultiFrame128.jpg>
And after all that I’m not sure it improves the image that much. I’ll check again at night, when the SNR will get worse.
Thanks for the help!
More information about the ffmpeg-user
mailing list