[FFmpeg-user] Average N frames of RTSP video for better snapshot quality?
astroelectronic at t-online.de
Tue Mar 8 20:32:21 EET 2022
Am 08.03.2022 um 19:09 schrieb Steven Kan:
> After 7.5 years of waiting, my banana plant is finally flowering! I want to do a time-lapse capture of the flowering and fruiting process. Due to its location, the easiest way for me to get a camera out there is to use a little WyzeCam v3 with the RTSP firmware and the Wyze lamp socket. Unfortunately the WyzeCam doesn’t (yet) have a externally accessible JPG snapshot feature, so I have a cron job set up to:
> ./ffmpeg -rtsp_transport tcp -i rtsp://anonymous:password@$IPAddress/live -frames:v 1 $outfile
> every hour. The results are OK, but not fantastic:
> https://www.kan.org/pictures/BananaTimeLapseFirstImage.jpg <https://www.kan.org/pictures/BananaTimeLapseFirstImage.jpg>
> Is there a way to tell ffmpeg to collect N frames of video and output one single averaged image to improve the SNR? Even if there’s some wind, the flower stalk shouldn’t be moving much.
> I tried:
> ./ffmpeg -rtsp_transport tcp -i rtsp://anonymous:firstname.lastname@example.org/live -frames:v 10 ~/BananaLapse/MultiFrame%03d.jpg
> and that results in N JPGs. I suppose I could have a second ffmpeg command that averages those 10 JPGs, but can this all be done in one pass? Thanks!
You can use the "tmix" filter before you extract the images from the video.
More information about the ffmpeg-user