[FFmpeg-devel] [CLOSED] movie Filter reload Option

TalkVideo at TalkVideo.net TalkVideo at TalkVideo.net
Wed May 15 20:38:48 EEST 2019


This is marked tested now, per below.


On Wed, May 15, 2019 at 01:35:24PM -0400, TalkVideo at TalkVideo.net wrote:
> The ultimate goal that was the source of the question for this email thread 
> was to enable spontaneous, and random injection of media into an existing RTMP
> stream. It has been accomplished. It is generally referred to as "Media Play"
> on YouTube, and other sites, and it allows the Audience to interact in Real-Time
> with the stream, and modify its Content.
> 
> It is being tested Live while posting this message, here:
> 
> https://www.youtube.com/watch?v=xJgyle00A8k
> 
> A previous Live Stream demonstrating this is here:
> 
> https://www.youtube.com/watch?v=wOULTQ5dzls 
> 
> YouTube may have fixed audio sync issues after ingestion.
> 
> 
> It has been tested as follows:
> 
> 
> 1) Script 1 -- Updates files used by the "drawtext" And PNG "overlay" filters:
> 
> while :; do date +%s.%N > live.tmp; mv live.tmp live.txt; cp 1px-Alpha.png overlay.tmp ; mv overlay.tmp overlay.png; sleep 1; cp large-red-circle-emoji.png overlay.tmp ; mv overlay.tmp overlay.png; date +%s.%N > live.tmp; mv live.tmp live.txt; free; sleep 1; done
> 
> 
> 2) Script 2 --  On a looping basis, stream video from whatever source into the main FFMPEG process 
> as $INPUT_2. This stream for the "overlay" filter is generated from PNG, FLV, or possibly RTMP Inputs. 
>  
> while :; do
> vid=`find ./videos -maxdepth 1 -type f -size +100M -print0 | xargs -0 ls | sort -R | tail -n1 `;  echo $vid;  
> # Get a random video from a directory, and print its name.
> 
> /usr/local/bin/ffmpeg -re -y  -i $vid -r 30 -c:v libx264 -x264-params "nal-hrd=cbr" -qmin 1 -qmax 15 -b:v 8M -maxrate 40M -bufsize 20M -g 15  -preset ultrafast -s 640x360 -t 10 -f flv -an -r 30   "rtmp://localhost:1935/video_overlay/key"; usleep 33333;
> # Stream that video into INPUT_2 of the main FFMPEG process.
> 
> ffmpeg -y -re  -f image2 -loop 1 -r 30 -i 640x360-Alpha.png  -c:v libx264 -x264-params "nal-hrd=cbr" -qmin 1 -qmax 15 -b:v 8M -maxrate 40M -bufsize 20M -g 15  -preset ultrafast  -s 64x36 -f flv -t 3 "rtmp://localhost:1935/video_overlay/key"; usleep 33333; 
> # Generate a stream from a PNG Image into the main stream. FLV Does not support Alpha.
> # The Image will appear as a black area. The image appears in upper right corner
> # as a 64x36 area. It possibly can be reduced to a single pixel, to make it "disappear".
> 
> ffmpeg -y -re  -f image2 -loop 1 -r 30 -i 640x360-Alpha.png  -c:v libx264 -x264-params "nal-hrd=cbr" -qmin 1 -qmax 15 -b:v 8M -maxrate 40M -bufsize 20M -g 15  -preset ultrafast  -s 640x360 -f flv -t 3 "rtmp://localhost:1935/video_overlay/key"; usleep 33333; 
> # Same as the line above, but now the area is 640x360. This creates the effect of
> # a window appearing in upper right corner, from the smaller black area created by the line above.
> 
> ffmpeg -y -re -f flv  -r 30 -i "rtmp://localhost:1935/inbound_rtmp/key?listen&listen_timeout=15000"  -s 640x360 -r 30  -c:v libx264 -x264-params "nal-hrd=cbr" -qmin 1 -qmax 15 -b:v 8M -maxrate 40M -bufsize 20M -g 15  -preset ultrafast  -f flv  -filter_complex "crop=640:360:in_w-out_w:in_h-in_h" -t 7   "rtmp://localhost:1935/video_overlay/key" ;  usleep 33333;
> # Crops an area of the stream that is INPUT_1 to the main FFMPEG process, and sends it
> # back into that process as an overlay. Can be used when no other overlay is desired,
> # but must maintain Frame Sync with main process.
> 
> 3) Script 3 -- The main FFMPEG process. This is the command that would be called by an
> "exec_push" directive to the nginx_rtmp module. It can be run directly at the command line,
> or by some kind of calling script, and will receive input via RTMP, and output it to YouTube,
> or some other endpoint.
> 
> /usr/local/bin/ffmpeg -y -re -r 30  $INPUT_1 -re -r 30 -an $INPUT_2 -f image2 -loop 1 -r 30 -i /home/nginx/overlay.png  -c:v libx264 -x264-params "nal-hrd=cbr" -filter_complex 'overlay=main_w-overlay_w:main_h-main_h:repeatlast=0:eof_action=pass, overlay=100:main_h-overlay_h-600, drawtext=fontfile=/home/nginx/fonts/Live.ttf:textfile=/home/nginx/live.txt:reload=1:x=main_w/33:y=main_h/24:fontsize=h/20:fontcolor=white:borderw=3' -qmin 1 -qmax 15  -c:a aac -b:a 128k -b:v 8M -maxrate 40M -bufsize 20M -g 15  -preset ultrafast -s 1920x1080  -r 30 $OUTPUT
> 
> The Inputs and Outputs of this process can be switched as follows, for testing, etc.
> 
> export INPUT_1='-i Input.flv' # Testing Main Stream From Stored File
> export INPUT_1='-i rtmp://localhost:1935/inbound_rtmp/key?listen&listen_timeout=15000' # Live Source
> 
> export INPUT_2='-f image2 -loop 1 -i 640x360-Alpha.png' # Image Overlay From File 
> export INPUT_2='-i  rtmp://localhost:1935/video_overlay/key?listen&listen_timeout=15000' # Live or other Video overlay
> 
> export OUTPUT="-f flv /dev/null" # Output for testing / debugging.
> export OUTPUT="-f mp4 out.mp4" # Save as file Check For Output quality, sync, etc.
> export OUTPUT="-f flv rtmp://a.rtmp.youtube.com/live2/<YouTube Stream Key>" # Live!
> 
> 
> 
> Tweakable Paramaters:
> -s 640x360                         # Image Size. Adjust "-b:v", "-maxrate" and "-bufsize" with this.
> -r 30                              # Frame Rate. Probably best to match source if possible.
> -c:v libx264                       # Codec For YouTube
> -x264-params "nal-hrd=cbr"         # CBR
> -qmin 1 -qmax 15                   # Smaller "q" Range is better Quality video but higher Bitrate or possibly lower Frame Rate.
> -b:v 8M -maxrate 40M -bufsize 20M  # This combination is for CBR. Increase "-bufsize" if you see VBV Underflow errors,
>                                    # increase "-maxrate" when decreasing "qmin -> qmax" range.
> -g 15                              # KeyFrame Interval. Two Per Second @30FPS.
> -preset ultrafast                  # Least CPU Usage. Set slower if your machine has enough Juice, or set equivalent Parameters manually.
> 
> 
> Notes:
> 1) The script that sends the stream to the "overlay" filter, Blocks when it comes to the line that crops
> a portion of the main stream, if the main stream is not active. This may be favorable as it will pick up
> once the main stream is incoming, otherwise, it just waits. 
> 
> 2) The main stream blocks if there is no incoming RTMP stream. Again this may be the desired behavior.
> 
> 3) The "movie" filter is not able to accomplish this kind of dynamic or random overlaying, 
> as the movie to be overlayed is loaded at inception time, and there is no "reload" option. 
> Hence the Subject of this Thread. However, after testing the setup used, it appears that using
> RTMP inputs as a way to create Real-Time injection of media into an existing RTMP stream is actually
> a pretty good way to go. The only failure in this scheme is the inability of FLV to do Alpha Overlays.
> FFMPEG itself worked as expected. But for inability of FLV to handle Alpha, this setup appears that
> it would work just fine.
> 
> 4) The trick of cropping an area of the incoming stream, and re-overlaying it on top of the
> outgoing stream, between overlays of other material, is pretty slick. If the sync can be maintained
> between the separate FFMPEG processes, it would work. This is more likely to take place in a
> standalone application built with the various LibAV Libraries.
> 
> 
> This thread will be marked as [CLOSED] by the Author, because it is no longer related to
> the "movie" Filter. Additional discussion on this Subject will be posted in a new Thread
> with a relevant Subject Line. This Thread will be updated with a Link to that thread.
> _______________________________________________
> ffmpeg-devel mailing list
> ffmpeg-devel at ffmpeg.org
> https://ffmpeg.org/mailman/listinfo/ffmpeg-devel
> 
> To unsubscribe, visit link above, or email
> ffmpeg-devel-request at ffmpeg.org with subject "unsubscribe".


More information about the ffmpeg-devel mailing list