[FFmpeg-user] ffmpeg muxing raw video with audio file
Anatol
anatol2002 at gmail.com
Wed Sep 24 18:59:27 CEST 2014
Putting '-async 1' on an input side (before '-i') might help
On Wed, Sep 24, 2014 at 4:00 PM, Steffen Richter - m.objects <
info at mobjects.com> wrote:
> I try to let ffmpeg.exe encode and mux a raw video stream (input) and an
> PCM audio file (input) to a MP4 file (output).
> The raw video stream is supplied by a named pipe as it is generated in
> realtime.
>
> The command line is like:
> ffmpeg.exe -f rawvideo -pix_fmt rgb24 -r 30 -s 1280x720 -i
> \\.\pipe\pipename -i audiofile.wav -c:a ac3 -ab 128k -ar 44100 -c:v libx264
> -preset slow -crf 20 -pix-fmt yuv420p output.mp4
>
> All seems to work fine, there is no error message, and the file resulting
> file runs without problems in Windows Media Player and Quicktime Player.
> But with VLC player, the sound seems to be messed up, packets are mixed
> during playback.
> A closer look to the file resp. to the logfile of ffmpeg (options -report
> -debug_ts) tells me that all audio packets of the whole audio input file
> are processed between the first few video frames. This means that the last
> audio packet (timestamp 300s) is already written before video frame 30 (1s)
> is processed. This probably leads to an improberly muxed result file.
> When I simply try to write the same to an MTS oder MOV container, other
> problems with all players seem to occur (seeking does not work properly),
> so I think this is a general muxing problem.
>
> How can I get audio and video processing synchronized? Is it necessary to
> supply video and audio in a balanced datarate, thus control the audio
> stream through a pipe as well as the video stream?
> If so: How many audio preload would be useful/necessary/common?
>
> Thank you!
> _______________________________________________
> ffmpeg-user mailing list
> ffmpeg-user at ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
More information about the ffmpeg-user
mailing list