[FFmpeg-user] Use ONVIF 1.0 NTP timestamp for synchronization
jan.koppe at wwu.de
Sun Feb 12 11:03:16 EET 2017
first off - I'm new here, so short introduction to what I'm trying to
I work for a university doing lecture capture. Our system does automatic
lecture recording using ffmpeg, magewell capture cards for beamer-video
(via HDMI) and audio, and Axis P1428-E security network cameras to film
the teachers. Because the Video of the teacher and the audio signal are
basically recorded on two separate machines (the camera's running linux,
providing a rtp stream) we are having severe issues with synchronizing
those two, because the start-latency for the camera stream is changing
ever so slightly for each capture.
I've dug a bit into the rtp protocol (very basic understanding only for
now) and saw that our cameras support the ONVIF 1.0 standard, which says
that a wallclock NTP timestamp is sent within the RTP header. As far as
I can see this would be ideal: the university is running a dedicated NTP
server to which all devices synchronize, so if we could leverage this
timestamp to sync up audio and video our issues would be solved (there
will probably still be a small offset, but that should be much more
consistent and can be fixed easily).
At this point I am completely stuck: I tried capturing in separate
processes, a single process with multiple inputs, different -vsync
parameters but have not got it to work at all. It seems that ffmpeg is
not capable of detecting these timestamps.
Before I dig into ffmpegs code and try to understand what's going on -
is there anybody that can tell me if this idea is even worth chasing? Am
I missing something? Better suggestions?
Any help or ideas are much appreciated!
eLectures / LearnWeb
Georgskommende 25 - Room 310
48143 Münster/Westf. - Germany
Tel. + 49 (0) 251 - 83 29295
E-mail: jan.koppe at wwu.de
-------------- next part --------------
A non-text attachment was scrubbed...
Size: 5003 bytes
Desc: S/MIME Cryptographic Signature
More information about the ffmpeg-user