[FFmpeg-devel] (sponsored) feature request

Edit B bouke at editb.nl
Thu Jul 3 13:05:06 CEST 2014


----- Original Message ----- 
From: "Andrey Utkin" <andrey.krieger.utkin at gmail.com>
To: "FFmpeg development discussions and patches" <ffmpeg-devel at ffmpeg.org>
Sent: Thursday, July 03, 2014 11:57 AM
Subject: Re: [FFmpeg-devel] (sponsored) feature request


> 2014-07-03 11:56 GMT+03:00 Edit B <bouke at editb.nl>:
>> Hi all,
>> Feature request / want to hire a coder, and IMHO a not so difficult.
>> (Well, the devil will be in the detail's of course / famous last 
>> words...)
>>
>> I would like to have proper timecode on capturing live streams.
>> Now, the way it is currently, i can start the recording with a timecode
>> value, but there is no fixed relation between the moment FFmpeg is fired 
>> and
>> the first frame is captured, so there is always an unpredictable offset.
>>
>> What I think 'could' work is adding something that will hold FFmpeg at 
>> the
>> point just before it starts buffering. (So start up FFmeg with '-timecode
>> hold_horses' or whatever)
>> Then, ask for a timecode value input.
>> On receiving it, start filling the buffer and encode. (Taking the time
>> between that an initializing the buffer into account, if needed.)
>>
>> That 'should' result in a +/- one frame accurace, as the start time could 
>> be
>> just after the beginning of a frame coming in, or very close to the end 
>> of
>> one, so there can be an offset of slightly less than one frame duration.
>> No idea how to cope with that at the moment. Perhaps I could fire the 
>> values
>> at a fixed time intervals of N*frameDur so the offset would always be the
>> same. (All sources may be genlocked.)
>>
>> Instead of taking the timecode from the user, it could also use the 
>> system
>> time. My main concern is that I get multiple recordings starting at
>> different moments in sync.
>>
>> Does this make sense?
>> Doable?
>
> Sorry for my ignorance, what exactly is your use case, and what is
> inconvenience you suffer from with current FFmpeg?

FFmpeg takes time to startup, and that time is not always the same. (Between 
some 200 and 450 Msecs on my system)
Thus, by feeding a TC in the commandline, it's impossible to get a correct 
timestamp on output when encoding a live stream, as by the time FFmpeg has 
the first frame to work on, a couple of frames have already passed.
My case is to record from several inputs at the same time, and i need 
matching TC on all streams (Thus, NOT the same start TC value on all, as the 
streams never start at exactly the same time.).
More clear now?

Bouke

> -- 
> Andrey Utkin
> _______________________________________________
> ffmpeg-devel mailing list
> ffmpeg-devel at ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-devel 


---
This email is free from viruses and malware because avast! Antivirus protection is active.
http://www.avast.com



More information about the ffmpeg-devel mailing list