[FFmpeg-devel] [RFC] Move ffplay engine to separate library

Lukasz M lukasz.m.luki at gmail.com
Tue Oct 1 18:03:23 CEST 2013


On 1 October 2013 12:53, Stefano Sabatini <stefasab at gmail.com> wrote:

> On date Monday 2013-09-30 23:29:33 +0200, Lukasz M encoded:
> > Hi,
> >
> > I'd like to propose new library of ffmpeg project that provides middle
> > layer between current ffmpeg tools and player that uses it.
> > There is a lot to do to make a player based on ffmpeg working correctly,
> > this is probably common for most players.
> > That would be good to make it as a part of ffmpeg project so it can be
> > maintained globaly and people may contribute to it.
> >
> > I used to make an iOS player for my own purpose, that base on ffmpeg and
> I
> > ported ffplay as a core for it.
> > Unfortunatelly it is hard to keep it up to date with git master, I
> believe
> > there is many people that would appreciate this kind of library inside
> > ffmpeg.
> >
> > I made some work as kind of "proof of concept". I pushed it on
> > https://github.com/lukaszmluki/ffmpeg/tree/libavengine
> >
> > This is not a stuff ready to merge. In contrary, there is a lot of things
> > not finished. It's just a vision more or less how it would look like.
> >
> > In general I wanted to make a library that is platform independet (it
> > depends only on ffmpeg). As it requires threads, synchronization etc,
> > it would be provided as callbacks by player (ffplay as a internal
> player).
> > These callback would be global (initalized once)
> >
> > There would be also callbacks for player stuff like audio initialization
> /
> > closing, picture rendering etc.
> > The player would need to call (names are kept from ffplay)
> > video_refresh periodically and sdl_audio_callback when audio system needs
> > data.
> >
>
> > At this moment I don't want to talk about details. Just want to hear your
> > opinion about this idea.
>
> One problem with your approach is that you depend on SDL. Ideally a
> player should be able to select the output device depending on
> availability/preference, and a generic framework as FFmpeg should be
> able to abstract from a specific implementation.
>
> As I envision FFmpeg, a player should be built based on the basic
> blocks provided by the libraries, in particular by composing filters
> and input / output devices.
>
> For example you could have something like this:
>
> [input device] -> [movie source] -> [filters] -> [output device]
> |--------------------------------------------------------------|
>                             ^
>                             |
>                             v
>                     [control device]
>
> and provide some way to send interactive commands, e.g. to adjust
> filtering and/or to seek back.
>

By output device you mean stuff available in libavdevice?
And I'm not sure what you mean by input device and movie soruce.
I understand general idea, but not sure what you mean by these particular
blocks.


More information about the ffmpeg-devel mailing list