[FFmpeg-devel] [RFC] avpriv cleanup
michael at niedermayer.cc
Mon Mar 26 01:06:11 EEST 2018
On Sun, Mar 25, 2018 at 10:24:31PM +0200, Nicolas George wrote:
> > If they used distro packages then they wouldn't have any power over what
> > gets built or shipped. Distro packages are the "Enable everything" kind
> > of build, so I'm of course talking about projects shipping their own
> > builds of the libraries.
> I have nothing to add to that. But you only answered half the question.
> The other half was: do they use static or dynamic linking?
> I am not sure how familiar you are with the precise workings of both
> solutions, so let me summarize a bit.
> With static linking, libraries are just archives or object files (*.o),
> with an index of symbols. The linker will examine each archive in turn
> and select the object files that contains required symbols.
> With dynamic linking, the linker takes notes of all the libraries,
> checks that all symbols are resolved and registers as dependencies of
> the executable file.
> What is remarkable about static linking is that it only takes the object
> files that are necessary, nothing more. It does not matter whether there
> are one library, eight or two thousands (one for each object file in the
> project), the linker will take what it needs in all of that.
at the risk of stating the obvious, and talking about something you
assumed would be reverted before all this ...
This largly stoped working with the new iteration APIs which has arrays of
every component in each lib which is referened by mandatory functions.
thus referencing everything and pulling every symbol in.
It still works kind of i guess if you never use any codec, any filter and
any muxer or demuxer and just some component from libavutil. But thats a
far way from well working
> Therefore, these project who insist on small libraries could achieve
> their goal with the simplest possible solution: they should be using
> static linking. It would work automatically, because linkers have been
> designed that way.
> It is possible that for some reason, they need a shared library. In that
> case, the best solution is to make their own: put all the functions that
> use lav* together somewhere (most projects will abstract the lav* APIs
> to suit their own design choices anyway), and link that statically with
> FFmpeg, making a shared library perfectly tailored to the needs of the
> And if you think about it, you will realize that what I propose has the
> consequence of making that simpler for them.
> > Steam, Foobar2000, Firefox, Chromium, only four examples of projects
> > where the av*.dll files they ship have long configure lines where they
> > manually disable all kinds of things beyond the standard
> > "--disable-everything --enable-decoder=foo" use case, including entire
> > libraries and frameworks, because they only need a handful of software
> > decoders.
> This is true, but misleading. First of all, I would like to emphasize
> that the framework code is very small compared to the code of the
> But most importantly, the granularity of the modules is not the correct
> one. Consider an application that needs to decode from a few files. It
> requires the lavc decoder framework and a few decoders, and the lavf
> demuxing framework and a few demuxers. All encoders and all muxers are
> disabled. Yet, it will include the encoding framework and the muxing
> Now, you may say that it calls for more modularity, like you advocate
> below, and you would be right. But this is missing something, see below.
> > And Chromium as I said even goes the extra mile by manually striping
> > libavutil of virtually everything except the core modules, something our
> > build system currently doesn't support.
> And with my full proposal, our build system should actually support it.
> > Building a single monolithic library will force the presence of a lot of
> > public symbols that currently can be avoided by simply building ffmpeg
> > for one decoder and effectively require just avcodec and avutil.
> > The direction we should head towards is making the libraries even more
> > modular and independent, starting with libavutil.
> You are entirely right here, but it is actually not really related to
> the number of libraries.
> Modularity has a cost: tracking the dependencies between components. If
> you implement modularity with many libraries, you push that complexity
> into the build system. This is a waste, since the linker already has all
> the algorithms to track dependencies.
> As I have explained above, static linking already achieves maximum
> modularity automatically, and irregardless of whether there is one
> library or many.
It still needs some care to not introduce unneeded references.
no matter if one lib or 10 libs, if symbols are referenced they and
everything they reference are pulled in.
you just need a function that has a conditional call to some
help printout which iterates over all components (like it does
currently) and the linker must include everything.
I think independant of 1 vs 10 libs, some tests to check if static linking
of a few individual components doesnt regress size wise would be a good
Michael GnuPG fingerprint: 9FF2128B147EF6730BADF133611EC787040B0FAB
The bravest are surely those who have the clearest vision
of what is before them, glory and danger alike, and yet
notwithstanding go out to meet it. -- Thucydides
-------------- next part --------------
A non-text attachment was scrubbed...
Size: 181 bytes
Desc: not available
More information about the ffmpeg-devel