[FFmpeg-devel] MediaCodec support

Matthieu Bouron matthieu.bouron at gmail.com
Tue Feb 16 14:47:51 CET 2016


On Tue, Feb 16, 2016 at 12:36 PM, wm4 <nfxjfg at googlemail.com> wrote:

> On Tue, 16 Feb 2016 12:09:58 +0100
> Matthieu Bouron <matthieu.bouron at gmail.com> wrote:
>
> > On Tue, Feb 16, 2016 at 10:41 AM, wm4 <nfxjfg at googlemail.com> wrote:
> >
> > > On Mon, 15 Feb 2016 18:52:25 +0100
> > > Matthieu Bouron <matthieu.bouron at gmail.com> wrote:
> > >
> > > > Hello,
> > > >
> > > > The following patchset adds basic MediaCodec support to libavcodec,
> ie:
> > > only
> > > > h264 is supported and the HWAccel part (Surface output) is missing.
> > > >
> > > > JNI comes as a dependency. The JNI support is based on the same
> patchset
> > > I've
> > > > sent some time ago with some improvements.
> > > >
> > > > I originally developed the patch against the Ndk API (Android >= 5.0)
> > > but then
> > > > changed my mind and go with the JNI version for two main reasons:
> > > >
> > > >   * there are still too many android 4 devices
> > > >   * there is still needs for some jni bits as the MediaCodec Ndk API
> > > >   does not provide a way to known the codec name which is mandatory
> > > >   to workaround or blacklist some implementations (ie: do not use
> known
> > > >   software decoders, workaround OMX.SEC.avc.dec as it returns invalid
> > > >   stride and slice-height values, ...)
> > > >
> > >
> > > I guess there's no way around it.
> > >
> > > > I decided to mimic the Ndk API minus a few differences (see
> > > > mediacodec_wrapper.h) so it can be ported more easily to the C API
> in the
> > > > future. The other reason being it is to totally hide the JNI code.
> > > >
> > > > The HWAccel part is on my todo list but I wanted a real use case to
> > > develop the
> > > > API against.
> > > >
> > > > The development branch can be found here:
> > > > https://github.com/mbouron/FFmpeg/tree/feature/mediacodec-support
> > > >
> > > > --enable-jni and --enable-mediacodec is required to build the
> > > h264_mediacodec
> > > > decoder.
> > > >
> > > > av_jni_register_java_vm(vm) must called before lavc is used.
> > >
> > > Wasn't there some sort of trick that could avoid this?
> > >
> >
> > The workaround for this is to call a *private* C++ API, and in particular
> > checking that the variable jni_invocation_ is initialized
> > and call the GetCreatedJavaVMS function from:
> >
> https://android.googlesource.com/platform/libnativehelper/+/master/JniInvocation.cpp
> >
> >
>
> If I read this right, the host can somehow initialize the JNI, and then
> JNI_GetCreatedJavaVMs() will work?
>

AFAIK, This is initialized when the VM (dalvik or art) is initialized.

>
> Though it looks like that function will just crash if the jni stuff is
> not initialized. With no way to ensure or verify initialization using
> public (or just C++?) API. Well, I guess this is Android quality code...
>

Well it is ... this api is not public.


>
> > > > The patchset also includes supports for Android content uris which
> is not
> > > > mandatory for the mediacodec supports but helps dealing with them
> more
> > > > seamlessly.
> > >
> > > I'm still not convinced that this is necessary (custom I/O allows any
> > > application to provide its own I/O callbacks). This would also avoid
> > > the need for avpriv JNI API, since it'd be confined to libavcodec.
> > >
> >
> > Content uris are the proper way to deal with medias on Android since
> > version 5.0.
>
> What exactly does this mean? What are content URIs anyway? Some crazy
> Android-specific crap to make URLs harder? Is it not possible to turn
> this into something reasonable on a higher level? Does MediaCodec have
> to access it? (Would be strange since the demuxer would be between
> that.)
>

Content uris are a "protocol" used to access files from other apps (meaning
the accessed files can be private).
Every application can create its own content uris and attaches temporary or
definitive permissions (r/w), so other applications can acces its data.

Example:
>From your application (A), the user choose to grab some content from
application (B) (let's say a photos gallery), it opens application (B) and
let the user choose a collection of medias. When the selection is done,
Application (A) get the selection of medias as a collection of content uris
(which have generally temporary read permissions). At this point you can't
have a file path because the content could be on the app (B) private
storage but you can have access to a filestream (java) or a regular file
descriptor.
Note: It is considered really bad practice to copy the content of the uri
locally to have access to a regular file path.


> > Having it in lavf as a protocol would prevent anyone who wants
> > to support it in its application to re-do a custom io wrapper around it.
> > IMHO, it's like the other protocols we already support (samba, ssh,
> gopher,
> > icecast, ...) and the code that adds its support is not intrusive (it
> just
> > returns a fd that is then used by the file protocol functions).
>
> Can't judge it, but we don't like all these "extra" protocols much,
> simply because lavf does not do a very good job at abstracting them
> well while still exposing their full capabilities. But I guess it's
> open to discussion.
>
> Why not just add a way to make lavf to use an existing FD?
>

This is something that can be done too.


>
> > The issue
> > here is its jni dependency right ?
>
> I guess so.
>
> >
> > >
> > > > In order to use this support,
> av_jni_register_application_context(env,
> > > context)
> > > > must be called before lavf/lavu is used.
> > >
> > > For "content URIs"?
> > >
> >
> > Yes for content uris usage.
>
> So what's this application context?
>

The application context (reference
http://developer.android.com/reference/android/content/Context.html#getApplicationContext%28%29)
can access various information about the application like the file
directories, cache directories, ...
You can also access the content resolver of the application (this is what
we are interested in here) which is the interface responsible for resolving
content uris and in this case, getting fds out of them.
The application context also let you access to the application class loader
which can access classes bundled with the application (as opposed to the
system ones which are part of java or the android sdk).


More information about the ffmpeg-devel mailing list