[FFmpeg-devel] [PATCH 4/4] avfilter/vf_v360: refactor (i)flat_range for fisheye

Daniel Playfair Cal daniel.playfair.cal at gmail.com
Tue Mar 23 08:31:50 EET 2021


Do you agree with my definition or not? And which code are you referring to
- the master branch or my patches?

I'd like to get these patches to a point where they can be applied, but
it's going to be difficult if we can't agree on the goal.

On Tue, Mar 23, 2021 at 5:15 PM Paul B Mahol <onemda at gmail.com> wrote:

>
>
> On Tue, Mar 23, 2021 at 5:00 AM Daniel Playfair Cal <
> daniel.playfair.cal at gmail.com> wrote:
>
>> What exactly is your definition of fisheye?
>>
>
> Take look at source code. I do not see how your definition matches one in
> source code.
>
>
>>
>> The definition I'm working with is the equidistant fisheye projection as
>> described here: https://wiki.panotools.org/Fisheye_Projection, i.e. r =
>> f * theta
>>
>> That mapping works for any theta, and you can have a circular image with
>> a field of view of up to 360 degrees before anything is repeated and the
>> inverse mapping is ambiguous. Hence my assumption that a rectangular output
>> image with a 180 horizontal/vertical field of view should still contain
>> areas near the corners where theta > 90 (because the diagonal FoV is >
>> 180), and these should still be mapped from such an image to a
>> equirectangular projection.
>>
>> Do you prefer for some reason to limit the fisheye projection to 180
>> degrees on any axis, i.e. have the constraint that theta <= 90? If that's
>> the case I could patch xyz_to_fisheye and fisheye_to_xyz so that such areas
>> are marked as invisible? That causes your example filtergraph to work as
>> before.
>>
>
>> On Tue, Mar 23, 2021 at 3:46 AM Paul B Mahol <onemda at gmail.com> wrote:
>>
>>>
>>>
>>> On Mon, Mar 22, 2021 at 1:35 PM Daniel Playfair Cal <
>>> daniel.playfair.cal at gmail.com> wrote:
>>>
>>>> > I disagree, if I use 180 hfov and 180 vfov it should not have extra
>>>> areas but only half of previous input.
>>>>
>>>> Not sure I follow - the ih_fov and vh_fov refer to the input (i.e. the
>>>> fisheye image). If you wanted to restrict the FoV of the output, surely the
>>>> way to do that would be to implement and use the FoV settings for the
>>>> equirectangular projection?. It doesn't seem right that the code for the
>>>> input projection is responsible for deciding what appears in the output. My
>>>> understanding was that the FoV settings simply describe the focal length of
>>>> the input or output camera so that points in the images can me mapped
>>>> to/from 3d coordinates.
>>>>
>>>>
>>> Take any equirectangular input and convert it to fisheye and than back
>>> to equirectangular.
>>> Or just take pure fisheye input with 180 h & v fov and convert it to
>>> equirectangular. There is plenty of such video content on esa website.
>>>
>>> To give you an idea of what I am trying to fix, here is an example
>>>> input: https://photos.app.goo.gl/o51NfY6aqWn3unPG6
>>>> This is a 1920x1440 image taken on a GoPro Hero 5 black with the 4:3
>>>> Wide FoV setting and stabilisation disabled.
>>>>
>>>>
>>> That is flat take of something else. Not real fisheye input.
>>>
>>>
>>>> The following filtergraph demonstrates the issues:
>>>> 'v360=input=fisheye:ih_fov=116.66:iv_fov=87.50:output=flat:d_fov=145.8'
>>>>  1. the dfov_from_hfov issue is worked around by the use of ih_fov and
>>>> iv_fov instead of id_fov, although you can try with id_fov=145.8 to see
>>>> that problem too
>>>>  2. by default the output has double the aspect ratio of the input,
>>>> even though the fisheye -> rectilinear transformation doesn't change the
>>>> aspect ratio (assuming the entire input image is included as it is in this
>>>> example)
>>>>  3. much of the input is not visible in the output even though there is
>>>> a mapping between the chosen projections (changed in the visibility test
>>>> patch)
>>>>
>>>> 3 in particular I don't think can be solved by changing the settings -
>>>> the input field of view needs to match the FoV of the input camera,
>>>> otherwise the mapping is wrong. But it seems there is no other way to
>>>> include the entire input from a fisheye image.
>>>>
>>>> On Mon, Mar 22, 2021 at 5:59 PM Paul B Mahol <onemda at gmail.com> wrote:
>>>>
>>>>>
>>>>>
>>>>> On Mon, Mar 22, 2021 at 5:09 AM Daniel Playfair Cal <
>>>>> daniel.playfair.cal at gmail.com> wrote:
>>>>>
>>>>>> I've tried that filtergraph and a few other similar ones and I'm not
>>>>>> sure what you mean - what exactly is the regression?
>>>>>>
>>>>>> I tried it on this image with an equirectangular projection:
>>>>>> https://wiki.panotools.org/images/0/01/Big_ben_equirectangular.jpg
>>>>>>
>>>>>> The only difference I can see is that there are less unmapped areas
>>>>>> in the output with the patches, because the final mapping from the output
>>>>>> equirectangular image to the intermediate fisheye image no longer fails to
>>>>>> map some areas which are present in the fisheye image. I would describe
>>>>>> this as an improvement?
>>>>>>
>>>>>
>>>>> I disagree, if I use 180 hfov and 180 vfov it should not have extra
>>>>> areas but only half of previous input.
>>>>>
>>>>>
>>>>>>
>>>>>> On Mon, Mar 22, 2021 at 3:30 AM Paul B Mahol <onemda at gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>> Sorry, but I cannot apply this set as is, It makes at least one
>>>>>>> serious regression.
>>>>>>>
>>>>>>> For example try this filtergraph:
>>>>>>>
>>>>>>>
>>>>>>> v360=input=e:output=fisheye:h_fov=180:v_fov=180,v360=input=fisheye:output=e:ih_fov=180:iv_fov=180
>>>>>>>
>>>>>>> On Sun, Mar 21, 2021 at 1:45 PM Daniel Playfair Cal <
>>>>>>> daniel.playfair.cal at gmail.com> wrote:
>>>>>>>
>>>>>>>> This changes the iflat_range and flat_range values for the fisheye
>>>>>>>> projection to match their meaning for the flat/rectilinear
>>>>>>>> projection.
>>>>>>>> That is, the range is between the two x or two y coordinates of the
>>>>>>>> outermost points above/below or left/right of the center, in the
>>>>>>>> flat/rectilinear projection.
>>>>>>>>
>>>>>>>> Signed-off-by: Daniel Playfair Cal <daniel.playfair.cal at gmail.com>
>>>>>>>> ---
>>>>>>>>  libavfilter/vf_v360.c | 19 +++++++++----------
>>>>>>>>  1 file changed, 9 insertions(+), 10 deletions(-)
>>>>>>>>
>>>>>>>> diff --git a/libavfilter/vf_v360.c b/libavfilter/vf_v360.c
>>>>>>>> index 68bb2f7b0f..3158451963 100644
>>>>>>>> --- a/libavfilter/vf_v360.c
>>>>>>>> +++ b/libavfilter/vf_v360.c
>>>>>>>> @@ -2807,9 +2807,8 @@ static int
>>>>>>>> prepare_fisheye_out(AVFilterContext *ctx)
>>>>>>>>  {
>>>>>>>>      V360Context *s = ctx->priv;
>>>>>>>>
>>>>>>>> -    s->flat_range[0] = s->h_fov / 180.f;
>>>>>>>> -    s->flat_range[1] = s->v_fov / 180.f;
>>>>>>>> -
>>>>>>>> +    s->flat_range[0] = 0.5f * s->h_fov * M_PI / 180.f;
>>>>>>>> +    s->flat_range[1] = 0.5f * s->v_fov * M_PI / 180.f;
>>>>>>>>      return 0;
>>>>>>>>  }
>>>>>>>>
>>>>>>>> @@ -2827,8 +2826,8 @@ static int fisheye_to_xyz(const V360Context
>>>>>>>> *s,
>>>>>>>>                            int i, int j, int width, int height,
>>>>>>>>                            float *vec)
>>>>>>>>  {
>>>>>>>> -    const float uf = s->flat_range[0] * ((2.f * i) / width  - 1.f);
>>>>>>>> -    const float vf = s->flat_range[1] * ((2.f * j + 1.f) / height
>>>>>>>> - 1.f);
>>>>>>>> +    const float uf = 2.f * s->flat_range[0] / M_PI * ((2.f * i) /
>>>>>>>> width  - 1.f);
>>>>>>>> +    const float vf = 2.f * s->flat_range[1] / M_PI * ((2.f * j +
>>>>>>>> 1.f) / height - 1.f);
>>>>>>>>
>>>>>>>>      const float phi   = atan2f(vf, uf);
>>>>>>>>      const float theta = M_PI_2 * (1.f - hypotf(uf, vf));
>>>>>>>> @@ -2858,8 +2857,8 @@ static int prepare_fisheye_in(AVFilterContext
>>>>>>>> *ctx)
>>>>>>>>  {
>>>>>>>>      V360Context *s = ctx->priv;
>>>>>>>>
>>>>>>>> -    s->iflat_range[0] = s->ih_fov / 180.f;
>>>>>>>> -    s->iflat_range[1] = s->iv_fov / 180.f;
>>>>>>>> +    s->iflat_range[0] = 0.5f * s->ih_fov * M_PI / 180.f;
>>>>>>>> +    s->iflat_range[1] = 0.5f * s->iv_fov * M_PI / 180.f;
>>>>>>>>
>>>>>>>>      return 0;
>>>>>>>>  }
>>>>>>>> @@ -2882,10 +2881,10 @@ static int xyz_to_fisheye(const V360Context
>>>>>>>> *s,
>>>>>>>>  {
>>>>>>>>      const float h   = hypotf(vec[0], vec[1]);
>>>>>>>>      const float lh  = h > 0.f ? h : 1.f;
>>>>>>>> -    const float phi = atan2f(h, vec[2]) / M_PI;
>>>>>>>> +    const float phi = atan2f(h, vec[2]);
>>>>>>>>
>>>>>>>> -    float uf = vec[0] / lh * phi / s->iflat_range[0];
>>>>>>>> -    float vf = vec[1] / lh * phi / s->iflat_range[1];
>>>>>>>> +    float uf = 0.5f * vec[0] / lh * phi / s->iflat_range[0];
>>>>>>>> +    float vf = 0.5f * vec[1] / lh * phi / s->iflat_range[1];
>>>>>>>>
>>>>>>>>      const int visible = -0.5f < uf && uf < 0.5f && -0.5f < vf &&
>>>>>>>> vf < 0.5f;
>>>>>>>>      int ui, vi;
>>>>>>>> --
>>>>>>>> 2.31.0
>>>>>>>>
>>>>>>>> _______________________________________________
>>>>>>>> ffmpeg-devel mailing list
>>>>>>>> ffmpeg-devel at ffmpeg.org
>>>>>>>> https://ffmpeg.org/mailman/listinfo/ffmpeg-devel
>>>>>>>>
>>>>>>>> To unsubscribe, visit link above, or email
>>>>>>>> ffmpeg-devel-request at ffmpeg.org with subject "unsubscribe".
>>>>>>>
>>>>>>>


More information about the ffmpeg-devel mailing list