[FFmpeg-user] Does atadenoise work with 10bit HDR video?
Oliver Fromme
oliver at fromme.com
Thu Jun 27 15:01:47 EEST 2024
Paul B Mahol wrote:
> On Wed, Jun 26, 2024 at 11:43 PM Oliver Fromme <oliver at fromme.com> wrote:
> > I'm using ffmpeg for a long time, but I've just recently started to
> > use it with 10bit HDR (HDR10) video.
> >
> > So far I found out how to re-encode HDR10 video with libx265 and
> > retain the HDR metadata. It's a bit complicated, but it seems to
> > work fine. However, I'm unsure about video filters.
> >
> > In particular, I often use the atadenoise filter. It works really
> > well for 8bit SDR video. But can it be used with HDR10 video, too?
> > Can I simply add ''-vf atadenoise`` to the command and it'll work,
> > or will it clip the data down to 8bit? Is there anything else that
> > I need to take into account?
> >
>
> You could already try it via several ways...
Well, of course I tried to simply use ''-vf atadenoise`` with a
sample HDR10 video. There were no error messages, and I couldn't
see a problem, but eyes can deceive you. I may have missed a
potential problem.
For example, I wondered if the atadenoise filter is aware of the
color space and transfer function? Does it have to? Or do I have
to convert the 10bit pixel data to bt709 or linear light (maybe even
16bit), then run atadenoise, then convert back to the actual HDR10
color space (bt2020nc, smpte2084)? Apparently I don't have to, but
I'm not 100% sure.
> The filter supports >8bit pixel formats too.
Thank you very much for the confirmation!
By the way ...
I think it is desirable that the documentation mentioned what bit
depths (more precisely: what pixel formats) are supported by the
various filters. Right now, that piece of information seems to be
missing from most of the filters, although it is quite important,
especially now that HDR content becomes more and more common.
Best regards,
Oliver
More information about the ffmpeg-user
mailing list