[FFmpeg-devel] [PATCH v2] avcodec/nvenc: High bit depth encoding for HEVC

Timo Rothenpieler timo at rothenpieler.org
Thu Apr 25 01:43:12 EEST 2024


On 19.04.2024 10:38, Diego Felix de Souza via ffmpeg-devel wrote:
> From: Diego Felix de Souza <ddesouza at nvidia.com>
> 
> Adding 10-bit encoding support for HEVC if the input is 8-bit. In
> case of 8-bit input content, NVENC performs an internal CUDA 8 to
> 10-bit conversion of the input prior to encoding. Currently, only
> AV1 supports encoding 8-bit content as 10-bit.
> 
> Signed-off-by: Diego Felix de Souza <ddesouza at nvidia.com>
> ---
>   libavcodec/nvenc.c      | 10 +++++-----
>   libavcodec/nvenc_hevc.c |  3 +++
>   2 files changed, 8 insertions(+), 5 deletions(-)
> 
> diff --git a/libavcodec/nvenc.c b/libavcodec/nvenc.c
> index 794174a53f..e48224347d 100644
> --- a/libavcodec/nvenc.c
> +++ b/libavcodec/nvenc.c
> @@ -514,7 +514,7 @@ static int nvenc_check_capabilities(AVCodecContext *avctx)
>       }
> 
>       ret = nvenc_check_cap(avctx, NV_ENC_CAPS_SUPPORT_10BIT_ENCODE);
> -    if (IS_10BIT(ctx->data_pix_fmt) && ret <= 0) {
> +    if ((IS_10BIT(ctx->data_pix_fmt) || ctx->highbitdepth) && ret <= 0) {
>           av_log(avctx, AV_LOG_WARNING, "10 bit encode not supported\n");
>           return AVERROR(ENOSYS);
>       }
> @@ -1420,8 +1420,8 @@ static av_cold int nvenc_setup_hevc_config(AVCodecContext *avctx)
>           break;
>       }
> 
> -    // force setting profile as main10 if input is 10 bit
> -    if (IS_10BIT(ctx->data_pix_fmt)) {
> +    // force setting profile as main10 if input is 10 bit or if it should be encoded as 10 bit
> +    if (IS_10BIT(ctx->data_pix_fmt) || ctx->highbitdepth) {
>           cc->profileGUID = NV_ENC_HEVC_PROFILE_MAIN10_GUID;
>           avctx->profile = AV_PROFILE_HEVC_MAIN_10;
>       }
> @@ -1435,8 +1435,8 @@ static av_cold int nvenc_setup_hevc_config(AVCodecContext *avctx)
>       hevc->chromaFormatIDC = IS_YUV444(ctx->data_pix_fmt) ? 3 : 1;
> 
>   #ifdef NVENC_HAVE_NEW_BIT_DEPTH_API
> -    hevc->inputBitDepth = hevc->outputBitDepth =
> -        IS_10BIT(ctx->data_pix_fmt) ? NV_ENC_BIT_DEPTH_10 : NV_ENC_BIT_DEPTH_8;
> +    hevc->inputBitDepth = IS_10BIT(ctx->data_pix_fmt) ? NV_ENC_BIT_DEPTH_10 : NV_ENC_BIT_DEPTH_8;
> +    hevc->outputBitDepth = (IS_10BIT(ctx->data_pix_fmt) || ctx->highbitdepth) ? NV_ENC_BIT_DEPTH_10 : NV_ENC_BIT_DEPTH_8;
>   #else
>       hevc->pixelBitDepthMinus8 = IS_10BIT(ctx->data_pix_fmt) ? 2 : 0;
>   #endif
> diff --git a/libavcodec/nvenc_hevc.c b/libavcodec/nvenc_hevc.c
> index b949cb1bd7..d54e5f2512 100644
> --- a/libavcodec/nvenc_hevc.c
> +++ b/libavcodec/nvenc_hevc.c
> @@ -183,6 +183,9 @@ static const AVOption options[] = {
>       { "fullres",      "Two Pass encoding is enabled where first Pass is full resolution",
>                                                               0,                    AV_OPT_TYPE_CONST, { .i64 = NV_ENC_TWO_PASS_FULL_RESOLUTION },    0,                          0,                               VE, .unit = "multipass" },
>   #endif
> +#ifdef NVENC_HAVE_NEW_BIT_DEPTH_API
> +    { "highbitdepth", "Enable 10 bit encode for 8 bit input",OFFSET(highbitdepth),AV_OPT_TYPE_BOOL,  { .i64 = 0 }, 0, 1, VE },
> +#endif
>   #ifdef NVENC_HAVE_LDKFS
>       { "ldkfs",        "Low delay key frame scale; Specifies the Scene Change frame size increase allowed in case of single frame VBV and CBR",
>                                                               OFFSET(ldkfs),        AV_OPT_TYPE_INT,   { .i64 = 0 }, 0, UCHAR_MAX, VE },
> --
> 2.34.1

applied, thanks


More information about the ffmpeg-devel mailing list