[Libav-user] H.264 decoding on Haswell CPU/Xubuntu 14.04
Prashanth Bhat
prashanth.bhat at yahoo.com
Mon Aug 11 13:10:02 CEST 2014
I'm using libavcodec to perform decoding of H.264 frames. I'm on a Linux environment (14.04 Xubuntu), and Intel Haswell (Pentium grade) CPU. My program decodes the frames, without rendering them on the screen. With 4 simultaneous decodes of 1080p resolution and 15 fps, the CPU utilization is around 90% (not bad), but the load average shown by 'top' is 20+ This looks excessive. I don't think I'm using the hardware decoding ability of the Haswell CPU. Could someone please advise how to find out if I'm effectively using the decoder? The API calls I'm making are pretty standard -
avcodec_register_all(); avcodec_alloc_context3(NULL); avcodec_find_decoder(AV_CODEC_ID_H264); av_parser_init(AV_CODEC_ID_H264); avcodec_open2(); The above calls are made during initialization. The below calls are made on each frame - av_parser_parse2 avcodec_decode_video2 The following points are probably relevant -
a) The default context allocated by avcodec_alloc_context3() does not have any hw_accel associated with it. I allocated a h264_vaapi accelerator, but this doesn't lead to any improvement.
b) The codec capabilities does not have the HW_ACCEL bit set.
c) The same code worked better when used with Ubuntu 11.10 on a Sandybridge Pentium grade CPU and FFMPEG 0.9
Any help would be appreciated.
Thanks,
Prashanth
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://ffmpeg.org/pipermail/libav-user/attachments/20140811/1a5f16e2/attachment.html>
More information about the Libav-user
mailing list