[FFmpeg-user] How does ffmpeg calculate bitrate?

Robert Krüger krueger at lesspain.de
Mon Jul 14 17:11:29 CEST 2014


when specifying bitrate using minrate/maxrate arguments, how does
ffmpeg measure the bitrate in its rate control algorithm? Is this the
bitrate as a GOP average, i.e. sum(packet sizes of gop)/gop duration
or simply packet size/packet duration that ffmpeg attempts to keep
within the limits? The latter would seem odd for non-I-frame-only
material as an I-frame of a certain quality is typically (depending on
the motive, I know) a lot larger than the following P- or B-frames of
the same quality. If the answer is "it depends on the codec" then I
would like to know how it is for the mpeg-style codecs, e.g.

Thanks for any insights into this,


More information about the ffmpeg-user mailing list