[FFmpeg-user] How av_read_frame() works?

Keestu Pillo get2jils at gmail.com
Wed Feb 12 09:36:57 CET 2014


hi i am trying ffmpeg 2.1.3 with SDL2 in android, and understanding few
things.
Kindly correct me if i am wrong?

NOTE: Device is sending yuv420p format through rtsp protocol.



1)   av_read_frame (), i understand it reads the packet.  Frame is a
collection of packets.

2) We can not get the packet into UI unless we received complete Frame,
which is
    identified by avcodec_decode_video2[3d parameter ]

3) Once i check if entire frame is received, i am not dont any scaling [ ie
converting into different format], as in sdl i created texture with
   SDL_CreateTexture  (renderer, SDL_PIXELFORMAT_IYUV,
SDL_TEXTUREACCESS_STREAMING,1280, 720);.


4) Now the frames are displayed in the android phone, but it flickers, and
not clear.


is it av_read_frame() reads from server?, or it reads the data pushed by
server?

For example. rtsp server plays 100th frame, do we also get 100th frame only
when we invoke av_read_frame(), even if we delay for some time after
reading 10th frame by av_read_frame()?


What is that i need to take care ? Kindly provide a hint, as i dont see
much information in the internet.


More information about the ffmpeg-user mailing list