[Libav-user] Request for Help: How to Retrieve RTP and NTP Timestamps from RTCP SR Reports in FFmpeg
扶桑
s1711880582 at gmail.com
Tue Feb 11 13:31:01 EET 2025
Dear FFmpeg Development Team,
Hello!
I am currently working on using FFmpeg to parse RTSP streams, and I have
encountered some issues that I would like to seek your advice on.
I would like to extract the timestamp (PTS) of each frame. Currently, I
have successfully obtained the RTP packet timestamp (rtptime), but the RTP
timestamp itself is a relative time (i.e., it is the time since the stream
started), and it is not mapped to an absolute time (NTP time). Therefore, I
need to convert the RTP timestamp into an absolute NTP timestamp for
further processing.
According to the specifications in *RFC 2326* and *RFC 3550*, the *RTCP SR
(Sender Report)* contains a mapping between *NTP timestamps* and *RTP
timestamps*, which can be used to calculate the offset from the RTP
timestamp to the NTP timestamp. Specifically, the RTCP SR report contains
ntp_time and rtp_time, and I hope to calculate the offset using these
values to convert the RTP timestamp into an absolute NTP timestamp.
My questions:
1.
*How can I retrieve the ntp_time and rtp_time from the RTCP SR report in
FFmpeg?*
- How can I access these timestamps from the RTCP SR report in FFmpeg's
AVFormatContext?
- I understand that AVRTPDynamicProtocolContext seems to handle RTP
and RTCP data, but this structure is not exposed in FFmpeg's
public API. Is
there another way to access this data?
2.
*How can I compute the offset between the RTP timestamp and the NTP
timestamp?*
- I understand that the offset can be calculated from the RTCP SR
report's ntp_time and rtp_time. Is there a way to achieve this
calculation using FFmpeg's existing API?
3.
*How does FFmpeg handle timestamps when there are no RTCP SR reports?*
- If the RTSP stream does not contain RTCP SR reports, how does FFmpeg
handle the RTP timestamps? Since the timestamps are relative,
how can they
be converted into absolute time?
Current Implementation:
I am currently using av_read_frame() in FFmpeg to read each RTP packet and
calculate the PTS of each frame (using the stream's time base time_base).
However, since the RTP timestamps in the RTSP stream are relative, I need
to convert them into absolute time. Therefore, I hope to use the NTP
timestamp and RTP timestamp from the RTCP SR report to calculate the offset.
Request:
I would appreciate your help in explaining how to retrieve the timestamps (
ntp_time and rtp_time) from the RTCP SR report and how to perform the RTP
timestamp to NTP timestamp conversion using FFmpeg.
Thank you very much for your time and attention. I look forward to your
response.
Best regards,
Fusang
------------------------------
extern "C"
{
#include <libavformat/avformat.h>
#include <libavcodec/avcodec.h>
#include <libavutil/time.h>
}
#include <iostream>
#include <memory>
#include <queue>
class Packet {
public:
typedef std::shared_ptr<Packet> Ptr;
Packet() {
m_packet = av_packet_alloc();
if (!m_packet) {
std::cerr << "av_packet_alloc fail" << std::endl;
}
}
~Packet() {
if (m_packet) {
av_packet_unref(m_packet);
av_packet_free(&m_packet);
m_packet = nullptr;
}
}
AVPacket* m_packet = nullptr;
};
class BInterface {
public:
void read_avpacket_fun();
private:
std::queue<Packet::Ptr> m_avpacket_list;
};
void BInterface::read_avpacket_fun() {
const char *rtsp_url = "rtsp://
admin:123456 at 192.168.30.64:554/Streaming/Channels/101"; // RTSP stream URL
AVFormatContext *format_ctx = nullptr;
if (avformat_open_input(&format_ctx, rtsp_url, nullptr, nullptr) < 0) {
std::cerr << "Failed to open RTSP stream." << std::endl;
return;
}
if (avformat_find_stream_info(format_ctx, nullptr) < 0) {
std::cerr << "Failed to find stream info." << std::endl;
avformat_close_input(&format_ctx);
return;
}
// 查找视频流
int video_stream_index = -1;
for (unsigned int i = 0; i < format_ctx->nb_streams; ++i) {
if (format_ctx->streams[i]->codecpar->codec_type == AVMEDIA_TYPE_VIDEO) {
video_stream_index = i;
break;
}
}
if (video_stream_index == -1) {
std::cerr << "No video stream found." << std::endl;
avformat_close_input(&format_ctx);
return;
}
AVStream *video_stream = format_ctx->streams[video_stream_index];
AVRational time_base = video_stream->time_base; // 视频流的时间基
while (true) {
Packet::Ptr packet = std::make_shared<Packet>();
int ret = av_read_frame(format_ctx, packet->m_packet);
if (ret < 0) {
std::cerr << "av_read_frame failed, ret(" << ret << ")" << std::endl;
break;
}
// 判断是否是视频流
if (packet->m_packet->stream_index == video_stream_index) {
int64_t pts = packet->m_packet->pts; // 获取PTS时间戳
if (pts != AV_NOPTS_VALUE) {
double pts_seconds = av_q2d(time_base) * pts;
std::cout << "PTS:" << pts << std::endl;
std::cout << "PTS (seconds): " << pts_seconds << std::endl << std::endl;
} else {
std::cerr << "Invalid PTS value!" << std::endl;
}
m_avpacket_list.push(packet);
}
}
avformat_close_input(&format_ctx);
}
int main() {
// 初始化FFmpeg库
avformat_network_init();
BInterface bInterface;
bInterface.read_avpacket_fun();
return 0;
}
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://ffmpeg.org/pipermail/libav-user/attachments/20250211/8d27c2c0/attachment.htm>
More information about the Libav-user
mailing list