[FFmpeg-user] Enconding and streaming raw pixel stream via WiFi
Ralf Ramsauer
ralf.ramsauer at oth-regensburg.de
Wed May 17 20:29:52 EEST 2017
Hi,
I'd like to use ffmpeg/ffserver to encode and stream a raw pixel stream.
My intention is to capture a v4l2 cam device, do some magic with opencv
and stream with ffserver and RTSP via WiFi, as the device is a flying
microcopter platform with an unstable WiFi connection.
This is the reason why I'd like to use RTSP via UDP. I decided to pass
the raw pixel stream via an unnamed pipe to ffmpeg. Tricky, but this
keeps all encoding/streaming stuff out of my C++ code. ffmpeg decodes
the stream and sends it to a ffserver instance running on the same
platform. ffserver binds to the WiFi interface, other computers on the
same network are allowed to watch the RTSP stream.
I'm open for better solutions. :)
However, the pixel stream has no fix framerate, it produces as many
frames as possible. IOW, a new frame is sent to stdout as soon as it is
available. Effectively, this results in a framerate of about ~28fps.
This is how I invoke ffmpeg:
./my_opencv_app | ffmpeg -f rawvideo -r 30 -pixel_format gray
-video_size 640x480 -tune zerolatency -i - http://localhost:8081/foo.ffm
Please find the corresponding ffserver.conf below.
Everything *somehow* works, but not as intended. The video delay is
constantly increasing, and it produces high network load (~1.4MB/s) for
rather simple pictures (grayscaled).
- My opencv app currently produces as many frames as possible. Should it
produce frames at a (more or less) constant frame rate?
- Apparently, this approach makes ffmpeg use the mpeg4 codec per
default. This results in high network load, though my frames are only
grayscaled.
So I tried to switch to h26[45]. This broke everything. I get a bunch
of errors and no video stream when I try to watch it with mplayer.
I tried to replace my opencv app by a direct stream of the webcam with
ffmpeg. Same issues.
- What are 'recommended' codec settings for my use case? Which codec
would probably be the best for me? (Sorry, I'm not an A/V guy ;) )
- I have to specify the framerate as a ffmpeg parameter "-r 30", and in
the ffserver.conf. Why do I have to specify it twice? Why do I have to
specify a framerate at all?
- I'd like to keep latency as low as possible. Are there some special
tweaks for achieving low latency with ffmpeg?
- Is h26[45] suitable for streaming with an unstable connection? Is it
robust against random failures?
If it helps, I can push my stuff to some repository.
Anything helps!
Thanks
Ralf
ffserver.conf:
HttpPort 8081
RtspPort 5554
HttpBindAddress 0.0.0.0
RTSPBindAddress 0.0.0.0
MaxClients 5
MaxBandwidth 100000
CustomLog -
NoDefaults
<Feed foo.ffm>
File /tmp/foo.ffm
FileMaxSize 1M
ACL allow 127.0.0.1
</Feed>
<Stream foo.sdp>
Feed foo.ffm
Format rtp
Noaudio
VideoSize 640x480
VideoFrameRate 30
ACL allow 0.0.0.0
</Stream>
More information about the ffmpeg-user
mailing list