[FFmpeg-user] Issues Live Streaming Audio Only

Brett Garrett brett at edgewaterbroadcasting.com
Thu Nov 1 01:04:38 EET 2018


We actually have been using IceCast for public access on personal computers
and/or mobiles (an app). Not for our internal private remote servers. I was
of the impression that we were avoiding it was due to bandwidth. That it
was causing choppy audio and was undesirable. That the udp based
distribution was the best/only option. IceCast uses http to stream which is
tcp based, so it didn't seem like an option to even consider. Inquired
about the IceCast and using it. After discussing it further, I was reminded
of the story about why a grandma always cut the ends off the ham.
Basically, we are working with old technology that actually breaks when we
try to stream an http based stream. I'm trying to make things newer, and
with new software the tcp based streaming does not just break. So we will
be exploring the new path with IceCast.

Thank you for your suggestion and help.

On Wed, Oct 31, 2018 at 3:46 PM DopeLabs <dopelabs at dubstep.fm> wrote:

> have you considered using something other than ffserve (which no longer
> supported if im not mistaken) that has been specifically developed for
> audio transfer over a network?
>
> such as icecast?
>
> ffmpeg can output to an icecast server running on the local machine or
> running on the remote machine
>
>
> source machine:
> ffmpeg -i input -ac 2 -c:a libfdk_aac -ar 48k -b:a 64k -ice_public 0
> -ice_genre Genre -ice_url http://url.tld -ice_description Description
> -ice_name Name -content_type audio/aacp -f adts icecast://
> source:pass at remote.host.com/mount
>
> remote machine:
> ffmpeg -i http://remote.host/mount
>
>
> > On Oct 31, 2018, at 2:23 15PM, Brett Garrett <
> brett at edgewaterbroadcasting.com> wrote:
> >
> > I've been trying to stream an audio feed from a live source using the
> RTSP
> > protocol. The main goal is to eventually utilize jitter buffers or other
> > techniques to reduce/remove stutters and skips. Although, at the moment I
> > can't seem to get any audio to transfer over the network at all. I do
> have
> > a somewhat unique setup, and therefore don't have many other options to
> > work with.
> >
> > I'm currently using Debian 9 with no gui for both the server and
> client(s).
> > I would say the biggest hangup or issue is having to use Jack (which uses
> > an ALSA device) for my output as it is used in several other applications
> > that interact with the audio output. All of which are not running for the
> > time being because I'm still just trying to get the stream from one
> > computer to another across my local network. Here's what I'm hoping is
> > enough information and that I'm not vomiting too much text.
> >
> > A little info on our audio card specs...
> >
> > root at test-9:~# *aplay -l*
> > **** List of PLAYBACK Hardware Devices ****
> > card 0: M44 [M Audio Delta 44], device 0: ICE1712 multi [ICE1712 multi]
> >  Subdevices: 1/1
> >  Subdevice #0: subdevice #0
> >
> > root at test-9:~# *arecord --dump-hw-params -D hw:0,0*
> > Recording WAVE 'stdin' : Unsigned 8 bit, Rate 8000 Hz, Mono
> > HW Params of device "hw:0,0":
> > --------------------
> > ACCESS:  MMAP_INTERLEAVED RW_INTERLEAVED
> > FORMAT:  S32_LE
> > SUBFORMAT:  STD
> > SAMPLE_BITS: 32
> > FRAME_BITS: 384
> > CHANNELS: 12
> > RATE: [8000 96000]
> > PERIOD_TIME: (20 341250]
> > PERIOD_SIZE: [2 2730]
> > PERIOD_BYTES: [96 131040]
> > PERIODS: [1 1024]
> > BUFFER_TIME: (20 682625]
> > BUFFER_SIZE: [2 5461]
> > BUFFER_BYTES: [96 262128]
> > TICK_TIME: ALL
> > --------------------
> > arecord: set_params:1299: Sample format non available
> > Available formats:
> > - S32_LE
> >
> > Audio comes in through an analog XLR connection. Verified audio input is
> > good by using the following commands...
> >
> > root at test-9:~# /usr/bin/dbus-run-session ffmpeg -f jack -i ffmpeg -y
> > output.wav
> > root at test-9:~# jack_connect system:capture_1 ffmpeg:input_1 &&
> jack_connect
> > system:capture_2 ffmpeg:input_2
> >
> > Waited a while then killed the tasks. Acquired the output.wav and it
> played
> > with good audio. Although, I need to be able to get it from one computer
> to
> > another through a live feed (not recording to some single file that will
> > grow to crazy sizes).
> > So here's my server config setup. Without the comments.
> >
> > root at test-9:~# *grep ^[^#] /etc/ffserver.conf*
> > HTTPPort 8585
> > RTSPPort 15151
> > HTTPBindAddress 0.0.0.0
> > MaxHTTPConnections 2000
> > MaxClients 1000
> > MaxBandwidth 1000
> > CustomLog -
> > <Feed feed1.ffm>
> > File /tmp/feed1.ffm
> > FileMaxSize 200K
> > ACL allow 127.0.0.1
> > </Feed>
> > <Stream test1-rtsp.ogg>
> > Format rtp
> > Feed feed1.ffm
> > NoVideo
> > AudioCodec aac
> > AudioChannels 2
> > AudioBitRate 64
> > AudioSampleRate 48000
> > AVOptionAudio flags +global_header
> > </Stream>
> > <Stream stat.html>
> > Format status
> > ACL allow localhost
> > ACL allow 192.168.0.0 192.168.255.255
> > </Stream>
> > <Redirect index.html>
> > URL http://www.ffmpeg.org/
> > </Redirect>
> >
> > root at test-9:~# *ffserver -loglevel debug &*
> > [1] 11631
> > root at test-9:~# ffserver version 3.2.10-1~deb9u1 Copyright (c) 2000-2018
> the
> > FFmpeg developers
> >  built with gcc 6.3.0 (Debian 6.3.0-18) 20170516
> >  configuration: --prefix=/usr --extra-version='1~deb9u1'
> > --toolchain=hardened --libdir=/usr/lib/i386-linux-gnu
> > --incdir=/usr/include/i386-linux-gnu --enable-gpl --disable-stripping
> > --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa
> > --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca
> > --enable-libcdio --enable-libebur128 --enable-libflite
> > --enable-libfontconfig --enable-libfreetype --enable-libfribidi
> > --enable-libgme --enable-libgsm --enable-libmp3lame --enable-libopenjpeg
> > --enable-libopenmpt --enable-libopus --enable-libpulse
> > --enable-librubberband --enable-libshine --enable-libsnappy
> > --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora
> > --enable-libtwolame --enable-libvorbis --enable-libvpx
> --enable-libwavpack
> > --enable-libwebp --enable-libx265 --enable-libxvid --enable-libzmq
> > --enable-libzvbi --enable-omx --enable-openal --enable-opengl
> --enable-sdl2
> > --enable-libdc1394 --enable-libiec61883 --enable-chromaprint
> > --enable-frei0r --enable-libopencv --enable-libx264 --enable-shared
> >  libavutil      55. 34.101 / 55. 34.101
> >  libavcodec     57. 64.101 / 57. 64.101
> >  libavformat    57. 56.101 / 57. 56.101
> >  libavdevice    57.  1.100 / 57.  1.100
> >  libavfilter     6. 65.100 /  6. 65.100
> >  libavresample   3.  1.  0 /  3.  1.  0
> >  libswscale      4.  2.100 /  4.  2.100
> >  libswresample   2.  3.100 /  2.  3.100
> >  libpostproc    54.  1.100 / 54.  1.100
> > Tue Oct 30 13:23:01 2018 [file @ 0xfe65c0]Setting default whitelist
> > 'file,crypto'
> > Tue Oct 30 13:23:01 2018 [ffm @ 0xfe39e0]Using AVStream.codec to pass
> codec
> > parameters to muxers is deprecated, use AVStream.codecpar instead.
> > Tue Oct 30 13:23:01 2018 writing recommended configuration:
> > ac=2,b=64000,ar=48000,flags=+global_header
> > Tue Oct 30 13:23:01 2018 [AVIOContext @ 0xfe6660]Statistics: 0 seeks, 1
> > writeouts
> > Tue Oct 30 13:23:01 2018 FFserver started.
> >
> >
> > root at test-9:~# */usr/bin/dbus-run-session ffmpeg -f jack -i ffmpeg
> > http://127.0.0.1:8585/feed1.ffm <http://127.0.0.1:8585/feed1.ffm>*
> > ffmpeg version 3.2.10-1~deb9u1 Copyright (c) 2000-2018 the FFmpeg
> developers
> >  built with gcc 6.3.0 (Debian 6.3.0-18) 20170516
> >  configuration: --prefix=/usr --extra-version='1~deb9u1'
> > --toolchain=hardened --libdir=/usr/lib/i386-linux-gnu
> > --incdir=/usr/include/i386-linux-gnu --enable-gpl --disable-stripping
> > --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa
> > --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca
> > --enable-libcdio --enable-libebur128 --enable-libflite
> > --enable-libfontconfig --enable-libfreetype --enable-libfribidi
> > --enable-libgme --enable-libgsm --enable-libmp3lame --enable-libopenjpeg
> > --enable-libopenmpt --enable-libopus --enable-libpulse
> > --enable-librubberband --enable-libshine --enable-libsnappy
> > --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora
> > --enable-libtwolame --enable-libvorbis --enable-libvpx
> --enable-libwavpack
> > --enable-libwebp --enable-libx265 --enable-libxvid --enable-libzmq
> > --enable-libzvbi --enable-omx --enable-openal --enable-opengl
> --enable-sdl2
> > --enable-libdc1394 --enable-libiec61883 --enable-chromaprint
> > --enable-frei0r --enable-libopencv --enable-libx264 --enable-shared
> >  libavutil      55. 34.101 / 55. 34.101
> >  libavcodec     57. 64.101 / 57. 64.101
> >  libavformat    57. 56.101 / 57. 56.101
> >  libavdevice    57.  1.100 / 57.  1.100
> >  libavfilter     6. 65.100 /  6. 65.100
> >  libavresample   3.  1.  0 /  3.  1.  0
> >  libswscale      4.  2.100 /  4.  2.100
> >  libswresample   2.  3.100 /  2.  3.100
> >  libpostproc    54.  1.100 / 54.  1.100
> > Cannot connect to server socket err = No such file or directory
> > Cannot connect to server request channel
> > jackdmp 1.9.11
> > Copyright 2001-2005 Paul Davis and others.
> > Copyright 2004-2014 Grame.
> > jackdmp comes with ABSOLUTELY NO WARRANTY
> > This is free software, and you are welcome to redistribute it
> > under certain conditions; see the file COPYING for details
> > no message buffer overruns
> > no message buffer overruns
> > no message buffer overruns
> > JACK server starting in realtime mode with priority 10
> > self-connect-mode is "Don't restrict self connect requests"
> > audio_reservation_init
> > Acquire audio card Audio0
> > creating alsa driver ... hw:0|hw:0|1024|2|48000|0|0|nomon|swmeter|-|32bit
> > configuring for 48000Hz, period = 1024 frames (21.3 ms), buffer = 2
> periods
> > ALSA: final selected sample format for capture: 32bit integer
> little-endian
> > ALSA: use 2 periods for capture
> > ALSA: final selected sample format for playback: 32bit integer
> little-endian
> > ALSA: use 2 periods for playback
> > [jack @ 0x2494700] JACK client registered and activated (rate=48000Hz,
> > buffer_size=1024 frames)
> > Guessed Channel Layout for Input Stream #0.0 : stereo
> > Input #0, jack, from 'ffmpeg':
> >  Duration: N/A, start: 1540927425.470458, bitrate: 3072 kb/s
> >    Stream #0:0: Audio: pcm_f32le, 48000 Hz, stereo, flt, 3072 kb/s
> > Tue Oct 30 13:23:45 2018 [NULL @ 0xfe39e0]Opening '/tmp/feed1.ffm' for
> > reading
> > Tue Oct 30 13:23:45 2018 [file @ 0xfe46c0]Setting default whitelist
> > 'file,crypto'
> > Tue Oct 30 13:23:45 2018 [ffm @ 0xfe39e0]Format ffm probed with size=2048
> > and score=101
> > Tue Oct 30 13:23:45 2018 [NULL @ 0xff0f20]Setting entry with key 'ac' to
> > value '2'
> > Tue Oct 30 13:23:45 2018 [NULL @ 0xff0f20]Setting entry with key 'b' to
> > value '64000'
> > Tue Oct 30 13:23:45 2018 [NULL @ 0xff0f20]Setting entry with key 'ar' to
> > value '48000'
> > Tue Oct 30 13:23:45 2018 [NULL @ 0xff0f20]Setting entry with key 'flags'
> to
> > value '+global_header'
> > Tue Oct 30 13:23:45 2018 writing recommended configuration:
> > ac=2,b=64000,ar=48000,flags=+global_header
> > Tue Oct 30 13:23:45 2018 127.0.0.1 - - [GET] "/feed1.ffm HTTP/1.1" 200
> 4175
> > Tue Oct 30 13:23:45 2018 [AVIOContext @ 0xfe6720]Statistics: 4096 bytes
> > read, 0 seeks
> > Output #0, ffm, to 'http://127.0.0.1:8585/feed1.ffm':
> >  Metadata:
> >    creation_time   : now
> >    encoder         : Lavf57.56.101
> >    Stream #0:0: Audio: aac (LC), 48000 Hz, stereo, fltp, 64 kb/s
> >    Metadata:
> >      encoder         : Lavc57.64.101 aac
> > Stream mapping:
> >  Stream #0:0 -> #0:0 (pcm_f32le (native) -> aac (native))
> > Press [q] to stop, [?] for help
> > Tue Oct 30 13:23:45 2018 [NULL @ 0xff0f20]Setting entry with key 'b' to
> > value '64000'
> > Tue Oct 30 13:23:45 2018 [NULL @ 0xff0f20]Setting entry with key 'ab' to
> > value '64000'
> > Tue Oct 30 13:23:45 2018 [NULL @ 0xff0f20]Setting entry with key 'flags'
> to
> > value '0x00400000'
> > Tue Oct 30 13:23:45 2018 [NULL @ 0xff0f20]Setting entry with key 'ar' to
> > value '48000'
> > Tue Oct 30 13:23:45 2018 [NULL @ 0xff0f20]Setting entry with key 'ac' to
> > value '2'
> > Tue Oct 30 13:23:45 2018 [NULL @ 0xff0f20]Setting entry with key
> > 'frame_size' to value '1024'
> > Tue Oct 30 13:23:45 2018 [NULL @ 0xff0f20]Setting entry with key
> 'profile'
> > to value '1'
> > Tue Oct 30 13:23:45 2018 [NULL @ 0xff0f20]Setting entry with key
> > 'channel_layout' to value '3'
> > Tue Oct 30 13:23:45 2018 [NULL @ 0xff0f20]Setting entry with key
> > 'time_base' to value '1/48000'
> > Tue Oct 30 13:23:45 2018 [NULL @ 0xff0f20]Setting entry with key 'delay'
> to
> > value '1024'
> > Tue Oct 30 13:23:45 2018 [NULL @ 0xff0f20]Setting entry with key
> > 'pkt_timebase' to value '1/1000000'
> >
> > Changing text at the bottom makes it appear like 'somethings happening'.
> > There is a slow bit transfer because I have not connected the jack
> > inputs/outputs.
> >
> > size=      72kB time=00:01:07.08 bitrate=   8.8kbits/s speed=   1x
> >
> > root at test-9:~# *jack_connect system:capture_1 ffmpeg:input_1 &&
> > jack_connect system:capture_2 ffmpeg:input_2 && jack_lsp -c*
> > system:capture_1
> >   ffmpeg:input_1
> > system:capture_2
> >   ffmpeg:input_2
> > system:capture_3
> > system:capture_4
> > system:capture_5
> > system:capture_6
> > system:capture_7
> > system:capture_8
> > system:capture_9
> > system:capture_10
> > system:capture_11
> > system:capture_12
> > system:playback_1
> > system:playback_2
> > system:playback_3
> > system:playback_4
> > system:playback_5
> > system:playback_6
> > system:playback_7
> > system:playback_8
> > system:playback_9
> > system:playback_10
> > ffmpeg:input_1
> >   system:capture_1
> > ffmpeg:input_2
> >   system:capture_2
> > size=     800kB time=00:03:31.24 bitrate=  31.0kbits/s speed=   1x
> >
> > The bitrate goes up, so I'm assuming this means it is receiving the audio
> > alright and that everything's in place. Go to
> > http://192.168.1.119:8585/stat.html the page comes up with...
> >
> > [see attached picture]
> >
> > Time to see if we can get the stream from another computer...
> >
> > root at test-17:~# ffplay -nodisp rtsp://192.168.1.119:15151/test1-rtsp.ogg
> > ffplay version 3.2.12-1~deb9u1 Copyright (c) 2003-2018 the FFmpeg
> developers
> >  built with gcc 6.3.0 (Debian 6.3.0-18+deb9u1) 20170516
> >  configuration: --prefix=/usr --extra-version='1~deb9u1'
> > --toolchain=hardened --libdir=/usr/lib/i386-linux-gnu
> > --incdir=/usr/include/i386-linux-gnu --enable-gpl --disable-stripping
> > --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa
> > --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca
> > --enable-libcdio --enable-libebur128 --enable-libflite
> > --enable-libfontconfig --enable-libfreetype --enable-libfribidi
> > --enable-libgme --enable-libgsm --enable-libmp3lame --enable-libopenjpeg
> > --enable-libopenmpt --enable-libopus --enable-libpulse
> > --enable-librubberband --enable-libshine --enable-libsnappy
> > --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora
> > --enable-libtwolame --enable-libvorbis --enable-libvpx
> --enable-libwavpack
> > --enable-libwebp --enable-libx265 --enable-libxvid --enable-libzmq
> > --enable-libzvbi --enable-omx --enable-openal --enable-opengl
> --enable-sdl2
> > --enable-libdc1394 --enable-libiec61883 --enable-chromaprint
> > --enable-frei0r --enable-libopencv --enable-libx264 --enable-shared
> >  libavutil      55. 34.101 / 55. 34.101
> >  libavcodec     57. 64.101 / 57. 64.101
> >  libavformat    57. 56.101 / 57. 56.101
> >  libavdevice    57.  1.100 / 57.  1.100
> >  libavfilter     6. 65.100 /  6. 65.100
> >  libavresample   3.  1.  0 /  3.  1.  0
> >  libswscale      4.  2.100 /  4.  2.100
> >  libswresample   2.  3.100 /  2.  3.100
> >  libpostproc    54.  1.100 / 54.  1.100
> >    nan    :  0.000 fd=   0 aq=    0KB vq=    0KB sq=    0B f=0/0
> >
> > Then it just sits there. No audio, but that's not surprising since
> normally
> > I'd have to pipe the audio out through jack (no doable with ffmpeg, to my
> > knowledge). So I re-enabled the integrated audio to see if it might
> > automatically pipe the audio out to that device, but I get the same
> output
> > noted above and still no audio. When I run the same command on server. I
> > get the exact same output on the new ssh prompt I opened up. Even if I
> > change the ip address to 'localhost'. Although, on the ssh prompt where I
> > started up ffserver from I get the following output pop up along with the
> > constantly updating bitrate...
> >
> > Wed Oct 30 14:37:35 2018 [NULL @ 0x128cfe0]Opening '/tmp/feed1.ffm' for
> > reading
> > Wed Oct 30 14:37:35 2018 [file @ 0x127be20]Setting default whitelist
> > 'file,crypto'
> > Wed Oct 30 14:37:35 2018 [ffm @ 0x128cfe0]Format ffm probed with
> size=2048
> > and score=101
> > Wed Oct 30 14:37:35 2018 [NULL @ 0x128e8a0]Setting entry with key
> 'strict'
> > to value '-2'
> > Wed Oct 30 14:37:35 2018 [NULL @ 0x128e8a0]Setting entry with key 'ac' to
> > value '2'
> > Wed Oct 30 14:37:35 2018 [NULL @ 0x128e8a0]Setting entry with key 'b' to
> > value '128000'
> > Wed Oct 30 14:37:35 2018 [NULL @ 0x128e8a0]Setting entry with key 'ar' to
> > value '48000'
> > Wed Oct 30 14:37:35 2018 [NULL @ 0x128e8a0]Setting entry with key 'flags'
> > to value '+global_header'
> > Wed Oct 30 14:37:35 2018 [rtp @ 0x128f440]No default whitelist set
> > Wed Oct 30 14:37:35 2018 [udp @ 0x1290c20]No default whitelist set
> > Wed Oct 30 14:37:35 2018 [udp @ 0x1290d60]No default whitelist set
> > Wed Oct 30 14:37:35 2018 192.168.1.119:31830 - - "PLAY
> > test1-rtsp.ogg/streamid=0 RTP/UDP"
> > Wed Oct 30 14:37:35 2018 Failed to parse interval end specification ''
> >
> > I couldn't find much on the final line there about the "Failed to parse
> > interval end specification ''" or of a way to fix it I mean. I've tried
> > adding -analyzeduration and -probesize on both the ffmpeg command that
> > uploads the audio to the feed, and also on the ffplay command that should
> > play the feed. They did absolutely nothing to change anything.
> >
> > I'm pretty sure it's not firewall related. ufw and iptables are not
> > installed.
> > root at test-9:~# iptables
> > -su: iptables: command not found
> > root at test-9:~# ufw
> > -su: ufw: command not found
> >
> > It feels like I'm missing some config option somewhere, but I can't
> figure
> > out where. All I see everywhere are examples of people setting up streams
> > with webcams or other video related formats and very little that just
> deals
> > with audio only, nothing has worked for me so far. I'm open to any and
> all
> > suggestions. Thank you for your time.
> >
> > --
> > Brett
> > <rtsp server status.PNG>_______________________________________________
> > ffmpeg-user mailing list
> > ffmpeg-user at ffmpeg.org
> > http://ffmpeg.org/mailman/listinfo/ffmpeg-user
> >
> > To unsubscribe, visit link above, or email
> > ffmpeg-user-request at ffmpeg.org with subject "unsubscribe".
>
> _______________________________________________
> ffmpeg-user mailing list
> ffmpeg-user at ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-request at ffmpeg.org with subject "unsubscribe".



-- 
Brett


More information about the ffmpeg-user mailing list