[Ffmpeg-devel-irc] ffmpeg.log.20191102

burek burek at teamnet.rs
Sun Nov 3 03:05:02 EET 2019


[00:00:59 CET] <phobosoph> hi
[00:01:04 CET] <phobosoph> "Stream #0:0: Video: h264 (Main), yuvj420p(pc, bt709, progressive), 1920x1080, 30 fps, 30 tbr, 90k tbn, 60 tbc"
[00:01:08 CET] <phobosoph> looks nice + youtube live friendly
[00:01:12 CET] <phobosoph> except the yuvj420p thing
[00:01:30 CET] <phobosoph> I could use a raspberry (with good cooling) and hardware accelerated ffmpeg to re-encode to other colour format
[00:01:38 CET] <phobosoph> hm, or does youtube live do this already for all its users?
[00:09:33 CET] <nicolas17> youtube video upload will reencode whatever you send even if it's already compatible
[00:09:39 CET] <nicolas17> not sure about live
[00:17:35 CET] <phobosoph> nicolas17: pro question: is it possible to ffprobe the youtube live stream, too? or another way for finding out whether it is yuvj420p?
[00:17:38 CET] <phobosoph> :)
[00:17:40 CET] <phobosoph> that would be so cool
[00:18:07 CET] <nicolas17> youtube-dl --get-url https://youtube.com/...
[00:18:12 CET] <nicolas17> then pass the result to ffprobe
[00:18:21 CET] <nicolas17> I think I never tried it on live ... might work
[00:18:34 CET] <phobosoph> oh cool
[00:18:40 CET] <phobosoph> pass result to ffprobe, so the downloaded file
[00:18:40 CET] <phobosoph> ok
[00:19:09 CET] <nicolas17> oh with --get-url it gives you the video URL instead of downloading anything
[00:19:54 CET] <nicolas17> I guess downloading and passing the file to ffprobe would work too ^^
[00:25:10 CET] <phobosoph> nicolas17: alright, installing youtube-dl from source (master) because apparently older versions have issue with latest youtube
[00:25:20 CET] <furq> pretty sure live won't work with yuvj
[00:25:29 CET] <furq> there's an easy way to find out though
[00:27:26 CET] <phobosoph> furq: how? the stream runs currently, but I don't know what yuvj thing it got
[00:29:40 CET] <phobosoph> furq: is there another way except using youtube-dl?
[00:30:12 CET] <nicolas17> stream yuvj live and see if it works on the browser? :)
[00:30:35 CET] <furq> yeah that
[00:35:21 CET] <phobosoph> it works nicely in chrome and on my smartphone
[00:36:54 CET] <phobosoph> so youtube-dl can download that stream thing
[00:36:55 CET] <phobosoph> hm
[00:37:07 CET] <phobosoph> but ffprobe is unhappy because youtube-dl can't finish because well, the stream doesn't finish
[00:37:35 CET] <phobosoph> ah , I found it
[00:37:50 CET] <phobosoph> Stream #0:1: Video: h264 (Main) ([27][0][0][0] / 0x001B), yuv420p(tv, bt709), 1920x1080 [SAR 1:1 DAR 16:9], 30 fps, 30 tbr, 90k tbn, 60 tbc
[00:37:56 CET] <phobosoph> hurra
[00:38:03 CET] <phobosoph> youtube live converts it to yuv420p!!!
[00:38:18 CET] <phobosoph> no need for re-encoding it (though I plan to do this with hardware acceleration on my raspberry, just for being able to)
[00:38:20 CET] <phobosoph> good
[00:38:35 CET] <furq> the hwenc on the rpi is really bad
[00:38:45 CET] <furq> so you probably shouldn't do that
[00:39:51 CET] <phobosoph> furq: it isn't necessary :D
[00:39:51 CET] <furq> at least the 0-3 quality is terrible, idk what the 4 is like
[00:40:08 CET] <phobosoph> furq: 0-3 quality from the raspy hw encoder?
[00:40:18 CET] <Banana51> I am looking at this class in python that is using ffmpeg with a source file specified (https://github.com/imayhaveborkedit/discord.py/blob/voice-recv-mk2/discord/player.py#L163) - How can I use ffmpeg to fetch audio from system microphone? I have tried with "ffmpeg -f alsa -i hw:0 -ar 48000 -ac 2 -loglevel warning", but it wants me to specify output file. However, I just want it to be streamed
[00:40:20 CET] <Banana51> like an input file (see link)
[00:40:27 CET] <furq> the 4 has a new hwenc block which does hevc
[00:40:37 CET] <phobosoph> raspberry 4?
[00:40:39 CET] <furq> yeah
[00:40:41 CET] <phobosoph> ah
[00:40:44 CET] <furq> no idea if the h264 quality got any better
[00:40:47 CET] <furq> it can't have got any worse
[00:41:10 CET] <phobosoph> mine is a raspby 3 :/  well
[00:41:23 CET] <phobosoph> currently I use a vps without traffic issues for now
[00:41:32 CET] <phobosoph> but a raspby could remove the need for it
[00:42:29 CET] <phobosoph> hm, nvidia jetson (120 bucks) could do it, too
[00:42:34 CET] <phobosoph> well, if it supports it
[00:42:37 CET] <phobosoph> but that would be awesomely fast
[00:43:35 CET] <JEEB> IIRC nvidia decided with that stuff that using standard APIs with standard APIs is not fun
[00:43:50 CET] <JEEB> so it has v4l2 hwdec for stuff, but you have to utilize some custom nvidia lib in the middle?
[00:44:02 CET] <JEEB> which would be less bad if it was open source. it isn't
[00:45:02 CET] <Banana51> Seems like I have to use pipe:1
[00:45:18 CET] <JEEB> I think someone was interested in implementing the jetson stuff on -devel, but I had to stop his description of it after I heard "linking against nvidia lib"
[04:22:50 CET] <Adcock> What ffmpeg command will show me what hardware accelerated capabilities I have on my computer?
[05:19:25 CET] <TheDcoder> Hi, is it possible to process multiple files with ffprobe?
[05:20:01 CET] <TheDcoder> I tried passing multiple files in the command but I got an error about another file already being specified
[05:21:11 CET] <TheDcoder> Currently I am using a script which uses ffprobe to process a few hundred video files in a loop, this process is very slow and it is just supposed to be get the dimensions of each video
[05:21:56 CET] <TheDcoder> So I was thinking I can speed this up by passing all of those files to ffprobe in a single command and get the output of all files at once
[05:25:53 CET] <furq> you can't and it wouldn't speed it up anyway
[05:25:58 CET] <TheDcoder> Actually it took me about 20 minutes to process ~700 files (mp4, h256)
[05:26:10 CET] <TheDcoder> furq: oh... why is that? :(
[05:26:34 CET] <TheDcoder> I imagine just getting the dimensions of a video is a simple enough operation?
[05:26:39 CET] <furq> the bottleneck is disk speed
[05:27:00 CET] <furq> you could try running it with xargs -P or something but i doubt it'd be significantly faster
[05:27:01 CET] <TheDcoder> oh, does ffprobe read all of the file to determine the dimensions?
[05:27:11 CET] <furq> no but it doesn't actually decode anything
[05:27:20 CET] <furq> it just reads the first 5MB or so of the file
[05:27:45 CET] <furq> you could try reducing -probesize
[05:27:48 CET] <TheDcoder> I see... but isn't it sufficient enough to read the metadata alone to determine the video dimensions?
[05:28:20 CET] <furq> i don't think ffprobe is optimised to only read as much as necessary to get the entries you selected
[05:28:34 CET] <furq> afaik it just reads 5MB, gets all the properties and then displays the ones you asked for
[05:28:55 CET] <TheDcoder> hmmm...
[05:29:00 CET] <furq> but all those properties are stored in headers so it's not doing anything cpu intensive
[05:29:23 CET] <TheDcoder> furq: I cannot find the -probesize option here: https://ffmpeg.org/ffprobe.html
[05:29:47 CET] <furq> it's a libavformat option
[05:29:57 CET] <furq> it works for all the ff* tools
[05:30:14 CET] <furq> https://www.ffmpeg.org/ffmpeg-formats.html#Format-Options
[05:30:59 CET] <furq> -probesize 1M or something
[05:31:14 CET] <TheDcoder> ah, I see
[05:31:27 CET] <TheDcoder> would -probesize 1024 suffice?
[05:31:35 CET] <furq> probably not
[05:31:37 CET] <furq> depends on the file
[05:31:40 CET] <furq> hence it being configurable
[05:31:48 CET] <TheDcoder> okay, will try it, thanks
[05:32:53 CET] <TheDcoder> I have another question, is it possible for several input videos with constant frame rate to output a video with variable frame rate when I am using the copy option?
[05:34:07 CET] <furq> maybe
[05:34:19 CET] <furq> the input files need the same timebase but afaik they don't need to be the same framerate
[05:34:25 CET] <furq> you'll need -vsync vfr anyway
[05:38:18 CET] <TheDcoder> what if the input have the same fps with constant frame rate?
[05:38:24 CET] <TheDcoder> *input files
[05:38:49 CET] <TheDcoder> Is it possible to merge them into a single file with the same constant fps in that case?
[05:40:08 CET] <furq> if they have the same timebase, sure
[05:41:02 CET] <TheDcoder> I am not sure that I really understand what "timebase" means...
[05:41:36 CET] <TheDcoder> but the input files are from the same source, so I imagine there is a pretty good chance that they have the same timebase
[05:41:48 CET] <furq> yeah they will do
[05:42:18 CET] <TheDcoder> assuming that this is true, do I need to use any special options to specify constant frame-rate output?
[05:42:21 CET] <furq> no
[05:43:26 CET] <TheDcoder> so if the output has vfr, that means some of the videos don't have the same timebase, am I right?
[05:43:46 CET] <furq> it won't be vfr unless you set -vsync vfr
[05:44:01 CET] <furq> and they have the same timebase or else the concat demuxer will throw an error
[05:44:36 CET] <TheDcoder> very strange, I am not using -vsync vfr
[05:44:49 CET] <TheDcoder> but MediaInfo says the output has VFR
[05:45:18 CET] <TheDcoder> and the concat demuxer is not throwing any errors as far as I know
[05:45:22 CET] <furq> is this a 23.97/29.97fps mkv
[05:45:32 CET] <furq> mediainfo always reports those as vfr unless you muxed them with merge
[05:45:33 CET] <furq> mkvmerge
[05:45:58 CET] <furq> or 59.94 or whatever
[05:46:24 CET] <TheDcoder> it is not a mkv
[05:46:29 CET] <TheDcoder> the output is an mpv
[05:46:31 CET] <TheDcoder> *mp4
[05:46:57 CET] <furq> actually nvm i forgot -vsync auto is the default
[05:47:02 CET] <TheDcoder> ...
[05:47:06 CET] <TheDcoder> :)
[05:47:20 CET] <TheDcoder> so, what should I set -vsync to turn it off? :
[05:47:22 CET] <TheDcoder> :)
[05:47:39 CET] <furq> try -vsync cfr but i'm not sure that'll work with copy
[05:47:46 CET] <furq> since it'd have to drop frames, which it can't do
[05:48:02 CET] <furq> or -vsync drop
[05:48:13 CET] <TheDcoder> how about -vsync passthrough
[05:48:36 CET] <furq> yeah maybe
[05:48:42 CET] <TheDcoder> sounds good
[05:48:58 CET] <furq> i assume the concat demuxer corrects the timestamps
[05:49:05 CET] <furq> since that obviously won't work otherwise
[05:49:54 CET] <TheDcoder> as long as it doesn't drop any frames or make the output vfr, that is not a problem
[05:50:15 CET] <furq> copying will never drop frames
[05:50:25 CET] <TheDcoder> got it
[05:50:28 CET] <TheDcoder> thanks
[05:50:29 CET] <furq> you can't just drop frames out of most video codecs without breaking it
[05:50:41 CET] <TheDcoder> makes sense
[05:50:59 CET] <TheDcoder> by the way, any idea what the -safe option is for?
[05:51:37 CET] <furq> https://www.ffmpeg.org/ffmpeg-formats.html#Options
[05:51:40 CET] <TheDcoder> I can't find it anywhere in the documentation
[05:51:49 CET] <TheDcoder> oops
[05:51:50 CET] <furq> it's specific to the concat demuxer
[05:52:09 CET] <furq> if the concat list has 'ffconcat version 1.0' at the top then it auto-enables it
[05:52:35 CET] <TheDcoder> thanks, was just curious
[05:52:37 CET] <furq> or disables it, rather
[05:52:47 CET] <TheDcoder> I understand
[05:56:24 CET] <TheDcoder> -vsync passthrough seems to have done the trick
[05:56:46 CET] <TheDcoder> I tested with a small sample and it shows constant frame rate in MediaInfo...
[06:00:22 CET] <TheDcoder> Now only if there was a faster way to get the dimensions of the video... -probesize 1024 works actually
[06:00:36 CET] <TheDcoder> so that should in theory ease the disk speed bottleneck
[06:01:18 CET] <TheDcoder> but I think the real bottle neck is the overhead of starting a new process for every video
[06:01:41 CET] <TheDcoder> and not to mention ffmpeg will perform some common initialization actions every time
[13:13:22 CET] <snooky> hi all
[14:03:22 CET] <pagios> -i plyalist.m3u8  -live_start_index -3 -t 10 -acodec copy -vcodec copy out.mp4 <<-- is this how i convert from hls to mp4? it is not working as expected. I am not seeing any video playing on mp4
[14:15:50 CET] <pagios> JEEB, hi
[15:14:01 CET] <pagios> any idea how to extract 1 frame image from a livestream?
[16:13:33 CET] <snooky> Is there a way to calculate the required power for ffmpeg?
[16:13:47 CET] <JEEB> not really
[16:14:13 CET] <JEEB> depends on the speed you require, what modules you utilize and the drivers even
[16:16:52 CET] <snooky> So I wrote a little script. and depending on the input material ffmpeg breaks down. or it hacks everything. or a huge delay
[16:52:17 CET] <snooky> https://imgur.com/a/nLJMaD3
[16:53:32 CET] <snooky> does ffmpeg get it? so live. no 2 pass or similar. directly, live.
[17:13:08 CET] <ddubya> I'm using nv-codec-headers git master and the version supplied is not supported anymore. I'll look elsewhere for the header I guess
[17:16:09 CET] <ddubya> nvm, I may have another issue
[17:21:01 CET] <ddubya> multiple versions installed
[18:08:22 CET] <pagios> i am transcoding a source RTMP stream in HD (1280x720)  to 480p and 320p and i used  -vf scale=..  -vb.. params, are those enough or idealy i should set a preset too?
[18:15:07 CET] <DHE> if you're using a non-standard preset, you will want to set that on all outputs
[18:15:14 CET] <DHE> *non-default
[18:44:18 CET] <pagios> DHE, thanks. any other parametersthat would ease the cpu and optimze stuff/
[19:02:43 CET] <analogical> what is wrong with this command? "ffmpeg -i audio.wav -b 320 audio.mp3"  ??
[19:03:45 CET] <furq> 320k
[19:04:14 CET] <pagios> thats 320kbps not kB
[19:06:53 CET] <analogical> when I type "ffmpeg -i audio.wav -b 320k audio.mp3" it still creates a file that is 128kbps :(
[19:07:34 CET] <furq> did you try reading the warning it throws
[19:09:15 CET] <analogical> ffmpeg is an extraordinary tool but it has the worst syntax of any program I've ever used
[19:10:28 CET] <Reinhilde> ffmpeg -i input.wav -codec:a libmp3lame -b:a 320k output.mp3
[19:10:41 CET] <furq> how do you suggest the syntax should be improved
[19:20:10 CET] <klaxa> when compared with most other programs with long command line arguments it seems rather logical and sane
[19:20:38 CET] <klaxa> then again, i don't use many cli programs with as many arguments as ffmpeg
[19:29:38 CET] <another> analogical: -b:a 320k
[19:30:39 CET] <analogical> another, :D
[19:30:55 CET] <another> there is even a warning, that informs you of -b beeing ambiguous
[19:33:59 CET] <analogical> what a stupid program hehe
[19:37:37 CET] <analogical> the problem with https://www.ffmpeg.org/ffmpeg-codecs.html is that there are too few examples
[19:37:48 CET] <analogical> examples makes everything MUCH easier
[19:43:11 CET] <analogical> for example https://www.ffmpeg.org/ffmpeg-codecs.html#Options-7 doesn't tell what syntax to use when chosing compression level
[19:45:07 CET] <analogical> like I said before ffmpeg is an extraordinary tool but obviously the developers don't want it to be easy to use
[19:50:44 CET] <another> i don't believe that to be true
[19:51:38 CET] <another> what syntax help are you missing? there's the option and the allowed values it can have
[19:56:35 CET] <klaxa> i think there's plenty of examples on the internet, most stuff has examples, everything is documented
[19:57:02 CET] <klaxa> once you "grok" it, it's very intuitive
[19:57:27 CET] <another> hmm.. well, actually compression_level could use a bit more info
[19:57:46 CET] <analogical> the correct syntax for flac was "-compression_level" but on ffmpeg.org they typed "compression_level" which doesn't work
[19:57:58 CET] <another> like which value are outside of the subset
[19:58:02 CET] <TheWild> hello
[19:58:22 CET] <analogical> they need to offer examples!!
[20:00:20 CET] <TheWild> I have a 11 GB movie which has been copied multiple times over many computers. One computer had slightly damaged RAM. That was just one bit, but enough to cause mess during copying.
[20:00:21 CET] <TheWild> The movie plays seemingly fine, but I'd like to check the consistency of the file. I don't have any checksum to check against.
[20:00:21 CET] <TheWild> Is there a way to use ffmpeg to simply go through every frame and tell the errors?
[20:00:46 CET] <another> https://ffmpeg.org/ffmpeg-all.html#Examples-1
[20:03:47 CET] <TheWild> okay, I'll just try ffmpeg -i video.mkv -f null -
[20:16:10 CET] <kepstin> note that ffmpeg can only tell you if there's an error that causes something like a bad reference or syntax issue - it's possible (depending on the format/codec) that there are errors left that still decode fine but cause visible differences.
[20:16:32 CET] <carltoso23> hi
[20:19:11 CET] <ddubya> is there a way to see what pixel formats a filter supports
[20:19:29 CET] <carltoso23> I tried to make a mosaic video with 4 videos all mkv videos, but the audio does not follow when playing the new mosaic mkv video
[20:19:41 CET] <carltoso23> it doesn't have any audo
[20:19:43 CET] <carltoso23> audio*
[20:20:37 CET] <ddubya> so the file has no audio at all then?
[20:21:02 CET] <carltoso23> yes no audio at all
[20:21:39 CET] <ddubya> try a simpler example and see if you get audio.. like -vf unsharp or something
[20:22:15 CET] <kepstin> carltoso23: when you'd using -filter_complex with video, you have to explicitly say what to do with the audio too, or you'll just get no audio
[20:22:28 CET] <kepstin> what do you want to do? take the audio from one video? mix them all together?
[20:22:54 CET] <carltoso23> I want audio from all of the 4 videos mixed into one
[20:23:06 CET] <carltoso23> the final mosaic video
[20:23:21 CET] <kepstin> carltoso23: then you have to add an "amix" filter to your filter chain to mix them together.
[20:23:29 CET] <carltoso23> amix
[20:23:34 CET] <carltoso23> i will look that up!
[20:24:44 CET] <kepstin> it should work similarly to the filter you're using to make the video mosaic; you put multiple sets of filters into a single -filter_complex option by separating them with ;
[20:25:23 CET] <carltoso23> "ffmpeg -i 1.mkv -i 2.mkv -i 3.mkv -i 4.mkv -filter_complex: -filter_amix "nullsrc=size=640x480 [base]; [0:v] setpts=PTS-STARTPTS, scale=320x240 [upperleft]; [1:v] setpts=PTS-STARTPTS, scale=320x240 [upperright]; [2:v] setpts=PTS-STARTPTS, scale=320x240 [lowerleft]; [3:v] setpts=PTS-STARTPTS, scale=320x240 [lowerright]; [base][upperleft]
[20:25:23 CET] <carltoso23> overlay=shortest=1 [tmp1]; [tmp1][upperright] overlay=shortest=1:x=320 [tmp2]; [tmp2][lowerleft] overlay=shortest=1:y=240 [tmp3]; [tmp3][lowerright] overlay=shortest=1:x=320:y=240" -c:v libx264 outpu2t.mkv"
[20:25:25 CET] <carltoso23> ?
[20:27:25 CET] <kepstin> uh, i have no ide what you just did
[20:27:59 CET] <kepstin> you want to add ";[0:a][1:a][2:a][3:a]amix=inputs=4" fo the end of the existing -filter_complex argument
[20:28:38 CET] <carltoso23> ah!
[20:28:41 CET] <carltoso23> wonderful
[20:28:55 CET] <carltoso23> i just added "-filter_complex: (-filter_amix) "
[20:29:39 CET] <kepstin> (newer ffmpeg versions have filters that can do the video mosaic in fewer steps, but that'll work fine)
[20:33:17 CET] <carltoso23> ah xstack?
[20:33:22 CET] <carltoso23> -filter_complex;[0:a][1:a][2:a][3:a]amix=inputs=4
[20:33:25 CET] <carltoso23> correct ?
[20:34:05 CET] <another> replace the semicolon after filter_complex with a space
[20:37:26 CET] <carltoso23> ah thanks
[20:37:39 CET] <carltoso23> " Unable to find a suitable output format for 'nullsrc=size=4608x2880 [base]; [0:v] setpts=PTS-STARTPTS, scale=1152x720 [upperleft]; [1:v] setpts=PTS-STARTPTS, scale=1152x720 [upperright]; [2:v] setpts=PTS-STARTPTS, scale=1152x720[lowerleft]; [3:v] setpts=PTS-STARTPTS, scale=1152x720 [lowerright]; [base][upperleft] overlay=shortest=1 [tmp1];
[20:37:39 CET] <carltoso23> [tmp1][upperright] overlay=shortest=1:x=320 [tmp2]; [tmp2][lowerleft] overlay=shortest=1:y=240 [tmp3]; [tmp3][lowerright] overlay=shortest=1:x=320:y=240'nullsrc=size=4608x2880 [base]; [0:v] setpts=PTS-STARTPTS, scale=1152x720 [upperleft]; [1:v] setpts=PTS-STARTPTS, scale=1152x720 [upperright]; [2:v] setpts=PTS-STARTPTS, scale=1152x720[lowerleft];
[20:37:40 CET] <carltoso23> [3:v] setpts=PTS-STARTPTS, scale=1152x720 [lowerright]; [base][upperleft] overlay=shortest=1 [tmp1]; [tmp1][upperright] overlay=shortest=1:x=320 [tmp2]; [tmp2][lowerleft] overlay=shortest=1:y=240 [tmp3]; [tmp3][lowerright] overlay=shortest=1:x=320:y=240: Invalid argument"
[20:37:44 CET] <carltoso23> ran into this error
[20:37:56 CET] <carltoso23> ah I have to edit the last part
[20:38:00 CET] <snooky> how can I insert a video device as an overlay in a video?
[20:39:02 CET] <carltoso23> "ffmpeg -i 1.mkv -i 2.mkv -i 3.mkv -i 4.mkv -filter_complex [0:a][1:a][2:a][3:a]amix=inputs=4 "nullsrc=size=4608x2880 [base]; [0:v] setpts=PTS-STARTPTS, scale=1152x720 [upperleft]; [1:v] setpts=PTS-STARTPTS, scale=1152x720 [upperright]; [2:v] setpts=PTS-STARTPTS, scale=1152x720[lowerleft]; [3:v] setpts=PTS-STARTPTS, scale=1152x720 [lowerright];
[20:39:03 CET] <carltoso23> [base][upperleft] overlay=shortest=1 [tmp1]; [tmp1][upperright] overlay=shortest=1:x=1152 [tmp2]; [tmp2][lowerleft] overlay=shortest=1:y=720 [tmp3]; [tmp3][lowerright] overlay=shortest=1:x=1152:y=720" -c:v crf 23 -preset medium -movflags +faststart -c:a aac libx264 outpu2t.mp4"
[20:40:23 CET] <snooky> https://nopaste.linux-dev.org/?1270917
[20:41:47 CET] <snooky> and there i would have /dev/video2 add as "overlay"
[20:41:56 CET] <furq> you can't use -vf twice and you can't use filters with -c:v copy
[20:42:04 CET] <furq> and you can't use overlay with -vf
[20:43:09 CET] <snooky> with this script
[20:43:19 CET] <snooky> i became an rtsp stream WITH logo as overlay
[20:46:12 CET] <snooky> can I then add / dev / video1 and / dev / video2 together and then put the overlay over it?
[00:00:00 CET] --- Sun Nov  3 2019


More information about the Ffmpeg-devel-irc mailing list