[Ffmpeg-devel-irc] ffmpeg.log.20130524

burek burek021 at gmail.com
Sat May 25 02:05:02 CEST 2013


[01:17] <vade> is it possible to set the SPS-ID of a h.264 video stream which using copy ?
[01:17] <vade> I am attempting to solve some concatenation problems, and I would like to avoid a re-transcode
[01:47] <vade> interesting
[01:47] <vade> MP4Box appears to be able to cat the files together in a way that does not result in issues, the same files with FFMPEG produce an issue
[02:53] <matamus> hi
[02:55] <matamus> i try to make screenshots with ffmpeg
[02:55] <matamus> i use
[02:55] <matamus> ffmpeg -ss 00:30:01.01 -i input.mkv -y -f image2 -vframes 1 output.png
[02:56] <matamus> but it seems it make a screenshot in PAR but i want screenshots in DAR
[02:56] <matamus> is it possible?
[03:17] <matamus> http://pastebin.com/RVubSqW4
[03:41] <matamus> thanks ubitux but it doesnt help
[03:46] <ubitux> matamus: the point is that we don't really support the tool you're using, so you'll likely won't get much help
[03:48] <matamus> you just support avconv?
[03:48] <ubitux> no
[03:48] <sacarasc> avconv is what you're using.
[03:48] <ubitux> we support ffmpeg
[03:49] <matamus> ah, ok
[03:50] <matamus> its confusing
[03:53] <sacarasc> If you want to use FFmpeg but not install anything, try one of them.
[03:53] <sacarasc> Otherwise, the nice people in #libav should be able to help.
[03:55] <Keshl> Hopefully third time's the charm.. http://pastebin.com/Jckai6Qi
[03:57] <Keshl> llogan: I did, didn't I? oÉo
[03:58] <Keshl> Oh, forgot the output, my bad.
[04:00] <Keshl> llogan: http://pastebin.com/t3VDAJXp -- The error at the end is spawned by me feeding it a broken clip. I just lopped the file off for demonstration purposes, but even with the unlopped file it's weird as crap.
[04:02] <llogan> it's hard to follow what the issue is exactly.
[04:03] <Keshl> Sec, lemme get the not-lopped-off video..
[04:04] <matamus> so i hope i used ffmpeg now.. same problem
[04:04] <matamus> http://pastebin.com/Qmjd0bN2
[04:06] <ubitux> you might need to use -vf scale
[04:08] <ubitux> https://ffmpeg.org/ffmpeg-filters.html#scale try playing with the sar and dar in the expression
[04:08] <ubitux> i don't need if there is a simpler solution
[04:10] <llogan> Keshl: i have to go now, but you can always try ffmpeg-user mailing list if you don't get an answer here
[04:10] <Keshl> Dang D:
[04:11] <Keshl> And I have no idea how to use those. Tried, never worked. x.x
[04:21] <Keshl> http://pastebin.com/ixQhzW3G in case anyone's around. xwx
[04:23] <ubitux> matamus: -vf scale=w=sar*iw seems to do the trick here, but that's not very generic
[04:37] <matamus> oh yes! thanks ubitux ./ffmpeg -ss 00:10:01.01 -i /media/music/summer.mkv -y -f image2 -vframes 1 -vf scale=w=sar*iw:h=ih screenshot6.png
[04:37] <matamus> is the solution :D
[04:37] <ubitux> i'm not sure it will work as expected with < 0 sar
[04:38] <ubitux> you might need to use min/max utils
[04:39] <ubitux> something like scale=max(sar,1)*iw:max(1/sar,1)*ih (completely untested but you get the point)
[04:39] <ubitux> matamus: btw, if you're using ffmpeg to generate thumbnail, you might be interested in the scene detection
[04:39] <ubitux> also, -f image2 is not necessary in your command line
[04:42] <matamus> hmm noe i see its not the solution i made a screenshot 1012x572 and it might be 1016x572.. i will check it wirh the max utils
[04:44] <matamus> bash: syntax error near unexpected token `('
[04:44] <ubitux> -vf "scale='max(sar,1)*iw':'max(1/sar,1)*ih'"
[04:44] <ubitux> (still untested)
[04:47] <matamus> it works but same problem screenshot is 1012x572 and might be 1016x572
[04:49] <ubitux> check the doc then and adapt to your need
[05:06] <matamus> oh.. i guess its all fine.. i made a screenshot with vlc and it shows also 1012x572.. so it seems my thought it might be 1016x572 was wrong
[05:06] <matamus> ok, so everything is fine
[05:06] <matamus> ubitux, many thanks!
[05:27] <bigmac> i fail to log error output from ffmpeg
[05:27] <bigmac> ffmpeg #{input} #{output} -v error &> error.log
[05:30] <bigmac> i thought that was what was said last time i asked this
[06:06] <ubitux> matamus: you should set sar to 1 after this btw
[06:36] <matamus> ok
[08:51] <wlritchi> How might I go about making a mkv with a slideshow of a series of jpgs, pngs, anigifs, from a dumb script?
[08:52] <wlritchi> I'm inclined to convert all to png using imagemagick (which handles framing on anigifs) and then make symlinks to extend each image to n frames (as seen by ffmpeg)
[08:52] <wlritchi> That would work and would even handle the animation properly without too much difficulty, but is there a better way you can think of to do it?
[09:15] <zeeflo> hey guys
[09:15] <zeeflo> I need to convert a file to ogv
[09:15] <zeeflo> for html5 playback
[09:16] <zeeflo> If I show you my command to convert a file to MP4, can you help me create a similar command to convert to ogv ?
[09:21] <wlritchi> I thought HTML5 preferred webm (but that's beside the point)
[09:22] <zeeflo> webm is too unmature
[09:22] <zeeflo> imature*
[09:23] <wlritchi> I suppose I don't need the output of the command when we're not troubleshooting
[09:23] <zeeflo> and converting bluray sources to 720p webm takes... 10+ hours for a feature length movie.. even though we use xeon series 5 dual cpu setups
[09:23] <zeeflo> only 5-6 fps conversion time
[09:23] <zeeflo> wlritchi: ill get it now :)
[09:24] <wlritchi> Ouch, is VP8 encoding really that hard on the processor?
[09:25] <wlritchi> And is theora any better?
[09:26] <zeeflo> http://pastebin.com/ACjLkaAK
[09:26] <zeeflo> yes it is really that hard on the cpus
[09:27] <zeeflo> and also, google is currently developing webm9
[09:27] <zeeflo> i actually think you can get betas of their binary
[09:32] <wlritchi> I forget what -strict experimental does, but I rather think you won't need it for ogv
[09:32] <zeeflo> i probably wont
[09:33] <wlritchi> I think you can just use this command http://pastebin.com/4x3NqrYq (changed the codecs, removed mp4-specific flags), minus strict experimental if you don't think it's necessary
[09:34] <zeeflo> thank you
[09:34] <wlritchi> Keep in mind that IE, ever the standards-compliant, doesn't support ogv or webm
[09:34] <wlritchi> So you'll want an mp4 file listed in the video tag as well
[09:35] <zeeflo> to hell with ie
[09:35] <zeeflo> but yes
[09:35] <zeeflo> everything we have is mp4
[09:35] <zeeflo> but we want ogv support for html5 playback as well
[09:35] <wlritchi> Out of curiosity, (feel free not to answer), what's the application? Streaming of full-length movies sounds like a netflix competitor :P
[09:36] <zeeflo> so were thinking about converting all our mp4's to ogv as well
[09:36] <zeeflo> yes, but were desktop based
[09:36] <zeeflo> we dont do ps3 crap
[09:36] <zeeflo> pure html5 and flash
[09:37] <wlritchi> So what's the advantage to your service over netflix? Better international selection (he asked hopefully)?
[09:38] <zeeflo> our bit advantage is that you can bring your account with you overseas.
[09:38] <zeeflo> we dont lock to country
[09:39] <zeeflo> and no, we dont have a better international selection per say, were streaming with closed captioning
[09:39] <zeeflo> so you have to be from specific parts of europe to "legally" use our service
[09:40] <zeeflo> but, like I said, one of our big advantages is that you can bring your device with you to.... korea, and still access our content.
[09:41] <wlritchi> Neat
[09:41] <wlritchi> In any case, the selection can hardly be worse than canadian netflix
[09:44] <zeeflo> well..
[09:44] <zeeflo> We lease our rights from epix.
[09:45] <zeeflo> as do netflix
[09:45] <zeeflo> but netflix sucks in general.. yes
[09:45] <zeeflo> were not streaming series..
[09:45] <zeeflo> we lease rights to movies only
[09:46] <zeeflo> so we have got alot of new stuff as we do not have expendures to series
[10:16] <highgod> Huemac:how can I test whether the bug https://ffmpeg.org/trac/ffmpeg/ticket/2603 is fixed? I use mingw
[10:36] <Huemac> highgod: well use the --enable-opencl flag and get some opencl win library from somewhere
[10:36] <wlritchi> So I managed to get an FPS of 2 while encoding on a quad-core i7
[10:36] <wlritchi> Is there an achievement for that?
[10:37] <Huemac> highgod: well most importantly i think try first with any compiler does it work after you have fixed it
[10:37] <Huemac> the if it works it works also with mingw32
[10:38] <JEEB> wlritchi, no -- this is on a 16core http://up-cat.net/p/ff265f5b
[10:39] <wlritchi> Darn, I thought 1080p x264 veryslow -qp 0 was bad, that must be at least 4K
[10:39] <JEEB> that's 720p
[10:39] <JEEB> but not libx264 of course :)
[10:39] <wlritchi> Ah, I see
[10:39] <wlritchi> What codec is it?
[10:39] <JEEB> the HEVC reference encoder
[10:39] <JEEB> HM
[10:40] <wlritchi> oh dear
[10:40] <JEEB> gives pretty good results, but is dumb as a brick and slow as a mule
[10:40] <JEEB> was encoding some samples with it for later testing
[10:40] <wlritchi> I think mules go faster than that, actually
[10:41] <wlritchi> I really shouldn't be doing this with x264
[10:41] <wlritchi> I should use a png delta codec
[10:42] <wlritchi> Is such a thing currently in existence?
[10:43] <highgod> Huemac:"get some opencl win library from somewhere", sorry, don't get it
[10:44] <wlritchi> I just remembered I have access to 64 cores on a watercooled supercomputer and I haven't been using them for the last hour and a half of encoding
[10:44] <zeeflo> seriously i dont get it..
[10:44] <Huemac> highgod: hard to explain. but how excatly do you want to test it?
[10:45] <zeeflo> why is libtherora almost as slow as webm8 ?
[10:45] <Huemac> highgod: with the win32 opencl library i mean like AMD-APP-SDK
[10:45] <zeeflo> 16fps with a dual CPU xeon 5320 @ 2.4 ghz 8 core?
[10:45] <zeeflo> wtf?
[10:46] <ubitux> Huemac: can't you just send a patch? also, what is the reason it doesn't fail for highgod?
[10:46] <JEEB> zeeflo, nothing else is as optimized as libx264
[10:46] <wlritchi> How fast was libx264 going?
[10:46] <wlritchi> And what JEEB said
[10:46] <JEEB> libvpx and libtheora will be slow in comparison
[10:46] <highgod> OK, wait a moment
[10:47] <zeeflo> wlritchi: around 100fps
[10:48] <JEEB> I think libvpx at least had some kind of pre-set configs ("realtime", "good", "best"), don't remember how exactly to poke at them from ffmpeg tho
[10:48] <zeeflo> why make a webstreaming format standard if its not even optimized?
[10:48] <zeeflo> forget this!! People will have to settle with mp4 and use chrome!
[10:48] <zeeflo> to hell with it
[10:48] <Huemac> ubitux: sending a patch for so simple bug seems like too much effort.
[10:48] <Huemac> im just lazy
[10:48] <JEEB> Well, the standard and the implementations are separate, that you have to remember. But hey, big G was telling at the other IETF talk that "x264 is not optimized for live streaming" ;)
[10:49] <wlritchi> pfft
[10:49] <Huemac> there is no good instructions on how to send a patch using git
[10:49] <zeeflo> JEEB: maybe so. But it does work, and it works very well..
[10:49] <Huemac> i tried yesterday on i found from google but it didn't work
[10:49] <zeeflo> atleast if you use flash
[10:50] <wlritchi> Okay, now I've truly derped. I tried to make a slideshow script supporting anigifs
[10:50] <JEEB> zeeflo, the ";)" was kind of meaning that the stuff that G person said was not true
[10:50] <zeeflo> ah
[10:50] <zeeflo> hehe
[10:50] <JEEB> I think one of the x264-related folk went and send an angry letter to them for it
[10:50] <wlritchi> But I forgot to take into account that not all gifs are 30fps
[10:51] <zeeflo> JEEB: well, it doesnt surprise me they would say something like that.. Theyll soon have their own web9..
[10:51] <JEEB> it's vp9, just like the previous was vp8 :P
[10:51] <zeeflo> And ill bet you this! When they have it, they will most likely close for mp4 in their browser
[10:51] <wlritchi> Nah, they can't do that
[10:51] <JEEB> webm is the naming for the "package" (subset of matroska etc.)
[10:51] <zeeflo> yea
[10:51] <zeeflo> vp9
[10:51] <wlritchi> not until they have closer to a monopoly, anyway
[10:52] <JEEB> and vp9 is "alright", but just sounds like a rip-off of HEVC in many parts IMHO :P
[10:52] <JEEB> it will be better than AVC/H.264 if the encoder gets psy optimizations (libvpx never got those for vp8 tho, lol)
[10:53] <JEEB> but a good HEVC implementation most probably will be better than a good VP9 implementation, although we're still a year or so away from seeing that
[10:53] <JEEB> (if not some more)
[10:53] <wlritchi> Okay, I have a question
[10:54] <wlritchi> And I've just lost it
[10:54] <wlritchi> It's too late...
[10:54] <gagan_> hello everybody
[10:54] <gagan_> i have one question
[10:55] <wlritchi> Oh, I remember. I have an aiff with a few mjpeg streams from album art attachments
[10:55] <wlritchi> How do I get those attached to an mkv file?
[10:56] <gagan_> i am developing social networking site i need to use ffmpeg for video encoding ..how can i start using this service..??
[10:57] <t4nk095> does any one know the issue 855 :http://roundup.libav.org/issue855 is fixed or not?
[10:57] <t4nk095> because I also the same problem currently,
[10:58] <wlritchi> Are you having issues with ffmpeg's ffserver, or libav's ffserver?
[10:58] <t4nk095> in ffmdec.c, ffm_read_packet, it will not get the correct data of ffm->header
[10:58] <t4nk095> yes
[10:58] <t4nk095> ffm->header is corrupt
[10:59] <wlritchi> I need to learn to stop asking OR questions, and start asking WHICH OF questions
[10:59] <gagan_>  I have a web application that user's are going to be uploading video files with differenct formats (e.g. WOV, AVI, MPEG...). ineed to convert that files into flash files..
[10:59] <gagan_> how can i do this task..?
[11:00] <relaxed> gagan_: you need h264 videoand aac audio in the mp4 container.
[11:01] <gagan_> it will automatically convert when user upload videos..??
[11:03] <relaxed> ffmpeg is a tool that could be used to do it along with some scripting.
[11:03] <gagan_> relaxed: i am new to this ffmpeg and i m interested to use this tool for my web application
[11:04] <relaxed> you'll have to feed ffmpeg input somehow
[11:04] <gagan_> relaxed: i have a plan to develop this using java libraries
[11:04] <gagan_> where can i start from
[11:06] <relaxed> first read the docs about ffmpeg and learn how to use it.
[11:06] <gagan_> relaxed:  https://ffmpeg.org/trac/ffmpeg/wiki/UbuntuCompilationGuideQuantal  can i use this steps for installation on ubuntu 12.04
[11:06] <gagan_> yeah sure i will do it
[11:07] <relaxed> yes, it says 12.04 in the title.
[11:07] <wlritchi> Someone should update that guide actually
[11:08] <gagan_> okay sure i will use this.. thank you
[11:08] <relaxed> "Last modified 40 hours ago"
[11:08] <relaxed> update what?
[11:08] <wlritchi> After the first step from the quantal one, the raring instructions work just fine on quantal
[11:09] <wlritchi> But they don't require that you uninstall everything avconv-based
[11:09] <ubitux> Huemac: can you test a patch for me?
[11:09] <wlritchi> Read: Half the damn system (yay ubuntu -.-)
[11:10] <ubitux> Huemac: http://b.pkh.me/0001-lavu-opencl-remove-semi-colon-from-macro.patch
[11:10] <ubitux> please try this
[11:11] <JEEB> also which compiler is it that is giving the trouble?
[11:13] <ubitux> JEEB: any i suppose.
[11:14] <ubitux> a label with no statement will fail afaict
[11:14] <JEEB> but... the macro has a ; there, no?
[11:15] <JEEB> although yes, most macros I've seen have no ; inside them and then there's a ; outside the macro
[11:15] <ubitux> JEEB: not the empty one
[11:15] <ubitux> it's either #define FOOBAR bla(); or #define FOOBAR
[11:15] <Huemac> ubitux: wait a second
[11:19] <Huemac> ubitux: it will take about 15min to build the whole crap because my build machine is so slow :D
[11:20] <ubitux> ccache is your friend
[11:22] <Huemac> ubitux: cloned a fresh repo
[11:22] <Huemac> applied your patch
[11:22] <Huemac> now starting build...
[11:23] <Huemac> ./configure ok
[11:23] <Huemac> and now we will have to wait about 15min
[11:26] <ubitux> if it works ;)
[11:33] <t4nk095> how to co-work between ffserver and ffmpeg?
[11:33] <t4nk095> ffmepg will keep writing video data to feed and ffserver keep reading the data from feed
[11:34] <t4nk095> if they are write and read at the same time
[11:34] <t4nk095> ffserver may get the wrong ffm header and stop output stream
[11:34] <t4nk095> does anyone have ideas?
[11:35] <t4nk095> is there any get the problem :ffserver get wrong information of ffm header then stop output video stream
[11:37] <wlritchi> How do I scale a video up to fit 1080p, padding with black bars to get the right resolution (never cropping)?
[11:45] <Huemac> ubitux: works like a charm!
[11:45] <Huemac> ubitux: thank you!
[11:45] <ubitux> great
[11:45] <Huemac> can you make a pull request for it?
[11:45] <ubitux> i'll push later
[11:45] <Huemac> okay
[11:45] <Huemac> could you inform me when you have pushed it?
[11:45] <ubitux> the patch is on the ml, if i get no review i'll push in a few hours
[11:45] <ubitux> ok
[12:01] <ubitux> Huemac: applied
[12:26] <Phase4> is there an alternative to x11grab? on my machine it lags compiz
[12:27] <gmag> hello. I wonder if I can convert a video container to yuv file.
[12:27] <gmag> and also specifying the yuv format
[12:28] <gmag> because I can actually do the conversion, but I dont know exactly how to specify YUV format
[12:29] <gmag> I am using this: ffmpeg -i inputFile.mp4 -f rawvideo outputFile.yuv
[12:30] <JEEB> uhh, the log should tell you what exactly it did, and if your output file ends with dot-yuv it should be doing the right thing :P If the log tells you it used a colorspace you didn't want it to use, then you can add -pix_fmt after -i as well
[12:31] <gmag> JEEB, it is generating yuv420p, and I would like it to be yuv420semiplanar (YUV N21)
[12:32] <JEEB> see if ffmpeg -pix_fmts gives you something like that
[12:32] <JEEB> if it does, then use -pix_fmt to convert to it
[12:34] <JEEB> I can see nv12/nv21 on it at least there
[12:34] <gmag> JEEB, yes!
[12:34] <gmag> JEEB, I can see it too
[12:34] <gmag> let me try with that
[12:38] <gmag> JEEB, sorry, is this the correct way of calling it? "ffmpeg -i x_video.mp4 -f rawvideo x_video.yuv -pix_fmt nv21" It still telles me it used yuv420p
[12:39] <JEEB> you don't need the -f rawvideo there if the output file name ends with dot-yuv, and naturally the -pix_fmt won't work if it's after the output. It just has to be after the input
[12:39] <JEEB> "ffmpeg -i welp.input -pix_fmt nv21 out.yuv" should work
[12:40] <gmag> JEEB, shame on me
[12:40] <gmag> JEEB, that worked fine, thanks a lot
[15:01] <Huemac> ubitux: when is it going to be on the master branch?
[15:03] <ubitux> Huemac: it's in since 12:01:24 <@ubitux> Huemac: applied
[15:03] <ubitux> git reset --hard HEAD^ to remove your local patch and git pull
[15:03] <Huemac> hmmm why i cant see the change in gitbug :O
[15:03] <ubitux> github is a mirror
[15:04] <ubitux> use source.ffmpeg.org
[15:04] <ubitux> github is not updated at every commit
[15:04] <Huemac> ah
[15:04] <Huemac> ok
[15:04] <Huemac> nice
[15:08] <Huemac> wtf :O
[15:08] <Huemac> Unknown option "--enable-opencl".
[15:11] <ubitux> if it doesn't appear in ./configure --help|grep opencl, then you clone the wrong repository
[15:16] <Huemac> yes i think it pulled from the 1.2 branch somehow mysteriously
[15:16] <Huemac> cause 1.2 doesn't have the opencl option
[15:17] <ubitux> git clone git://source.ffmpeg.org/ffmpeg.git ffmpeg
[15:18] <ubitux> ’ https://www.ffmpeg.org/download.html
[15:20] <Huemac> yes yes i know
[15:20] <Huemac> just my stupid build machine tries to pull only the 0.6 branch if it's not specified :D
[15:21] <ubitux> svn?
[15:21] <Huemac> its gone bonkers somehow
[15:22] <Huemac> yep, now it works
[16:05] <fabreg> Hello, I'm trying to concatenate two flv into an mp4 file. I see the <output>.mp4 file but into there is just one video. I'm using this command "ffmpeg -f concat -i input.txt -c copy output.mp4" and in input.txt I have two lines with "file <filename>"
[16:49] <mpfundstein_work> fabreg: ffmpeg -isync -i "concat:file1.flv,file2.flv"
[16:49] <mpfundstein_work> fabreg: dont know if flv's work good . I always use mpeg2video files to concat
[16:51] <fabreg> mpfundstein_work,  maybe the problem is the files had different resolution (800x600 and 320x180)
[16:51] <fabreg> mpfundstein_work, so maybe I have to re-encode them to the same resolution
[16:52] <mpfundstein_work> fabreg: that shouldnt be a problem. but indeed, I always make sure that they have the same res
[16:52] <fabreg> mpfundstein_work, and after try to concat them again
[16:52] <mpfundstein_work> fabreg: try a) my syntax b) mpg instead of flv
[16:52] <fabreg> mpfundstein_work, ok, I'm trying
[16:54] <xlinkz0> durandal11707: if i take a file with the moov atom at the top of the file and transcode it, will it retain the moov atom position?
[16:58] <fabreg> mpfundstein_work, concat:first.flv,second.flv: No such file or directory
[16:59] <fabreg> but the files there are
[16:59] <mpfundstein_work> put the "concat:..." in " "
[17:01] <fabreg> opt/ffmpeg/bin/ffmpeg -isync -i "concat:first.flv" "second.flv" -c copy fab.flv
[17:01] <fabreg> seems to want to rewrite "second.flv"
[17:01] <fabreg> File 'second.flv' already exists. Overwrite ? [y/N]
[17:03] <mpfundstein_work> no no
[17:03] <mpfundstein_work> ffmpeg -isync -i "concat:first.flv,second.flv"
[17:03] <mpfundstein_work> and why -c copy?
[17:03] <mpfundstein_work> you need to reencode it
[17:03] <MadTBone> I have a project using ffmpeg's libav to encode video from a capture device to a file.  It's working well.  However, when I encode to h264 using a constant quantizer higher than around 20, the difference between I frames and the previous P frame is too dramatic, so I get a visual pulse (higher clarity for a few frames) at each I frame.  This happens even when I set i_quant_factor to 1.0 and i_quant_offset to 0.0 (making I frames use the same quantizer
[17:03] <MadTBone> as the previous P frame).  Anyone know how to mitigate this (without lowering the quantizer / increasing the bitrate)?
[17:06] <fabreg> mpfundstein_work, concat:first.flv,second.flv: Protocol not found
[17:06] <mpfundstein_work> did you put it in " " ?
[17:06] <mpfundstein_work> and what version do you have? does it support concat input
[17:10] <fabreg> mpfundstein_work, I'm getting the latest from git
[17:10] <fabreg> ffmpeg -isync -i "concat:first.flv,second.flv"
[17:11] <mpfundstein_work> that should work
[17:11] <mpfundstein_work> :-)
[17:11] <mpfundstein_work> i do it nearly everyday. but as I said, with mpg's
[17:14] <fabreg> concat:first.flv,second.flv: No such file or directory -> this is with latest version
[17:14] <fabreg> the error before was because I used an old version, sorry
[17:20] <durandal11707> xlinkz0: no
[17:20] <xlinkz0> thanks
[17:20] <durandal11707> you always need to give same option if you want same output
[17:22] <fabreg> mpfundstein_work, could you please do a "ffmpeg -version" ?
[17:22] <fabreg> ffmpeg version N-53403-g8d4e969 -> I'm using this
[17:22] <mpfundstein_work> moment
[17:23] <mpfundstein_work> ffmpeg version N-48645-gf3c9d8d
[17:23] <mpfundstein_work> i give you my script
[17:25] <mpfundstein_work> fabreg: https://gist.github.com/MarkusXite/8e56c48bc6cb48b6cf04
[17:25] <mpfundstein_work> fabreg: figure the important stuff out by yourself :-)
[17:30] <mpfundstein_work> fabreg: AHHH
[17:30] <mpfundstein_work> fabreg: You have to separate the concat files by a pipe |
[17:30] <mpfundstein_work> fabreg: -i "concat:file1.flv|file2.flv"
[17:33] <fabreg> mpfundstein_work, hehe now works but I get this error: At least one output file must be specified
[17:34] <fabreg> maybe needs to be specified some output file?
[17:36] <mpfundstein_work> pretty logical :-)
[17:42] <fabreg> mpfundstein_work, same problem of before: I see one mp4 file but it contains just one video.
[17:43] <mpfundstein_work> try mpg
[17:43] <fabreg> mpfundstein_work, but the original file are flv
[17:43] <fabreg> mpfundstein_work, need I to convert to mpg?
[17:44] <mpfundstein_work> try it
[17:44] <mpfundstein_work> :-)
[17:44] <mpfundstein_work> my source files are h264 in mpg for containers
[17:44] <mpfundstein_work> mp4 containers
[17:44] <mpfundstein_work> the only ones I could succesfully concat were mkv and mpeg2
[17:54] <towolf> hi, if i use "-vf ass=subtitle.ass" can I use the blend filter on top of that to blend the subtitles in (multiply, screen, or something)?
[18:04] <ubitux> blend the subtitles in?
[18:04] <ubitux> ass and subtitles filter are doing the blending
[18:05] <towolf> ubitux, thanks, but the only effect they do is alpha blending. if i want to multiply the subtitles in, can a -vf blend filter take the output of the -vf ass filter and blend that into the video stream?
[18:22] <ubitux> i don't really understand TBH
[18:22] <ubitux> by "multiply the subtitles in" you mean blending it multiple times at different places?
[18:28] <towolf> blending as in http://en.wikipedia.org/wiki/Blend_modes
[18:31] <ubitux> ah, you want to make the blend not overlay but multiply the two?
[18:37] <towolf> i think ive figured it out.  something like this? -filter_complex "ass=/tmp/timelapse.ass [ass]; [ass][0:0] blend=all_mode=screen"
[18:37] <ubitux> that's not a multiply
[18:38] <ubitux> this is just normal blending
[18:38] <ubitux> should be the default behaviour
[18:38] <towolf> yeah, thats screen. somehow in the first 10 seconds i encoded everything got green ..
[18:38] <towolf> so im trying screen now
[18:38] <ubitux> no i mean
[18:38] <ubitux> -vf ass=foobar.ass is that
[18:38] <ubitux> ass and subtitles filters should do the blending properly honoring alpha
[18:38] <ubitux> if not, there is a bug
[18:39] <towolf> yeah i know. i want to blend it, to try to make it look like its part of the picture
[18:39] <towolf> not floating on top
[18:39] <ubitux> ??
[18:39] <ubitux> the filter blends
[18:39] <ubitux> it's not muxing
[18:39] <towolf> overlay is just alpha compositing. i want to multiply the color values of the subtitles with the video colors
[18:40] <ubitux> can you show me visually an example? i belive i'm missing something
[18:41] <ubitux> i'm assuming you're willing something crazy like "color=black, ass=foobar[s]; [0:0][s] blend=..."
[18:42] <towolf> ubitux, http://imgur.com/AN1MTi7,ArAwn6R
[18:43] <towolf> at night its too bright, so i want to use screen or multiply, not just alpha compositing.
[18:45] <towolf> ubitux, what is color=black in your last line for?
[18:46] <ubitux> a source filter
[18:46] <ubitux> try ffplay -f lavfi 'color=blue, ass=foobar.ass'
[18:46] <towolf> so i have to composit the ass filter on top of black and then blend that with the video?
[18:46] <towolf> because i get very funny colors now
[18:46] <ubitux> that's an idea
[18:47] <ubitux> i'm just trying to figure out the purpose
[18:47] <ubitux> since you can just adjust the alpha and color value into the .ass
[18:47] <towolf> ubitux, do you know how in photoshop and gimp you can blend layers with blend mode?
[18:47] <ubitux> nope, not my domain at all
[18:47] <ubitux> but surely some ppl here might have a hint
[18:47] <ubitux> i have to leave, have fun
[18:48] <towolf> ubitux, thanks though. black is a good idea
[18:55] <durandal11707> the subtitles have alpha or?
[18:57] <towolf> durandal11707, yes, but alpha is not enough
[18:57] <towolf> now with the line i gave its all green. why?
[18:58] <durandal11707> the thing you actually want is to blend only subtitle stuff and not other stuff
[18:59] <towolf> what other stuff?
[18:59] <durandal11707> all other pixels not part of subtitle
[19:00] <durandal11707> you get green stuff usually if you use yuv colorspace
[19:11] <durandal11707> and most blend modes are for rgb colorspace
[19:13] <towolf> i think ive worked it out. i only apply to first  component, thats luma, right. and then with multiply it gives pretty good result.
[19:16] <durandal11707> well, if you don't want to modify hue/saturation, but just brightness/luma then yes
[21:13] <first-time-here> hello
[21:13] <Mavrik> olleh.
[21:13] <first-time-here> i cant run this command
[21:14] <first-time-here> ffmpeg -f v4l2 -i /dev/video0 -vcodec rawvideo -delay 20 -f mpegts udp://127.0.0.1:1234
[21:16] <first-time-here> i get this error
[21:16] <first-time-here> http://pastebin.com/aGctH1Uh
[21:19] <first-time-here> Mavrik: you can help me?
[21:21] <Mavrik> 1.) You're not using ffmpeg. You're using the libav project
[21:21] <Mavrik> 2.) Your build has no support for V4L2
[21:22] <first-time-here> but the command is "ffmpeg"
[21:22] <Hans_Henrik> yeah, and your distro maintainer likes to lie/troll you
[21:23] <first-time-here> what do you mean?
[21:23] <first-time-here> i'm try do work with this guide
[21:23] <first-time-here> http://ffmpeg.org/trac/ffmpeg/wiki/How%20to%20capture%20a%20lightning%20%28thunderbolt%29%20with%20FFmpeg
[21:23] <Hans_Henrik> first-time-here, ah i know at least in debian, they use some libav-binary, and call it "ffmpeg"
[21:24] <first-time-here> i'm work with ubuntu 12.04 LTS
[21:24] <durandal11707> and that old libav's avconv does not have support for v4l2
[21:25] <first-time-here> and what i can do for solving the problem?
[22:36] <yo_mama> help!
[22:55] <vade> does FFMPEGs concat protocol do something that MP4Box doesn't, with respect to VBR files that are not of a constant FPS? I ask, because MP4Box apparently has some issues contacting some files in some orders (and the same files may work in different orders), whereas FFMPEG appears to be ok with concatenation.
[22:55] <anddam> hi, what codec can I use in an avi container to replace AC-3?
[22:55] <anddam> audio codec*
[23:12] <anddam> mp3, aac
[23:12] <anddam> bye
[23:12] <pirea> ffmpeg is suporting raspberry pi gpu?
[23:22] <p4plus2> Would anybody have a guess as to why "ffmpeg -an -f x11grab -r 30 ...." records at 32 FPS and eventually after a minute or two settles to 30FPS rather than being a solid 30FPS?
[23:22] <p4plus2> I mean its not a big deal, but I would be curious as to why it happens when as far as I know -r 30 should cap ffmpeg to 30 FPS.
[23:42] <maep> hi. is it possible to have avformat_open_input reject files with a low detection score?
[23:48] <Mavrik> hmm
[23:54] <maep> i thought about redirecting stderr and parse the output :)
[23:58] <durandal11707> maep: isn't score reported anyway, you would just need to ignore such files...
[00:00] --- Sat May 25 2013


More information about the Ffmpeg-devel-irc mailing list