[FFmpeg-user] Create video frame by frame using pipes - Can't get FFmpeg to finish up
Günther Schrenk
g.schrenk89 at googlemail.com
Thu Feb 1 02:41:28 EET 2018
Hey. Im trying to build a video frame by frame by passing frames to FFmpeg
via a pipe (in this case an anonymous pipe).
Operating System: Windows 10
My language is C++. Trying to follow this Python tutorial :
http://zulko.github.io/blog/2013/09/27/read-and-write-video-frames-in-python-using-ffmpeg/
Everything is working as expected, except the closing of the pipe. I would
assume that when closing my write end of the pipe, FFmpeg should finish the
video and exit. But it doesn't.
I first tried using the commandline and receiving from STDIN (just write
some crappy values), but can't get the pipe to close properly with EOF
(STRG+Z). Typing EOF multiple times finally closes and I get an output
video.
My target is to achieve this using UnrealEngine. But it can't make FFmpeg
finish up.
So I wrote my own little program that create gray values for the frames.
The result is the same, FFmpeg does not finish up.
Here is the whole program:
#include "stdafx.h"
//// stdafx.h
//#include "targetver.h"
//#include <stdio.h>
//#include <tchar.h>
//#include <iostream>
#include "Windows.h"
#include <cstdlib>
using namespace std;
bool WritePipe(void* WritePipe, const UINT8 *const Buffer, const UINT32
Length)
{
if (WritePipe == nullptr || Buffer == nullptr || Length == 0)
{
cout << __FUNCTION__ << ": Some input is useless";
return false;
}
// Write to pipe
UINT32 BytesWritten = 0;
UINT8 newline = '\n';
bool bIsWritten = WriteFile(WritePipe, Buffer, Length,
(::DWORD*)&BytesWritten, nullptr);
cout << __FUNCTION__ << " Bytes written to pipe " << BytesWritten <<
endl;
//bIsWritten = WriteFile(WritePipe, &newline, 1,
(::DWORD*)&BytesWritten, nullptr); // Do we need this? Actually this should
destroy the image.
FlushFileBuffers(WritePipe); // Do we need this?
return bIsWritten;
}
#define PIXEL 80 // must be multiple of 8. Otherwise we get warning: Bytes
are not aligned
int main()
{
HANDLE PipeWriteEnd = nullptr;
HANDLE PipeReadEnd = nullptr;
{
// create us a pipe for inter process communication
SECURITY_ATTRIBUTES Attr = { sizeof(SECURITY_ATTRIBUTES), NULL,
true };
if (!CreatePipe(&PipeReadEnd, &PipeWriteEnd, &Attr, 0))
{
cout << "Could not create pipes" << ::GetLastError() << endl;
system("Pause");
return 0;
}
}
// Setup the variables needed for CreateProcess
// initialize process attributes
SECURITY_ATTRIBUTES Attr;
Attr.nLength = sizeof(SECURITY_ATTRIBUTES);
Attr.lpSecurityDescriptor = NULL;
Attr.bInheritHandle = true;
// initialize process creation flags
UINT32 CreateFlags = NORMAL_PRIORITY_CLASS;
CreateFlags |= CREATE_NEW_CONSOLE;
// initialize window flags
UINT32 dwFlags = 0;
UINT16 ShowWindowFlags = SW_HIDE;
if (PipeWriteEnd != nullptr || PipeReadEnd != nullptr)
{
dwFlags |= STARTF_USESTDHANDLES;
}
// initialize startup info
STARTUPINFOA StartupInfo = {
sizeof(STARTUPINFO),
NULL, NULL, NULL,
(::DWORD)CW_USEDEFAULT,
(::DWORD)CW_USEDEFAULT,
(::DWORD)CW_USEDEFAULT,
(::DWORD)CW_USEDEFAULT,
(::DWORD)0, (::DWORD)0, (::DWORD)0,
(::DWORD)dwFlags,
ShowWindowFlags,
0, NULL,
HANDLE(PipeReadEnd),
HANDLE(nullptr),
HANDLE(nullptr)
};
LPSTR ffmpegURL = "\"FFMPEGDIRECTORY\ffmpeg.exe\" -y -loglevel verbose
-f rawvideo -vcodec rawvideo -framerate 1 -video_size 80x80 -pixel_format
rgb24 -i - -vcodec mjpeg -framerate 1/4 -an \"OUTPUTDIR\sometest.mp4\"";
// Finally create the process
PROCESS_INFORMATION ProcInfo;
if (!CreateProcessA(NULL, ffmpegURL, &Attr, &Attr, true,
(::DWORD)CreateFlags, NULL, NULL, &StartupInfo, &ProcInfo))
{
cout << "CreateProcess failed " << ::GetLastError() << endl;
}
CloseHandle(ProcInfo.hThread); // note sure if this is needed
system("pause");
{
// Create images and write to pipe
#define MYARRAYSIZE (PIXEL*PIXEL*3) // each pixel has 3 bytes
UINT8* Bitmap = new UINT8[MYARRAYSIZE];
for (INT32 outerLoopIndex = 50; outerLoopIndex >= 0;
--outerLoopIndex) // frame loop
{
for (INT32 innerLoopIndex = MYARRAYSIZE - 1; innerLoopIndex >=
0; --innerLoopIndex) // create the pixels for each frame
{
Bitmap[innerLoopIndex] = (UINT8)(outerLoopIndex * 4); //
some gray color
}
system("pause");
if (!WritePipe(PipeWriteEnd, Bitmap, MYARRAYSIZE))
{
cout << "Failed writing to pipe" << endl;
}
}
// Done sending images. Tell the other process. IS THIS NEEDED? HOW
TO TELL FFmpeg WE ARE DONE?
UINT8 endOfFile = 0xFF; // EOF = -1 == 1111 1111 for uint8
if (!WritePipe(PipeWriteEnd, &endOfFile, 1))
{
cout << "Failed writing to pipe" << endl;
}
delete Bitmap;
}
system("Pause");
// clean stuff up
// We do not want to destroy the read end of the pipe as that belongs
to FFmpeg
//if (PipeReadEnd != NULL && PipeReadEnd != INVALID_HANDLE_VALUE)
//{
// ::CloseHandle(PipeReadEnd);
//}
if (PipeWriteEnd != NULL && PipeWriteEnd != INVALID_HANDLE_VALUE)
{
::CloseHandle(PipeWriteEnd);
}
return 0;
}
And here the whole output from FFmpeg
ffmpeg version 3.4.1 Copyright (c) 2000-2017 the FFmpeg developers
built with gcc 7.2.0 (GCC)
configuration: --enable-gpl --enable-version3 --enable-sdl2
--enable-bzlib --enable-fontconfig --enable-gnutls --enable-iconv
--enable-libass --enable-libbluray --enable-libfreetype --enable-libmp3lame
--enable-libopenjpeg --enable-libopus --enable-libshine --enable-libsnappy
--enable-libsoxr --enable-libtheora --enable-libtwolame --enable-libvpx
--enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265
--enable-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp
--enable-libvidstab --enable-libvorbis --enable-cuda --enable-cuvid
--enable-d3d11va --enable-nvenc --enable-dxva2 --enable-avisynth
--enable-libmfx
libavutil 55. 78.100 / 55. 78.100
libavcodec 57.107.100 / 57.107.100
libavformat 57. 83.100 / 57. 83.100
libavdevice 57. 10.100 / 57. 10.100
libavfilter 6.107.100 / 6.107.100
libswscale 4. 8.100 / 4. 8.100
libswresample 2. 9.100 / 2. 9.100
libpostproc 54. 7.100 / 54. 7.100
[rawvideo @ 000002207d1220a0] max_analyze_duration 5000000 reached at
5000000 microseconds st:0
Input #0, rawvideo, from 'pipe:':
Duration: N/A, start: 0.000000, bitrate: 153 kb/s
Stream #0:0: Video: rawvideo, 1 reference frame (RGB[24] / 0x18424752),
rgb24, 80x80, 153 kb/s, 1 fps, 1 tbr, 1 tbn, 1 tbc
Stream mapping:
Stream #0:0 -> #0:0 (rawvideo (native) -> mjpeg (native))
[graph 0 input from stream 0:0 @ 000002207d1966e0] w:80 h:80 pixfmt:rgb24
tb:1/1 fr:1/1 sar:0/1 sws_param:flags=2
[auto_scaler_0 @ 000002207d130280] w:iw h:ih flags:'bicubic' interl:0
[format @ 000002207d12fd80] auto-inserting filter 'auto_scaler_0' between
the filter 'Parsed_null_0' and the filter 'format'
[swscaler @ 000002207d19a000] deprecated pixel format used, make sure you
did set range correctly
[auto_scaler_0 @ 000002207d130280] w:80 h:80 fmt:rgb24 sar:0/1 -> w:80 h:80
fmt:yuvj444p sar:0/1 flags:0x4
Output #0, mp4, to 'c:/users/vr3/Documents/Guenni/sometest.mp4':
Metadata:
encoder : Lavf57.83.100
Stream #0:0: Video: mjpeg, 1 reference frame (mp4v / 0x7634706D),
yuvj444p(pc), 80x80, q=2-31, 200 kb/s, 1 fps, 16384 tbn, 1 tbc
Metadata:
encoder : Lavc57.107.100 mjpeg
Side data:
cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1
frame= 50 fps= 14 q=1.6 size= 0kB time=00:00:49.00 bitrate=
0.0kbits/s speed=13.3x
As you can see on the output of FFmpeg, there are 50 frames writte.
Everything as should be. But how to tell FFmpeg the pipe is closed, there
will not come any more frames.
More information about the ffmpeg-user
mailing list