FFMPEG is a great swiss-army tool for video manipulation. I recently had need to use ffmpeg and ffserver to pass a video feed – constructed of static images – to a remote server, for wider redistribution. Searching google, the top hit for “piping rgb data to ffmpeg” results in this article from Kyle Cordes, which is several years old. In the article, Kyle details his efforts in piping raw data into ffmpeg, and the requisite conversion from RGB to the yuv4mpegpipe format.
There has apparently been quite some progress on ffmpeg since that article, as now piping data into ffmpeg is relatively easy. Firstly, create your pipe:
$ mkfifo image_pipe
Then, using your own code, write the data to the pipe in RGB tuples:
... fifo f(); f.open( "image_pipe"); std::vector<image>::iterator it = images.begin(); while( it != images.end() ) f.write( it->m_data, it->height, it->width ); ...
Then, invoke ffmpeg with the command:
$ ffmpeg -f rawvideo -pix_fmt rgb24 -s 512x384 -y -an -i \ ./image_pipe http://localhost:8090/image_feed.ffm
which will stream the images direct to your server for redistribution.