FFmpeg Documentation
FFmpeg is a very fast video and audio converter. It can also grab from a live audio/video source. The command line interface is designed to be intuitive, in the sense that ffmpeg tries to figure out all the parameters, when possible. You have usually to give only the target bitrate you want.
FFmpeg can also convert from any sample rate to any other, and resize video on the fly with a high quality polyphase filter.
FFmpeg can use a video4linux compatible video source and any Open Sound System audio source:
ffmpeg /tmp/out.mpg
Note that you must activate the right video source and channel before launching ffmpeg. You can use any TV viewer such as xawtv by Gerd Knorr which I find very good. You must also set correctly the audio recording levels with a standard mixer.
* ffmpeg can use any supported file format and protocol as input:
Examples:
* You can input from YUV files:
ffmpeg -i /tmp/test%d.Y /tmp/out.mpg
It will use the files:
/tmp/test0.Y, /tmp/test0.U, /tmp/test0.V, /tmp/test1.Y, /tmp/test1.U, /tmp/test1.V, etc...
The Y files use twice the resolution of the U and V files. They are raw files, without header. They can be generated by all decent video decoders. You must specify the size of the image with the '-s' option if ffmpeg cannot guess it.
* You can input from a RAW YUV420P file:
ffmpeg -i /tmp/test.yuv /tmp/out.avi
The RAW YUV420P is a file containing RAW YUV planar, for each frame first come the Y plane followed by U and V planes, which are half vertical and horizontal resolution.
* You can output to a RAW YUV420P file:
ffmpeg -i mydivx.avi -o hugefile.yuv
* You can set several input files and output files:
ffmpeg -i /tmp/a.wav -s 640x480 -i /tmp/a.yuv /tmp/a.mpg
Convert the audio file a.wav and the raw yuv video file a.yuv to mpeg file a.mpg
* You can also do audio and video convertions at the same time:
ffmpeg -i /tmp/a.wav -ar 22050 /tmp/a.mp2
Convert the sample rate of a.wav to 22050 Hz and encode it to MPEG audio.
* You can encode to several formats at the same time and define a mapping from input stream to output streams:
ffmpeg -i /tmp/a.wav -ab 64 /tmp/a.mp2 -ab 128 /tmp/b.mp2 -map 0:0 -map 0:0
Convert a.wav to a.mp2 at 64 kbits and b.mp2 at 128 kbits. '-map file:index' specify which input stream is used for each output stream, in the order of the definition of output streams.
* You can transcode decrypted VOBs
ffmpeg -i snatch_1.vob -f avi -vcodec mpeg4 -b 800 -g 300 -bf 2 -acodec mp3 -ab 128 snatch.avi
This is a typicall DVD ripper example, input from a VOB file, output
to an AVI file with MPEG-4 video and MP3 audio, note that in this
command we use B frames so the MPEG-4 stream is DivX5 compatible, GOP
size is 300 that means an INTRA frame every 10 seconds for 29.97 fps
input video. Also the audio stream is MP3 encoded so you need LAME
support which is enabled using --enable-mp3lame
when
configuring. The mapping is particullary usefull for DVD transcoding
to get the desired audio language.
NOTE: to see the supported input formats, use ffmpeg -formats
.
The generic syntax is:
ffmpeg [[options][-i input_file]]... {[options] output_file}...
If no input file is given, audio/video grabbing is done.
As a general rule, options are applied to the next specified file. For example, if you give the '-b 64' option, it sets the video bitrate of the next file. Format option may be needed for raw input files.
By default, ffmpeg tries to convert as losslessly as possible: it uses the same audio and video parameter fors the outputs as the one specified for the inputs.
hh:mm:ss[.xxx]
syntax is also
supported.
The output file can be "-" to output to a pipe. This is only possible with mpeg1 and h263 formats.
ffmpeg handles also many protocols specified with the URL syntax.
Use 'ffmpeg -formats' to have a list of the supported protocols.
The protocol http:
is currently used only to communicate with
ffserver (see the ffserver documentation). When ffmpeg will be a
video player it will also be used for streaming :-)
ffmpeg -g 3 -r 3 -t 10 -b 50 -s qcif -f rv10 /tmp/b.rm
You can use the -formats
option to have an exhaustive list.
FFmpeg supports the following file formats thru the libavformat
library.
Supported File Format | Encoding | Decoding | Comments |
MPEG audio | X | X | |
MPEG1 systems | X | X | muxed audio and video |
MPEG2 PS | X | X | also known as VOB file
|
MPEG2 TS | X | also known as DVB Transport Stream | |
ASF | X | X | |
AVI | X | X | |
WAV | X | X | |
Macromedia Flash | X | X | Only embedded audio is decoded |
Real Audio and Video | X | X | |
PGM, YUV, PPM, JPEG images | X | X | |
Animated GIF | X | Only uncompressed GIFs are generated | |
Raw AC3 | X | X | |
Raw MJPEG | X | X | |
Raw MPEG video | X | X | |
Raw PCM8/16 bits, mulaw/Alaw | X | X | |
SUN AU format | X | X | |
Quicktime | X | ||
MPEG4 | X | MPEG4 is a variant of Quicktime | |
Raw MPEG4 video | X | Only small files are supported. | |
DV | X | Only the video track is decoded. |
X
means that the encoding (resp. decoding) is supported.
Supported Codec | Encoding | Decoding | Comments |
MPEG1 video | X | X | |
MPEG2 video | X | ||
MPEG4 | X | X | Also known as DIVX4/5 |
MSMPEG4 V1 | X | X | |
MSMPEG4 V2 | X | X | |
MSMPEG4 V3 | X | X | Also known as DIVX3 |
WMV7 | X | X | |
H263(+) | X | X | Also known as Real Video 1.0 |
MJPEG | X | X | |
DV | X |
X
means that the encoding (resp. decoding) is supported.
Supported Codec | Encoding | Decoding | Comments |
MPEG audio layer 2 | IX | IX | |
MPEG audio layer 1/3 | IX | IX | MP3 encoding is supported thru the external library LAME |
AC3 | IX | X | liba52 is used internally for decoding. |
Vorbis | X | encoding is supported thru the external library libvorbis. |
X
means that the encoding (resp. decoding) is supported.
I
means that an integer only version is available too (ensures highest
performances on systems without hardware floating point support).
You can integrate all the source code of the libraries to link them statically to avoid any version problem. All you need is to provide a 'config.mak' and a 'config.h' in the parent directory. See the defines generated by ./configure to understand what is needed.
You can use libavcodec or libavformat in your commercial program, but any patch you make must be published. The best way to proceed is to send your patches to the ffmpeg mailing list.
ffmpeg is programmed in ANSI C language. GCC extensions are tolerated. Indent size is 4. The TAB character should not be used.
The presentation is the one specified by 'indent -i4 -kr'.
Main priority in ffmpeg is simplicity and small code size (=less bugs).
Comments: for functions visible from other modules, use the JavaDoc format (see examples in `libav/utils.c') so that a documentation can be generated automatically.
When you submit your patch, try to send a unified diff (diff '-u' option). I cannot read other diffs :-)
Run the regression tests before submitting a patch so that you can verify that there is no big problems.
Except if your patch is really big and adds an important feature, by submitting it to me, you accept implicitely to put it under my copyright. I prefer to do this to avoid potential problems if licensing of ffmpeg changes.
Patches should be posted as base64 encoded attachments (or any other encoding which ensures that the patch wont be trashed during transmission) to the ffmpeg-devel mailinglist, see http://lists.sourceforge.net/lists/listinfo/ffmpeg-devel
Before submitting a patch (or commiting with CVS), you should at least test that you did not break anything.
The regression test build a synthetic video stream and a synthetic audio stream. Then there are encoded then decoded with all codecs or formats. The CRC (or MD5) of each generated file is recorded in a result file. Then a 'diff' is launched with the reference results and the result file.
Run 'make test' to test all the codecs.
Run 'make libavtest' to test all the codecs.
[Of course, some patches may change the regression tests results. In this case, the regression tests reference results shall be modified accordingly].
This document was generated on 27 October 2002 using texi2html 1.56k.