• Home
  • History
  • Annotate
Name Date Size #Lines LOC

..03-May-2022-

aenc/H03-May-2022-9,4227,467

debian/H03-May-2022-1,160964

docs/H03-May-2022-12,34210,711

lavtools/H03-May-2022-33,12523,786

mpeg2enc/H03-May-2022-23,10714,202

mplex/H03-May-2022-11,9717,994

scripts/H03-May-2022-4,3693,060

utils/H03-May-2022-25,48316,505

y4mdenoise/H03-May-2022-21,65112,155

y4mscaler/H03-May-2022-8,6566,416

y4munsharp/H03-May-2022-1,2711,009

y4mutils/H03-May-2022-5,3234,130

yuvcorrect/H03-May-2022-3,8913,108

yuvdeinterlace/H03-May-2022-1,7331,384

yuvdenoise/H03-May-2022-2,3421,825

yuvfilters/H03-May-2022-2,7892,340

yuvscaler/H03-May-2022-4,7803,612

AUTHORSH A D02-Aug-2013647 1716

BUGSH A D14-May-20111.8 KiB4131

CHANGESH A D28-Sep-20001.5 KiB5633

COPYINGH A D25-Jul-200417.6 KiB341281

ChangeLogH A D14-May-20116.1 KiB215131

HINTSH A D15-Nov-20037 KiB170126

INSTALLH A D21-Sep-201315.4 KiB371289

INSTALL.realH A D29-Jul-20063 KiB8061

Makefile.amH A D18-Nov-20121.8 KiB9370

Makefile.inH A D03-May-202229.7 KiB982866

NEWSH A D08-Sep-200662 21

PLANSH A D13-Aug-20041.3 KiB4628

READMEH A D19-Sep-20136.3 KiB152112

README.AltiVecH A D09-Dec-20031.7 KiB4631

README.DVH A D24-Apr-20024.7 KiB11184

README.avilibH A D26-Dec-20046.7 KiB221138

README.glavH A D05-Feb-20021.5 KiB4743

README.lavpipeH A D09-Aug-200410.2 KiB258211

README.transistH A D15-Nov-20034 KiB10084

TODOH A D18-Apr-20071.6 KiB4932

acinclude.m4H A D08-Aug-200436.3 KiB994883

aclocal.m4H A D21-Sep-2013346.2 KiB9,8548,863

autogen.shH A D30-Aug-200483 51

compileH A D21-Sep-20137.1 KiB343253

config.guessH A D21-Sep-201344.1 KiB1,5411,330

config.h.inH A D21-Sep-20135.3 KiB218156

config.subH A D21-Sep-201334.7 KiB1,7801,637

configureH A D03-May-2022659.7 KiB22,07518,773

configure.acH A D21-Sep-201321.5 KiB634567

cpuinfo.shH A D31-Aug-20097.9 KiB347292

depcompH A D21-Sep-201320.4 KiB708460

install-shH A D21-Sep-201313.7 KiB528351

ltmain.shH A D21-Sep-2013276.8 KiB9,6567,304

missingH A D21-Sep-20139.9 KiB331243

mjpegtools.pcH A D21-Sep-2013316 119

mjpegtools.pc.inH A D15-Aug-2005324 119

mjpegtools.specH A D21-Sep-20134.9 KiB168134

mjpegtools.spec.inH A D21-Sep-20134.9 KiB168134

README

1Documentation Files
2===================
3
4INSTALL - How to install this package
5
6lavtools: Linux Audio and Video TOOLS for Motion JPEG and MPEG
7==============================================================
8
9Programs for MJPEG recording and playback and simple cut-and-paste
10editting and MPEG compression of audio and video under Linux.
11
12N.b. Only the "lav" programs have been written whooly from scratch.
13The rest are from diverse open source originals, modified to work with
14the lav tools edit lists and AVI and quicktime files.  Some
15(especially the MPEG tools) have had also more major perfomance and
16functionality enhancements.
17
18* The lavtools, xlav and utils directories
19
20The latest and greatest versions of Rainer Johanni's original
21lavtools.  Plus some extra goodies to link them to the MPEG
22compression tools.
23
24- "lavrec" is a program to record AVI and Quicktime MJPEG files with
25   the Iomega BUZ etc including sound recording from the soundcard.
26
27- "lavplay" plays back AVI and Quicktime MJPEG files or "xlav" produced
28  edit files with the Iomega BUZ etc  (including sound).
29
30- "xlav" in the directory xlav is a GUI for lavplay which permits
31  fast forward/reverse, single frame stepping and simple non-destructive
32  editing.  Instead of writing new AVI or Quicktime files xlav creates an
33  "edit list" file with pointers to the relevant bits of the original files.
34  The lav tools and MPEG encoding tools can all work directly with these files
35  as well as Quicktime and AVI files.  Since these files have simple plain-text
36  format they can easily be manually editted or use for other tools.
37
38- "lavtrans" (in directory utils) converts a mixture of AVI files,
39  Quicktime files or Edit Lists into a single AVI or Quicktime file
40  into single images or into WAV files.
41
42- "lavaddwav" (in directory utils) lets you add another soundtrack
43  (a WAV file) to an AVI/Quicktime file
44
45- "lavvideo" is a small test application for the video overlay
46   capabilities of your V4L hardware device.
47
48- "lav2yuv" decodes AVI and Quicktime MJPEG and edit lists video and
49  outputs it to standard out the simple raw YUV video format expected
50  by the MPEG compressor "mpeg2enc" (see below).
51
52- "lav2wav" decodes AVI and Quicktime MJPEG and edit lists video and
53  outputs it to standard out the WAV format expected by the MPEG layer
54  2 audio compressor "mp2enc" (see below.  If you output to a pipe it
55  simply sticks a maxium length in the header.  Most programs processing
56  streams will  work fine with this, ending cleanly when they reach the
57  actual end.
58
59- "lav2dfilter".  A median filter for noise reduction.  Useful when MPEG
60  encoding TV material.
61
62- "v4l-conf" is used in X to set up your video framebuffer address
63
64- "yuv2lav", "ypipe" and "transist.flt" for (first decode an mjpeg
65  input file into raw yuv data (lav2yuv)) putting the data of more than
66  one yuv stream together (ypipe), making a transition between the two
67  input streams (transist.flt) and for making an mjpeg file out of the
68  resulting yuv data (yuv2lav). Read README.transist for more info on
69  how to use this.
70
71* The scripts directory
72
73Turning a video stream captured as MJPEG AVI's or Quicktime files
74using a capture card into MPEG or transcoding a MPEG-2 file
75involves several steps and multiple programs. This directory
76contains several useful shell scripts that automagically
77do the right thing for common encoding tasks. They're also
78intended as examples of typical uses.
79
80
81* The mpeg2enc directory
82
83- "mpeg2enc" for encoding the output of lav2yuv or suitable
84  output plugins in mpeg2dec to MPEG-1/2 video streams.
85
86See the README for further details.
87In the current beta release MPEG-2 is untested and variable
88bit-rate probably won't work.  HOWEVER: speed and quality for
89MPEG-1 are really rather good. Encoding 352x288 PAL at 18
90fps on a 700Mhz Duron or 11ps on a 450Mhz P-III.
91
92* The aenc directory -
93
94Contains the source files for "mp2enc" the MPEG-1 layer 2 audio
95compressor. It is not particularly good as encoders go, but is
96included for simplicity and completeness.  It also has the virtue
97that for transcoding applications (e.g. AC3 to mp2) it can do
98sampling rate conversions.  You need this to compress audio.
99
100
101In general, however, you'd be better off with the faster and more
102capable "toolame" encoder.  For transcoding applications the good old
103stand-by "sox" will convert sampling rates very nicely (albeit it'll
104print a few warnings about incorrect headers if you use it in a
105pipeline).
106
107You may wish to try  using your favourite MPEG 1 layer 3 "MP3"
108encoder for better quality results.  However, make sure you
109use constant bit-rate  and turn of any extensions to the format.
110I have no idea how many players actually cope with layer 3.  Drop
111andrew.stevens@nexgo.de an email if you get it to work!
112
113* The mplex directory
114
115Contains for the source files the "mplex" multiplexer.
116This multiplexes (interleaves) MPEG-1/2 video and audio streams
117into a combined "system stream" that can be played back smoothly.
118This is *not* quite as trivial a task as it might seem (see the
119original authors paper on the subject - copy is in the
120documentation). Note that the program has been pretty heavily
121modified since then.
122MPEG-2 multiplexing is implemented but is currently untested.
123DVD VOB multiplexing (AC3 audio in addition to MPEG) is not implemented.
124
125
126See the README's of the various programs for further details
127of authorship, usage, and implementation/compilation details.
128
129
130Attention: lavplay is mainly intended to play back files created by
131lavrec and should also be able to play back MJPEG AVI files created
132from the Iomega BUZ under Win98 or the xawtv capture tool under linux.
133The vast majority of AVI/Quicktime files will not be played by
134lavplay!!!  (see http://dix.euro.ru for such codecs) The reason is
135that lavplay only handles AVI that use an MJPG codec, all other files
136don't use that codec and can therefore not be played back by the BUZ.
137
138* Contact
139
140You reach us by email.
141
142If you have questions, remarks, problems or you just want to contact
143the developers, the main mailing list for the MJPEG-tools is:
144  mjpeg-users@lists.sourceforge.net
145
146Although little bits have been done by everyone the main work was
147roughly as follows:
148
149lav* : Rainer Johanni and Gernot Ziegler <gz@lysator.liu.se>
150mpeg2enc mplex : Andrew.Stevens@nexgo.de
151libmjpeg: Gernot Ziegler <gz@lysator.liu.se>
152

README.AltiVec

1AltiVec optimized library for MJPEG tools MPEG-1/2 Video Encoder
2-----------------------------------------------------------------------------
3
4Copyright (C) 2002  James Klicman <james@klicman.org>
5
6Performance statistics for these optimizations are available at
7http://klicman.org/altivec/.
8
9
10Platform specific comments:
11
12    Linux:
13
14	AltiVec enabled compilers are now widely distributed.  However,
15	the AltiVec support in the GNU GCC 3.1, 3.2 and even the latest
16	3.3.1 release all have known bugs which cause invalid code or
17	compile failures. It is likely that release GCC 3.3.2 will
18	be a functional compiler for this AltiVec code. Until then,
19	gcc 2.96.4 with the Motorola AltiVec patches works just fine
20	and may even provide better AltiVec instruction scheduling.
21	There are AltiVec enabled GCC compilers maintained by Kaoru
22	Fukui at ftp://ppc.linux.or.jp/pub/users/fukui/.
23
24        You will also need a recent binutils package. I've verified that
25        binutils-2.12.90.0.7 works. It is also the recommended version
26        for GCC 3.1. There may also be suitable binutils packages at
27        Kaoru's FTP site.
28
29    Mac OS X:
30
31        If you have the developer tools, you're all set.
32
33
34General comments:
35
36    Additional performance can be gained by setting the -mcpu option. For
37    GCC 2.95.* use -mcpu=750, for GCC 3.1 and higher try -mcpu=7400 or
38    -mcpu=7450.
39
40    This option can be added during configuration by setting the CFLAGS
41    environment variable.  Since the encoder (mpeg2enc) and multiplexor
42    (mplex) are now written in C++ the CXXFLAGS environment variable should
43    also be set.  For example:
44
45    env CFLAGS="-O2 -mcpu=750" CXXFLAGS="-O2 -mcpu=750" ./configure
46

README.DV

1
2    MJPEGtools v1.6.0 and onwards, when compiled with libdv, support DV video
3    streams, stored as "Type 2 AVI" files.
4
5
6Building and installation (if you want to compile from source):
7
8    1. Download and install libdv from libdv.sourceforge.net.  (v0.9 and up
9	are known to work.)
10
11    2. Download and unpack the v1.6.0 source tarball, or get a clean CVS
12        checkout of the development branch of mjpeg_play.
13
14    3. configure && make && make install in mjpeg_play
15        This should automatically detect libdv if it is installed.
16
17
18Grabbing DV input:
19
20    0. You should already have a DV camcorder and an IEEE-1394 board and
21       everything set up for 1394 capture, otherwise have a look at:
22           http://linux1394.sourceforge.net/
23           http://www.schirmacher.de/arne/dvgrab/
24
25       Two programs that I found very useful for resolving problems and
26       controlling the camcorder through IEEE 1394 are:
27            dvcont - http://www.spectsoft.com/idi/dvcont/
28          gscanbus - http://www.ivistar.de/0500opensource.php3?lang=en
29
30       If your camcorder allows you to record either 12bit or 16bit audio you
31       should make sure to set it to 16bit recording since that is the sound
32       format expected by the MJPEGtools.  Some camcorders have 12bit sound
33       as the default factory setting, so check that *before* you start
34       recording your videos.
35
36    1. Start replay on the camcorder.
37
38    2. Use dvgrab with the option "--format dv2" to capture the DV data in AVI
39       'Type 2' format.  Only this format is supported by lav2yuv; capturing
40       with the default '--format dv1' won't work.
41
42    3. If you want to preview and edit (simple cut and paste editing) the raw
43       DV files you can use kino from:
44           http://www.schirmacher.de/arne/kino/
45.
46       Make sure to set dv2 format for saving in the preferences.
47
48
49Using lav2yuv with DV:
50
51    Simply follow all other documentation for "lav2yuv"; your Type 2 DV AVI
52    file will be treated similarly to an MJPEG AVI file.
53
54    For example:
55
56          lav2yuv my-dv-file.avi | yuvplay
57
58    * lav2yuv accepts DV AVI files directly or as parts of editlists.
59    * Both PAL and NTSC streams are handled.
60    * Sample aspect ratio is automatically detected.  Interlacing is always
61      set to bottom-field-first (because DV is always bottom-field-first).
62    * You can extract the sound from the DV files using either "lav2wav" or
63      "lavtrans -f w".  Once extracted to a WAV file, you can encode it to
64      MPEG Layer II audio with "mp2enc", which is capable of resampling the
65      48kHz stream to 44.1kHz.  (Some hardware VCD and SVCD players will
66      accept 48kHz audio, however this is non-standard.)
67
68
69
70
71------------------------------------------------------------------------------
72
73Additional Notes (of purely HISTORICAL value)
74    1. The recent version of lav2yuv can also read raw YUV data from
75       quicktime files that were written in planar YUV 4:2:0 format. This is
76       one of the output file formats offered by bcast2000 for rendering
77       scenes.
78    2. This is a short description of the de-interlacing algorithm taken from
79       the source file DI_TwoFrame.c from deinterlace.sourceforge.net.  This
80       algorithm is a combination of simpler algorithms found in Windows
81       programs like flaskMPEG or AVIsynth.
82
83///////////////////////////////////////////////////////////////////////////////
84// Deinterlace the latest field, attempting to weave wherever it won'tcause
85// visible artifacts.
86//
87// The data from the most recently captured field is always copied to theoverlay
88// verbatim.  For the data from the previous field, the following algorithm is
89// applied to each pixel.
90//
91// We use the following notation for the top, middle, and bottom pixels
92// of concern:
93//
94// Field 1 | Field 2 | Field 3 | Field 4 |
95//         |   T0    |         |   T1    | scanline we copied in lastiteration
96//   M0    |         |    M1   |         | intermediate scanline fromalternate field
97//         |   B0    |         |   B1    | scanline we just copied
98//
99// We will weave M1 into the image if any of the following is true:
100//   - M1 is similar to either B1 or T1.  This indicates that no weave
101//     artifacts would be visible.  The SpatialTolerance setting controls
102//     how far apart the luminances can be before pixels are considered
103//     non-similar.
104//   - T1 and B1 and M1 are old.  In that case any weave artifact that
105//     appears isn't due to fast motion, since it was there in the previous
106//     frame too.  By "old" I mean similar to their counterparts in the
107//     previous frame; TemporalTolerance controls the maximum squared
108//     luminance difference above which a pixel is considered "new".
109//
110///////////////////////////////////////////////////////////////////////////////
111

README.avilib

1avilib: Reading and writing avi files
2=====================================
3
4Copyright (C) 1999 Rainer Johanni <Rainer@Johanni.de>
5
6avilib is a open source library for dealing with AVI
7files under Linux or other UNIX operating systems.
8
9It provides a framework for extracting or adding raw
10audio and single raw (=compressed) frames from/to AVI Files.
11
12It does not deal with any compression issues which have to be
13handled on a higher level by the user of avilib.
14
15AVI files may have several video and audiotracks.
16
17avilib writes only one video track and (optionally) one
18audio track and also extracts only the first video and audio
19track (but input files may contain more than one track, the others
20just being ingored).
21
22The interface to avilib is kept similar to the quicktime4linux interface
23(by Adam Williams) with the following important differences:
24
25- since only the first track of video and audio is considered,
26  there is no track argument in any of the routines.
27
28- audio is generally considered as a byte stream and therefore
29  all size arguments used in reading/writing audio are in bytes
30  and not in samples.
31
32- as mentioned above, there are no routines dealing with compression issues.
33
34
35Compiling:
36==========
37
38Since the library consists only of one c source file, I have not provided
39a Makefile or similar, just compile with
40
41cc -c <your favorite options> avilib.c
42
43
44Portability:
45============
46
47AVI-Files use little endian numbers throughout the file, I have tried
48to read/write these numbers in a way which doesn't depent on endianness.
49This library should therefore also be useable on big endian machines.
50This feature is not so heavily tested, however.
51
52
53Usage:
54======
55
56Basics, opening, closing
57------------------------
58
59Include "avilib.h" in your source and declare a pointer:
60
61   avi_t *avifile;
62
63Open the AVI file with:
64
65   avifile = AVI_open_input_file("xxx.avi",1);
66
67or
68
69   avifile = AVI_open_output_file("xxx.avi");
70
71You may either only read from the input file (leaving it unchanged)
72or create a completly new AVI file. There is no editing or append
73mode available.
74
75Both routines will either return a pointer to avi_t or a zero pointer
76in the case of an error.
77
78For closing the file, use:
79
80   int  AVI_close(avi_t *AVI);
81
82Files you have written MUST be closed (the header is written at close time),
83else they will not be readable by any other software.
84
85Files opened for reading should be closed to free the file descriptor
86and some data (unless your program is finishing anyway).
87
88
89Error handling:
90---------------
91
92Most routines (besides open/close) will return 0 or a useful number if successfull
93and a -1 in the case of an error. If an error occured, the external variable
94AVI_errno is set. See avilib.h for the meaning of the error codes in AVI_errno.
95
96There is also a routine (which acts like strerror) to retrieve a string
97description of the last error (which can then be logged or printed):
98
99AVI_strerror(char *str)
100
101
102Reading from an AVI file:
103-------------------------
104
105After opening the file, you can obtain the parameters of the AVI
106with the following routines:
107
108long AVI_video_frames(avi_t *AVI);
109   number of video frames in the file
110
111int  AVI_video_width(avi_t *AVI);
112int  AVI_video_height(avi_t *AVI);
113   width and height of the video in pixels
114
115double AVI_frame_rate(avi_t *AVI);
116   frame rate in frames per second, notice that this is a double value!
117
118char* AVI_video_compressor(avi_t *AVI);
119   string describing the compressor
120
121int  AVI_audio_channels(avi_t *AVI);
122   number of audio channels, 1 for mono, 2 for stereo, 0 if no audio present
123
124int  AVI_audio_bits(avi_t *AVI);
125   audio bits, usually 8 or 16
126
127int  AVI_audio_format(avi_t *AVI);
128   audio format, most common is 1 for raw PCM, look into avilib.h for others
129
130long AVI_audio_rate(avi_t *AVI);
131   audio rate in samples/second
132
133long AVI_audio_bytes(avi_t *AVI);
134   total number of audio bytes in the file
135
136
137In order to read the video frame by frame, use
138(frame numbers are starting from 0 !!!!!)
139
140long AVI_frame_size(avi_t *AVI, long frame);
141   to get the size of frame with number "frame"
142
143long AVI_read_frame(avi_t *AVI, char *vidbuf);
144   to read the next  frame (frame posittion is advanced by 1 after the read)
145
146int  AVI_seek_start(avi_t *AVI);
147int  AVI_set_video_position(avi_t *AVI, long frame);
148   to position in the AVI file
149   (for reading the frames out of order)
150
151
152Read audio with
153
154int  AVI_set_audio_position(avi_t *AVI, long byte);
155   to position to an arbitrary byte position within the audio stream
156
157long AVI_read_audio(avi_t *AVI, char *audbuf, long bytes);
158   to actually read "bytes" number of audio bytes.
159   the audio position is advanced by "bytes", so there is no
160   need to reposition before every call when reading in order.
161
162
163Avoiding lengthy index searches:
164--------------------------------
165
166When opening the AVI file, avilib looks if the file has an index attached
167and if this is not the case, it creates one by reading through the whole file.
168
169If you want to read through the file only once, creation of an index is
170not necessary in that case. You may use AVI_open_input_file with the second
171argument set to 0 and then use AVI_read_data for readin through the file.
172
173Look to the source for the arguments of AVI_read_data.
174
175
176Writing to an AVI file:
177-----------------------
178
179After you have opened the file, use the following routines to set
180the properties of the AVI file:
181
182void AVI_set_video(avi_t *AVI, int width, int height, double fps, char *compressor);
183void AVI_set_audio(avi_t *AVI, int channels, long rate, int bits, int format);
184
185with:
186
187width, height   width and height of the video in pixels
188
189fps             frame rate in frames per second, notice that this is a double value!
190
191compressor      string describing the compressor
192
193channels        number of audio channels, 1 for mono, 2 for stereo, 0 if no audio present
194
195rate            audio rate in samples/second
196
197bits            audio bits, usually 8 or 16, 0 if no audio present
198
199format          audio format, most common is 1 for raw PCM, look into avilib.h for others
200
201
202to write video frames or audio, use:
203
204int  AVI_write_frame(avi_t *AVI, char *data, long bytes);
205int  AVI_write_audio(avi_t *AVI, char *data, long bytes);
206
207there is also a feature to duplicate the index entry of the last
208frame without writing the data again to the file, this should
209used with care since I don't know if all AVI players can handle
210the resulting file (xanim can do it!):
211
212int  AVI_dup_frame(avi_t *AVI);
213
214AVI files have a 2 GB limit (as has the Linux ext2 file system),
215avilib will return an error if you try to add more data to the file
216(and it cares that the file still can be correctly closed).
217If you want to check yourself how far you are away from that limit
218(for example to synchronize the amount of audio and video data) use:
219
220long AVI_bytes_remain(avi_t *AVI);
221

README.glav

1You should be able to just "make"
2
3there are some hacks so I didn't have to include anything from the mjpeg utils
4directory.  I hope to get it into the build tree sometime (soon...).
5
6seconds are NTSC seconds (30 frames)
7Key Bindings:  (loosely based on vi key bindings)
8   (            set Selection Start
9   Home         "
10   )            set Selection End
11   End          "
12   0            go to beginning
13   h,<- + CTRL  "
14   9            go to end
15   l,-> + CTRL  "
16   l,->         forward single frame
17   l,-> + SHIFT forward 10 frames
18   l,-> + ALT   forward 50 frames
19   w            forward 1/2 second (15 frames)
20   W            forward 1 second (30 frames)
21   h,<-         reverse single frame
22   h,<- + SHIFT reverse 10 frames
23   h,<- + ALT   reverse 50 frames
24   1            forward 5 seconds
25   2            forward 10 seconds
26   3            forward 15 seconds
27   4            forward 20 seconds
28   5            forward 25 seconds
29   6            forward 30 seconds
30   b            reverse 1/2 second (15 frames)
31   B            reverse 1 second (30 frames)
32   !            reverse 5 seconds
33   @            reverse 10 seconds
34   #            reverse 15 seconds
35   $            forward 20 seconds
36   %            forward 25 seconds
37   ^            forward 30 seconds
38   x,del        cut selection
39   y            yank (copy) selection
40   p,ins        paste selection
41   s,S          stop, pause
42   f            play (forward)
43   F            fast forward
44   r            reverse
45   R            fast reverse
46
47

README.lavpipe

1
2This README describes the lavpipe tools: how to use them,
3and how they work (partially).
4The current implementation still is not tested very well, so be
5warned (and please report any errors you encounter).
6
7At the moment, there are only two filters for lavpipe,
8transist.flt and matteblend.flt, which means that you can
9use it only to make simple blending transistions between two
10movies or to blend one video over another using a predefined
11matte (yuv image or lav movie). But it is very easy to code
12new filters or extend existing ones. Read on.
13
14Contents:
151. What are the tools?
162. What can we do with it?
173. What else can we do with it?
184. That's all?! How can we make them do more?
19_____________________________________________________________________________
20                                                                             \
211. What are the tools?                                                       /
22____________________________________________________________________________/
23
24The tools that are involved so far are:
25 - lav2yuv, which decompresses the given input files (or any portions
26            of them) and gives us raw yuv frame streams that can be
27            piped through the several tools.
28 - lavpipe, which reads out a "LAV Pipe List" (e.g. *.pli) file, a
29            "recipe" which tells it how to combine several input
30            movies using several yuv stream filters.
31 - transist.flt and matteblend.flt, all the filters that already exist.
32 - yuv2lav or mpeg2enc, which compress the resulting yuv stream
33                        into an mjpeg or mpeg file.
34
35_____________________________________________________________________________
36                                                                             \
372. So hat can we do with it?                                                 /
38____________________________________________________________________________/
39
40Example one: Make a transistion from one movie to another one.
41
42Let's assume that we have got two 352x288 PAL files:
43intro.avi (1040 frames) and epilogue.qt (3920 frames). Now we want
44to make a transistion between them that lasts for two seconds
45(2 sec = 50 frames, as they are PAL movies).
46We also have to take care that both share the same dimensions - if
47they are of different sizes, we can use lavscale once it is finished.
48
49Our task now is to write a "recipe" for lavpipe and thus tell it how
50to do the work. If we store our work as trans.pli, the final call will
51simply be: "lavpipe trans.pli | yuv2lav -o result.avi" (lavpipe
52writes a yuv stream to stdout as lav2yuv does).
53
54The first line of trans.pli must be "LAV Pipe List".
55
56The second line contains one single number, the number of input
57streams that we will use. (In our case: "2")
58
59Now for each of the two input streams a line containing the
60command that produces the stream is added. First for intro.avi:
61"lav2yuv -o $o -f $n -n 1 intro.avi"
62The -o $o and -f $n parameters are necessary as lavpipe somehow
63has to inform lav2yuv which frames it should output. $o will
64be replaced by the offset and $n will be replaced by the number
65of frames that lavpipe wants lav2yuv to output. The -n 1
66parameter is of course optional, and any other parameters to
67lav2yuv could be added. The second line for epilogue.qt might
68look like this: "lav2yuv -o $o -f $n epilogue.qt"
69
70Now follow all the sequences of the Pipe List, each of which
71consists of a listing of the input streams used and a command
72line to the filter program.
73
74The first sequence will simply reproduce all but the last 50
75frames of intro.avi (that are 1040 - 50 = 990 frames). Its
76first line only contains "990", the number of frames. The
77second line is "1", the number of streams used.
78The next line contains the index of the stream to use and the
79offset (how many frames to skip in the beginning).
80In our case both index and offset are 0, so the line would be:
81"0 0"
82Now we would add the command line of the filter program, but
83as we don't want to invoke any filter here, this line only
84contains "-", which causes lavpipe to simply output the contents
85of the stream.
86
87The second sequence is the actual transistion. So the first
88line is "50" (two seconds), the second one "2" (we use both streams).
89The following line will be "0 990" (intro.avi will be continued
90at frame 990) and then "1 0" follows (epilogue.qt starts with
91frame 0).
92The next line is the filter command, in our case
93"transist.flt -s $o -n $n -o 0 -O 255 -d 50"
94The -s $o -n $n parameters equal to the -o $o -f $n parameters
95of lav2yuv, -o 0 means that at the beginning of the transistion,
96only intro.avi is visible (opacity of epilogue.qt = 0).
97As you would have expected, -O 255 means that at the end of
98the transistion, only epilogue.qt is visible (opacity of
99epilogue.qt = 255 = 100%) - the opacity will be linearly
100interpolated inbetween. And finally -d 50 is the duration
101of the transistion in frames and should be equal to the
102first line (duration in frames) of the sequence in most cases.
103
104The last sequence continues with only epilogue.qt (the last
1053870 frames), thus the first line is "3870". The second line
106is "1" (only one stream), then "1 50" follows (epilogue.qt,
107beginning with frame 50). The filter command line is "-" again.
108
109Finally, our Pipe List file should look like this:
110
111--------------------< trans.pli >--------------------------
112LAV Pipe List
1132
114lav2yuv -o $o -f $n -n 1 intro.avi
115lav2yuv -o $o -f $n epilogue.qt
116990
1171
1180 0
119-
12050
1212
1220 990
1231 0
124transist.flt -s $o -n $n -o 0 -O 255 -d 25
1253870
1261
1271 50
128-
129--------------------< end of file >------------------------
130
131Remember the call? "lavpipe trans.pli | yuv2lav -o result.avi"
132should now produce a nice avi file with a nice transistion.
133
134_____________________________________________________________________________
135                                                                             \
1363. And what else can we do with it?                                          /
137____________________________________________________________________________/
138
139Example two: Blend one movie over another one, using a third one's luminance
140             channel as a matte (alpha channel) for the second one.
141
142matteblend.flt has no parameters until now, as its output is independent of
143the actual position in the stream (only depends on the input frames it is fed).
144
145If you read the first example and have understood the pipe list format, it
146will be easy to write a pipe list for this task. As there still is no
147bluescreen.flt filter, and it is very time consuming to build an animated
148matte channel for a given input movie by hand, I will only describe how to
149blend a static picture (a title or static logo) over a movie.
150
151For this you need your input.avi, a picture with an alpha channel. Use for
152example the GIMP to save the image as plain yuv (pic.yuv) and save its
153alpha channel as a grayscale plain yuv (matte.yuv - its chrominance channels
154will be ignored). Of course the must be of the right size.
155Now create this simple shell script that will output an infinite yuv stream
156that only contains the given plain yuv picture:
157
158--------------------< foreveryuv >-------------------------
159#!/bin/sh
160echo "YUV4MPEG 352 288 3"
161while true
162do
163       echo "FRAME"
164       cat $1
165done
166--------------------< end of file >------------------------
167
168And write the pipe list:
169
170--------------------< title.pli >--------------------------
171LAV Pipe List
1723
173lav2yuv -o $o -f $n input.avi
174foreveryuv pic.yuv
175foreveryuv matte.yuv
17675
1773
1780 0
1791 0
1802 0
1811000000
1821
1830 75
184matteblend.flt
185--------------------< end of file >------------------------
186
187As long as your input.avi is shorter than 1000076 frames,
188"lavpipe title.pli | yuv2lav -o result.avi" will output
189the whole movie with the given picture blended over it
190for the first three seconds.
191
192_____________________________________________________________________________
193                                                                             \
1944. That's all?! How can we make them do more?                                /
195____________________________________________________________________________/
196
197The solution is of course to code new filter programs. And of course this
198is very easy. I want to annote here, that the whole system is not very
199fool proof at the moment. So if you feed matteblend.flt the wrong number
200of input streams via lavpipe, you will get funny results, if you get
201any results at all (without any hint from the programs). Perhaps this
202could be improved by adding additional (optional) parameters to the
203YUV4MPEG header line.
204
205A filter program consists only of 4 parts:
206
2071. Read input parameters (especially -o and -n, if the output is not only
208   dependent on the input frames but also on some variable parameters that
209   change over time) - optional.
210
2112. Read in and write out the YUV headers, could look like this:
212
213   int fd_in = 0, fd_out = 1; /* stdin, stdout */
214   y4m_stream_info_t istream, ostream;
215   int res, width, height, frame_rate_code;
216
217   y4m_init_stream_info(&istream);
218   y4m_init_frame_info(&iframe);
219
220   if (y4m_read_stream_header (fd_in, &istream) != Y4M_OK)
221      exit (1);
222
223   y4m_init_stream_info(&ostream);
224   y4m_copy_stream_info(&ostream,&istream);
225   y4m_write_stream_header (fd_out, width, heigth, frame_rate_code);
226
2273. Allocate the YUV buffer(s) - one for each input stream and perhaps one
228   for the output or an arbitrary number of temporary buffers (no bloated
229   code, please ;-) )
230
231   char *yuv_buffer[3]; /* this is one yuv buffer */
232   yuv_buffer[0] = (char *) malloc(y4m_si_get_plane_length(&istream, 0)); /* Y' */
233   yuv_buffer[1] = (char *) malloc(y4m_si_get_plane_length(&istream, 1)); /* Cr */
234   yuv_buffer[2] = (char *) malloc(y4m_si_get_plane_length(&istream, 2)); /* Cb */
235
2364. The loop - while (number of frames processed) < (-n parameter)
237
2384.1. Read the input frames, one of those for each input stream (e.g. yuv_buffer[123])
239
240   while (y4m_read_frame (fd_in, &istream, &iframe, yuv_buffer) == Y4M_OK)
241         {
2424.2. Process the input buffers in any way you want.
243
2445. Write out the result:
245
246         y4m_write_frame(fd_out, &ostream, &iframe, yuv_buffer);
247         }
248
2496. Clean up:
250     y4m_fini_frame(&iframe);
251     y4m_fini_stream_info(&istream);
252     y4m_fini_stream_info(&ostream);
253
254That's all. You should in any case have a look at the existing filters,
255transist.flt.c and matteblend.flt.c.
256
257- pHilipp Zabel <pzabel@gmx.de>
258

README.transist

1This file describes how to make a transistion from one video file
2to another using the current sample implementations.
3Especially ypipe will be replaced (hopefully) soon by a more
4capable program.
5
6The programs involved in this process are:
7
8 - lav2yuv, which decompresses the given input files (or any portions
9            of them) and gives us raw yuv frame streams that can be
10            piped through the several tools.
11 - ypipe, which starts two instances of lav2yuv and combines the two
12          input streams into one, in which every even frame is from
13          the first and every odd frame from the second input.
14 - transist.flt, which takes the frame-interlaced stream from
15                 ypipe and writes out the final transistion.
16 - yuv2lav or mpeg2enc, which compress the resulting yuv stream
17                        into an mjpeg or mpeg file.
18
19Let's assume simple this scenery: We have two input videos, intro.avi
20and epilogue.qt and want make intro.avi transist into epilogue.qt
21with a duration of one second (that is 25 frames for PAL or 30 frames
22for NTSC).
23
24intro.avi and epiloque.qt have to be of the same format regarding
25frame rate and image resolution, at the moment.
26In this example they are both 352x288 PAL files. intro.avi contains
27250 frames and epilogue.qt is 1000 frames long.
28
29Therefor our output file will contain:
30 - the first 225 frames of intro.avi
31 - a 25 frame transistion containing the last 25 frames of intro.avi
32   and the first 25 frames of epilogue.qt
33 - the last 975 frames of epilogue.qt
34
35We could get the last 25 frames of intro.avi by calling:
36  lav2yuv -o 225 -f 25 intro.avi
37(-o 225, the offset, tells lav2yuv to begin with frame # 225
38 and -f 25 makes it output 25 frames from there on)
39Another possibility is:
40  lav2yuv -o -25 intro.avi
41(negative offsets are counted from the end)
42
43And the first 25 frames of epilogue.qt:
44  lav2yuv -f 25 epilogue.qt
45(-o defaults to an offset of zero)
46
47But we need to combine the two streams with ypipe. So the call would be:
48  ypipe "lav2yuv -o 255 -f 25 intro.avi" "lav2yuv -f 25 epilogue.qt"
49The output of this is a raw yuv stream that can be fed into
50transist.flt.
51
52transist.flt needs to be informed about the duration of the transistion
53and the opacity of the second stream at the beginning and at the end
54of the transistion:
55 -o num   opacity of second input at the beginning [0-255]
56 -O num   opacity of second input at the end [0-255]
57 -d num   duration of transistion in frames
58An opacity of 0 means that the second stream is fully transparent
59(only stream one visible), at 255 stream two is fully opaque.
60In our case the correct call (transistion from stream 1 to stream 2)
61would be:
62  transist.flt -o 0 -O 255 -d 25
63The -s and -n parameters equal to the -o and -f parameters of lav2yuv
64and are only needed if anybody wants to render only a portion of the
65transistion for whatever reason. Please note that this only affects
66the weighting calculations - none of the input is really skipped, so
67that if you pass the skip parameter (-s 30, for example), you also
68need to skip the first 30 frames in lav2yuv (-o 30) in order to get
69the expected result. If you didn't understand this, send an email to
70the authors or simply ignore -s and -n.
71The whole procedure will be automated later, anyway.
72
73Now we want to compress the yuv stream with yuv2lav.
74  yuv2lav -f a -q 80 -o transistion.avi
75Reads the yuv stream from stdin and outputs an avi file (-f a)
76with compressed jpeg frames of quality 80.
77
78Now we have the whole command for creating a transistion:
79
80ypipe "lav2yuv -o 255 -f 25 intro.avi" "lav2yuv -f 25 epilogue.qt" | \
81transist.flt -o 0 -O 255 -d 25 | yuv2lav -f a -q 80 -o transistion.avi
82
83(This is one line.) The resulting video can be written as a LAV Edit List,
84a plain text file containing the following lines:
85
86LAV Edit List
87PAL
883
89intro.avi
90transistion.avi
91epilogue.qt
920 0 224
931 0 24
942 25 999
95
96This file can be fed into xlav or lavplay, or you
97can pipe it into mpeg2enc with lav2yuv or combine
98the whole stuff into one single mjpeg file with
99lavtrans or lav2yuv|yuv2lav.
100