README.LTU
1This is version 2.5 of Statistics::LTU, a module for Linear Threshold Units.
2
3A linear threshold unit is a 1-layer neural network, also called a
4perceptron. LTU's are used to learn classifications from examples. An LTU
5learns to distinguish between two classes based on the data given to it.
6After training on a number of examples, the LTU can then be used to
7classify new (unseen) examples.
8
9LTU.pm defines an (uninstantiable) base class, LTU, and four other
10instantiable classes built on top of LTU. Each of the four classes uses a
11different training method: ACR (absolute correction rule), TACR (a thermal
12annealing version of the absolute correction rule), LMS (least-mean squares
13fit) and RLS (recursive least-mean squares rule). Check out ltu.doc for
14further information on these. You can use LTUs without understanding
15exactly how they work.
16
17
18REQUIREMENTS
19
20Statistics::LTU needs Perl version 5, as it is object-oriented and uses
21references extensively.
22
23
24INSTALLATION
25
26Run the following:
27 perl Makefile.PL
28 make
29 make install
30
31LTU.pm has some tests at the end; try running "perl LTU.pm". Note that
32this creates four LTU files with ".saved" extensions, which can be deleted
33after the tests.
34
35Note: Depending on the version of MakeMakefile you have, you may get an
36error with ld when you run make. I don't know how to prevent this. The
37Makefile is just used to copy LTU.pm, LTU.pod and weather.pl into the
38Statistics subdirectory of your Perl library directory.
39
40Then you probably want to:
41 make clean
42
43
44FILES
45
46ltu.doc has some useful information on LTUs, though it was written for the
47 C version.
48
49weather.perl is a simple demo showing how examples are created, and how
50 LTUs are trained and tested. It should be more instructive than
51 LTU.man. Run "perl weather.perl | more".
52
53The code itself is heavily documented.
54
55
56
57AUTHOR / SUGGESTIONS / BUG MAGNET
58
59Tom Fawcett (fawcett@nynexst.com).
60
61LTU.pm is based on LTU.C, an implementation of LTUs written by James Callan
62at the University of Massachusetts. I've tried to Perlify and objectify
63the code completely, but some awkwardness remains.
64