31X7/M.Sc Neural Computation: Introduction to Neural Networks.

Using a simple perceptron program.

A simple C program implementing a perceptron has been made available from the WWW. To find it, look at these exercises.

There are two programs:

perc.h, perc.c
the perceptron program (learns)
runperc.h, runperc.c
a program for testing the trained perceptron.

These are simple C programs.

To compile these programs use (SUN)

cc -o perc perc.c

cc -o runperc runperc.c

(HPUX)

gcc -o perc perc.c

gcc -o runperc runperc.c

The programs implement a simple perceptron with a number of inputs: this number may be altered by editing the line in perc.h and runperc.h

/* no of input units (not including bias) */

#define NINPUT 3

to reflect the number of input units desired.

The maximum number of patterns is also fixed inside the program:

/* maximum no of patterns */

#define NPATTERN 10

and may require to be altered.

The programs read two files when they are run, a parameters file, and a data file. The parameters file is used to set up some initial values, and the data file is used for the training data. The parameters file has format

ep n

et m

er 0.002

w 1.2 3.1 6.0,....

where ep is the maximum number of epochs allowed, et is the learning rate, and er is the error limit. w introduces the weights vector to be used. Any or all of these can be omitted, and defaults used. The defaults are 10 epochs, a learning rate of 1, and initial weights of 0. An example parameter file is

ep 100

et 1

er 0

The data file consists of alternating lines of input vector values and output unit values. Input vector values are terminated by -100, and output unit values by -200. Thus, for a 3-input network the data file might be

0 0 0 -100

0 -200

0 0 1 -100

0 -200

0 1 0 -100

0 -200

0 1 1 -100

0 -200

1 0 0 -100

0 -200

1 0 1 -100

0 -200

1 1 0 -100

1 -200

1 1 1 -100

1 -200

If the parameter file is called params.1, and the data file data.1, then one runs the perceptron training program by typing (at a command line prompt)

perc params.1 data.1

To run the test program, using a set of weights in param.1, type

runperc params.1 data.1

This will produce output of the form

w 0.0 0.0 0.0 0.0

w 2.0 2.0 1.0 2.0

w 0.0 0.0 -2.0 -6.0

w 2.0 2.0 -1.0 -4.0

w 4.0 4.0 0.0 -2.0

w 2.0 2.0 -2.0 -6.0

w 4.0 4.0 -1.0 -4.0

w 4.0 4.0 -1.0 -4.0

Each line is a weight vector: the last element being the bias weight. The program stops when the dataset has been learned, or when the number of epochs has reached the limit.

To see the effect of a set of weights, copy the line containing a weight vector into the params file (and save it somewhere else). Then run runperc:

runperc params.2 data.1

This will give output of the form

actual desired

0.0000 0.0000

0.0000 0.0000

0.0000 0.0000

0.0000 0.0000

0.0000 0.0000

0.0000 0.0000

1.0000 1.0000

1.0000 1.0000

0.0000 0.0000

0.0000 0.0000

in which the left column gives the actual output, and the right column gives the desired (target) output.

You should

(i) Copy the files into your own directory, and compile the programs. Please tell me at once if you have problems making them compile (but remember to use gcc on HPUX!)

(ii) Show that the perceptron can learn to do AND and OR on a 3-bit input. Take the weight vector and use it in running runperc.Look at the decision surface produced by the system. Try is from a different starting point in weight space. Does it produce the same decision surface?

(iii) Can the following mapping be learned?

INPUT OUTPUT

0 1 0 1 0 1 -> 0

0 1 0 1 1 0 -> 1 1 0 1 0 0 1 -> 0

1 0 1 0 1 0 -> 1

0 1 1 0 0 1 -> 0

0 1 1 0 1 0 -> 1

1 0 0 1 1 0 -> 1

1 0 0 1 0 1 -> 1

(iv) Experiment with other training sets.

backto 31Y7 Exercises page