Cognity

A neural network library

Cookbook

  1. Creating and using a neural network
  2. Using data sets and data examples
  3. Importing and exporting data sets
  4. Initializing a neural network
  5. Training a neural network
  6. Training with custom stop conditions
  7. Serializing a neural network

Creating and using a neural network

All neural networks can be created via NeuralNetFactory object. The NeuralNetFactory object provides you a builder for a specific type of network. In order to obtain the neural network object, you need to specify network's details and call the build() method. Of course different builders require different details to be specified, see the concrete builder's documentation for more info. It's also possible to instantiate the builder object directly.

This example code creates two multilayer perceptrons with one hidden layer:

1
2
3
4
5
6
7
8
9
10
11
12
NeuralNet myNetwork = NeuralNetFactory.mlp()
                        .setInputSize(2)
                        .addTanh(3)
                        .addLinear(2)
                        .build();
 
MlpNet.Builder builder = new MlpNetBuilder();
builder.setInputSize(2);
builder.addTanh(3);
builder.addLinear(2);
NeuralNet mySecondNetwork = builder.build();

When you already have your neural network, you probably would like to calculate something with it. All you need to do is to call the compute method. It accepts an array as a parameter, this array should contain your input data. The method returns an array of doubles, it contains the network's response. Note that input array's size must be equal to the input layer's size. Here's a little example:

1
2
3
double[] input = new double[] {1.0, 2.0};
double[] output = myNetwork.compute(input);

You can do with the output array whatever you want, it won't alter the network's state.

Using data sets and data examples

A data example is a basic data unit used during neural network training. Every data example consists of one or two arrays of doubles i.e. vectors. They're called respectively "unsupervised" and "supervised" data examples. Both types are represented by the DataExample class.

1
2
3
4
5
6
7
8
9
10
11
12
DataExample unsupervisedExample = new DataExample(new double[] {1.0, 2.0});
                                     
unsupervisedExample.isSupervised();     // returns false
unsupervisedExample.getTarget();        // returns an array of length 0
 
DataExample supervisedExample = new DataExample(
                                    new double[] {1.0, 2.0},
                                    new double[] {3.0});
                                     
supervisedExample.isSupervised();       // returns true
supervisedExample.getTarget();          // returns {3.0}

Please note that getInput and getTarget methods return a reference to internal arrays, thus changing this arrays will change the data example.

A data set is simply a list of data examples. Sizes of input and target vectors in the data example must be equal for all data examples in the set.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
// Create a data set with input size 2 and target size 1.
DataSet set = new ListDataSet(2, 1);
 
DataExample sup = new DataExample(
                    new double[]{1.0, 2.0},
                    new double[]{3.0});
DataExample sup2 = new DataExample(
                    new double[]{1.0, 2.0},
                    new double[]{3.0, 4.0});
DataExample unsup = new DataExample(new double[]{1.0, 2.0});
 
set.add(sup);           // added successfully
 
set.add(unsup);         // error, size mismatch
set.add(sup2);          // error, size mismatch

Importing and exporting data sets

At the moment data sets can be read/written using Matlab M-files. The following example saves a data set to a Matlab file and then reads it back:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
DataSet set = new ListDataSet(2, 1);
set.add(new DataExample(new double[]{0.0, 0.0}, new double[]{0.0}));
set.add(new DataExample(new double[]{1.0, 0.0}, new double[]{1.0}));
 
MatlabExporter exporter = new MatlabExporter("myDataset.m");
exporter.setOrientation(MatlabExporter.COLUMN);
exporter.setInputMatrixName("myCoolInputs");
exporter.setTargetMatrixName("myCoolTargets");
exporter.exportData(set);
 
MatlabImporter importer = new MatlabImporter("myDataset.m",
                                             "myCoolInputs",
                                             "myCoolTargets");
importer.setOrientation(MatlabExporter.COLUMN);
DataSet setFormFile = importer.importData();

Initializing a neural network

During the initialization of a neural network all network's parameters will be randomized in some way. In Cognity initialization of a network is managed by NeuralNetInitializer objects. Here's a simple example showing how to use such an object:

1
2
3
4
5
6
7
8
NeuralNet net = NeuralNetFactory.mlp()
                    .addSigmoid(2)
                    .build();
 
// Initialize network's parameters with a random numbers
// from range [-2.0, 2.0).                 
new UniformInitializer(-2.0, 2.0).init(net);

Note that the network returned by the builder object will be usually already initialized using some fixed method.

Training a neural network

The neural network training is done by using TrainingRule objects. You can think of them as a "teachers" for neural networks. Usually, a training rule is designed for a specific type of neural network and you should always make sure, that chosen rule can be applied to your network, otherwise you may encounter an error. Moreover the data set passed to the rule must also meet the rule's requirements. For more info see the concrete rule's documentation. Here's a simple example of multilayer perceptron training with the backpropagation algorithm:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
// Create a data set
DataSet set = new ListDataSet();
set.add(new DataExample(new double[]{1.0, 2.0}, new double[]{3.0}));
 
// Create a network
NeuralNet net = NeuralNetFactory.mlp()
                    .setInputSize(2)
                    .addLinear(1)
                    .build();
                     
// Train the network
Backpropagation bp = new Backpropagation(net);
bp.setLearningRate(0.25);
bp.setMomentum(0.1);
bp.train(set);
 
// Print the training details:
// After which iteration the training has stopped
System.out.println("Iteration = " + bp.getCurrentIteration());
 
// The final mean square error on entire data set
System.out.println("Error = " + bp.getCurrentError());             

Note, that sometimes training may not converge (i.e. if you set too high learning rate). In such a case, the error value will be usually equal to NaN . Therefore you shold always check if the training process has really succeeded.

Training with custom stop conditions

Many training algorithms have an iterative nature. They repeat certain operation until specified stop conditions are met. Here we'll show how you can specify various stop conditions for your training algorithm. Every stop condition is represented by the StopCondition interface. There are few implementations of this interface, which allow you to check certain conditions (i.e. if the iteration limit has been exceeded). There are also stop conditions which are used to combine other conditions. For example the OR condition checks all underlying conditions and is satisfied when at least one of them is satisfied. Here's an example, that demonstrates setting a custom stop condition:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
// Create a network
NeuralNet net = NeuralNetFactory.mlp()
                    .setInputSize(2)
                    .addLinear(1)
                    .build();
                     
Backpropagation bp = new Backpropagation(net);
 
// Train no longer than 100 iterations
StopCondition iterCondition = new MaxIterationCondition(bp, 100);
 
// Train until the error is less or equal to 0.001
StopCondition errorCondition = new MaxErrorCondition(bp, 0.001);
 
// Combine both conditions
StopCondition condition = new OrCondition(iterCondition, errorCondition);
 
bp.setStopCondition(condition);

Serializing a neural network

Neural networks implement Serializable interface, so they can be serialized in a common java-way. To make this process easier, NeuralNetIO helper class has been written. With it's help saving neural network to the file is just a one line of code:

1
2
3
4
5
6
7
8
9
10
11
NeuralNet net = NeuralNetFactory.mlp()
                    .setInputSize(3)
                    .addTanh(2)
                    .build();
 
// Save network to the file
NeuralNetIO.write("myNetwork", net);
 
// Read network from the file
NeuralNet netFormFile = NeuralNetIO.read("myNetwork");