Hot questions for Using Neural networks in pylearn

Question:

I'm training a convolutional neural network using pylearn2 library and during all the ephocs, my validation error is consistently higher than the testing error. Is it possible? If so, in what kind of situations?


Answer:

moving the comment to an answer; modifying my previous answer seemed wrong

The full dataset may not be properly shuffled so the examples in the test set may be easier to classify.

Doing the experiment again with examples redistributed among the train / valid / test subsets would show if this is the case.

Question:

Pylearn2 is usually suggest as python resource for neural networks.

I would like to create a Single hidden layer neural network and train it with the backpropagation algorithm.

This should be something of basic but I do not understand how to do it with pylearn2. I have found this tutorial on multilayer perceptron but despite that I am still lost. (http://nbviewer.ipython.org/github/lisa-lab/pylearn2/blob/master/pylearn2/scripts/tutorials/multilayer_perceptron/multilayer_perceptron.ipynb)

n = 200
p = 20
X = np.random.normal(0, 1, (n, p))
y = X[:,0]* X[:, 1] + np.random.normal(0, .1, n)

I would like to create a single layer neural network with 40 hidden nodes and a sigmoid activation function.

Can someone help me?

EDIT:

I have been able to write this code but it is still not working

ds = DenseDesignMatrix(X=X, y=y)

hidden_layer = mlp.Sigmoid(layer_name='hidden', dim=10, irange=.1, init_bias=1.)
output_layer = mlp.Linear(1, 'output', irange=.1)
trainer = sgd.SGD(learning_rate=.05, batch_size=10, 
                  termination_criterion=EpochCounter(200))

layers = [hidden_layer, output_layer]
ann = mlp.MLP(layers, nvis=1)
trainer.setup(ann, ds)

while True:
    trainer.train(dataset=ds)
    ann.monitor.report_epoch()
    ann.monitor()
    if not trainer.continue_learning(ann):
        break

Answer:

This is my current solution:

n = 200
p = 2
X = np.random.normal(0, 1, (n, p))
y = X[:,0]* X[:, 1] + np.random.normal(0, .1, n)
y.shape = (n, 1)

ds = DenseDesignMatrix(X=X, y=y)


hidden_layer = mlp.Sigmoid(layer_name='hidden', dim=10, irange=.1, init_bias=1.)
output_layer = mlp.Linear(dim=1, layer_name='y', irange=.1)
trainer = sgd.SGD(learning_rate=.05, batch_size=10, 
                  termination_criterion=EpochCounter(200))
layers = [hidden_layer, output_layer]
ann = mlp.MLP(layers, nvis=2)
trainer.setup(ann, ds)

while True:
    trainer.train(dataset=ds)
    ann.monitor.report_epoch()
    ann.monitor()
    if not trainer.continue_learning(ann):
        break

inputs = X 
y_est = ann.fprop(theano.shared(inputs, name='inputs')).eval()

Question:

I'm training a simple convolution neural network using pylearn2. I have my RGB image data stored in a npy file. Is there anyway to convert that data directly to grayscale data directly from the npy file?


Answer:

If this is a standalone file then load the file using numpy.load then convert the content using something like this:

def rgb2gray(rgb): return np.dot(rgb[...,:3], [0.299, 0.587, 0.144])

If the file is part of a pylearn2 dataset (resulted from use_design_loc()), then load the dataset

from pylearn2.utils import serial serial.load("file.pkl")

and apply rgb2gray() function to X member (I assume a DenseDesignMatrix).