Your answer is one click away!

Philip O'Brien February 2016
### Slightly differing output from Pybrain neural network despite consistent initialisation?

# UPDATE

I am working on a feed forward network in PyBrain. To allow me to compare the effects of varying certain parameters I have initialised the network weights myself. I have done this under the assumption that if the weights are always the same then the output should always be the same. Is this assumption incorrect? Below is the code used to set up the network

```
n = FeedForwardNetwork()
inLayer = LinearLayer(7, name="in")
hiddenLayer = SigmoidLayer(1, name="hidden")
outLayer = LinearLayer(1, name="out")
n.addInputModule(inLayer)
n.addModule(hiddenLayer)
n.addOutputModule(outLayer)
in_to_hidden = FullConnection(inLayer, hiddenLayer, name="in-to-hidden")
hidden_to_out = FullConnection(hiddenLayer, outLayer, name="hidden-to-out")
n.addConnection(in_to_hidden)
n.addConnection(hidden_to_out)
n.sortModules()
in_to_hidden_params = [
0.27160018, -0.30659429, 0.13443352, 0.4509613,
0.2539234, -0.8756649, 1.25660715
]
hidden_to_out_params = [0.89784474]
net_params = in_to_hidden_params + hidden_to_out_params
n._setParameters(net_params)
trainer = BackpropTrainer(n, ds, learningrate=0.01, momentum=0.8)
```

It looks like even by seeding the random number generator, reproducibility is still an issue. See the GitHub issue here

I have done this under the assumption that if the weights are always the same then the output should always be the same

The assumption is correct, but your code is not doing so. Your are **training** your weights, thus they do not end up being the same. Stochastic training methods often **permute** training samples, and this permutation leads to different results, in particular BackpropTrainer does so:

```
def train(self):
"""Train the associated module for one epoch."""
assert len(self.ds) > 0, "Dataset cannot be empty."
self.module.resetDerivatives()
errors = 0
ponderation = 0.
shuffledSequences = []
for seq in self.ds._provideSequences():
shuffledSequences.append(seq)
shuffle(shuffledSequences)
```

If you want repeatable results - **seed your random number generators**.

Asked in February 2016

Viewed 3,737 times

Voted 11

Answered 1 times

Viewed 3,737 times

Voted 11

Answered 1 times