#include <PerceptronNeuron.h>
Public Member Functions | |
PerceptronNeuron (unsigned int in_layer_num) | |
double | getDelta (void) |
void | setDelta (double newval) |
double | getTheta (void) |
void | setTheta (double newval) |
double | getThetaDiff (void) |
void | initializeWeightings (unsigned int succ_count) |
void | resetWeights (void) |
void | resetWeightDiffs (void) |
void | resetDiffs (void) |
void | postprocessWeight (PerceptronNeuron *succ, double epsilon, double weight_decay, double momterm) |
void | postprocessTheta (double epsilon, double weight_decay, double momterm) |
void | update (void) |
Public Attributes | |
unsigned int | num |
double | input |
double | output |
vector< double > | weight |
vector< double > | weight_diff |
Protected Attributes | |
double | delta |
double | theta |
double | theta_diff |
double | theta_diff_last |
vector< double > | weight_diff_last |
|
Main constructor.
|
|
Getter function for the delta variable. |
|
Getter function for theta parameter. |
|
Getter for the difference calculated for the theta parameter. |
|
Initialize the weightings vectors used for each neuron to zero.
|
|
Theta postprocess algorithm. Assign the theta differences to this neuron.
|
|
Weighting postprocess algorithm. Assign the weight differences to this neuron. Note that the assignment is done by adding the new delta value to the current one. Hence, both batch- and online learning can use this function. For online-learning the values must be resetted after each update using the resetWeightDiffs method. Also, only one weighting difference is calculated at a time. The computation of all differences is scheduled by PerceptronLayer::postprocess.
|
|
Reset all learned differences. |
|
Reset all weight deltas to zero. |
|
Reset all weightings to zero. |
|
Setter function for the delta variable.
|
|
Setter function for theta parameter. |
|
Update algorithm. Update weightings and theta parameter by the values calculated in the postprocess step. |
|
Errorsignal, computed by the backpropagation algorithm. Used to compute the deltas for both the weightings and the sensitivity parameter |
|
Input signal level for this node, . |
|
Counter within the current layer (top = 0) |
|
Output signal level for this node, . |
|
Sensitivity parameter to the activation function |
|
Delta-value for the sensitivity, calculated by the Processing algorithm, used to update theta within the Update algorithm |
|
The previous theta difference, used for momentum term calculation. |
|
Weightings to neurons in the next layer. Hence, the size of the vector must equal the number of neurons in the next layer |
|
Delta-value for the individual weightings the node has to its successor nodes. Computed by the Processing algorithm Note this is within the public space due to the from-file constructor of PerceptronNetwork, which push_back's zeroes here. |
|
The previous weight differences, used for momentum term. |