|
|
|
|
|
A Layer-by-Layer Levenberg-Marquardt algorithm for Feedforward Multilayer Perceptron |
|
PP: 505S-511S |
|
Author(s) |
|
Young-Tae Kwak,
Heeseung Jo,
|
|
Abstract |
|
This paper proposes an improved LM algorithm that trains the weights of FMLP layer-by-layer. FMLP doesn’t have full connections between each output node and between each hidden node. Therefore, our algorithm updates output weights with a Jacobian matrix reduced by its block diagonal matrix. Then we define a new error function for hidden layer derived from output layer’s error signals. According to the new error function, we update hidden weights with hidden layer’s block diagonal Jacobian matrix. The proposed method can save both memory required and expensive operations of LM algorithm by downsized Jacobian matrices. We tested an iris classification and a handwritten digit recognition for this work. As a result, we found that our method improved training speed and reduced the memory of Jacobian matrix by 30% in the classification and by 10% in the recognition. |
|
|
|
|
|