2009 IEEE International Conference on
Systems, Man, and Cybernetics |
![]() |
Abstract
This paper presents a pruning algorithm using adaptive pruning interval for system identification with general dynamic neural networks (GDNN). GDNNs are artificial neural networks with internal dynamics. All layers have feedback connections with time delays to the same and to all other layers. The parameters are trained with the Levenberg-Marquardt (LM) optimization algorithm. Therefore the Jacobian matrix is required. The Jacobian is calculated by real time recurrent learning (RTRL). As both LM and OBS need Hessian information, computing time can be saved, if OBS uses the scaled inverse Hessian already calculated for the LM algorithm. This paper discusses the effect of using the scaled Hessian instead of the real Hessian in the OBS pruning approach. In addition to that an adaptive pruning interval is presented. During the identification process the structure of the model is changed drastically. So the parameter optimization task between the pruning steps becomes more or less complex. To guarantee that the parameter optimization algorithm has enough time to cope with the structural changes in the GDNN-model, it is suggested to adapt the pruning interval during the identification process. The proposed algorithm is tested with two interesting identification examples.