Training Neural Networks with Many Layers: More Coming Soon

Use of Vertical Knowledge Sharing Links

The training of a very deep neural network may be facilated by adding vertical node-to-node knowledge sharing links that skip many layers to create implicit coordination in the training of the linked layers.

Step-by-Step Incremental Growth and Imitation Learning

The training process may be further enhanced by incrementally growing the deep neural network, creating a new deeper network in each step. If at each step the nodes of the new network are a superset of copies of the nodes in the previous network, the new network may be initialized by Transfer Learning. In addition to Transfer Learning, the transfer of knowledge from the previous network may be continued during the training process by using selective Imitation Learning. Imitation learning may be applied even for data on which the previous network was not trained. In fact, for imitation learning, the data does not even need to be labeled. Furthermore, node-to-node Knowledge Sharing may be used to link any node in the previous network to the corresponding node in the new network, helping the new network to retain any knowledge acquired by the previous network, even if the new network is being trained on different data.

Growing the Network Layer-by-Layer

More detail of the process of incrementally growing a deep neural network is given in the particular case of growing the network Layer by Layer. A similar process may be applied when many layers are added at once. Additional ideas are given in Easy Ways to Build Deep Networks.

Navigation Menu

by James K Baker and Bradley J Baker

© D5AI LLC, 2020

The text in this work is licensed under a Creative Commons Attribution 4.0 International License.
Some of the ideas presented here are covered by issued or pending patents. No license to such patents is created or implied by publication or reference to herein.