dc.description.abstract |
Transfer learning is widely used for many applications, but it is difficult to adapt intermediate layers of
the network to a new learning task with less data available. Learning the importance that key connections play in performance can lead the models to yield better performance for new tasks. This paper
presents a novel deep learning architecture, termed as DRCNet along with an unconventional technique
of “connection-finetuning”. Dense residual connection-finetuning is achievable through a strength parameter learned via backpropagation. In this research, we show that some connections are redundant
and therefore, based on their strength, removing them can improve the overall performance. Results on
multiple databases demonstrate exceptional results with the proposed DRCNet architecture using this
novel technique. Due to the easy integration, we have also shown improved performance in other existing variants of ResNets as well. Results on both intra (same) and inter (cross) database experiments
showcase the effectiveness of the proposed algorithm. The cross-dataset experiments show improvement
of 10-20% in classification accuracy compared to traditional ResNet architecture. Further, experiments
also demonstrate that our novel method excels over existing approaches when limited data is available
for training, thus reinforcing the claim. |
en_US |