Artificial intelligence researchers have improved the performance of deep neural networks by combining feature normalization and feature attention modules into a single module that they call attentive normalization. The hybrid module improves the accuracy of the system significantly, while using negligible extra computational power.
New data processing module makes deep neural networks smarter