DEEP RESIDUAL LEARNING WITH DILATED CAUSAL CONVOLUTION EXTREME LEARNING MACHINE

Deep Residual Learning With Dilated Causal Convolution Extreme Learning Machine

Deep Residual Learning With Dilated Causal Convolution Extreme Learning Machine

Blog Article

A feedforward neural network with random weights (RW-FFNN) uses a randomized feature map layer.This randomization enables the optimization problem to be replaced by a standard linear least-squares problem, which offers a major advantage in terms of training speed.An extreme learning machine (ELM) is a well-known RW-FFNN that can be implemented as a single-hidden-layer feedforward neural network.

However, for a large dataset, owing to the shallow architecture, such an ELM typically requires a very large number of nodes in a single hidden layer to achieve a sufficient level of accuracy.In this paper, we propose a deep residual learning method with click here a dilated causal convolution ELM (DRLDCC-ELM).The baseline layer performs feature mapping to predict the target features based on the input features.

The subsequent residual-compensation layers then iteratively remodel the uncaptured click here prediction errors in the previous layer.The proposed network architecture also adopts dilated causal convolution based on the ELM in each layer to effectively expand the receptive field of the multilayer network.The results of experiments involving acoustic scene classification of daily activities in a home environment confirmed that the proposed DRLDCC-ELM outperforms the previously proposed residual-compensation ELM and deep-residual-compensation ELM methods.

We also confirmed that the generalization capability of the proposed DRLDCC-ELM tends to be superior to that of convolutional neural network-based models, especially for a large number of parameters.

Report this page